Joint the Center for Applied Scientific Computing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gamblin, Todd; Bremer, Timo; Van Essen, Brian
The Center for Applied Scientific Computing serves as Livermore Lab’s window to the broader computer science, computational physics, applied mathematics, and data science research communities. In collaboration with academic, industrial, and other government laboratory partners, we conduct world-class scientific research and development on problems critical to national security. CASC applies the power of high-performance computing and the efficiency of modern computational methods to the realms of stockpile stewardship, cyber and energy security, and knowledge discovery for intelligence applications.
Bethel, E. Wes [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Computational Research Division and Scientific Visualization Group
2018-05-07
Summer Lecture Series 2008: Scientific visualization transforms abstract data into readily comprehensible images, provide a vehicle for "seeing the unseeable," and play a central role in both experimental and computational sciences. Wes Bethel, who heads the Scientific Visualization Group in the Computational Research Division, presents an overview of visualization and computer graphics, current research challenges, and future directions for the field.
Ontology-Driven Discovery of Scientific Computational Entities
ERIC Educational Resources Information Center
Brazier, Pearl W.
2010-01-01
Many geoscientists use modern computational resources, such as software applications, Web services, scientific workflows and datasets that are readily available on the Internet, to support their research and many common tasks. These resources are often shared via human contact and sometimes stored in data portals; however, they are not necessarily…
Optimizing Engineering Tools Using Modern Ground Architectures
2017-12-01
Considerations,” International Journal of Computer Science & Engineering Survey , vol. 5, no. 4, 2014. [10] R. Bell. (n.d). A beginner’s guide to big O notation...scientific community. Traditional computing architectures were not capable of processing the data efficiently, or in some cases, could not process the...thesis investigates how these modern computing architectures could be leveraged by industry and academia to improve the performance and capabilities of
Translations on Eastern Europe, Scientific Affairs, Number 542.
1977-04-18
transplanting human tissue has not as yet been given a final juridical approval like euthanasia, artificial insemination , abortion, birth control, and others...and data teleprocessing. This computer may also be used as a satellite computer for complex systems. The IZOT 310 has a large instruction...a well-known truth that modern science is using the most modern and leading technical facilities—from bathyscaphes to satellites , from gigantic
Through Kazan ASPERA to Modern Projects
NASA Astrophysics Data System (ADS)
Gusev, Alexander; Kitiashvili, Irina; Petrova, Natasha
Now the European Union form the Sixth Framework Programme. One of its the objects of the EU Programme is opening national researches and training programmes. The Russian PhD students and young astronomers have business and financial difficulties in access to modern databases and astronomical projects and so they has not been included in European overview of priorities. Modern requirements to the organization of observant projects on powerful telescopes assumes painstaking scientific computer preparation of the application. A rigid competition for observation time assume preliminary computer modeling of target object for success of the application. Kazan AstroGeoPhysics Partnership
NASA Astrophysics Data System (ADS)
Bird, Robert; Nystrom, David; Albright, Brian
2017-10-01
The ability of scientific simulations to effectively deliver performant computation is increasingly being challenged by successive generations of high-performance computing architectures. Code development to support efficient computation on these modern architectures is both expensive, and highly complex; if it is approached without due care, it may also not be directly transferable between subsequent hardware generations. Previous works have discussed techniques to support the process of adapting a legacy code for modern hardware generations, but despite the breakthroughs in the areas of mini-app development, portable-performance, and cache oblivious algorithms the problem still remains largely unsolved. In this work we demonstrate how a focus on platform agnostic modern code-development can be applied to Particle-in-Cell (PIC) simulations to facilitate effective scientific delivery. This work builds directly on our previous work optimizing VPIC, in which we replaced intrinsic based vectorisation with compile generated auto-vectorization to improve the performance and portability of VPIC. In this work we present the use of a specialized SIMD queue for processing some particle operations, and also preview a GPU capable OpenMP variant of VPIC. Finally we include a lessons learnt. Work performed under the auspices of the U.S. Dept. of Energy by the Los Alamos National Security, LLC Los Alamos National Laboratory under contract DE-AC52-06NA25396 and supported by the LANL LDRD program.
NASA Astrophysics Data System (ADS)
Kashansky, Vladislav V.; Kaftannikov, Igor L.
2018-02-01
Modern numerical modeling experiments and data analytics problems in various fields of science and technology reveal a wide variety of serious requirements for distributed computing systems. Many scientific computing projects sometimes exceed the available resource pool limits, requiring extra scalability and sustainability. In this paper we share the experience and findings of our own on combining the power of SLURM, BOINC and GlusterFS as software system for scientific computing. Especially, we suggest a complete architecture and highlight important aspects of systems integration.
Environmental models are products of the computer architecture and software tools available at the time of development. Scientifically sound algorithms may persist in their original state even as system architectures and software development approaches evolve and progress. Dating...
NASA Astrophysics Data System (ADS)
Clementi, Enrico
2012-06-01
This is the introductory chapter to the AIP Proceedings volume "Theory and Applications of Computational Chemistry: The First Decade of the Second Millennium" where we discuss the evolution of "computational chemistry". Very early variational computational chemistry developments are reported in Sections 1 to 7, and 11, 12 by recalling some of the computational chemistry contributions by the author and his collaborators (from late 1950 to mid 1990); perturbation techniques are not considered in this already extended work. Present day's computational chemistry is partly considered in Sections 8 to 10 where more recent studies by the author and his collaborators are discussed, including the Hartree-Fock-Heitler-London method; a more general discussion on present day computational chemistry is presented in Section 14. The following chapters of this AIP volume provide a view of modern computational chemistry. Future computational chemistry developments can be extrapolated from the chapters of this AIP volume; further, in Sections 13 and 15 present an overall analysis on computational chemistry, obtained from the Global Simulation approach, by considering the evolution of scientific knowledge confronted with the opportunities offered by modern computers.
NASA Astrophysics Data System (ADS)
Schulthess, Thomas C.
2013-03-01
The continued thousand-fold improvement in sustained application performance per decade on modern supercomputers keeps opening new opportunities for scientific simulations. But supercomputers have become very complex machines, built with thousands or tens of thousands of complex nodes consisting of multiple CPU cores or, most recently, a combination of CPU and GPU processors. Efficient simulations on such high-end computing systems require tailored algorithms that optimally map numerical methods to particular architectures. These intricacies will be illustrated with simulations of strongly correlated electron systems, where the development of quantum cluster methods, Monte Carlo techniques, as well as their optimal implementation by means of algorithms with improved data locality and high arithmetic density have gone hand in hand with evolving computer architectures. The present work would not have been possible without continued access to computing resources at the National Center for Computational Science of Oak Ridge National Laboratory, which is funded by the Facilities Division of the Office of Advanced Scientific Computing Research, and the Swiss National Supercomputing Center (CSCS) that is funded by ETH Zurich.
Astrobiology for the 21st Century
NASA Astrophysics Data System (ADS)
Oliveira, C.
2008-02-01
We live in a scientific world. Science is all around us. We take scientific principles for granted every time we use a piece of technological apparatus, such as a car, a computer, or a cellphone. In today's world, citizens frequently have to make decisions that require them to have some basic scientific knowledge. To be a contributing citizen in a modern democracy, a person needs to understand the general principles of science.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kevrekidis, Ioannis G.
The work explored the linking of modern developing machine learning techniques (manifold learning and in particular diffusion maps) with traditional PDE modeling/discretization/scientific computation techniques via the equation-free methodology developed by the PI. The result (in addition to several PhD degrees, two of them by CSGF Fellows) was a sequence of strong developments - in part on the algorithmic side, linking data mining with scientific computing, and in part on applications, ranging from PDE discretizations to molecular dynamics and complex network dynamics.
Green Day? An Old Mill City Leads a New Revolution in Massachusetts
ERIC Educational Resources Information Center
Brown, Robert A.
2012-01-01
The Northeast United States just experienced one of the region's worst natural disasters. Fortunately, because of the confluence of modern computing power and scientific computing methods, weather forecasting models predicted Sandy's very complicated trajectory and development with a precision that would not have been possible even a decade ago.…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Almgren, Ann; DeMar, Phil; Vetter, Jeffrey
The widespread use of computing in the American economy would not be possible without a thoughtful, exploratory research and development (R&D) community pushing the performance edge of operating systems, computer languages, and software libraries. These are the tools and building blocks — the hammers, chisels, bricks, and mortar — of the smartphone, the cloud, and the computing services on which we rely. Engineers and scientists need ever-more specialized computing tools to discover new material properties for manufacturing, make energy generation safer and more efficient, and provide insight into the fundamentals of the universe, for example. The research division of themore » U.S. Department of Energy’s (DOE’s) Office of Advanced Scientific Computing and Research (ASCR Research) ensures that these tools and building blocks are being developed and honed to meet the extreme needs of modern science. See also http://exascaleage.org/ascr/ for additional information.« less
New Trends in E-Science: Machine Learning and Knowledge Discovery in Databases
NASA Astrophysics Data System (ADS)
Brescia, Massimo
2012-11-01
Data mining, or Knowledge Discovery in Databases (KDD), while being the main methodology to extract the scientific information contained in Massive Data Sets (MDS), needs to tackle crucial problems since it has to orchestrate complex challenges posed by transparent access to different computing environments, scalability of algorithms, reusability of resources. To achieve a leap forward for the progress of e-science in the data avalanche era, the community needs to implement an infrastructure capable of performing data access, processing and mining in a distributed but integrated context. The increasing complexity of modern technologies carried out a huge production of data, whose related warehouse management and the need to optimize analysis and mining procedures lead to a change in concept on modern science. Classical data exploration, based on local user own data storage and limited computing infrastructures, is no more efficient in the case of MDS, worldwide spread over inhomogeneous data centres and requiring teraflop processing power. In this context modern experimental and observational science requires a good understanding of computer science, network infrastructures, Data Mining, etc. i.e. of all those techniques which fall into the domain of the so called e-science (recently assessed also by the Fourth Paradigm of Science). Such understanding is almost completely absent in the older generations of scientists and this reflects in the inadequacy of most academic and research programs. A paradigm shift is needed: statistical pattern recognition, object oriented programming, distributed computing, parallel programming need to become an essential part of scientific background. A possible practical solution is to provide the research community with easy-to understand, easy-to-use tools, based on the Web 2.0 technologies and Machine Learning methodology. Tools where almost all the complexity is hidden to the final user, but which are still flexible and able to produce efficient and reliable scientific results. All these considerations will be described in the detail in the chapter. Moreover, examples of modern applications offering to a wide variety of e-science communities a large spectrum of computational facilities to exploit the wealth of available massive data sets and powerful machine learning and statistical algorithms will be also introduced.
Multidisciplinary Approaches in Evolutionary Linguistics
ERIC Educational Resources Information Center
Gong, Tao; Shuai, Lan; Wu, Yicheng
2013-01-01
Studying language evolution has become resurgent in modern scientific research. In this revival field, approaches from a number of disciplines other than linguistics, including (paleo)anthropology and archaeology, animal behaviors, genetics, neuroscience, computer simulation, and psychological experimentation, have been adopted, and a wide scope…
Component architecture in drug discovery informatics.
Smith, Peter M
2002-05-01
This paper reviews the characteristics of a new model of computing that has been spurred on by the Internet, known as Netcentric computing. Developments in this model led to distributed component architectures, which, although not new ideas, are now realizable with modern tools such as Enterprise Java. The application of this approach to scientific computing, particularly in pharmaceutical discovery research, is discussed and highlighted by a particular case involving the management of biological assay data.
Fast Acceleration of 2D Wave Propagation Simulations Using Modern Computational Accelerators
Wang, Wei; Xu, Lifan; Cavazos, John; Huang, Howie H.; Kay, Matthew
2014-01-01
Recent developments in modern computational accelerators like Graphics Processing Units (GPUs) and coprocessors provide great opportunities for making scientific applications run faster than ever before. However, efficient parallelization of scientific code using new programming tools like CUDA requires a high level of expertise that is not available to many scientists. This, plus the fact that parallelized code is usually not portable to different architectures, creates major challenges for exploiting the full capabilities of modern computational accelerators. In this work, we sought to overcome these challenges by studying how to achieve both automated parallelization using OpenACC and enhanced portability using OpenCL. We applied our parallelization schemes using GPUs as well as Intel Many Integrated Core (MIC) coprocessor to reduce the run time of wave propagation simulations. We used a well-established 2D cardiac action potential model as a specific case-study. To the best of our knowledge, we are the first to study auto-parallelization of 2D cardiac wave propagation simulations using OpenACC. Our results identify several approaches that provide substantial speedups. The OpenACC-generated GPU code achieved more than speedup above the sequential implementation and required the addition of only a few OpenACC pragmas to the code. An OpenCL implementation provided speedups on GPUs of at least faster than the sequential implementation and faster than a parallelized OpenMP implementation. An implementation of OpenMP on Intel MIC coprocessor provided speedups of with only a few code changes to the sequential implementation. We highlight that OpenACC provides an automatic, efficient, and portable approach to achieve parallelization of 2D cardiac wave simulations on GPUs. Our approach of using OpenACC, OpenCL, and OpenMP to parallelize this particular model on modern computational accelerators should be applicable to other computational models of wave propagation in multi-dimensional media. PMID:24497950
1978-10-11
REQUIREMENTS OF COMPUTER USERS Warsaw INFORMATYKA in Polish Vol 12 No 8, 1977 pp 12-14 CHELCHOWSKI, JERZY, Academy of Economics, Wroclaw [Abstract...Western. 11 E. Hardware POLAND SQUARE-LOOP FERRITE CORES IN THE WORKING STORAGE OF MODERN COMPUTERS Warsaw INFORMATYKA in Polish Vol 12 No 5...INDUSTRY PLANT Warsaw INFORMATYKA in Polish Vol 12 No 10, 1977 Pp 20-22 BERNATOWICZ, KRYSTYN [Text] Next to mines, steelworks and shipyards, The H
Design Trade-off Between Performance and Fault-Tolerance of Space Onboard Computers
NASA Astrophysics Data System (ADS)
Gorbunov, M. S.; Antonov, A. A.
2017-01-01
It is well known that there is a trade-off between performance and power consumption in onboard computers. The fault-tolerance is another important factor affecting performance, chip area and power consumption. Involving special SRAM cells and error-correcting codes is often too expensive with relation to the performance needed. We discuss the possibility of finding the optimal solutions for modern onboard computer for scientific apparatus focusing on multi-level cache memory design.
NASA STI Program Coordinating Council Eleventh Meeting: NASA STI Modernization Plan
NASA Technical Reports Server (NTRS)
1993-01-01
The theme of this NASA Scientific and Technical Information Program Coordinating Council Meeting was the modernization of the STI Program. Topics covered included the activities of the Engineering Review Board in the creation of the Infrastructure Upgrade Plan, the progress of the RECON Replacement Project, the use and status of Electronic SCAN (Selected Current Aerospace Notices), the Machine Translation Project, multimedia, electronic document interchange, the NASA Access Mechanism, computer network upgrades, and standards in the architectural effort.
Scalable Automated Model Search
2014-05-20
ma- chines. Categories and Subject Descriptors Big Data [Distributed Computing]: Large scale optimization 1. INTRODUCTION Modern scientific and...from Continuum Analytics[1], and Apache Spark 0.8.1. Additionally, we made use of Hadoop 1.0.4 configured on local disks as our data store for the large...Borkar et al. Hyracks: A flexible and extensible foundation for data -intensive computing. In ICDE, 2011. [16] J. Canny and H. Zhao. Big data
2008-02-01
journal article. Didactic coursework requirements for the PhD degree have been completed at this time as well as successful presentation of the...Libraries", Modern Software Tools in Scientific Computing. Birkhauser Press, pp. 163-202, 1997. [5] Doyley MM, Weaver JB, Van Houten EEW, Kennedy FE...data from MR, x-ray computed tomography (CT) and digital photography have been used to successfully drive the algorithm in two-dimensional (2D) work
Computer Synthesis Approaches of Hyperboloid Gear Drives with Linear Contact
NASA Astrophysics Data System (ADS)
Abadjiev, Valentin; Kawasaki, Haruhisa
2014-09-01
The computer design has improved forming different type software for scientific researches in the field of gearing theory as well as performing an adequate scientific support of the gear drives manufacture. Here are attached computer programs that are based on mathematical models as a result of scientific researches. The modern gear transmissions require the construction of new mathematical approaches to their geometric, technological and strength analysis. The process of optimization, synthesis and design is based on adequate iteration procedures to find out an optimal solution by varying definite parameters. The study is dedicated to accepted methodology in the creation of soft- ware for the synthesis of a class high reduction hyperboloid gears - Spiroid and Helicon ones (Spiroid and Helicon are trademarks registered by the Illinois Tool Works, Chicago, Ill). The developed basic computer products belong to software, based on original mathematical models. They are based on the two mathematical models for the synthesis: "upon a pitch contact point" and "upon a mesh region". Computer programs are worked out on the basis of the described mathematical models, and the relations between them are shown. The application of the shown approaches to the synthesis of commented gear drives is illustrated.
A Parallel Numerical Micromagnetic Code Using FEniCS
NASA Astrophysics Data System (ADS)
Nagy, L.; Williams, W.; Mitchell, L.
2013-12-01
Many problems in the geosciences depend on understanding the ability of magnetic minerals to provide stable paleomagnetic recordings. Numerical micromagnetic modelling allows us to calculate the domain structures found in naturally occurring magnetic materials. However the computational cost rises exceedingly quickly with respect to the size and complexity of the geometries that we wish to model. This problem is compounded by the fact that the modern processor design no longer focuses on the speed at which calculations are performed, but rather on the number of computational units amongst which we may distribute our calculations. Consequently to better exploit modern computational resources our micromagnetic simulations must "go parallel". We present a parallel and scalable micromagnetics code written using FEniCS. FEniCS is a multinational collaboration involving several institutions (University of Cambridge, University of Chicago, The Simula Research Laboratory, etc.) that aims to provide a set of tools for writing scientific software; in particular software that employs the finite element method. The advantages of this approach are the leveraging of pre-existing projects from the world of scientific computing (PETSc, Trilinos, Metis/Parmetis, etc.) and exposing these so that researchers may pose problems in a manner closer to the mathematical language of their domain. Our code provides a scriptable interface (in Python) that allows users to not only run micromagnetic models in parallel, but also to perform pre/post processing of data.
Modeling and Simulation of High Dimensional Stochastic Multiscale PDE Systems at the Exascale
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kevrekidis, Ioannis
2017-03-22
The thrust of the proposal was to exploit modern data-mining tools in a way that will create a systematic, computer-assisted approach to the representation of random media -- and also to the representation of the solutions of an array of important physicochemical processes that take place in/on such media. A parsimonious representation/parametrization of the random media links directly (via uncertainty quantification tools) to good sampling of the distribution of random media realizations. It also links directly to modern multiscale computational algorithms (like the equation-free approach that has been developed in our group) and plays a crucial role in accelerating themore » scientific computation of solutions of nonlinear PDE models (deterministic or stochastic) in such media – both solutions in particular realizations of the random media, and estimation of the statistics of the solutions over multiple realizations (e.g. expectations).« less
NASA Astrophysics Data System (ADS)
2014-10-01
The active involvement of young researchers in scientific processes and the acquisition of scientific experience by gifted youth currently have a great value for the development of science. One of the research activities of National Research Tomsk Polytechnic University, aimed at the preparing and formation of the next generation of scientists, is the International Conference of Students and Young Scientists ''Modern Techniques and Technologies'', which was held in 2014 for the twentieth time. Great experience in the organization of scientific events has been acquired through years of carrying the conference. There are all the necessary resources for this: a team of organizers - employees of Tomsk Polytechnic University, premises provided with modern office equipment and equipment for demonstration, and leading scientists - professors of TPU, as well as the status of the university as a leading research university in Russia. This way the conference is able to attract world leading scientists for the collaboration. For the previous years the conference proved itself as a major scientific event at international level, which attracts more than 600 students and young scientists from Russia, CIS and other countries. The conference provides oral plenary and section reports. The conference is organized around lectures, where leading Russian and foreign scientists deliver plenary presentations to young audiences. An important indicator of this scientific event is the magnitude of the coverage of scientific fields: energy, heat and power, instrument making, engineering, systems and devices for medical purposes, electromechanics, material science, computer science and control in technical systems, nanotechnologies and nanomaterials, physical methods in science and technology, control and quality management, design and technology of artistic materials processing. The main issues considered by young researchers at the conference were related to the analysis of contemporary problems using new techniques and application of new technologies.
Unstructured Adaptive (UA) NAS Parallel Benchmark. Version 1.0
NASA Technical Reports Server (NTRS)
Feng, Huiyu; VanderWijngaart, Rob; Biswas, Rupak; Mavriplis, Catherine
2004-01-01
We present a complete specification of a new benchmark for measuring the performance of modern computer systems when solving scientific problems featuring irregular, dynamic memory accesses. It complements the existing NAS Parallel Benchmark suite. The benchmark involves the solution of a stylized heat transfer problem in a cubic domain, discretized on an adaptively refined, unstructured mesh.
Cellular automaton supercomputing
NASA Technical Reports Server (NTRS)
Wolfram, Stephen
1987-01-01
Many of the models now used in science and engineering are over a century old. And most of them can be implemented on modern digital computers only with considerable difficulty. Some new basic models are discussed which are much more directly suitable for digital computer simulation. The fundamental principle is that the models considered herein are as suitable as possible for implementation on digital computers. It is then a matter of scientific analysis to determine whether such models can reproduce the behavior seen in physical and other systems. Such analysis was carried out in several cases, and the results are very encouraging.
Evolution of the SOFIA tracking control system
NASA Astrophysics Data System (ADS)
Fiebig, Norbert; Jakob, Holger; Pfüller, Enrico; Röser, Hans-Peter; Wiedemann, Manuel; Wolf, Jürgen
2014-07-01
The airborne observatory SOFIA (Stratospheric Observatory for Infrared Astronomy) is undergoing a modernization of its tracking system. This included new, highly sensitive tracking cameras, control computers, filter wheels and other equipment, as well as a major redesign of the control software. The experiences along the migration path from an aged 19" VMbus based control system to the application of modern industrial PCs, from VxWorks real-time operating system to embedded Linux and a state of the art software architecture are presented. Further, the concept is presented to operate the new camera also as a scientific instrument, in parallel to tracking.
Building a Data Science capability for USGS water research and communication
NASA Astrophysics Data System (ADS)
Appling, A.; Read, E. K.
2015-12-01
Interpreting and communicating water issues in an era of exponentially increasing information requires a blend of domain expertise, computational proficiency, and communication skills. The USGS Office of Water Information has established a Data Science team to meet these needs, providing challenging careers for diverse domain scientists and innovators in the fields of information technology and data visualization. Here, we detail the experience of building a Data Science capability as a bridging element between traditional water resources analyses and modern computing tools and data management techniques. This approach includes four major components: 1) building reusable research tools, 2) documenting data-intensive research approaches in peer reviewed journals, 3) communicating complex water resources issues with interactive web visualizations, and 4) offering training programs for our peers in scientific computing. These components collectively improve the efficiency, transparency, and reproducibility of USGS data analyses and scientific workflows.
Bringing Legacy Visualization Software to Modern Computing Devices via Application Streaming
NASA Astrophysics Data System (ADS)
Fisher, Ward
2014-05-01
Planning software compatibility across forthcoming generations of computing platforms is a problem commonly encountered in software engineering and development. While this problem can affect any class of software, data analysis and visualization programs are particularly vulnerable. This is due in part to their inherent dependency on specialized hardware and computing environments. A number of strategies and tools have been designed to aid software engineers with this task. While generally embraced by developers at 'traditional' software companies, these methodologies are often dismissed by the scientific software community as unwieldy, inefficient and unnecessary. As a result, many important and storied scientific software packages can struggle to adapt to a new computing environment; for example, one in which much work is carried out on sub-laptop devices (such as tablets and smartphones). Rewriting these packages for a new platform often requires significant investment in terms of development time and developer expertise. In many cases, porting older software to modern devices is neither practical nor possible. As a result, replacement software must be developed from scratch, wasting resources better spent on other projects. Enabled largely by the rapid rise and adoption of cloud computing platforms, 'Application Streaming' technologies allow legacy visualization and analysis software to be operated wholly from a client device (be it laptop, tablet or smartphone) while retaining full functionality and interactivity. It mitigates much of the developer effort required by other more traditional methods while simultaneously reducing the time it takes to bring the software to a new platform. This work will provide an overview of Application Streaming and how it compares against other technologies which allow scientific visualization software to be executed from a remote computer. We will discuss the functionality and limitations of existing application streaming frameworks and how a developer might prepare their software for application streaming. We will also examine the secondary benefits realized by moving legacy software to the cloud. Finally, we will examine the process by which a legacy Java application, the Integrated Data Viewer (IDV), is to be adapted for tablet computing via Application Streaming.
Defining Computational Thinking for Mathematics and Science Classrooms
NASA Astrophysics Data System (ADS)
Weintrop, David; Beheshti, Elham; Horn, Michael; Orton, Kai; Jona, Kemi; Trouille, Laura; Wilensky, Uri
2016-02-01
Science and mathematics are becoming computational endeavors. This fact is reflected in the recently released Next Generation Science Standards and the decision to include "computational thinking" as a core scientific practice. With this addition, and the increased presence of computation in mathematics and scientific contexts, a new urgency has come to the challenge of defining computational thinking and providing a theoretical grounding for what form it should take in school science and mathematics classrooms. This paper presents a response to this challenge by proposing a definition of computational thinking for mathematics and science in the form of a taxonomy consisting of four main categories: data practices, modeling and simulation practices, computational problem solving practices, and systems thinking practices. In formulating this taxonomy, we draw on the existing computational thinking literature, interviews with mathematicians and scientists, and exemplary computational thinking instructional materials. This work was undertaken as part of a larger effort to infuse computational thinking into high school science and mathematics curricular materials. In this paper, we argue for the approach of embedding computational thinking in mathematics and science contexts, present the taxonomy, and discuss how we envision the taxonomy being used to bring current educational efforts in line with the increasingly computational nature of modern science and mathematics.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Arkin, Adam; Bader, David C.; Coffey, Richard
Understanding the fundamentals of genomic systems or the processes governing impactful weather patterns are examples of the types of simulation and modeling performed on the most advanced computing resources in America. High-performance computing and computational science together provide a necessary platform for the mission science conducted by the Biological and Environmental Research (BER) office at the U.S. Department of Energy (DOE). This report reviews BER’s computing needs and their importance for solving some of the toughest problems in BER’s portfolio. BER’s impact on science has been transformative. Mapping the human genome, including the U.S.-supported international Human Genome Project that DOEmore » began in 1987, initiated the era of modern biotechnology and genomics-based systems biology. And since the 1950s, BER has been a core contributor to atmospheric, environmental, and climate science research, beginning with atmospheric circulation studies that were the forerunners of modern Earth system models (ESMs) and by pioneering the implementation of climate codes onto high-performance computers. See http://exascaleage.org/ber/ for more information.« less
Science in the cloud (SIC): A use case in MRI connectomics
Gorgolewski, Krzysztof J.; Kleissas, Dean; Roncal, William Gray; Litt, Brian; Wandell, Brian; Poldrack, Russel A.; Wiener, Martin; Vogelstein, R. Jacob; Burns, Randal
2017-01-01
Abstract Modern technologies are enabling scientists to collect extraordinary amounts of complex and sophisticated data across a huge range of scales like never before. With this onslaught of data, we can allow the focal point to shift from data collection to data analysis. Unfortunately, lack of standardized sharing mechanisms and practices often make reproducing or extending scientific results very difficult. With the creation of data organization structures and tools that drastically improve code portability, we now have the opportunity to design such a framework for communicating extensible scientific discoveries. Our proposed solution leverages these existing technologies and standards, and provides an accessible and extensible model for reproducible research, called ‘science in the cloud’ (SIC). Exploiting scientific containers, cloud computing, and cloud data services, we show the capability to compute in the cloud and run a web service that enables intimate interaction with the tools and data presented. We hope this model will inspire the community to produce reproducible and, importantly, extensible results that will enable us to collectively accelerate the rate at which scientific breakthroughs are discovered, replicated, and extended. PMID:28327935
Science in the cloud (SIC): A use case in MRI connectomics.
Kiar, Gregory; Gorgolewski, Krzysztof J; Kleissas, Dean; Roncal, William Gray; Litt, Brian; Wandell, Brian; Poldrack, Russel A; Wiener, Martin; Vogelstein, R Jacob; Burns, Randal; Vogelstein, Joshua T
2017-05-01
Modern technologies are enabling scientists to collect extraordinary amounts of complex and sophisticated data across a huge range of scales like never before. With this onslaught of data, we can allow the focal point to shift from data collection to data analysis. Unfortunately, lack of standardized sharing mechanisms and practices often make reproducing or extending scientific results very difficult. With the creation of data organization structures and tools that drastically improve code portability, we now have the opportunity to design such a framework for communicating extensible scientific discoveries. Our proposed solution leverages these existing technologies and standards, and provides an accessible and extensible model for reproducible research, called 'science in the cloud' (SIC). Exploiting scientific containers, cloud computing, and cloud data services, we show the capability to compute in the cloud and run a web service that enables intimate interaction with the tools and data presented. We hope this model will inspire the community to produce reproducible and, importantly, extensible results that will enable us to collectively accelerate the rate at which scientific breakthroughs are discovered, replicated, and extended. © The Author 2017. Published by Oxford University Press.
Chen, Xiaodong; Ren, Liqiang; Zheng, Bin; Liu, Hong
2013-01-01
The conventional optical microscopes have been used widely in scientific research and in clinical practice. The modern digital microscopic devices combine the power of optical imaging and computerized analysis, archiving and communication techniques. It has a great potential in pathological examinations for improving the efficiency and accuracy of clinical diagnosis. This chapter reviews the basic optical principles of conventional microscopes, fluorescence microscopes and electron microscopes. The recent developments and future clinical applications of advanced digital microscopic imaging methods and computer assisted diagnosis schemes are also discussed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Krasheninnikov, Sergei I.; Angus, Justin; Lee, Wonjae
The goal of the Edge Simulation Laboratory (ESL) multi-institutional project is to advance scientific understanding of the edge plasma region of magnetic fusion devices via a coordinated effort utilizing modern computing resources, advanced algorithms, and ongoing theoretical development. The UCSD team was involved in the development of the COGENT code for kinetic studies across a magnetic separatrix. This work included a kinetic treatment of electrons and multiple ion species (impurities) and accurate collision operators.
The seven sins in academic behavior in the natural sciences.
van Gunsteren, Wilfred F
2013-01-02
"Seven deadly sins" in modern academic research and publishing can be condensed into a list ranging from poorly described experimental or computational setups to falsification of data. This Essay describes these sins and their ramifications, and serves as a code of best practice for researchers in their quest for scientific truth. Copyright © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Visualization and Interaction in Research, Teaching, and Scientific Communication
NASA Astrophysics Data System (ADS)
Ammon, C. J.
2017-12-01
Modern computing provides many tools for exploring observations, numerical calculations, and theoretical relationships. The number of options is, in fact, almost overwhelming. But the choices provide those with modest programming skills opportunities to create unique views of scientific information and to develop deeper insights into their data, their computations, and the underlying theoretical data-model relationships. I present simple examples of using animation and human-computer interaction to explore scientific data and scientific-analysis approaches. I illustrate how valuable a little programming ability can free scientists from the constraints of existing tools and can facilitate the development of deeper appreciation data and models. I present examples from a suite of programming languages ranging from C to JavaScript including the Wolfram Language. JavaScript is valuable for sharing tools and insight (hopefully) with others because it is integrated into one of the most powerful communication tools in human history, the web browser. Although too much of that power is often spent on distracting advertisements, the underlying computation and graphics engines are efficient, flexible, and almost universally available in desktop and mobile computing platforms. Many are working to fulfill the browser's potential to become the most effective tool for interactive study. Open-source frameworks for visualizing everything from algorithms to data are available, but advance rapidly. One strategy for dealing with swiftly changing tools is to adopt common, open data formats that are easily adapted (often by framework or tool developers). I illustrate the use of animation and interaction in research and teaching with examples from earthquake seismology.
Database Design and Management in Engineering Optimization.
1988-02-01
scientific and engineer- Q.- ’ method In the mid-19SOs along with modern digital com- ing applications. The paper highlights the difference puters, have made...is continuously tion software can call standard subroutines from the DBMS redefined in an application program, DDL must have j libary to define...operations. .. " type data usually encountered in engineering applications. GFDGT: Computes the number of digits needed to display " "’ A user
Physics of Spin-Polarized Media
2011-03-06
below, and we will provide citations where more details can be found from papers we have published. Most of the work supported by this AFOSR grant has...important for imaging of space objects, and much of the early work on this important technology was done at the Starfire Optical Range at Kirtland Air... space , together with modern scientific computing software makes it practical to analyze the full, multilevel system of optically pumped atoms. Sections
Scientific Tourism Centres in Armenia
NASA Astrophysics Data System (ADS)
Mickaelian, A. M.; Farmanyan, S. V.; Mikayelyan, G. A.; Mikayelyan, A. A.
2016-12-01
Armenia is rich in scientific sites, among which archaeological sites of scientific nature, modern scientific institutions and science related museums can be mentioned. Examples of archaeological sites are ancient observatories, petroglyphs having astronomical nature, as well as intangible heritage, such as Armenian calendars. Modern institutions having tools or laboratories which can be represented in terms of tourism, are considered as scientific tourism sites. Science related museums are Museum of science and technology, Space museum, Geological museum and other museums. Despite the fact, that scientific tourism is a new direction, it has great perspectives, and Armenia has a great potential in this field. It is very important to introduce Armenia from this angle, including scientific archaeological sites as well as modern institutions and museums. This article presents major scientific tourism centers of Armenia.
NASA Astrophysics Data System (ADS)
Alameda, J. C.
2011-12-01
Development and optimization of computational science models, particularly on high performance computers, and with the advent of ubiquitous multicore processor systems, practically on every system, has been accomplished with basic software tools, typically, command-line based compilers, debuggers, performance tools that have not changed substantially from the days of serial and early vector computers. However, model complexity, including the complexity added by modern message passing libraries such as MPI, and the need for hybrid code models (such as openMP and MPI) to be able to take full advantage of high performance computers with an increasing core count per shared memory node, has made development and optimization of such codes an increasingly arduous task. Additional architectural developments, such as many-core processors, only complicate the situation further. In this paper, we describe how our NSF-funded project, "SI2-SSI: A Productive and Accessible Development Workbench for HPC Applications Using the Eclipse Parallel Tools Platform" (WHPC) seeks to improve the Eclipse Parallel Tools Platform, an environment designed to support scientific code development targeted at a diverse set of high performance computing systems. Our WHPC project to improve Eclipse PTP takes an application-centric view to improve PTP. We are using a set of scientific applications, each with a variety of challenges, and using PTP to drive further improvements to both the scientific application, as well as to understand shortcomings in Eclipse PTP from an application developer perspective, to drive our list of improvements we seek to make. We are also partnering with performance tool providers, to drive higher quality performance tool integration. We have partnered with the Cactus group at Louisiana State University to improve Eclipse's ability to work with computational frameworks and extremely complex build systems, as well as to develop educational materials to incorporate into computational science and engineering codes. Finally, we are partnering with the lead PTP developers at IBM, to ensure we are as effective as possible within the Eclipse community development. We are also conducting training and outreach to our user community, including conference BOF sessions, monthly user calls, and an annual user meeting, so that we can best inform the improvements we make to Eclipse PTP. With these activities we endeavor to encourage use of modern software engineering practices, as enabled through the Eclipse IDE, with computational science and engineering applications. These practices include proper use of source code repositories, tracking and rectifying issues, measuring and monitoring code performance changes against both optimizations as well as ever-changing software stacks and configurations on HPC systems, as well as ultimately encouraging development and maintenance of testing suites -- things that have become commonplace in many software endeavors, but have lagged in the development of science applications. We view that the challenge with the increased complexity of both HPC systems and science applications demands the use of better software engineering methods, preferably enabled by modern tools such as Eclipse PTP, to help the computational science community thrive as we evolve the HPC landscape.
Ship Structure Committee Long-Range Research Plan - Guidelines for Program Development.
1982-01-01
many scientists and engineers who contributed their time and expertise. We are indebted especially to Mr. J. J. Hopkinson, Dr. J. G. Giannotti, Mr...projection is better than assuming extension of the status quo. There is a long lead time in the use of new knowledge. Scientific research maturation...They even promise to remove traditional constraints previously too intractable to be labelled problems. The modern computer is an outstanding example
High Performance Object-Oriented Scientific Programming in Fortran 90
NASA Technical Reports Server (NTRS)
Norton, Charles D.; Decyk, Viktor K.; Szymanski, Boleslaw K.
1997-01-01
We illustrate how Fortran 90 supports object-oriented concepts by example of plasma particle computations on the IBM SP. Our experience shows that Fortran 90 and object-oriented methodology give high performance while providing a bridge from Fortran 77 legacy codes to modern programming principles. All of our object-oriented Fortran 90 codes execute more quickly thatn the equeivalent C++ versions, yet the abstraction modelling capabilities used for scentific programming are comparably powereful.
Scientific Computing Strategic Plan for the Idaho National Laboratory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Whiting, Eric Todd
Scientific computing is a critical foundation of modern science. Without innovations in the field of computational science, the essential missions of the Department of Energy (DOE) would go unrealized. Taking a leadership role in such innovations is Idaho National Laboratory’s (INL’s) challenge and charge, and is central to INL’s ongoing success. Computing is an essential part of INL’s future. DOE science and technology missions rely firmly on computing capabilities in various forms. Modeling and simulation, fueled by innovations in computational science and validated through experiment, are a critical foundation of science and engineering. Big data analytics from an increasing numbermore » of widely varied sources is opening new windows of insight and discovery. Computing is a critical tool in education, science, engineering, and experiments. Advanced computing capabilities in the form of people, tools, computers, and facilities, will position INL competitively to deliver results and solutions on important national science and engineering challenges. A computing strategy must include much more than simply computers. The foundational enabling component of computing at many DOE national laboratories is the combination of a showcase like data center facility coupled with a very capable supercomputer. In addition, network connectivity, disk storage systems, and visualization hardware are critical and generally tightly coupled to the computer system and co located in the same facility. The existence of these resources in a single data center facility opens the doors to many opportunities that would not otherwise be possible.« less
A need for a code of ethics in science communication?
NASA Astrophysics Data System (ADS)
Benestad, R. E.
2009-09-01
The modern western civilization and high standard of living are to a large extent the 'fruits' of scientific endeavor over generations. Some examples include the longer life expectancy due to progress in medical sciences, and changes in infrastructure associated with the utilization of electromagnetism. Modern meteorology is not possible without the state-of-the-art digital computers, satellites, remote sensing, and communications. Science also is of relevance for policy making, e.g. the present hot topic of climate change. Climate scientists have recently become much exposed to media focus and mass communications, a task for which many are not trained. Furthermore, science, communication, and politics have different objectives, and do not necessarily mix. Scientists have an obligation to provide unbiased information, and a code of ethics is needed to give a guidance for acceptable and unacceptable conduct. Some examples of questionable conduct in Norway include using the title 'Ph.D' to imply scientific authority when the person never had obtained such an academic degree, or writing biased and one-sided articles in Norwegian encyclopedia that do not reflect the scientific consensus. It is proposed here that a set of guide lines (for the scientists and journalists) and a code of conduct could provide recommendation for regarding how to act in media - similar to a code of conduct with respect to carrying out research - to which everyone could agree, even when disagreeing on specific scientific questions.
Visions of the Future - the Changing Role of Actors in Data-Intensive Science
NASA Astrophysics Data System (ADS)
Schäfer, L.; Klump, J. F.
2013-12-01
Around the world scientific disciplines are increasingly facing the challenge of a burgeoning volume of research data. This data avalanche consists of a stream of information generated from sensors and scientific instruments, digital recordings, social-science surveys or drawn from the World Wide Web. All areas of the scientific economy are affected by this rapid growth in data, from the logging of digs in Archaeology, telescope data with observations of distant galaxies in Astrophysics or data from polls and surveys in the Social Sciences. The challenge for science is not only to process the data through analysis, reduction and visualization, but also to set up infrastructures for provisioning and storing the data. The rise of new technologies and developments also poses new challenges for the actors in the area of research data infrastructures. Libraries, as one of the actors, enable access to digital media and support the publication of research data and its long-term archiving. Digital media and research data, however, introduce new aspects into the libraries' range of activities. How are we to imagine the library of the future? The library as an interface to the computer centers? Will library and computer center fuse into a new service unit? What role will scientific publishers play in future? Currently the traditional form of publication still carry greater weight - articles for conferences and journals. But will this still be the case in future? New forms of publication are already making their presence felt. The tasks of the computer centers may also change. Yesterday their remit was provisioning of rapid hardware, whereas now everything revolves around the topic of data and services. Finally, how about the researchers themselves? Not such a long time ago, Geoscience was not necessarily seen as linked to Computer Science. Nowadays, modern Geoscience relies heavily on IT and its techniques. Thus, in how far will the profile of the modern geoscientist change? This gives rise to the question of what tools are required to locate and pursue the correct course in a networked world. One tool from the area of innovation management is the scenario technique. This poster will outline visions of the future as possible developments of the scientific world in 2020 (or later). The scenarios presented will show possible developments - both positive and negative. It is up then to the actors themselves to define their own position in this context, to rethink it and consider steps that can achieve a positive development for the future.
Modernization of the NASA scientific and technical information program
NASA Technical Reports Server (NTRS)
Cotter, Gladys A.; Hunter, Judy F.; Ostergaard, K.
1993-01-01
The NASA Scientific and Technical Information Program utilizes a technology infrastructure assembled in the mid 1960s to late 1970s to process and disseminate its information products. When this infrastructure was developed it placed NASA as a leader in processing STI. The retrieval engine for the STI database was the first of its kind and was used as the basis for developing commercial, other U.S., and foreign government agency retrieval systems. Due to the combination of changes in user requirements and the tremendous increase in technological capabilities readily available in the marketplace, this infrastructure is no longer the most cost-effective or efficient methodology available. Consequently, the NASA STI Program is pursuing a modernization effort that applies new technology to current processes to provide near-term benefits to the user. In conjunction with this activity, we are developing a long-term modernization strategy designed to transition the Program to a multimedia, global 'library without walls.' Critical pieces of the long-term strategy include streamlining access to sources of STI by using advances in computer networking and graphical user interfaces; creating and disseminating technical information in various electronic media including optical disks, video, and full text; and establishing a Technology Focus Group to maintain a current awareness of emerging technology and to plan for the future.
Data Mining and Machine Learning in Time-Domain Discovery and Classification
NASA Astrophysics Data System (ADS)
Bloom, Joshua S.; Richards, Joseph W.
2012-03-01
The changing heavens have played a central role in the scientific effort of astronomers for centuries. Galileo's synoptic observations of the moons of Jupiter and the phases of Venus starting in 1610, provided strong refutation of Ptolemaic cosmology. These observations came soon after the discovery of Kepler's supernova had challenged the notion of an unchanging firmament. In more modern times, the discovery of a relationship between period and luminosity in some pulsational variable stars [41] led to the inference of the size of the Milky way, the distance scale to the nearest galaxies, and the expansion of the Universe (see Ref. [30] for review). Distant explosions of supernovae were used to uncover the existence of dark energy and provide a precise numerical account of dark matter (e.g., [3]). Repeat observations of pulsars [71] and nearby main-sequence stars revealed the presence of the first extrasolar planets [17,35,44,45]. Indeed, time-domain observations of transient events and variable stars, as a technique, influences a broad diversity of pursuits in the entire astronomy endeavor [68]. While, at a fundamental level, the nature of the scientific pursuit remains unchanged, the advent of astronomy as a data-driven discipline presents fundamental challenges to the way in which the scientific process must now be conducted. Digital images (and data cubes) are not only getting larger, there are more of them. On logistical grounds, this taxes storage and transport systems. But it also implies that the intimate connection that astronomers have always enjoyed with their data - from collection to processing to analysis to inference - necessarily must evolve. Figure 6.1 highlights some of the ways that the pathway to scientific inference is now influenced (if not driven by) modern automation processes, computing, data-mining, and machine-learning (ML). The emerging reliance on computation and ML is a general one - a central theme of this book - but the time-domain aspect of the data and the objects of interest presents some unique challenges. First, any collection, storage, transport, and computational framework for processing the streaming data must be able to keep up with the dataflow. This is not necessarily true, for instance, with static sky science, where metrics of interest can be computed off-line and on a timescale much longer than the time required to obtain the data. Second, many types of transient (one-off) events evolve quickly in time and require more observations to fully understand the nature of the events. This demands that time-changing events are quickly discovered, classified, and broadcast to other follow-up facilities. All of this must happen robustly with, in some cases, very limited data. Last, the process of discovery and classification must be calibrated to the available resources for computation and follow-up. That is, the precision of classification must be weighed against the computational cost of producing that level of precision. Likewise, the cost of being wrong about the classification of some sorts of sources must be balanced against the scientific gains about being right about the classification of other types of sources. Quantifying these trade-offs, especially in the presence of a limited amount of follow-up resources (such as the availability of larger telescope observations) is not straightforward and inheres domain-specific imperatives that will, in general, differ from astronomer to astronomer. This chapter presents an overview of the current directions in ML and data-mining techniques in the context of time-domain astronomy. Ultimately the goal - if not just the necessity given the data rates and the diversity of questions to be answered - is to abstract the traditional role of astronomer in the entire scientific process. In some sense, this takes us full circle from the pre modern view of the scientific pursuit presented in Vermeer's "The Astronomer" (Figure 6.2): in broad daylight, he contemplates the nighttime heavens from depictions presented to him on globe, based on observations that others have made. He is an abstract thinker, far removed from data collection and processing; his most visceral connection to the skies is just the feel of the orb under his fingers. Substitute the globe for a plot on a screen generated from a structured query language (SQL) query to a massive public database in the cloud, and we have a picture of the modern astronomer benefitting from the ML and data-mining tools operating on an almost unfathomable amount of raw data.
Trends in Programming Languages for Neuroscience Simulations
Davison, Andrew P.; Hines, Michael L.; Muller, Eilif
2009-01-01
Neuroscience simulators allow scientists to express models in terms of biological concepts, without having to concern themselves with low-level computational details of their implementation. The expressiveness, power and ease-of-use of the simulator interface is critical in efficiently and accurately translating ideas into a working simulation. We review long-term trends in the development of programmable simulator interfaces, and examine the benefits of moving from proprietary, domain-specific languages to modern dynamic general-purpose languages, in particular Python, which provide neuroscientists with an interactive and expressive simulation development environment and easy access to state-of-the-art general-purpose tools for scientific computing. PMID:20198154
Computational Physics in a Nutshell
NASA Astrophysics Data System (ADS)
Schillaci, Michael
2001-11-01
Too often students of science are expected to ``pick-up'' what they need to know about the Art of Science. A description of the two-semester Computational Physics course being taught by the author offers a remedy to this situation. The course teaches students the three pillars of modern scientific research: Problem Solving, Programming, and Presentation. Using FORTRAN, LaTeXe, MAPLE V, HTML, and JAVA, students learn the fundamentals of algorithm development, how to implement classes and packages written by others, how to produce publication quality graphics and documents and how to publish them on the world-wide-web. The course content is outlined and project examples are offered.
Trends in programming languages for neuroscience simulations.
Davison, Andrew P; Hines, Michael L; Muller, Eilif
2009-01-01
Neuroscience simulators allow scientists to express models in terms of biological concepts, without having to concern themselves with low-level computational details of their implementation. The expressiveness, power and ease-of-use of the simulator interface is critical in efficiently and accurately translating ideas into a working simulation. We review long-term trends in the development of programmable simulator interfaces, and examine the benefits of moving from proprietary, domain-specific languages to modern dynamic general-purpose languages, in particular Python, which provide neuroscientists with an interactive and expressive simulation development environment and easy access to state-of-the-art general-purpose tools for scientific computing.
Levitating Trains and Kamikaze Genes: Technological Literacy for the Future
NASA Astrophysics Data System (ADS)
Brennan, Richard P.
1994-08-01
A lively survey of the horizons of modern technology. Provides easy-to-read summaries of the state of the art in space science, biotechnology, computer science, exotic energy sources and materials engineering as well as life-enhancing medical advancements and environmental, transportation and defense/weapons technologies. Each chapter explains how a current or future technology works and provides an understanding of the underlying scientific concepts. Includes an extensive self-test to review your knowledge.
Neuromorphic Computing – From Materials Research to Systems Architecture Roundtable
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schuller, Ivan K.; Stevens, Rick; Pino, Robinson
2015-10-29
Computation in its many forms is the engine that fuels our modern civilization. Modern computation—based on the von Neumann architecture—has allowed, until now, the development of continuous improvements, as predicted by Moore’s law. However, computation using current architectures and materials will inevitably—within the next 10 years—reach a limit because of fundamental scientific reasons. DOE convened a roundtable of experts in neuromorphic computing systems, materials science, and computer science in Washington on October 29-30, 2015 to address the following basic questions: Can brain-like (“neuromorphic”) computing devices based on new material concepts and systems be developed to dramatically outperform conventional CMOS basedmore » technology? If so, what are the basic research challenges for materials sicence and computing? The overarching answer that emerged was: The development of novel functional materials and devices incorporated into unique architectures will allow a revolutionary technological leap toward the implementation of a fully “neuromorphic” computer. To address this challenge, the following issues were considered: The main differences between neuromorphic and conventional computing as related to: signaling models, timing/clock, non-volatile memory, architecture, fault tolerance, integrated memory and compute, noise tolerance, analog vs. digital, and in situ learning New neuromorphic architectures needed to: produce lower energy consumption, potential novel nanostructured materials, and enhanced computation Device and materials properties needed to implement functions such as: hysteresis, stability, and fault tolerance Comparisons of different implementations: spin torque, memristors, resistive switching, phase change, and optical schemes for enhanced breakthroughs in performance, cost, fault tolerance, and/or manufacturability.« less
Machine learning of molecular electronic properties in chemical compound space
NASA Astrophysics Data System (ADS)
Montavon, Grégoire; Rupp, Matthias; Gobre, Vivekanand; Vazquez-Mayagoitia, Alvaro; Hansen, Katja; Tkatchenko, Alexandre; Müller, Klaus-Robert; Anatole von Lilienfeld, O.
2013-09-01
The combination of modern scientific computing with electronic structure theory can lead to an unprecedented amount of data amenable to intelligent data analysis for the identification of meaningful, novel and predictive structure-property relationships. Such relationships enable high-throughput screening for relevant properties in an exponentially growing pool of virtual compounds that are synthetically accessible. Here, we present a machine learning model, trained on a database of ab initio calculation results for thousands of organic molecules, that simultaneously predicts multiple electronic ground- and excited-state properties. The properties include atomization energy, polarizability, frontier orbital eigenvalues, ionization potential, electron affinity and excitation energies. The machine learning model is based on a deep multi-task artificial neural network, exploiting the underlying correlations between various molecular properties. The input is identical to ab initio methods, i.e. nuclear charges and Cartesian coordinates of all atoms. For small organic molecules, the accuracy of such a ‘quantum machine’ is similar, and sometimes superior, to modern quantum-chemical methods—at negligible computational cost.
JINR cloud infrastructure evolution
NASA Astrophysics Data System (ADS)
Baranov, A. V.; Balashov, N. A.; Kutovskiy, N. A.; Semenov, R. N.
2016-09-01
To fulfil JINR commitments in different national and international projects related to the use of modern information technologies such as cloud and grid computing as well as to provide a modern tool for JINR users for their scientific research a cloud infrastructure was deployed at Laboratory of Information Technologies of Joint Institute for Nuclear Research. OpenNebula software was chosen as a cloud platform. Initially it was set up in simple configuration with single front-end host and a few cloud nodes. Some custom development was done to tune JINR cloud installation to fit local needs: web form in the cloud web-interface for resources request, a menu item with cloud utilization statistics, user authentication via Kerberos, custom driver for OpenVZ containers. Because of high demand in that cloud service and its resources over-utilization it was re-designed to cover increasing users' needs in capacity, availability and reliability. Recently a new cloud instance has been deployed in high-availability configuration with distributed network file system and additional computing power.
NASA Astrophysics Data System (ADS)
Lebedev, A. A.; Ivanova, E. G.; Komleva, V. A.; Klokov, N. M.; Komlev, A. A.
2017-01-01
The considered method of learning the basics of microelectronic circuits and systems amplifier enables one to understand electrical processes deeper, to understand the relationship between static and dynamic characteristics and, finally, bring the learning process to the cognitive process. The scheme of problem-based learning can be represented by the following sequence of procedures: the contradiction is perceived and revealed; the cognitive motivation is provided by creating a problematic situation (the mental state of the student), moving the desire to solve the problem, to raise the question "why?", the hypothesis is made; searches for solutions are implemented; the answer is looked for. Due to the complexity of architectural schemes in the work the modern methods of computer analysis and synthesis are considered in the work. Examples of engineering by students in the framework of students' scientific and research work of analog circuits with improved performance based on standard software and software developed at the Department of Microelectronics MEPhI.
Multicore Architecture-aware Scientific Applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Srinivasa, Avinash
Modern high performance systems are becoming increasingly complex and powerful due to advancements in processor and memory architecture. In order to keep up with this increasing complexity, applications have to be augmented with certain capabilities to fully exploit such systems. These may be at the application level, such as static or dynamic adaptations or at the system level, like having strategies in place to override some of the default operating system polices, the main objective being to improve computational performance of the application. The current work proposes two such capabilites with respect to multi-threaded scientific applications, in particular a largemore » scale physics application computing ab-initio nuclear structure. The first involves using a middleware tool to invoke dynamic adaptations in the application, so as to be able to adjust to the changing computational resource availability at run-time. The second involves a strategy for effective placement of data in main memory, to optimize memory access latencies and bandwidth. These capabilties when included were found to have a significant impact on the application performance, resulting in average speedups of as much as two to four times.« less
Baber, Z
2001-03-01
In this paper, the role of scientific knowledge, institutions and colonialism in mutually co-producing each other is analysed. Under the overarching rubric of colonial structures and imperatives, amateur scientists sought to deploy scientific expertise to expand the empire while at the same time seeking to take advantage of the opportunities to develop their careers as 'scientists'. The role of a complex interplay of structure and agency in the development of modern science, not just in India but in Britain too is analysed. The role of science and technology in the incorporation of South Asian into the modern world system, as well as the consequences of the emergent structures in understanding the trajectory of modern science in post-colonial India is examined. Overall, colonial rule did not simply diffuse modern science from the core to the periphery. Rather the colonial encounter led to the development of new forms of scientific knowledge and institutions both in the periphery and the core.
Novel 3D/VR interactive environment for MD simulations, visualization and analysis.
Doblack, Benjamin N; Allis, Tim; Dávila, Lilian P
2014-12-18
The increasing development of computing (hardware and software) in the last decades has impacted scientific research in many fields including materials science, biology, chemistry and physics among many others. A new computational system for the accurate and fast simulation and 3D/VR visualization of nanostructures is presented here, using the open-source molecular dynamics (MD) computer program LAMMPS. This alternative computational method uses modern graphics processors, NVIDIA CUDA technology and specialized scientific codes to overcome processing speed barriers common to traditional computing methods. In conjunction with a virtual reality system used to model materials, this enhancement allows the addition of accelerated MD simulation capability. The motivation is to provide a novel research environment which simultaneously allows visualization, simulation, modeling and analysis. The research goal is to investigate the structure and properties of inorganic nanostructures (e.g., silica glass nanosprings) under different conditions using this innovative computational system. The work presented outlines a description of the 3D/VR Visualization System and basic components, an overview of important considerations such as the physical environment, details on the setup and use of the novel system, a general procedure for the accelerated MD enhancement, technical information, and relevant remarks. The impact of this work is the creation of a unique computational system combining nanoscale materials simulation, visualization and interactivity in a virtual environment, which is both a research and teaching instrument at UC Merced.
Novel 3D/VR Interactive Environment for MD Simulations, Visualization and Analysis
Doblack, Benjamin N.; Allis, Tim; Dávila, Lilian P.
2014-01-01
The increasing development of computing (hardware and software) in the last decades has impacted scientific research in many fields including materials science, biology, chemistry and physics among many others. A new computational system for the accurate and fast simulation and 3D/VR visualization of nanostructures is presented here, using the open-source molecular dynamics (MD) computer program LAMMPS. This alternative computational method uses modern graphics processors, NVIDIA CUDA technology and specialized scientific codes to overcome processing speed barriers common to traditional computing methods. In conjunction with a virtual reality system used to model materials, this enhancement allows the addition of accelerated MD simulation capability. The motivation is to provide a novel research environment which simultaneously allows visualization, simulation, modeling and analysis. The research goal is to investigate the structure and properties of inorganic nanostructures (e.g., silica glass nanosprings) under different conditions using this innovative computational system. The work presented outlines a description of the 3D/VR Visualization System and basic components, an overview of important considerations such as the physical environment, details on the setup and use of the novel system, a general procedure for the accelerated MD enhancement, technical information, and relevant remarks. The impact of this work is the creation of a unique computational system combining nanoscale materials simulation, visualization and interactivity in a virtual environment, which is both a research and teaching instrument at UC Merced. PMID:25549300
Modern Education in China. Bulletin, 1919, No. 44
ERIC Educational Resources Information Center
Edmunds, Charles K.
1919-01-01
The Chinese conception of life's values is so different from that of western peoples that they have failed to develop modern technique and scientific knowledge. Now that they have come to see the value of these, rapid and fundamental changes are taking place. When modern scientific knowledge is added to the skill which the Chinese already have in…
Towards a Unified Architecture for Data-Intensive Seismology in VERCE
NASA Astrophysics Data System (ADS)
Klampanos, I.; Spinuso, A.; Trani, L.; Krause, A.; Garcia, C. R.; Atkinson, M.
2013-12-01
Modern seismology involves managing, storing and processing large datasets, typically geographically distributed across organisations. Performing computational experiments using these data generates more data, which in turn have to be managed, further analysed and frequently be made available within or outside the scientific community. As part of the EU-funded project VERCE (http://verce.eu), we research and develop a number of use-cases, interfacing technologies to satisfy the data-intensive requirements of modern seismology. Our solution seeks to support: (1) familiar programming environments to develop and execute experiments, in particular via Python/ObsPy, (2) a unified view of heterogeneous computing resources, public or private, through the adoption of workflows, (3) monitoring the experiments and validating the data products at varying granularities, via a comprehensive provenance system, (4) reproducibility of experiments and consistency in collaboration, via a shared registry of processing units and contextual metadata (computing resources, data, etc.) Here, we provide a brief account of these components and their roles in the proposed architecture. Our design integrates heterogeneous distributed systems, while allowing researchers to retain current practices and control data handling and execution via higher-level abstractions. At the core of our solution lies the workflow language Dispel. While Dispel can be used to express workflows at fine detail, it may also be used as part of meta- or job-submission workflows. User interaction can be provided through a visual editor or through custom applications on top of parameterisable workflows, which is the approach VERCE follows. According to our design, the scientist may use versions of Dispel/workflow processing elements offered by the VERCE library or override them introducing custom scientific code, using ObsPy. This approach has the advantage that, while the scientist uses a familiar tool, the resulting workflow can be executed on a number of underlying stream-processing engines, such as STORM or OGSA-DAI, transparently. While making efficient use of arbitrarily distributed resources and large data-sets is of priority, such processing requires adequate provenance tracking and monitoring. Hiding computation and orchestration details via a workflow system, allows us to embed provenance harvesting where appropriate without impeding the user's regular working patterns. Our provenance model is based on the W3C PROV standard and can provide information of varying granularity regarding execution, systems and data consumption/production. A video demonstrating a prototype provenance exploration tool can be found at http://bit.ly/15t0Fz0. Keeping experimental methodology and results open and accessible, as well as encouraging reproducibility and collaboration, is of central importance to modern science. As our users are expected to be based at different geographical locations, to have access to different computing resources and to employ customised scientific codes, the use of a shared registry of workflow components, implementations, data and computing resources is critical.
Basic energy sciences: Summary of accomplishments
NASA Astrophysics Data System (ADS)
1990-05-01
For more than four decades, the Department of Energy, including its predecessor agencies, has supported a program of basic research in nuclear- and energy related sciences, known as Basic Energy Sciences. The purpose of the program is to explore fundamental phenomena, create scientific knowledge, and provide unique user facilities necessary for conducting basic research. Its technical interests span the range of scientific disciplines: physical and biological sciences, geological sciences, engineering, mathematics, and computer sciences. Its products and facilities are essential to technology development in many of the more applied areas of the Department's energy, science, and national defense missions. The accomplishments of Basic Energy Sciences research are numerous and significant. Not only have they contributed to Departmental missions, but have aided significantly the development of technologies which now serve modern society daily in business, industry, science, and medicine. In a series of stories, this report highlights 22 accomplishments, selected because of their particularly noteworthy contributions to modern society. A full accounting of all the accomplishments would be voluminous. Detailed documentation of the research results can be found in many thousands of articles published in peer-reviewed technical literature.
Basic Energy Sciences: Summary of Accomplishments
DOE R&D Accomplishments Database
1990-05-01
For more than four decades, the Department of Energy, including its predecessor agencies, has supported a program of basic research in nuclear- and energy-related sciences, known as Basic Energy Sciences. The purpose of the program is to explore fundamental phenomena, create scientific knowledge, and provide unique user'' facilities necessary for conducting basic research. Its technical interests span the range of scientific disciplines: physical and biological sciences, geological sciences, engineering, mathematics, and computer sciences. Its products and facilities are essential to technology development in many of the more applied areas of the Department's energy, science, and national defense missions. The accomplishments of Basic Energy Sciences research are numerous and significant. Not only have they contributed to Departmental missions, but have aided significantly the development of technologies which now serve modern society daily in business, industry, science, and medicine. In a series of stories, this report highlights 22 accomplishments, selected because of their particularly noteworthy contributions to modern society. A full accounting of all the accomplishments would be voluminous. Detailed documentation of the research results can be found in many thousands of articles published in peer-reviewed technical literature.
Education through the prism of computation
NASA Astrophysics Data System (ADS)
Kaurov, Vitaliy
2014-03-01
With the rapid development of technology, computation claims its irrevocable place among research components of modern science. Thus to foster a successful future scientist, engineer or educator we need to add computation to the foundations of scientific education. We will discuss what type of paradigm shifts it brings to these foundations on the example of Wolfram Science Summer School. It is one of the most advanced computational outreach programs run by Wolfram Foundation, welcoming participants of almost all ages and backgrounds. Centered on complexity science and physics, it also covers numerous adjacent and interdisciplinary fields such as finance, biology, medicine and even music. We will talk about educational and research experiences in this program during the 12 years of its existence. We will review statistics and outputs the program has produced. Among these are interactive electronic publications at the Wolfram Demonstrations Project and contributions to the computational knowledge engine Wolfram|Alpa.
The application of computer image analysis in life sciences and environmental engineering
NASA Astrophysics Data System (ADS)
Mazur, R.; Lewicki, A.; Przybył, K.; Zaborowicz, M.; Koszela, K.; Boniecki, P.; Mueller, W.; Raba, B.
2014-04-01
The main aim of the article was to present research on the application of computer image analysis in Life Science and Environmental Engineering. The authors used different methods of computer image analysis in developing of an innovative biotest in modern biomonitoring of water quality. Created tools were based on live organisms such as bioindicators Lemna minor L. and Hydra vulgaris Pallas as well as computer image analysis method in the assessment of negatives reactions during the exposition of the organisms to selected water toxicants. All of these methods belong to acute toxicity tests and are particularly essential in ecotoxicological assessment of water pollutants. Developed bioassays can be used not only in scientific research but are also applicable in environmental engineering and agriculture in the study of adverse effects on water quality of various compounds used in agriculture and industry.
AstroGrid-D: Grid technology for astronomical science
NASA Astrophysics Data System (ADS)
Enke, Harry; Steinmetz, Matthias; Adorf, Hans-Martin; Beck-Ratzka, Alexander; Breitling, Frank; Brüsemeister, Thomas; Carlson, Arthur; Ensslin, Torsten; Högqvist, Mikael; Nickelt, Iliya; Radke, Thomas; Reinefeld, Alexander; Reiser, Angelika; Scholl, Tobias; Spurzem, Rainer; Steinacker, Jürgen; Voges, Wolfgang; Wambsganß, Joachim; White, Steve
2011-02-01
We present status and results of AstroGrid-D, a joint effort of astrophysicists and computer scientists to employ grid technology for scientific applications. AstroGrid-D provides access to a network of distributed machines with a set of commands as well as software interfaces. It allows simple use of computer and storage facilities and to schedule or monitor compute tasks and data management. It is based on the Globus Toolkit middleware (GT4). Chapter 1 describes the context which led to the demand for advanced software solutions in Astrophysics, and we state the goals of the project. We then present characteristic astrophysical applications that have been implemented on AstroGrid-D in chapter 2. We describe simulations of different complexity, compute-intensive calculations running on multiple sites (Section 2.1), and advanced applications for specific scientific purposes (Section 2.2), such as a connection to robotic telescopes (Section 2.2.3). We can show from these examples how grid execution improves e.g. the scientific workflow. Chapter 3 explains the software tools and services that we adapted or newly developed. Section 3.1 is focused on the administrative aspects of the infrastructure, to manage users and monitor activity. Section 3.2 characterises the central components of our architecture: The AstroGrid-D information service to collect and store metadata, a file management system, the data management system, and a job manager for automatic submission of compute tasks. We summarise the successfully established infrastructure in chapter 4, concluding with our future plans to establish AstroGrid-D as a platform of modern e-Astronomy.
Concept of JINR Corporate Information System
NASA Astrophysics Data System (ADS)
Filozova, I. A.; Bashashin, M. V.; Korenkov, V. V.; Kuniaev, S. V.; Musulmanbekov, G.; Semenov, R. N.; Shestakova, G. V.; Strizh, T. A.; Ustenko, P. V.; Zaikina, T. N.
2016-09-01
The article presents the concept of JINR Corporate Information System (JINR CIS). Special attention is given to the information support of scientific researches - Current Research Information System as a part of the corporate information system. The objectives of such a system are focused on ensuring an effective implementation and research by using the modern information technology, computer technology and automation, creation, development and integration of digital resources on a common conceptual framework. The project assumes continuous system development, introduction the new information technologies to ensure the technological system relevance.
Bonsai: an event-based framework for processing and controlling data streams
Lopes, Gonçalo; Bonacchi, Niccolò; Frazão, João; Neto, Joana P.; Atallah, Bassam V.; Soares, Sofia; Moreira, Luís; Matias, Sara; Itskov, Pavel M.; Correia, Patrícia A.; Medina, Roberto E.; Calcaterra, Lorenza; Dreosti, Elena; Paton, Joseph J.; Kampff, Adam R.
2015-01-01
The design of modern scientific experiments requires the control and monitoring of many different data streams. However, the serial execution of programming instructions in a computer makes it a challenge to develop software that can deal with the asynchronous, parallel nature of scientific data. Here we present Bonsai, a modular, high-performance, open-source visual programming framework for the acquisition and online processing of data streams. We describe Bonsai's core principles and architecture and demonstrate how it allows for the rapid and flexible prototyping of integrated experimental designs in neuroscience. We specifically highlight some applications that require the combination of many different hardware and software components, including video tracking of behavior, electrophysiology and closed-loop control of stimulation. PMID:25904861
On Stable Marriages and Greedy Matchings
DOE Office of Scientific and Technical Information (OSTI.GOV)
Manne, Fredrik; Naim, Md; Lerring, Hakon
2016-12-11
Research on stable marriage problems has a long and mathematically rigorous history, while that of exploiting greedy matchings in combinatorial scientific computing is a younger and less developed research field. In this paper we consider the relationships between these two areas. In particular we show that several problems related to computing greedy matchings can be formulated as stable marriage problems and as a consequence several recently proposed algorithms for computing greedy matchings are in fact special cases of well known algorithms for the stable marriage problem. However, in terms of implementations and practical scalable solutions on modern hardware, the greedymore » matching community has made considerable progress. We show that due to the strong relationship between these two fields many of these results are also applicable for solving stable marriage problems.« less
Plant, Richard R
2016-03-01
There is an ongoing 'replication crisis' across the field of psychology in which researchers, funders, and members of the public are questioning the results of some scientific studies and the validity of the data they are based upon. However, few have considered that a growing proportion of research in modern psychology is conducted using a computer. Could it simply be that the hardware and software, or experiment generator, being used to run the experiment itself be a cause of millisecond timing error and subsequent replication failure? This article serves as a reminder that millisecond timing accuracy in psychology studies remains an important issue and that care needs to be taken to ensure that studies can be replicated on current computer hardware and software.
Postdoctoral Fellow | Center for Cancer Research
The Neuro-Oncology Branch (NOB), Center for Cancer Research (CCR), National Cancer Institute (NCI) of the National Institutes of Health (NIH) is seeking outstanding postdoctoral candidates interested in studying metabolic and cell signaling pathways in the context of brain cancers through construction of computational models amenable to formal computational analysis and simulation. The ability to closely collaborate with the modern metabolomics center developed at CCR provides a unique opportunity for a postdoctoral candidate with a strong theoretical background and interest in demonstrating the incredible potential of computational approaches to solve problems from scientific disciplines and improve lives. The candidate will be given the opportunity to both construct data-driven models, as well as biologically validate the models by demonstrating the ability to predict the effects of altering tumor metabolism in laboratory and clinical settings.
Leveraging e-Science infrastructure for electrochemical research.
Peachey, Tom; Mashkina, Elena; Lee, Chong-Yong; Enticott, Colin; Abramson, David; Bond, Alan M; Elton, Darrell; Gavaghan, David J; Stevenson, Gareth P; Kennedy, Gareth F
2011-08-28
As in many scientific disciplines, modern chemistry involves a mix of experimentation and computer-supported theory. Historically, these skills have been provided by different groups, and range from traditional 'wet' laboratory science to advanced numerical simulation. Increasingly, progress is made by global collaborations, in which new theory may be developed in one part of the world and applied and tested in the laboratory elsewhere. e-Science, or cyber-infrastructure, underpins such collaborations by providing a unified platform for accessing scientific instruments, computers and data archives, and collaboration tools. In this paper we discuss the application of advanced e-Science software tools to electrochemistry research performed in three different laboratories--two at Monash University in Australia and one at the University of Oxford in the UK. We show that software tools that were originally developed for a range of application domains can be applied to electrochemical problems, in particular Fourier voltammetry. Moreover, we show that, by replacing ad-hoc manual processes with e-Science tools, we obtain more accurate solutions automatically.
Molecular structure input on the web.
Ertl, Peter
2010-02-02
A molecule editor, that is program for input and editing of molecules, is an indispensable part of every cheminformatics or molecular processing system. This review focuses on a special type of molecule editors, namely those that are used for molecule structure input on the web. Scientific computing is now moving more and more in the direction of web services and cloud computing, with servers scattered all around the Internet. Thus a web browser has become the universal scientific user interface, and a tool to edit molecules directly within the web browser is essential.The review covers a history of web-based structure input, starting with simple text entry boxes and early molecule editors based on clickable maps, before moving to the current situation dominated by Java applets. One typical example - the popular JME Molecule Editor - will be described in more detail. Modern Ajax server-side molecule editors are also presented. And finally, the possible future direction of web-based molecule editing, based on technologies like JavaScript and Flash, is discussed.
dREL: a relational expression language for dictionary methods.
Spadaccini, Nick; Castleden, Ian R; du Boulay, Doug; Hall, Sydney R
2012-08-27
The provision of precise metadata is an important but a largely underrated challenge for modern science [Nature 2009, 461, 145]. We describe here a dictionary methods language dREL that has been designed to enable complex data relationships to be expressed as formulaic scripts in data dictionaries written in DDLm [Spadaccini and Hall J. Chem. Inf. Model.2012 doi:10.1021/ci300075z]. dREL describes data relationships in a simple but powerful canonical form that is easy to read and understand and can be executed computationally to evaluate or validate data. The execution of dREL expressions is not a substitute for traditional scientific computation; it is to provide precise data dependency information to domain-specific definitions and a means for cross-validating data. Some scientific fields apply conventional programming languages to methods scripts but these tend to inhibit both dictionary development and accessibility. dREL removes the programming barrier and encourages the production of the metadata needed for seamless data archiving and exchange in science.
Figures of Disengagement: Charles Taylor, Scientific Parenting, and the Paradox of Late Modernity
ERIC Educational Resources Information Center
Van den Berge, Luc; Ramaekers, Stefan
2014-01-01
In this essay Luc Van den Berge and Stefan Ramaekers take the idea(l) of "scientific parenting" as an example of ambiguities that are typical of our late-modern condition. On the one hand, parenting seems like a natural thing to do, which makes "scientific parenting" sound like an oxymoron; on the other hand, a disengaged…
Enhancements to VTK enabling Scientific Visualization in Immersive Environments
DOE Office of Scientific and Technical Information (OSTI.GOV)
O'Leary, Patrick; Jhaveri, Sankhesh; Chaudhary, Aashish
Modern scientific, engineering and medical computational sim- ulations, as well as experimental and observational data sens- ing/measuring devices, produce enormous amounts of data. While statistical analysis provides insight into this data, scientific vi- sualization is tactically important for scientific discovery, prod- uct design and data analysis. These benefits are impeded, how- ever, when scientific visualization algorithms are implemented from scratch—a time-consuming and redundant process in im- mersive application development. This process can greatly ben- efit from leveraging the state-of-the-art open-source Visualization Toolkit (VTK) and its community. Over the past two (almost three) decades, integrating VTK with a virtual reality (VR)more » environment has only been attempted to varying degrees of success. In this pa- per, we demonstrate two new approaches to simplify this amalga- mation of an immersive interface with visualization rendering from VTK. In addition, we cover several enhancements to VTK that pro- vide near real-time updates and efficient interaction. Finally, we demonstrate the combination of VTK with both Vrui and OpenVR immersive environments in example applications.« less
MDTraj: A Modern Open Library for the Analysis of Molecular Dynamics Trajectories.
McGibbon, Robert T; Beauchamp, Kyle A; Harrigan, Matthew P; Klein, Christoph; Swails, Jason M; Hernández, Carlos X; Schwantes, Christian R; Wang, Lee-Ping; Lane, Thomas J; Pande, Vijay S
2015-10-20
As molecular dynamics (MD) simulations continue to evolve into powerful computational tools for studying complex biomolecular systems, the necessity of flexible and easy-to-use software tools for the analysis of these simulations is growing. We have developed MDTraj, a modern, lightweight, and fast software package for analyzing MD simulations. MDTraj reads and writes trajectory data in a wide variety of commonly used formats. It provides a large number of trajectory analysis capabilities including minimal root-mean-square-deviation calculations, secondary structure assignment, and the extraction of common order parameters. The package has a strong focus on interoperability with the wider scientific Python ecosystem, bridging the gap between MD data and the rapidly growing collection of industry-standard statistical analysis and visualization tools in Python. MDTraj is a powerful and user-friendly software package that simplifies the analysis of MD data and connects these datasets with the modern interactive data science software ecosystem in Python. Copyright © 2015 Biophysical Society. Published by Elsevier Inc. All rights reserved.
MDTraj: A Modern Open Library for the Analysis of Molecular Dynamics Trajectories
McGibbon, Robert T.; Beauchamp, Kyle A.; Harrigan, Matthew P.; Klein, Christoph; Swails, Jason M.; Hernández, Carlos X.; Schwantes, Christian R.; Wang, Lee-Ping; Lane, Thomas J.; Pande, Vijay S.
2015-01-01
As molecular dynamics (MD) simulations continue to evolve into powerful computational tools for studying complex biomolecular systems, the necessity of flexible and easy-to-use software tools for the analysis of these simulations is growing. We have developed MDTraj, a modern, lightweight, and fast software package for analyzing MD simulations. MDTraj reads and writes trajectory data in a wide variety of commonly used formats. It provides a large number of trajectory analysis capabilities including minimal root-mean-square-deviation calculations, secondary structure assignment, and the extraction of common order parameters. The package has a strong focus on interoperability with the wider scientific Python ecosystem, bridging the gap between MD data and the rapidly growing collection of industry-standard statistical analysis and visualization tools in Python. MDTraj is a powerful and user-friendly software package that simplifies the analysis of MD data and connects these datasets with the modern interactive data science software ecosystem in Python. PMID:26488642
ISCB Ebola Award for Important Future Research on the Computational Biology of Ebola Virus
Karp, Peter D.; Berger, Bonnie; Kovats, Diane; Lengauer, Thomas; Linial, Michal; Sabeti, Pardis; Hide, Winston; Rost, Burkhard
2015-01-01
Speed is of the essence in combating Ebola; thus, computational approaches should form a significant component of Ebola research. As for the development of any modern drug, computational biology is uniquely positioned to contribute through comparative analysis of the genome sequences of Ebola strains as well as 3-D protein modeling. Other computational approaches to Ebola may include large-scale docking studies of Ebola proteins with human proteins and with small-molecule libraries, computational modeling of the spread of the virus, computational mining of the Ebola literature, and creation of a curated Ebola database. Taken together, such computational efforts could significantly accelerate traditional scientific approaches. In recognition of the need for important and immediate solutions from the field of computational biology against Ebola, the International Society for Computational Biology (ISCB) announces a prize for an important computational advance in fighting the Ebola virus. ISCB will confer the ISCB Fight against Ebola Award, along with a prize of US$2,000, at its July 2016 annual meeting (ISCB Intelligent Systems for Molecular Biology (ISMB) 2016, Orlando, Florida). PMID:26097686
ISCB Ebola Award for Important Future Research on the Computational Biology of Ebola Virus.
Karp, Peter D; Berger, Bonnie; Kovats, Diane; Lengauer, Thomas; Linial, Michal; Sabeti, Pardis; Hide, Winston; Rost, Burkhard
2015-01-01
Speed is of the essence in combating Ebola; thus, computational approaches should form a significant component of Ebola research. As for the development of any modern drug, computational biology is uniquely positioned to contribute through comparative analysis of the genome sequences of Ebola strains as well as 3-D protein modeling. Other computational approaches to Ebola may include large-scale docking studies of Ebola proteins with human proteins and with small-molecule libraries, computational modeling of the spread of the virus, computational mining of the Ebola literature, and creation of a curated Ebola database. Taken together, such computational efforts could significantly accelerate traditional scientific approaches. In recognition of the need for important and immediate solutions from the field of computational biology against Ebola, the International Society for Computational Biology (ISCB) announces a prize for an important computational advance in fighting the Ebola virus. ISCB will confer the ISCB Fight against Ebola Award, along with a prize of US$2,000, at its July 2016 annual meeting (ISCB Intelligent Systems for Molecular Biology (ISMB) 2016, Orlando, Florida).
Patient Privacy in the Era of Big Data.
Kayaalp, Mehmet
2018-01-20
Privacy was defined as a fundamental human right in the Universal Declaration of Human Rights at the 1948 United Nations General Assembly. However, there is still no consensus on what constitutes privacy. In this review, we look at the evolution of privacy as a concept from the era of Hippocrates to the era of social media and big data. To appreciate the modern measures of patient privacy protection and correctly interpret the current regulatory framework in the United States, we need to analyze and understand the concepts of individually identifiable information, individually identifiable health information, protected health information, and de-identification. The Privacy Rule of the Health Insurance Portability and Accountability Act defines the regulatory framework and casts a balance between protective measures and access to health information for secondary (scientific) use. The rule defines the conditions when health information is protected by law and how protected health information can be de-identified for secondary use. With the advents of artificial intelligence and computational linguistics, computational text de-identification algorithms produce de-identified results nearly as well as those produced by human experts, but much faster, more consistently and basically for free. Modern clinical text de-identification systems now pave the road to big data and enable scientists to access de-identified clinical information while firmly protecting patient privacy. However, clinical text de-identification is not a perfect process. In order to maximize the protection of patient privacy and to free clinical and scientific information from the confines of electronic healthcare systems, all stakeholders, including patients, health institutions and institutional review boards, scientists and the scientific communities, as well as regulatory and law enforcement agencies must collaborate closely. On the one hand, public health laws and privacy regulations define rules and responsibilities such as requesting and granting only the amount of health information that is necessary for the scientific study. On the other hand, developers of de-identification systems provide guidelines to use different modes of operations to maximize the effectiveness of their tools and the success of de-identification. Institutions with clinical repositories need to follow these rules and guidelines closely to successfully protect patient privacy. To open the gates of big data to scientific communities, healthcare institutions need to be supported in their de-identification and data sharing efforts by the public, scientific communities, and local, state, and federal legislators and government agencies.
Patient Privacy in the Era of Big Data
Kayaalp, Mehmet
2018-01-01
Privacy was defined as a fundamental human right in the Universal Declaration of Human Rights at the 1948 United Nations General Assembly. However, there is still no consensus on what constitutes privacy. In this review, we look at the evolution of privacy as a concept from the era of Hippocrates to the era of social media and big data. To appreciate the modern measures of patient privacy protection and correctly interpret the current regulatory framework in the United States, we need to analyze and understand the concepts of individually identifiable information, individually identifiable health information, protected health information, and de-identification. The Privacy Rule of the Health Insurance Portability and Accountability Act defines the regulatory framework and casts a balance between protective measures and access to health information for secondary (scientific) use. The rule defines the conditions when health information is protected by law and how protected health information can be de-identified for secondary use. With the advents of artificial intelligence and computational linguistics, computational text de-identification algorithms produce de-identified results nearly as well as those produced by human experts, but much faster, more consistently and basically for free. Modern clinical text de-identification systems now pave the road to big data and enable scientists to access de-identified clinical information while firmly protecting patient privacy. However, clinical text de-identification is not a perfect process. In order to maximize the protection of patient privacy and to free clinical and scientific information from the confines of electronic healthcare systems, all stakeholders, including patients, health institutions and institutional review boards, scientists and the scientific communities, as well as regulatory and law enforcement agencies must collaborate closely. On the one hand, public health laws and privacy regulations define rules and responsibilities such as requesting and granting only the amount of health information that is necessary for the scientific study. On the other hand, developers of de-identification systems provide guidelines to use different modes of operations to maximize the effectiveness of their tools and the success of de-identification. Institutions with clinical repositories need to follow these rules and guidelines closely to successfully protect patient privacy. To open the gates of big data to scientific communities, healthcare institutions need to be supported in their de-identification and data sharing efforts by the public, scientific communities, and local, state, and federal legislators and government agencies. PMID:28903886
GPU-Accelerated Molecular Modeling Coming Of Age
Stone, John E.; Hardy, David J.; Ufimtsev, Ivan S.
2010-01-01
Graphics processing units (GPUs) have traditionally been used in molecular modeling solely for visualization of molecular structures and animation of trajectories resulting from molecular dynamics simulations. Modern GPUs have evolved into fully programmable, massively parallel co-processors that can now be exploited to accelerate many scientific computations, typically providing about one order of magnitude speedup over CPU code and in special cases providing speedups of two orders of magnitude. This paper surveys the development of molecular modeling algorithms that leverage GPU computing, the advances already made and remaining issues to be resolved, and the continuing evolution of GPU technology that promises to become even more useful to molecular modeling. Hardware acceleration with commodity GPUs is expected to benefit the overall computational biology community by bringing teraflops performance to desktop workstations and in some cases potentially changing what were formerly batch-mode computational jobs into interactive tasks. PMID:20675161
The computational challenges of Earth-system science.
O'Neill, Alan; Steenman-Clark, Lois
2002-06-15
The Earth system--comprising atmosphere, ocean, land, cryosphere and biosphere--is an immensely complex system, involving processes and interactions on a wide range of space- and time-scales. To understand and predict the evolution of the Earth system is one of the greatest challenges of modern science, with success likely to bring enormous societal benefits. High-performance computing, along with the wealth of new observational data, is revolutionizing our ability to simulate the Earth system with computer models that link the different components of the system together. There are, however, considerable scientific and technical challenges to be overcome. This paper will consider four of them: complexity, spatial resolution, inherent uncertainty and time-scales. Meeting these challenges requires a significant increase in the power of high-performance computers. The benefits of being able to make reliable predictions about the evolution of the Earth system should, on their own, amply repay this investment.
GPU-accelerated molecular modeling coming of age.
Stone, John E; Hardy, David J; Ufimtsev, Ivan S; Schulten, Klaus
2010-09-01
Graphics processing units (GPUs) have traditionally been used in molecular modeling solely for visualization of molecular structures and animation of trajectories resulting from molecular dynamics simulations. Modern GPUs have evolved into fully programmable, massively parallel co-processors that can now be exploited to accelerate many scientific computations, typically providing about one order of magnitude speedup over CPU code and in special cases providing speedups of two orders of magnitude. This paper surveys the development of molecular modeling algorithms that leverage GPU computing, the advances already made and remaining issues to be resolved, and the continuing evolution of GPU technology that promises to become even more useful to molecular modeling. Hardware acceleration with commodity GPUs is expected to benefit the overall computational biology community by bringing teraflops performance to desktop workstations and in some cases potentially changing what were formerly batch-mode computational jobs into interactive tasks. (c) 2010 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Khodachenko, Maxim; Miller, Steven; Stoeckler, Robert; Topf, Florian
2010-05-01
Computational modeling and observational data analysis are two major aspects of the modern scientific research. Both appear nowadays under extensive development and application. Many of the scientific goals of planetary space missions require robust models of planetary objects and environments as well as efficient data analysis algorithms, to predict conditions for mission planning and to interpret the experimental data. Europe has great strength in these areas, but it is insufficiently coordinated; individual groups, models, techniques and algorithms need to be coupled and integrated. Existing level of scientific cooperation and the technical capabilities for operative communication, allow considerable progress in the development of a distributed international Research Infrastructure (RI) which is based on the existing in Europe computational modelling and data analysis centers, providing the scientific community with dedicated services in the fields of their computational and data analysis expertise. These services will appear as a product of the collaborative communication and joint research efforts of the numerical and data analysis experts together with planetary scientists. The major goal of the EUROPLANET-RI / EMDAF is to make computational models and data analysis algorithms associated with particular national RIs and teams, as well as their outputs, more readily available to their potential user community and more tailored to scientific user requirements, without compromising front-line specialized research on model and data analysis algorithms development and software implementation. This objective will be met through four keys subdivisions/tasks of EMAF: 1) an Interactive Catalogue of Planetary Models; 2) a Distributed Planetary Modelling Laboratory; 3) a Distributed Data Analysis Laboratory, and 4) enabling Models and Routines for High Performance Computing Grids. Using the advantages of the coordinated operation and efficient communication between the involved computational modelling, research and data analysis expert teams and their related research infrastructures, EMDAF will provide a 1) flexible, 2) scientific user oriented, 3) continuously developing and fast upgrading computational and data analysis service to support and intensify the European planetary scientific research. At the beginning EMDAF will create a set of demonstrators and operational tests of this service in key areas of European planetary science. This work will aim at the following objectives: (a) Development and implementation of tools for distant interactive communication between the planetary scientists and computing experts (including related RIs); (b) Development of standard routine packages, and user-friendly interfaces for operation of the existing numerical codes and data analysis algorithms by the specialized planetary scientists; (c) Development of a prototype of numerical modelling services "on demand" for space missions and planetary researchers; (d) Development of a prototype of data analysis services "on demand" for space missions and planetary researchers; (e) Development of a prototype of coordinated interconnected simulations of planetary phenomena and objects (global multi-model simulators); (f) Providing the demonstrators of a coordinated use of high performance computing facilities (super-computer networks), done in cooperation with European HPC Grid DEISA.
NASA Astrophysics Data System (ADS)
Davis, J. B.; Rigsby, C. A.; Muston, C.; Robinson, Z.; Morehead, A.; Stellwag, E. J.; Shinpaugh, J.; Thompson, A.; Teller, J.
2010-12-01
Graduate students and faculty at East Carolina University are working with area high schools to address the common science and mathematics deficiencies of many high school students. Project RaN (Reasoning about Nature), an interdisciplinary science/math/education research project, addresses these deficiencies by focusing on the history of science and the relationship between that history and modern scientific thought and practice. The geological sciences portion of project RaN has three specific goals: (1) to elucidate the relationships among the history of scientific discovery, the geological sciences, and modern scientific thought; (2) to develop, and utilize in the classroom, instructional modules that are relevant to the modern geological sciences curriculum and that relate fundamental scientific discoveries and principles to multiple disciplines and to modern societal issues; and (3) to use these activity-based modules to heighten students’ interest in science disciplines and to generate enthusiasm for doing science in both students and instructors. The educational modules that result from this linkage of modern and historical scientific thought are activity-based, directly related to the National Science Standards for the high school sciences curriculum, and adaptable to fit each state’s standard course of study for the sciences and math. They integrate historic sciences and mathematics with modern science, contain relevant background information on both the concept(s) and scientist(s) involved, present questions that compel students to think more deeply (both qualitatively and quantitatively) about the subject matter, and include threads that branch off to related topics. Modules on topics ranging from the density to cladistics to Kepler’s laws of planetary motion have been developed and tested. Pre- and post-module data suggest that both students and teachers benefit from these interdisciplinary historically based classroom experiences.
Supercomputing resources empowering superstack with interactive and integrated systems
NASA Astrophysics Data System (ADS)
Rückemann, Claus-Peter
2012-09-01
This paper presents the results from the development and implementation of Superstack algorithms to be dynamically used with integrated systems and supercomputing resources. Processing of geophysical data, thus named geoprocessing, is an essential part of the analysis of geoscientific data. The theory of Superstack algorithms and the practical application on modern computing architectures was inspired by developments introduced with processing of seismic data on mainframes and within the last years leading to high end scientific computing applications. There are several stacking algorithms known but with low signal to noise ratio in seismic data the use of iterative algorithms like the Superstack can support analysis and interpretation. The new Superstack algorithms are in use with wave theory and optical phenomena on highly performant computing resources for huge data sets as well as for sophisticated application scenarios in geosciences and archaeology.
Remote control system for high-perfomance computer simulation of crystal growth by the PFC method
NASA Astrophysics Data System (ADS)
Pavlyuk, Evgeny; Starodumov, Ilya; Osipov, Sergei
2017-04-01
Modeling of crystallization process by the phase field crystal method (PFC) - one of the important directions of modern computational materials science. In this paper, the practical side of the computer simulation of the crystallization process by the PFC method is investigated. To solve problems using this method, it is necessary to use high-performance computing clusters, data storage systems and other often expensive complex computer systems. Access to such resources is often limited, unstable and accompanied by various administrative problems. In addition, the variety of software and settings of different computing clusters sometimes does not allow researchers to use unified program code. There is a need to adapt the program code for each configuration of the computer complex. The practical experience of the authors has shown that the creation of a special control system for computing with the possibility of remote use can greatly simplify the implementation of simulations and increase the performance of scientific research. In current paper we show the principal idea of such a system and justify its efficiency.
Zilsel's Thesis, Maritime Culture, and Iberian Science in Early Modern Europe.
Leitão, Henrique; Sánchez, Antonio
2017-01-01
Zilsel's thesis on the artisanal origins of modern science remains one of the most original proposals about the emergence of scientific modernity. We propose to inspect the scientific developments in Iberia in the early modern period using Zilsel's ideas as a guideline. Our purpose is to show that his ideas illuminate the situation in Iberia but also that the Iberian case is a remarkable illustration of Zilsel's thesis. Furthermore, we argue that Zilsel's thesis is essentially a sociological explanation that cannot be applied to isolated cases; its use implies global events that involve extended societies over large periods of time.
Provenance Challenges for Earth Science Dataset Publication
NASA Technical Reports Server (NTRS)
Tilmes, Curt
2011-01-01
Modern science is increasingly dependent on computational analysis of very large data sets. Organizing, referencing, publishing those data has become a complex problem. Published research that depends on such data often fails to cite the data in sufficient detail to allow an independent scientist to reproduce the original experiments and analyses. This paper explores some of the challenges related to data identification, equivalence and reproducibility in the domain of data intensive scientific processing. It will use the example of Earth Science satellite data, but the challenges also apply to other domains.
Modelling decision-making by pilots
NASA Technical Reports Server (NTRS)
Patrick, Nicholas J. M.
1993-01-01
Our scientific goal is to understand the process of human decision-making. Specifically, a model of human decision-making in piloting modern commercial aircraft which prescribes optimal behavior, and against which we can measure human sub-optimality is sought. This model should help us understand such diverse aspects of piloting as strategic decision-making, and the implicit decisions involved in attention allocation. Our engineering goal is to provide design specifications for (1) better computer-based decision-aids, and (2) better training programs for the human pilot (or human decision-maker, DM).
NASA Astrophysics Data System (ADS)
Tyupikova, T. V.; Samoilov, V. N.
2003-04-01
Modern information technologies urge natural sciences to further development. But it comes together with evaluation of infrastructures, to spotlight favorable conditions for the development of science and financial base in order to prove and protect legally new research. Any scientific development entails accounting and legal protection. In the report, we consider a new direction in software, organization and control of common databases on the example of the electronic document handling, which functions in some departments of the Joint Institute for Nuclear Research.
QMachine: commodity supercomputing in web browsers.
Wilkinson, Sean R; Almeida, Jonas S
2014-06-09
Ongoing advancements in cloud computing provide novel opportunities in scientific computing, especially for distributed workflows. Modern web browsers can now be used as high-performance workstations for querying, processing, and visualizing genomics' "Big Data" from sources like The Cancer Genome Atlas (TCGA) and the International Cancer Genome Consortium (ICGC) without local software installation or configuration. The design of QMachine (QM) was driven by the opportunity to use this pervasive computing model in the context of the Web of Linked Data in Biomedicine. QM is an open-sourced, publicly available web service that acts as a messaging system for posting tasks and retrieving results over HTTP. The illustrative application described here distributes the analyses of 20 Streptococcus pneumoniae genomes for shared suffixes. Because all analytical and data retrieval tasks are executed by volunteer machines, few server resources are required. Any modern web browser can submit those tasks and/or volunteer to execute them without installing any extra plugins or programs. A client library provides high-level distribution templates including MapReduce. This stark departure from the current reliance on expensive server hardware running "download and install" software has already gathered substantial community interest, as QM received more than 2.2 million API calls from 87 countries in 12 months. QM was found adequate to deliver the sort of scalable bioinformatics solutions that computation- and data-intensive workflows require. Paradoxically, the sandboxed execution of code by web browsers was also found to enable them, as compute nodes, to address critical privacy concerns that characterize biomedical environments.
Challenges of Big Data Analysis.
Fan, Jianqing; Han, Fang; Liu, Han
2014-06-01
Big Data bring new opportunities to modern society and challenges to data scientists. On one hand, Big Data hold great promises for discovering subtle population patterns and heterogeneities that are not possible with small-scale data. On the other hand, the massive sample size and high dimensionality of Big Data introduce unique computational and statistical challenges, including scalability and storage bottleneck, noise accumulation, spurious correlation, incidental endogeneity, and measurement errors. These challenges are distinguished and require new computational and statistical paradigm. This article gives overviews on the salient features of Big Data and how these features impact on paradigm change on statistical and computational methods as well as computing architectures. We also provide various new perspectives on the Big Data analysis and computation. In particular, we emphasize on the viability of the sparsest solution in high-confidence set and point out that exogeneous assumptions in most statistical methods for Big Data can not be validated due to incidental endogeneity. They can lead to wrong statistical inferences and consequently wrong scientific conclusions.
NASA Astrophysics Data System (ADS)
Nelson, E.; L'Ecuyer, T. S.; Douglas, A.; Hansen, Z.
2017-12-01
In the modern computing age, scientists must utilize a wide variety of skills to carry out scientific research. Programming, including a focus on collaborative development, has become more prevalent in both academic and professional career paths. Faculty in the Department of Atmospheric and Oceanic Sciences at the University of Wisconsin—Madison recognized this need and recently approved a new course offering for undergraduates and postgraduates in computational methods that was first held in Spring 2017. Three programming languages were covered in the inaugural course semester and development themes such as modularization, data wrangling, and conceptual code models were woven into all of the sections. In this presentation, we will share successes and challenges in developing a research project-focused computational course that leverages hands-on computer laboratory learning and open-sourced course content. Improvements and changes in future iterations of the course based on the first offering will also be discussed.
Challenges of Big Data Analysis
Fan, Jianqing; Han, Fang; Liu, Han
2014-01-01
Big Data bring new opportunities to modern society and challenges to data scientists. On one hand, Big Data hold great promises for discovering subtle population patterns and heterogeneities that are not possible with small-scale data. On the other hand, the massive sample size and high dimensionality of Big Data introduce unique computational and statistical challenges, including scalability and storage bottleneck, noise accumulation, spurious correlation, incidental endogeneity, and measurement errors. These challenges are distinguished and require new computational and statistical paradigm. This article gives overviews on the salient features of Big Data and how these features impact on paradigm change on statistical and computational methods as well as computing architectures. We also provide various new perspectives on the Big Data analysis and computation. In particular, we emphasize on the viability of the sparsest solution in high-confidence set and point out that exogeneous assumptions in most statistical methods for Big Data can not be validated due to incidental endogeneity. They can lead to wrong statistical inferences and consequently wrong scientific conclusions. PMID:25419469
'Introspectionism' and the mythical origins of scientific psychology.
Costall, Alan
2006-12-01
According to the majority of the textbooks, the history of modern, scientific psychology can be tidily encapsulated in the following three stages. Scientific psychology began with a commitment to the study of mind, but based on the method of introspection. Watson rejected introspectionism as both unreliable and effete, and redefined psychology, instead, as the science of behaviour. The cognitive revolution, in turn, replaced the mind as the subject of study, and rejected both behaviourism and a reliance on introspection. This paper argues that all three stages of this history are largely mythical. Introspectionism was never a dominant movement within modern psychology, and the method of introspection never went away. Furthermore, this version of psychology's history obscures some deep conceptual problems, not least surrounding the modern conception of "behaviour," that continues to make the scientific study of consciousness seem so weird.
Golubeva, E Yu
Modern terminology on active and healthy aging used in scientific and project activities is discussed. There have been analyzed the WHO conception on active aging, which has no precise universally agreed definition, its main determinants. The directions of scientific expertise in the major European projects INNOVAGE - assessment of potentially profitable social innovations relating to the welfare and quality of life and health in old age; MOPACT - the interference between the demographic development and the main dimensions of economic and social contribution of older persons is defined. The approach to implement the policy of active and healthy longevity as a valuable asset of the modern society is underlined.
Reflections on the history of indoor air science, focusing on the last 50 years.
Sundell, J
2017-07-01
The scientific articles and Indoor Air conference publications of the indoor air sciences (IAS) during the last 50 years are summarized. In total 7524 presentations, from 79 countries, have been made at Indoor Air conferences held between 1978 (49 presentations) and 2014 (1049 presentations). In the Web of Science, 26 992 articles on indoor air research (with the word "indoor" as a search term) have been found (as of 1 Jan 2016) of which 70% were published during the last 10 years. The modern scientific history started in the 1970s with a question: "did indoor air pose a threat to health as did outdoor air?" Soon it was recognized that indoor air is more important, from a health point of view, than outdoor air. Topics of concern were first radon, environmental tobacco smoke, and lung cancer, followed by volatile organic compounds, formaldehyde and sick building syndrome, house dust-mites, asthma and allergies, Legionnaires disease, and other airborne infections. Later emerged dampness/mold-associated allergies and today's concern with "modern exposures-modern diseases." Ventilation, thermal comfort, indoor air chemistry, semi-volatile organic compounds, building simulation by computational fluid dynamics, and fine particulate matter are common topics today. From their beginning in Denmark and Sweden, then in the USA, the indoor air sciences now show increasing activity in East and Southeast Asia. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Applied Mathematics at the U.S. Department of Energy: Past, Present and a View to the Future
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, D L; Bell, J; Estep, D
2008-02-15
Over the past half-century, the Applied Mathematics program in the U.S. Department of Energy's Office of Advanced Scientific Computing Research has made significant, enduring advances in applied mathematics that have been essential enablers of modern computational science. Motivated by the scientific needs of the Department of Energy and its predecessors, advances have been made in mathematical modeling, numerical analysis of differential equations, optimization theory, mesh generation for complex geometries, adaptive algorithms and other important mathematical areas. High-performance mathematical software libraries developed through this program have contributed as much or more to the performance of modern scientific computer codes as themore » high-performance computers on which these codes run. The combination of these mathematical advances and the resulting software has enabled high-performance computers to be used for scientific discovery in ways that could only be imagined at the program's inception. Our nation, and indeed our world, face great challenges that must be addressed in coming years, and many of these will be addressed through the development of scientific understanding and engineering advances yet to be discovered. The U.S. Department of Energy (DOE) will play an essential role in providing science-based solutions to many of these problems, particularly those that involve the energy, environmental and national security needs of the country. As the capability of high-performance computers continues to increase, the types of questions that can be answered by applying this huge computational power become more varied and more complex. It will be essential that we find new ways to develop and apply the mathematics necessary to enable the new scientific and engineering discoveries that are needed. In August 2007, a panel of experts in applied, computational and statistical mathematics met for a day and a half in Berkeley, California to understand the mathematical developments required to meet the future science and engineering needs of the DOE. It is important to emphasize that the panelists were not asked to speculate only on advances that might be made in their own research specialties. Instead, the guidance this panel was given was to consider the broad science and engineering challenges that the DOE faces and identify the corresponding advances that must occur across the field of mathematics for these challenges to be successfully addressed. As preparation for the meeting, each panelist was asked to review strategic planning and other informational documents available for one or more of the DOE Program Offices, including the Offices of Science, Nuclear Energy, Fossil Energy, Environmental Management, Legacy Management, Energy Efficiency & Renewable Energy, Electricity Delivery & Energy Reliability and Civilian Radioactive Waste Management as well as the National Nuclear Security Administration. The panelists reported on science and engineering needs for each of these offices, and then discussed and identified mathematical advances that will be required if these challenges are to be met. A review of DOE challenges in energy, the environment and national security brings to light a broad and varied array of questions that the DOE must answer in the coming years. A representative subset of such questions includes: (1) Can we predict the operating characteristics of a clean coal power plant? (2) How stable is the plasma containment in a tokamak? (3) How quickly is climate change occurring and what are the uncertainties in the predicted time scales? (4) How quickly can an introduced bio-weapon contaminate the agricultural environment in the US? (5) How do we modify models of the atmosphere and clouds to incorporate newly collected data of possibly of new types? (6) How quickly can the United States recover if part of the power grid became inoperable? (7) What are optimal locations and communication protocols for sensing devices in a remote-sensing network? (8) How can new materials be designed with a specified desirable set of properties? In comparing and contrasting these and other questions of importance to DOE, the panel found that while the scientific breadth of the requirements is enormous, a central theme emerges: Scientists are being asked to identify or provide technology, or to give expert analysis to inform policy-makers that requires the scientific understanding of increasingly complex physical and engineered systems. In addition, as the complexity of the systems of interest increases, neither experimental observation nor mathematical and computational modeling alone can access all components of the system over the entire range of scales or conditions needed to provide the required scientific understanding.« less
The challenges of developing computational physics: the case of South Africa
NASA Astrophysics Data System (ADS)
Salagaram, T.; Chetty, N.
2013-08-01
Most modern scientific research problems are complex and interdisciplinary in nature. It is impossible to study such problems in detail without the use of computation in addition to theory and experiment. Although it is widely agreed that students should be introduced to computational methods at the undergraduate level, it remains a challenge to do this in a full traditional undergraduate curriculum. In this paper, we report on a survey that we conducted of undergraduate physics curricula in South Africa to determine the content and the approach taken in the teaching of computational physics. We also considered the pedagogy of computational physics at the postgraduate and research levels at various South African universities, research facilities and institutions. We conclude that the state of computational physics training in South Africa, especially at the undergraduate teaching level, is generally weak and needs to be given more attention at all universities. Failure to do so will impact negatively on the countrys capacity to grow its endeavours generally in the field of computational sciences, with negative impacts on research, and in commerce and industry.
Diana, Esther
2008-01-01
The scientific collections of Florentine Santa Maria Nuova Hospital stimulated new interest in the second half of eigthteenth century. Indeed, the modernization process of the Hospital lead to a steadily increasing alienation of its rich historical heritage, including the scientific collections. Archive documents witness the sale or the museum valorization of a number of collections including mathematical instruments and the anatomical, surgical and wax-obstetrical ones.
Structure and Evolution of Scientific Collaboration Networks in a Modern Research Collaboratory
ERIC Educational Resources Information Center
Pepe, Alberto
2010-01-01
This dissertation is a study of scientific collaboration at the Center for Embedded Networked Sensing (CENS), a modern, multi-disciplinary, distributed laboratory involved in sensor network research. By use of survey research and network analysis, this dissertation examines the collaborative ecology of CENS in terms of three networks of…
ERIC Educational Resources Information Center
Nikolaevskaya, Olga
2015-01-01
Neuromanagement of higher education is an effective tool for the development of higher education, professional identification of specialist, increase of the professional authority and prestige of modern scientific and research work. The target point of neuromanagement system is competitiveness of the modern university graduate whose competence…
InSAR Scientific Computing Environment
NASA Astrophysics Data System (ADS)
Gurrola, E. M.; Rosen, P. A.; Sacco, G.; Zebker, H. A.; Simons, M.; Sandwell, D. T.
2010-12-01
The InSAR Scientific Computing Environment (ISCE) is a software development effort in its second year within the NASA Advanced Information Systems and Technology program. The ISCE will provide a new computing environment for geodetic image processing for InSAR sensors that will enable scientists to reduce measurements directly from radar satellites and aircraft to new geophysical products without first requiring them to develop detailed expertise in radar processing methods. The environment can serve as the core of a centralized processing center to bring Level-0 raw radar data up to Level-3 data products, but is adaptable to alternative processing approaches for science users interested in new and different ways to exploit mission data. The NRC Decadal Survey-recommended DESDynI mission will deliver data of unprecedented quantity and quality, making possible global-scale studies in climate research, natural hazards, and Earth's ecosystem. The InSAR Scientific Computing Environment is planned to become a key element in processing DESDynI data into higher level data products and it is expected to enable a new class of analyses that take greater advantage of the long time and large spatial scales of these new data, than current approaches. At the core of ISCE is both legacy processing software from the JPL/Caltech ROI_PAC repeat-pass interferometry package as well as a new InSAR processing package containing more efficient and more accurate processing algorithms being developed at Stanford for this project that is based on experience gained in developing processors for missions such as SRTM and UAVSAR. Around the core InSAR processing programs we are building object-oriented wrappers to enable their incorporation into a more modern, flexible, extensible software package that is informed by modern programming methods, including rigorous componentization of processing codes, abstraction and generalization of data models, and a robust, intuitive user interface with graduated exposure to the levels of sophistication, allowing novices to apply it readily for common tasks and experienced users to mine data with great facility and flexibility. The environment is designed to easily allow user contributions, enabling an open source community to extend the framework into the indefinite future. In this paper we briefly describe both the legacy and the new core processing algorithms and their integration into the new computing environment. We describe the ISCE component and application architecture and the features that permit the desired flexibility, extensibility and ease-of-use. We summarize the state of progress of the environment and the plans for completion of the environment and for its future introduction into the radar processing community.
InSAR Scientific Computing Environment - The Home Stretch
NASA Astrophysics Data System (ADS)
Rosen, P. A.; Gurrola, E. M.; Sacco, G.; Zebker, H. A.
2011-12-01
The Interferometric Synthetic Aperture Radar (InSAR) Scientific Computing Environment (ISCE) is a software development effort in its third and final year within the NASA Advanced Information Systems and Technology program. The ISCE is a new computing environment for geodetic image processing for InSAR sensors enabling scientists to reduce measurements directly from radar satellites to new geophysical products with relative ease. The environment can serve as the core of a centralized processing center to bring Level-0 raw radar data up to Level-3 data products, but is adaptable to alternative processing approaches for science users interested in new and different ways to exploit mission data. Upcoming international SAR missions will deliver data of unprecedented quantity and quality, making possible global-scale studies in climate research, natural hazards, and Earth's ecosystem. The InSAR Scientific Computing Environment has the functionality to become a key element in processing data from NASA's proposed DESDynI mission into higher level data products, supporting a new class of analyses that take advantage of the long time and large spatial scales of these new data. At the core of ISCE is a new set of efficient and accurate InSAR algorithms. These algorithms are placed into an object-oriented, flexible, extensible software package that is informed by modern programming methods, including rigorous componentization of processing codes, abstraction and generalization of data models. The environment is designed to easily allow user contributions, enabling an open source community to extend the framework into the indefinite future. ISCE supports data from nearly all of the available satellite platforms, including ERS, EnviSAT, Radarsat-1, Radarsat-2, ALOS, TerraSAR-X, and Cosmo-SkyMed. The code applies a number of parallelization techniques and sensible approximations for speed. It is configured to work on modern linux-based computers with gcc compilers and python. ISCE is now a complete, functional package, under configuration management, and with extensive documentation and tested use cases appropriate to geodetic imaging applications. The software has been tested with canonical simulated radar data ("point targets") as well as with a variety of existing satellite data, cross-compared with other software packages. Its extensibility has already been proven by the straightforward addition of polarimetric processing and calibration, and derived filtering and estimation routines associated with polarimetry that supplement the original InSAR geodetic functionality. As of October 2011, the software is available for non-commercial use through UNAVCO's WinSAR consortium.
Trajectory Optimization for Missions to Small Bodies with a Focus on Scientific Merit.
Englander, Jacob A; Vavrina, Matthew A; Lim, Lucy F; McFadden, Lucy A; Rhoden, Alyssa R; Noll, Keith S
2017-01-01
Trajectory design for missions to small bodies is tightly coupled both with the selection of targets for a mission and with the choice of spacecraft power, propulsion, and other hardware. Traditional methods of trajectory optimization have focused on finding the optimal trajectory for an a priori selection of destinations and spacecraft parameters. Recent research has expanded the field of trajectory optimization to multidisciplinary systems optimization that includes spacecraft parameters. The logical next step is to extend the optimization process to include target selection based not only on engineering figures of merit but also scientific value. This paper presents a new technique to solve the multidisciplinary mission optimization problem for small-bodies missions, including classical trajectory design, the choice of spacecraft power and propulsion systems, and also the scientific value of the targets. This technique, when combined with modern parallel computers, enables a holistic view of the small body mission design process that previously required iteration among several different design processes.
The End of the Rainbow? Color Schemes for Improved Data Graphics
NASA Astrophysics Data System (ADS)
Light, Adam; Bartlein, Patrick J.
2004-10-01
Modern computer displays and printers enable the widespread use of color in scientific communication, but the expertise for designing effective graphics has not kept pace with the technology for producing them. Historically, even the most prestigious publications have tolerated high defect rates in figures and illustrations, and technological advances that make creating and reproducing graphics easier do not appear to have decreased the frequency of errors. Flawed graphics consequently beget more flawed graphics as authors emulate published examples. Color has the potential to enhance communication, but design mistakes can result in color figures that are less effective than gray scale displays of the same data. Empirical research on human subjects can build a fundamental understanding of visual perception and scientific methods can be used to evaluate existing designs, but creating effective data graphics is a design task and not fundamentally a scientific pursuit. Like writing well, creating good data graphics requires a combination of formal knowledge and artistic sensibility tempered by experience: a combination of ``substance, statistics, and design''.
Network-based statistical comparison of citation topology of bibliographic databases
Šubelj, Lovro; Fiala, Dalibor; Bajec, Marko
2014-01-01
Modern bibliographic databases provide the basis for scientific research and its evaluation. While their content and structure differ substantially, there exist only informal notions on their reliability. Here we compare the topological consistency of citation networks extracted from six popular bibliographic databases including Web of Science, CiteSeer and arXiv.org. The networks are assessed through a rich set of local and global graph statistics. We first reveal statistically significant inconsistencies between some of the databases with respect to individual statistics. For example, the introduced field bow-tie decomposition of DBLP Computer Science Bibliography substantially differs from the rest due to the coverage of the database, while the citation information within arXiv.org is the most exhaustive. Finally, we compare the databases over multiple graph statistics using the critical difference diagram. The citation topology of DBLP Computer Science Bibliography is the least consistent with the rest, while, not surprisingly, Web of Science is significantly more reliable from the perspective of consistency. This work can serve either as a reference for scholars in bibliometrics and scientometrics or a scientific evaluation guideline for governments and research agencies. PMID:25263231
Computing at the speed limit (supercomputers)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bernhard, R.
1982-07-01
The author discusses how unheralded efforts in the United States, mainly in universities, have removed major stumbling blocks to building cost-effective superfast computers for scientific and engineering applications within five years. These computers would have sustained speeds of billions of floating-point operations per second (flops), whereas with the fastest machines today the top sustained speed is only 25 million flops, with bursts to 160 megaflops. Cost-effective superfast machines can be built because of advances in very large-scale integration and the special software needed to program the new machines. VLSI greatly reduces the cost per unit of computing power. The developmentmore » of such computers would come at an opportune time. Although the US leads the world in large-scale computer technology, its supremacy is now threatened, not surprisingly, by the Japanese. Publicized reports indicate that the Japanese government is funding a cooperative effort by commercial computer manufacturers to develop superfast computers-about 1000 times faster than modern supercomputers. The US computer industry, by contrast, has balked at attempting to boost computer power so sharply because of the uncertain market for the machines and the failure of similar projects in the past to show significant results.« less
NASA Astrophysics Data System (ADS)
Chulaki, A.; Kuznetsova, M. M.; Rastaetter, L.; MacNeice, P. J.; Shim, J. S.; Pulkkinen, A. A.; Taktakishvili, A.; Mays, M. L.; Mendoza, A. M. M.; Zheng, Y.; Mullinix, R.; Collado-Vega, Y. M.; Maddox, M. M.; Pembroke, A. D.; Wiegand, C.
2015-12-01
Community Coordinated Modeling Center (CCMC) is a NASA affiliated interagency partnership with the primary goal of aiding the transition of modern space science models into space weather forecasting while supporting space science research. Additionally, over the past ten years it has established itself as a global space science education resource supporting undergraduate and graduate education and research, and spreading space weather awareness worldwide. A unique combination of assets, capabilities and close ties to the scientific and educational communities enable this small group to serve as a hub for raising generations of young space scientists and engineers. CCMC resources are publicly available online, providing unprecedented global access to the largest collection of modern space science models (developed by the international research community). CCMC has revolutionized the way simulations are utilized in classrooms settings, student projects, and scientific labs and serves hundreds of educators, students and researchers every year. Another major CCMC asset is an expert space weather prototyping team primarily serving NASA's interplanetary space weather needs. Capitalizing on its unrivaled capabilities and experiences, the team provides in-depth space weather training to students and professionals worldwide, and offers an amazing opportunity for undergraduates to engage in real-time space weather monitoring, analysis, forecasting and research. In-house development of state-of-the-art space weather tools and applications provides exciting opportunities to students majoring in computer science and computer engineering fields to intern with the software engineers at the CCMC while also learning about the space weather from the NASA scientists.
[Elucidating! But how? Insights into the impositions of modern science communication].
Lehmkuh, Markus
2015-01-01
The talk promotes the view that science communication should abandon the claim that scientific information can convince others. This is identified as one of the impositions modern science communication is exposed to. Instead of convin cing others, science communication should focus on identifying societally relevant scientific knowledge and on communicating it accurately and coherently.
Reviews in Modern Astronomy 12, Astronomical Instruments and Methods at the turn of the 21st Century
NASA Astrophysics Data System (ADS)
Schielicke, Reinhard E.
The yearbook series Reviews in Modern Astronomy of the Astronomische Gesellschaft (AG) was established in 1988 in order to bring the scientific events of the meetings of the society to the attention of the worldwide astronomical community. Reviews in Modern Astronomy is devoted exclusively to the invited Reviews, the Karl Schwarzschild Lectures, the Ludwig Biermann Award Lectures, and the highlight contributions from leading scientists reporting on recent progress and scientific achievements at their respective research institutes. Volume 12 continues the yearbook series with 16 contributions which were presented during the International Scientific Conference of the AG on ``Astronomical Instruments and Methods at the Turn of the 21st Century'' at Heidelberg from September 14 to 19, 1998
A Framework for the Design of Effective Graphics for Scientific Visualization
NASA Technical Reports Server (NTRS)
Miceli, Kristina D.
1992-01-01
This proposal presents a visualization framework, based on a data model, that supports the production of effective graphics for scientific visualization. Visual representations are effective only if they augment comprehension of the increasing amounts of data being generated by modern computer simulations. These representations are created by taking into account the goals and capabilities of the scientist, the type of data to be displayed, and software and hardware considerations. This framework is embodied in an assistant-based visualization system to guide the scientist in the visualization process. This will improve the quality of the visualizations and decrease the time the scientist is required to spend in generating the visualizations. I intend to prove that such a framework will create a more productive environment for tile analysis and interpretation of large, complex data sets.
The Center for Computational Biology: resources, achievements, and challenges
Dinov, Ivo D; Thompson, Paul M; Woods, Roger P; Van Horn, John D; Shattuck, David W; Parker, D Stott
2011-01-01
The Center for Computational Biology (CCB) is a multidisciplinary program where biomedical scientists, engineers, and clinicians work jointly to combine modern mathematical and computational techniques, to perform phenotypic and genotypic studies of biological structure, function, and physiology in health and disease. CCB has developed a computational framework built around the Manifold Atlas, an integrated biomedical computing environment that enables statistical inference on biological manifolds. These manifolds model biological structures, features, shapes, and flows, and support sophisticated morphometric and statistical analyses. The Manifold Atlas includes tools, workflows, and services for multimodal population-based modeling and analysis of biological manifolds. The broad spectrum of biomedical topics explored by CCB investigators include the study of normal and pathological brain development, maturation and aging, discovery of associations between neuroimaging and genetic biomarkers, and the modeling, analysis, and visualization of biological shape, form, and size. CCB supports a wide range of short-term and long-term collaborations with outside investigators, which drive the center's computational developments and focus the validation and dissemination of CCB resources to new areas and scientific domains. PMID:22081221
The Center for Computational Biology: resources, achievements, and challenges.
Toga, Arthur W; Dinov, Ivo D; Thompson, Paul M; Woods, Roger P; Van Horn, John D; Shattuck, David W; Parker, D Stott
2012-01-01
The Center for Computational Biology (CCB) is a multidisciplinary program where biomedical scientists, engineers, and clinicians work jointly to combine modern mathematical and computational techniques, to perform phenotypic and genotypic studies of biological structure, function, and physiology in health and disease. CCB has developed a computational framework built around the Manifold Atlas, an integrated biomedical computing environment that enables statistical inference on biological manifolds. These manifolds model biological structures, features, shapes, and flows, and support sophisticated morphometric and statistical analyses. The Manifold Atlas includes tools, workflows, and services for multimodal population-based modeling and analysis of biological manifolds. The broad spectrum of biomedical topics explored by CCB investigators include the study of normal and pathological brain development, maturation and aging, discovery of associations between neuroimaging and genetic biomarkers, and the modeling, analysis, and visualization of biological shape, form, and size. CCB supports a wide range of short-term and long-term collaborations with outside investigators, which drive the center's computational developments and focus the validation and dissemination of CCB resources to new areas and scientific domains.
Expanding the use of Scientific Data through Maps and Apps
NASA Astrophysics Data System (ADS)
Shrestha, S. R.; Zimble, D. A.; Herring, D.; Halpert, M.
2014-12-01
The importance of making scientific data more available can't be overstated. There is a wealth of useful scientific data available and demand for this data is only increasing; however, applying scientific data towards practical uses poses several technical challenges. These challenges can arise from difficulty in handling the data due largely to 1) the complexity, variety and volume of scientific data and 2) applying and operating the techniques and tools needed to visualize and analyze the data. As a result, the combined knowledge required to take advantage of these data requires highly specialized skill sets that in total, limit the ability of scientific data from being used in more practical day-to-day decision making activities. While these challenges are daunting, information technologies do exist that can help mitigate some of these issues. Many organizations for years have already been enjoying the benefits of modern service oriented architectures (SOAs) for everyday enterprise tasks. We can use this approach to modernize how we share and access our scientific data where much of the specialized tools and techniques needed to handle and present scientific data can be automated and executed by servers and done so in an appropriate way. We will discuss and show an approach for preparing file based scientific data (e.g. GRIB, netCDF) for use in standard based scientific web services. These scientific web services are able to encapsulate the logic needed to handle and describe scientific data through a variety of service types including, image, map, feature, geoprocessing, and their respective service methods. By combining these types of services and leveraging well-documented and modern web development APIs, we can afford to focus our attention on the design and development of user-friendly maps and apps. Our scenario will include developing online maps through these services by integrating various forecast data from the Climate Forecast System (CFSv2). This presentation showcases a collaboration between the National Oceanic and Atmospheric Administration's (NOAA) Climate.gov portal, Climate Prediction Center and Esri, Inc. on the implementation of the ArcGIS platform, which is aimed at helping modernize scientific data access through a service oriented architecture.
Master of Puppets: Cooperative Multitasking for In Situ Processing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Morozov, Dmitriy; Lukic, Zarija
2016-01-01
Modern scientific and engineering simulations track the time evolution of billions of elements. For such large runs, storing most time steps for later analysis is not a viable strategy. It is far more efficient to analyze the simulation data while it is still in memory. Here, we present a novel design for running multiple codes in situ: using coroutines and position-independent executables we enable cooperative multitasking between simulation and analysis, allowing the same executables to post-process simulation output, as well as to process it on the fly, both in situ and in transit. We present Henson, an implementation of ourmore » design, and illustrate its versatility by tackling analysis tasks with different computational requirements. This design differs significantly from the existing frameworks and offers an efficient and robust approach to integrating multiple codes on modern supercomputers. The techniques we present can also be integrated into other in situ frameworks.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Monozov, Dmitriy; Lukie, Zarija
2016-04-01
Modern scientific and engineering simulations track the time evolution of billions of elements. For such large runs, storing most time steps for later analysis is not a viable strategy. It is far more efficient to analyze the simulation data while it is still in memory. The developers present a novel design for running multiple codes in situ: using coroutines and position-independent executables they enable cooperative multitasking between simulation and analysis, allowing the same executables to post-process simulation output, as well as to process it on the fly, both in situ and in transit. They present Henson, an implementation of ourmore » design, and illustrate its versatility by tackling analysis tasks with different computational requirements. Our design differs significantly from the existing frameworks and offers an efficient and robust approach to integrating multiple codes on modern supercomputers. The presented techniques can also be integrated into other in situ frameworks.« less
Heliophysics Legacy Data Restoration
NASA Astrophysics Data System (ADS)
Candey, R. M.; Bell, E. V., II; Bilitza, D.; Chimiak, R.; Cooper, J. F.; Garcia, L. N.; Grayzeck, E. J.; Harris, B. T.; Hills, H. K.; Johnson, R. C.; Kovalick, T. J.; Lal, N.; Leckner, H. A.; Liu, M. H.; McCaslin, P. W.; McGuire, R. E.; Papitashvili, N. E.; Rhodes, S. A.; Roberts, D. A.; Yurow, R. E.
2016-12-01
The Space Physics Data Facility (SPDF)
General practice--a post-modern specialty?
Mathers, N; Rowland, S
1997-01-01
The 'modern' view of the world is based on the premise that we can discover the essential truth of the world using scientific method. The assumption is made that knowledge so acquired has been 'uncontaminated' by the mind of the investigator. Post-modern theory, however, is concerned with the process of knowing and how our minds are part of the process, i.e. our perceptions of reality and the relationships between different concepts are important influences on our ways of knowing. The values of post-modern theory are those of uncertainty, many different voices and experiences of reality and multifaceted descriptions of truth. These values are closer to our experience of general practice than the 'modern' values of scientific rationalism and should be reflected in a new curriculum for general practice. PMID:9167325
Explicit integration with GPU acceleration for large kinetic networks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brock, Benjamin; Computer Science and Mathematics Division, Oak Ridge National Laboratory, Oak Ridge, TN 37830; Belt, Andrew
2015-12-01
We demonstrate the first implementation of recently-developed fast explicit kinetic integration algorithms on modern graphics processing unit (GPU) accelerators. Taking as a generic test case a Type Ia supernova explosion with an extremely stiff thermonuclear network having 150 isotopic species and 1604 reactions coupled to hydrodynamics using operator splitting, we demonstrate the capability to solve of order 100 realistic kinetic networks in parallel in the same time that standard implicit methods can solve a single such network on a CPU. This orders-of-magnitude decrease in computation time for solving systems of realistic kinetic networks implies that important coupled, multiphysics problems inmore » various scientific and technical fields that were intractable, or could be simulated only with highly schematic kinetic networks, are now computationally feasible.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cary, J.R.
During the most recent funding period the authors obtained results important for helical confinement systems and in the use of modern computational methods for modeling of fusion systems. The most recent results include showing that the set of magnetic field functions that are omnigenous (i.e., the bounce-average drift lies within the flux surface) and, therefore, have good transport properties, is much larger than the set of quasihelical systems. This is important as quasihelical systems exist only for large aspect ratio. The authors have also carried out extensive earlier work on developing integrable three-dimensional magnetic fields, on trajectories in three-dimensional configurations,more » and on the existence of three-dimensional MHD equilibria close to vacuum integrable fields. At the same time they have been investigating the use of object oriented methods for scientific computing.« less
ChemCalc: a building block for tomorrow's chemical infrastructure.
Patiny, Luc; Borel, Alain
2013-05-24
Web services, as an aspect of cloud computing, are becoming an important part of the general IT infrastructure, and scientific computing is no exception to this trend. We propose a simple approach to develop chemical Web services, through which servers could expose the essential data manipulation functionality that students and researchers need for chemical calculations. These services return their results as JSON (JavaScript Object Notation) objects, which facilitates their use for Web applications. The ChemCalc project http://www.chemcalc.org demonstrates this approach: we present three Web services related with mass spectrometry, namely isotopic distribution simulation, peptide fragmentation simulation, and molecular formula determination. We also developed a complete Web application based on these three Web services, taking advantage of modern HTML5 and JavaScript libraries (ChemDoodle and jQuery).
Xarray: multi-dimensional data analysis in Python
NASA Astrophysics Data System (ADS)
Hoyer, Stephan; Hamman, Joe; Maussion, Fabien
2017-04-01
xarray (http://xarray.pydata.org) is an open source project and Python package that provides a toolkit and data structures for N-dimensional labeled arrays, which are the bread and butter of modern geoscientific data analysis. Key features of the package include label-based indexing and arithmetic, interoperability with the core scientific Python packages (e.g., pandas, NumPy, Matplotlib, Cartopy), out-of-core computation on datasets that don't fit into memory, a wide range of input/output options, and advanced multi-dimensional data manipulation tools such as group-by and resampling. In this contribution we will present the key features of the library and demonstrate its great potential for a wide range of applications, from (big-)data processing on super computers to data exploration in front of a classroom.
QMachine: commodity supercomputing in web browsers
2014-01-01
Background Ongoing advancements in cloud computing provide novel opportunities in scientific computing, especially for distributed workflows. Modern web browsers can now be used as high-performance workstations for querying, processing, and visualizing genomics’ “Big Data” from sources like The Cancer Genome Atlas (TCGA) and the International Cancer Genome Consortium (ICGC) without local software installation or configuration. The design of QMachine (QM) was driven by the opportunity to use this pervasive computing model in the context of the Web of Linked Data in Biomedicine. Results QM is an open-sourced, publicly available web service that acts as a messaging system for posting tasks and retrieving results over HTTP. The illustrative application described here distributes the analyses of 20 Streptococcus pneumoniae genomes for shared suffixes. Because all analytical and data retrieval tasks are executed by volunteer machines, few server resources are required. Any modern web browser can submit those tasks and/or volunteer to execute them without installing any extra plugins or programs. A client library provides high-level distribution templates including MapReduce. This stark departure from the current reliance on expensive server hardware running “download and install” software has already gathered substantial community interest, as QM received more than 2.2 million API calls from 87 countries in 12 months. Conclusions QM was found adequate to deliver the sort of scalable bioinformatics solutions that computation- and data-intensive workflows require. Paradoxically, the sandboxed execution of code by web browsers was also found to enable them, as compute nodes, to address critical privacy concerns that characterize biomedical environments. PMID:24913605
Creating technical heritage object replicas in a virtual environment
NASA Astrophysics Data System (ADS)
Egorova, Olga; Shcherbinin, Dmitry
2016-03-01
The paper presents innovative informatics methods for creating virtual technical heritage replicas, which are of significant scientific and practical importance not only to researchers but to the public in general. By performing 3D modeling and animation of aircrafts, spaceships, architectural-engineering buildings, and other technical objects, the process of learning is achieved while promoting the preservation of the replicas for future generations. Modern approaches based on the wide usage of computer technologies attract a greater number of young people to explore the history of science and technology and renew their interest in the field of mechanical engineering.
The Contingency of Laws of Nature in Science and Theology
NASA Astrophysics Data System (ADS)
Jaeger, Lydia
2010-10-01
The belief that laws of nature are contingent played an important role in the emergence of the empirical method of modern physics. During the scientific revolution, this belief was based on the idea of voluntary creation. Taking up Peter Mittelstaedt’s work on laws of nature, this article explores several alternative answers which do not overtly make use of metaphysics: some laws are laws of mathematics; macroscopic laws can emerge from the interplay of numerous subsystems without any specific microscopic nomic structures (John Wheeler’s “law without law”); laws are the preconditions of scientific experience (Kant); laws are theoretical abstractions which only apply in very limited circumstances (Nancy Cartwright). Whereas Cartwright’s approach is in tension with modern scientific methodology, the first three strategies count as illuminating, though partial answers. It is important for the empirical method of modern physics that these three strategies, even when taken together, do not provide a complete explanation of the order of nature. Thus the question of why laws are valid is still relevant. In the concluding section, I argue that the traditional answer, based on voluntary creation, provides the right balance of contingency and coherence which is in harmony with modern scientific method.
Educational process in modern climatology within the web-GIS platform "Climate"
NASA Astrophysics Data System (ADS)
Gordova, Yulia; Gorbatenko, Valentina; Gordov, Evgeny; Martynova, Yulia; Okladnikov, Igor; Titov, Alexander; Shulgina, Tamara
2013-04-01
These days, common to all scientific fields the problem of training of scientists in the environmental sciences is exacerbated by the need to develop new computational and information technology skills in distributed multi-disciplinary teams. To address this and other pressing problems of Earth system sciences, software infrastructure for information support of integrated research in the geosciences was created based on modern information and computational technologies and a software and hardware platform "Climate» (http://climate.scert.ru/) was developed. In addition to the direct analysis of geophysical data archives, the platform is aimed at teaching the basics of the study of changes in regional climate. The educational component of the platform includes a series of lectures on climate, environmental and meteorological modeling and laboratory work cycles on the basics of analysis of current and potential future regional climate change using Siberia territory as an example. The educational process within the Platform is implemented using the distance learning system Moodle (www.moodle.org). This work is partially supported by the Ministry of education and science of the Russian Federation (contract #8345), SB RAS project VIII.80.2.1, RFBR grant #11-05-01190a, and integrated project SB RAS #131.
Method and computer program product for maintenance and modernization backlogging
Mattimore, Bernard G; Reynolds, Paul E; Farrell, Jill M
2013-02-19
According to one embodiment, a computer program product for determining future facility conditions includes a computer readable medium having computer readable program code stored therein. The computer readable program code includes computer readable program code for calculating a time period specific maintenance cost, for calculating a time period specific modernization factor, and for calculating a time period specific backlog factor. Future facility conditions equal the time period specific maintenance cost plus the time period specific modernization factor plus the time period specific backlog factor. In another embodiment, a computer-implemented method for calculating future facility conditions includes calculating a time period specific maintenance cost, calculating a time period specific modernization factor, and calculating a time period specific backlog factor. Future facility conditions equal the time period specific maintenance cost plus the time period specific modernization factor plus the time period specific backlog factor. Other embodiments are also presented.
The origin of scientific neurology and its consequences for modern and future neuroscience.
Steinberg, David A
2014-01-01
John Hughlings Jackson (1835-1911) created a science of brain function that, in scope and profundity, is among the great scientific discoveries of the 19th century. It is interesting that the magnitude of his achievement is not completely recognized even among his ardent admirers. Although thousands of practitioners around the world use the clinical applications of his science every day, the principles from which bedside neurology is derived have broader consequences-for modern and future science-that remain unrecognized and unexploited. This paper summarizes the scientific formalism that created modern neurology, demonstrates how its direct implications affect a current area of neuroscientific research, and indicates how Hughlings Jackson's ideas form a path toward a novel solution to an important open problem of the brain and mind.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Luo, Y.; Cameron, K.W.
1998-11-24
Workload characterization has been proven an essential tool to architecture design and performance evaluation in both scientific and commercial computing areas. Traditional workload characterization techniques include FLOPS rate, cache miss ratios, CPI (cycles per instruction or IPC, instructions per cycle) etc. With the complexity of sophisticated modern superscalar microprocessors, these traditional characterization techniques are not powerful enough to pinpoint the performance bottleneck of an application on a specific microprocessor. They are also incapable of immediately demonstrating the potential performance benefit of any architectural or functional improvement in a new processor design. To solve these problems, many people rely on simulators,more » which have substantial constraints especially on large-scale scientific computing applications. This paper presents a new technique of characterizing applications at the instruction level using hardware performance counters. It has the advantage of collecting instruction-level characteristics in a few runs virtually without overhead or slowdown. A variety of instruction counts can be utilized to calculate some average abstract workload parameters corresponding to microprocessor pipelines or functional units. Based on the microprocessor architectural constraints and these calculated abstract parameters, the architectural performance bottleneck for a specific application can be estimated. In particular, the analysis results can provide some insight to the problem that only a small percentage of processor peak performance can be achieved even for many very cache-friendly codes. Meanwhile, the bottleneck estimation can provide suggestions about viable architectural/functional improvement for certain workloads. Eventually, these abstract parameters can lead to the creation of an analytical microprocessor pipeline model and memory hierarchy model.« less
Web-GIS platform for monitoring and forecasting of regional climate and ecological changes
NASA Astrophysics Data System (ADS)
Gordov, E. P.; Krupchatnikov, V. N.; Lykosov, V. N.; Okladnikov, I.; Titov, A. G.; Shulgina, T. M.
2012-12-01
Growing volume of environmental data from sensors and model outputs makes development of based on modern information-telecommunication technologies software infrastructure for information support of integrated scientific researches in the field of Earth sciences urgent and important task (Gordov et al, 2012, van der Wel, 2005). It should be considered that original heterogeneity of datasets obtained from different sources and institutions not only hampers interchange of data and analysis results but also complicates their intercomparison leading to a decrease in reliability of analysis results. However, modern geophysical data processing techniques allow combining of different technological solutions for organizing such information resources. Nowadays it becomes a generally accepted opinion that information-computational infrastructure should rely on a potential of combined usage of web- and GIS-technologies for creating applied information-computational web-systems (Titov et al, 2009, Gordov et al. 2010, Gordov, Okladnikov and Titov, 2011). Using these approaches for development of internet-accessible thematic information-computational systems, and arranging of data and knowledge interchange between them is a very promising way of creation of distributed information-computation environment for supporting of multidiscipline regional and global research in the field of Earth sciences including analysis of climate changes and their impact on spatial-temporal vegetation distribution and state. Experimental software and hardware platform providing operation of a web-oriented production and research center for regional climate change investigations which combines modern web 2.0 approach, GIS-functionality and capabilities of running climate and meteorological models, large geophysical datasets processing, visualization, joint software development by distributed research groups, scientific analysis and organization of students and post-graduate students education is presented. Platform software developed (Shulgina et al, 2012, Okladnikov et al, 2012) includes dedicated modules for numerical processing of regional and global modeling results for consequent analysis and visualization. Also data preprocessing, run and visualization of modeling results of models WRF and «Planet Simulator» integrated into the platform is provided. All functions of the center are accessible by a user through a web-portal using common graphical web-browser in the form of an interactive graphical user interface which provides, particularly, capabilities of visualization of processing results, selection of geographical region of interest (pan and zoom) and data layers manipulation (order, enable/disable, features extraction). Platform developed provides users with capabilities of heterogeneous geophysical data analysis, including high-resolution data, and discovering of tendencies in climatic and ecosystem changes in the framework of different multidisciplinary researches (Shulgina et al, 2011). Using it even unskilled user without specific knowledge can perform computational processing and visualization of large meteorological, climatological and satellite monitoring datasets through unified graphical web-interface.
NASA Astrophysics Data System (ADS)
Debnath, Lokenath
2010-09-01
This article is essentially devoted to a brief historical introduction to Euler's formula for polyhedra, topology, theory of graphs and networks with many examples from the real-world. Celebrated Königsberg seven-bridge problem and some of the basic properties of graphs and networks for some understanding of the macroscopic behaviour of real physical systems are included. We also mention some important and modern applications of graph theory or network problems from transportation to telecommunications. Graphs or networks are effectively used as powerful tools in industrial, electrical and civil engineering, communication networks in the planning of business and industry. Graph theory and combinatorics can be used to understand the changes that occur in many large and complex scientific, technical and medical systems. With the advent of fast large computers and the ubiquitous Internet consisting of a very large network of computers, large-scale complex optimization problems can be modelled in terms of graphs or networks and then solved by algorithms available in graph theory. Many large and more complex combinatorial problems dealing with the possible arrangements of situations of various kinds, and computing the number and properties of such arrangements can be formulated in terms of networks. The Knight's tour problem, Hamilton's tour problem, problem of magic squares, the Euler Graeco-Latin squares problem and their modern developments in the twentieth century are also included.
SiMon: Simulation Monitor for Computational Astrophysics
NASA Astrophysics Data System (ADS)
Xuran Qian, Penny; Cai, Maxwell Xu; Portegies Zwart, Simon; Zhu, Ming
2017-09-01
Scientific discovery via numerical simulations is important in modern astrophysics. This relatively new branch of astrophysics has become possible due to the development of reliable numerical algorithms and the high performance of modern computing technologies. These enable the analysis of large collections of observational data and the acquisition of new data via simulations at unprecedented accuracy and resolution. Ideally, simulations run until they reach some pre-determined termination condition, but often other factors cause extensive numerical approaches to break down at an earlier stage. In those cases, processes tend to be interrupted due to unexpected events in the software or the hardware. In those cases, the scientist handles the interrupt manually, which is time-consuming and prone to errors. We present the Simulation Monitor (SiMon) to automatize the farming of large and extensive simulation processes. Our method is light-weight, it fully automates the entire workflow management, operates concurrently across multiple platforms and can be installed in user space. Inspired by the process of crop farming, we perceive each simulation as a crop in the field and running simulation becomes analogous to growing crops. With the development of SiMon we relax the technical aspects of simulation management. The initial package was developed for extensive parameter searchers in numerical simulations, but it turns out to work equally well for automating the computational processing and reduction of observational data reduction.
Distributed storage and cloud computing: a test case
NASA Astrophysics Data System (ADS)
Piano, S.; Delia Ricca, G.
2014-06-01
Since 2003 the computing farm hosted by the INFN Tier3 facility in Trieste supports the activities of many scientific communities. Hundreds of jobs from 45 different VOs, including those of the LHC experiments, are processed simultaneously. Given that normally the requirements of the different computational communities are not synchronized, the probability that at any given time the resources owned by one of the participants are not fully utilized is quite high. A balanced compensation should in principle allocate the free resources to other users, but there are limits to this mechanism. In fact, the Trieste site may not hold the amount of data needed to attract enough analysis jobs, and even in that case there could be a lack of bandwidth for their access. The Trieste ALICE and CMS computing groups, in collaboration with other Italian groups, aim to overcome the limitations of existing solutions using two approaches: sharing the data among all the participants taking full advantage of GARR-X wide area networks (10 GB/s) and integrating the resources dedicated to batch analysis with the ones reserved for dynamic interactive analysis, through modern solutions as cloud computing.
Automatic Blocking Of QR and LU Factorizations for Locality
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yi, Q; Kennedy, K; You, H
2004-03-26
QR and LU factorizations for dense matrices are important linear algebra computations that are widely used in scientific applications. To efficiently perform these computations on modern computers, the factorization algorithms need to be blocked when operating on large matrices to effectively exploit the deep cache hierarchy prevalent in today's computer memory systems. Because both QR (based on Householder transformations) and LU factorization algorithms contain complex loop structures, few compilers can fully automate the blocking of these algorithms. Though linear algebra libraries such as LAPACK provides manually blocked implementations of these algorithms, by automatically generating blocked versions of the computations, moremore » benefit can be gained such as automatic adaptation of different blocking strategies. This paper demonstrates how to apply an aggressive loop transformation technique, dependence hoisting, to produce efficient blockings for both QR and LU with partial pivoting. We present different blocking strategies that can be generated by our optimizer and compare the performance of auto-blocked versions with manually tuned versions in LAPACK, both using reference BLAS, ATLAS BLAS and native BLAS specially tuned for the underlying machine architectures.« less
Nam, Yun Sik; Won, Sung-Ok; Lee, Kang-Bong
2014-07-01
A guidebook detailing the process of forensic investigation was written in 1440 A.C.E. It outlines the fundamentals and details of each element of criminal investigation during the era of the Chosun dynasty in Korea. Because this old guidebook was written in terms of personal experience rather than on scientific basis, it includes many fallacies from the perspective of modern forensic science. However, the book describes methods to form a scientific basis for the experiments performed. We demonstrate the modern scientific basis for ancient methods to monitor trace amounts of blood and detect lethal arsenic poisoning from a postmortem examination as described in this old forensic guidebook. Traces of blood and arsenic poisoning were detected according to the respective color changes of brownish red, due to the reaction of ferric ions in blood with acetic ions of vinegar, and dark blue, due to the reaction of silver with arsenic sulfide. © 2014 American Academy of Forensic Sciences.
sbv IMPROVER: Modern Approach to Systems Biology.
Guryanova, Svetlana; Guryanova, Anna
2017-01-01
The increasing amount and variety of data in biosciences call for innovative methods of visualization, scientific verification, and pathway analysis. Novel approaches to biological networks and research quality control are important because of their role in development of new products, improvement, and acceleration of existing health policies and research for novel ways of solving scientific challenges. One such approach is sbv IMPROVER. It is a platform that uses crowdsourcing and verification to create biological networks with easy public access. It contains 120 networks built in Biological Expression Language (BEL) to interpret data from PubMed articles with high-quality verification available for free on the CBN database. Computable, human-readable biological networks with a structured syntax are a powerful way of representing biological information generated from high-density data. This article presents sbv IMPROVER, a crowd-verification approach for the visualization and expansion of biological networks.
[Organization of clinical research: in general and visceral surgery].
Schneider, M; Werner, J; Weitz, J; Büchler, M W
2010-04-01
The structural organization of research facilities within a surgical university center should aim at strengthening the department's research output and likewise provide opportunities for the scientific education of academic surgeons. We suggest a model in which several independent research groups within a surgical department engage in research projects covering various aspects of surgically relevant basic, translational or clinical research. In order to enhance the translational aspects of surgical research, a permanent link needs to be established between the department's scientific research projects and its chief interests in clinical patient care. Importantly, a focus needs to be placed on obtaining evidence-based data to judge the efficacy of novel diagnostic and treatment concepts. Integration of modern technologies from the fields of physics, computer science and molecular medicine into surgical research necessitates cooperation with external research facilities, which can be strengthened by coordinated support programs offered by research funding institutions.
Virtual Observatories, Data Mining, and Astroinformatics
NASA Astrophysics Data System (ADS)
Borne, Kirk
The historical, current, and future trends in knowledge discovery from data in astronomy are presented here. The story begins with a brief history of data gathering and data organization. A description of the development ofnew information science technologies for astronomical discovery is then presented. Among these are e-Science and the virtual observatory, with its data discovery, access, display, and integration protocols; astroinformatics and data mining for exploratory data analysis, information extraction, and knowledge discovery from distributed data collections; new sky surveys' databases, including rich multivariate observational parameter sets for large numbers of objects; and the emerging discipline of data-oriented astronomical research, called astroinformatics. Astroinformatics is described as the fourth paradigm of astronomical research, following the three traditional research methodologies: observation, theory, and computation/modeling. Astroinformatics research areas include machine learning, data mining, visualization, statistics, semantic science, and scientific data management.Each of these areas is now an active research discipline, with significantscience-enabling applications in astronomy. Research challenges and sample research scenarios are presented in these areas, in addition to sample algorithms for data-oriented research. These information science technologies enable scientific knowledge discovery from the increasingly large and complex data collections in astronomy. The education and training of the modern astronomy student must consequently include skill development in these areas, whose practitioners have traditionally been limited to applied mathematicians, computer scientists, and statisticians. Modern astronomical researchers must cross these traditional discipline boundaries, thereby borrowing the best of breed methodologies from multiple disciplines. In the era of large sky surveys and numerous large telescopes, the potential for astronomical discovery is equally large, and so the data-oriented research methods, algorithms, and techniques that are presented here will enable the greatest discovery potential from the ever-growing data and information resources in astronomy.
Karp, Peter D; Berger, Bonnie; Kovats, Diane; Lengauer, Thomas; Linial, Michal; Sabeti, Pardis; Hide, Winston; Rost, Burkhard
2015-02-15
Speed is of the essence in combating Ebola; thus, computational approaches should form a significant component of Ebola research. As for the development of any modern drug, computational biology is uniquely positioned to contribute through comparative analysis of the genome sequences of Ebola strains and three-dimensional protein modeling. Other computational approaches to Ebola may include large-scale docking studies of Ebola proteins with human proteins and with small-molecule libraries, computational modeling of the spread of the virus, computational mining of the Ebola literature and creation of a curated Ebola database. Taken together, such computational efforts could significantly accelerate traditional scientific approaches. In recognition of the need for important and immediate solutions from the field of computational biology against Ebola, the International Society for Computational Biology (ISCB) announces a prize for an important computational advance in fighting the Ebola virus. ISCB will confer the ISCB Fight against Ebola Award, along with a prize of US$2000, at its July 2016 annual meeting (ISCB Intelligent Systems for Molecular Biology 2016, Orlando, FL). dkovats@iscb.org or rost@in.tum.de. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
The International Symposium on Grids and Clouds
NASA Astrophysics Data System (ADS)
The International Symposium on Grids and Clouds (ISGC) 2012 will be held at Academia Sinica in Taipei from 26 February to 2 March 2012, with co-located events and workshops. The conference is hosted by the Academia Sinica Grid Computing Centre (ASGC). 2012 is the decennium anniversary of the ISGC which over the last decade has tracked the convergence, collaboration and innovation of individual researchers across the Asia Pacific region to a coherent community. With the continuous support and dedication from the delegates, ISGC has provided the primary international distributed computing platform where distinguished researchers and collaboration partners from around the world share their knowledge and experiences. The last decade has seen the wide-scale emergence of e-Infrastructure as a critical asset for the modern e-Scientist. The emergence of large-scale research infrastructures and instruments that has produced a torrent of electronic data is forcing a generational change in the scientific process and the mechanisms used to analyse the resulting data deluge. No longer can the processing of these vast amounts of data and production of relevant scientific results be undertaken by a single scientist. Virtual Research Communities that span organisations around the world, through an integrated digital infrastructure that connects the trust and administrative domains of multiple resource providers, have become critical in supporting these analyses. Topics covered in ISGC 2012 include: High Energy Physics, Biomedicine & Life Sciences, Earth Science, Environmental Changes and Natural Disaster Mitigation, Humanities & Social Sciences, Operations & Management, Middleware & Interoperability, Security and Networking, Infrastructure Clouds & Virtualisation, Business Models & Sustainability, Data Management, Distributed Volunteer & Desktop Grid Computing, High Throughput Computing, and High Performance, Manycore & GPU Computing.
Evolution and convergence of the patterns of international scientific collaboration.
Coccia, Mario; Wang, Lili
2016-02-23
International research collaboration plays an important role in the social construction and evolution of science. Studies of science increasingly analyze international collaboration across multiple organizations for its impetus in improving research quality, advancing efficiency of the scientific production, and fostering breakthroughs in a shorter time. However, long-run patterns of international research collaboration across scientific fields and their structural changes over time are hardly known. Here we show the convergence of international scientific collaboration across research fields over time. Our study uses a dataset by the National Science Foundation and computes the fraction of papers that have international institutional coauthorships for various fields of science. We compare our results with pioneering studies carried out in the 1970s and 1990s by applying a standardization method that transforms all fractions of internationally coauthored papers into a comparable framework. We find, over 1973-2012, that the evolution of collaboration patterns across scientific disciplines seems to generate a convergence between applied and basic sciences. We also show that the general architecture of international scientific collaboration, based on the ranking of fractions of international coauthorships for different scientific fields per year, has tended to be unchanged over time, at least until now. Overall, this study shows, to our knowledge for the first time, the evolution of the patterns of international scientific collaboration starting from initial results described by literature in the 1970s and 1990s. We find a convergence of these long-run collaboration patterns between the applied and basic sciences. This convergence might be one of contributing factors that supports the evolution of modern scientific fields.
The JINR Tier1 Site Simulation for Research and Development Purposes
NASA Astrophysics Data System (ADS)
Korenkov, V.; Nechaevskiy, A.; Ososkov, G.; Pryahina, D.; Trofimov, V.; Uzhinskiy, A.; Voytishin, N.
2016-02-01
Distributed complex computing systems for data storage and processing are in common use in the majority of modern scientific centers. The design of such systems is usually based on recommendations obtained via a preliminary simulated model used and executed only once. However big experiments last for years and decades, and the development of their computing system is going on, not only quantitatively but also qualitatively. Even with the substantial efforts invested in the design phase to understand the systems configuration, it would be hard enough to develop a system without additional research of its future evolution. The developers and operators face the problem of the system behaviour predicting after the planned modifications. A system for grid and cloud services simulation is developed at LIT (JINR, Dubna). This simulation system is focused on improving the effciency of the grid/cloud structures development by using the work quality indicators of some real system. The development of such kind of software is very important for making a new grid/cloud infrastructure for such big scientific experiments like the JINR Tier1 site for WLCG. The simulation of some processes of the Tier1 site is considered as an example of our application approach.
Enhancing reproducibility in scientific computing: Metrics and registry for Singularity containers.
Sochat, Vanessa V; Prybol, Cameron J; Kurtzer, Gregory M
2017-01-01
Here we present Singularity Hub, a framework to build and deploy Singularity containers for mobility of compute, and the singularity-python software with novel metrics for assessing reproducibility of such containers. Singularity containers make it possible for scientists and developers to package reproducible software, and Singularity Hub adds automation to this workflow by building, capturing metadata for, visualizing, and serving containers programmatically. Our novel metrics, based on custom filters of content hashes of container contents, allow for comparison of an entire container, including operating system, custom software, and metadata. First we will review Singularity Hub's primary use cases and how the infrastructure has been designed to support modern, common workflows. Next, we conduct three analyses to demonstrate build consistency, reproducibility metric and performance and interpretability, and potential for discovery. This is the first effort to demonstrate a rigorous assessment of measurable similarity between containers and operating systems. We provide these capabilities within Singularity Hub, as well as the source software singularity-python that provides the underlying functionality. Singularity Hub is available at https://singularity-hub.org, and we are excited to provide it as an openly available platform for building, and deploying scientific containers.
Enhancing reproducibility in scientific computing: Metrics and registry for Singularity containers
Prybol, Cameron J.; Kurtzer, Gregory M.
2017-01-01
Here we present Singularity Hub, a framework to build and deploy Singularity containers for mobility of compute, and the singularity-python software with novel metrics for assessing reproducibility of such containers. Singularity containers make it possible for scientists and developers to package reproducible software, and Singularity Hub adds automation to this workflow by building, capturing metadata for, visualizing, and serving containers programmatically. Our novel metrics, based on custom filters of content hashes of container contents, allow for comparison of an entire container, including operating system, custom software, and metadata. First we will review Singularity Hub’s primary use cases and how the infrastructure has been designed to support modern, common workflows. Next, we conduct three analyses to demonstrate build consistency, reproducibility metric and performance and interpretability, and potential for discovery. This is the first effort to demonstrate a rigorous assessment of measurable similarity between containers and operating systems. We provide these capabilities within Singularity Hub, as well as the source software singularity-python that provides the underlying functionality. Singularity Hub is available at https://singularity-hub.org, and we are excited to provide it as an openly available platform for building, and deploying scientific containers. PMID:29186161
Cellular intelligence: Microphenomenology and the realities of being.
Ford, Brian J
2017-12-01
Traditions of Eastern thought conceptualised life in a holistic sense, emphasising the processes of maintaining health and conquering sickness as manifestations of an essentially spiritual principle that was of overriding importance in the conduct of living. Western science, which drove the overriding and partial eclipse of Eastern traditions, became founded on a reductionist quest for ultimate realities which, in the modern scientific world, has embraced the notion that every living process can be successfully modelled by a digital computer system. It is argued here that the essential processes of cognition, response and decision-making inherent in living cells transcend conventional modelling, and microscopic studies of organisms like the shell-building amoebae and the rhodophyte alga Antithamnion reveal a level of cellular intelligence that is unrecognized by science and is not amenable to computer analysis. Copyright © 2017. Published by Elsevier Ltd.
Explicit integration with GPU acceleration for large kinetic networks
Brock, Benjamin; Belt, Andrew; Billings, Jay Jay; ...
2015-09-15
In this study, we demonstrate the first implementation of recently-developed fast explicit kinetic integration algorithms on modern graphics processing unit (GPU) accelerators. Taking as a generic test case a Type Ia supernova explosion with an extremely stiff thermonuclear network having 150 isotopic species and 1604 reactions coupled to hydrodynamics using operator splitting, we demonstrate the capability to solve of order 100 realistic kinetic networks in parallel in the same time that standard implicit methods can solve a single such network on a CPU. In addition, this orders-of-magnitude decrease in computation time for solving systems of realistic kinetic networks implies thatmore » important coupled, multiphysics problems in various scientific and technical fields that were intractable, or could be simulated only with highly schematic kinetic networks, are now computationally feasible.« less
LARCRIM user's guide, version 1.0
NASA Technical Reports Server (NTRS)
Davis, John S.; Heaphy, William J.
1993-01-01
LARCRIM is a relational database management system (RDBMS) which performs the conventional duties of an RDBMS with the added feature that it can store attributes which consist of arrays or matrices. This makes it particularly valuable for scientific data management. It is accessible as a stand-alone system and through an application program interface. The stand-alone system may be executed in two modes: menu or command. The menu mode prompts the user for the input required to create, update, and/or query the database. The command mode requires the direct input of LARCRIM commands. Although LARCRIM is an update of an old database family, its performance on modern computers is quite satisfactory. LARCRIM is written in FORTRAN 77 and runs under the UNIX operating system. Versions have been released for the following computers: SUN (3 & 4), Convex, IRIS, Hewlett-Packard, CRAY 2 & Y-MP.
Kligfield, Paul; Gettes, Leonard S; Bailey, James J; Childers, Rory; Deal, Barbara J; Hancock, E William; van Herpen, Gerard; Kors, Jan A; Macfarlane, Peter; Mirvis, David M; Pahlm, Olle; Rautaharju, Pentti; Wagner, Galen S
2007-03-01
This statement examines the relation of the resting ECG to its technology. Its purpose is to foster understanding of how the modern ECG is derived and displayed and to establish standards that will improve the accuracy and usefulness of the ECG in practice. Derivation of representative waveforms and measurements based on global intervals are described. Special emphasis is placed on digital signal acquisition and computer-based signal processing, which provide automated measurements that lead to computer-generated diagnostic statements. Lead placement, recording methods, and waveform presentation are reviewed. Throughout the statement, recommendations for ECG standards are placed in context of the clinical implications of evolving ECG technology.
Manycore Performance-Portability: Kokkos Multidimensional Array Library
Edwards, H. Carter; Sunderland, Daniel; Porter, Vicki; ...
2012-01-01
Large, complex scientific and engineering application code have a significant investment in computational kernels to implement their mathematical models. Porting these computational kernels to the collection of modern manycore accelerator devices is a major challenge in that these devices have diverse programming models, application programming interfaces (APIs), and performance requirements. The Kokkos Array programming model provides library-based approach to implement computational kernels that are performance-portable to CPU-multicore and GPGPU accelerator devices. This programming model is based upon three fundamental concepts: (1) manycore compute devices each with its own memory space, (2) data parallel kernels and (3) multidimensional arrays. Kernel executionmore » performance is, especially for NVIDIA® devices, extremely dependent on data access patterns. Optimal data access pattern can be different for different manycore devices – potentially leading to different implementations of computational kernels specialized for different devices. The Kokkos Array programming model supports performance-portable kernels by (1) separating data access patterns from computational kernels through a multidimensional array API and (2) introduce device-specific data access mappings when a kernel is compiled. An implementation of Kokkos Array is available through Trilinos [Trilinos website, http://trilinos.sandia.gov/, August 2011].« less
Paul, N
1998-06-01
Up to now neither the question, whether all theoretical medical knowledge can at least be described as scientific, nor the one how exactly access to the existing scientific and theoretical medical knowledge during clinical problem-solving is made, has been sufficiently answered. Scientific theories play an important role in controlling clinical practice and improving the quality of clinical care in modern medicine on the one hand, and making it vindicable on the other. Therefore, the vagueness of unexplicit interrelations between medicine's stock of knowledge and medical practice appears as a gap in the theoretical concept of modern medicine which can be described as "Hiatus theoreticus" in the anatomy of medicine. A central intention of the paper is to analyze the role of philosophy of medicine for the clarification of the theoretical basis of medical practice. Clinical relevance and normativity in the sense of modern theory of science are suggested as criteria to establish a differentiation between philosophy of medicine as a primary medical discipline and the application of general philosophy in medicine.
Overview of computational structural methods for modern military aircraft
NASA Technical Reports Server (NTRS)
Kudva, J. N.
1992-01-01
Computational structural methods are essential for designing modern military aircraft. This briefing deals with computational structural methods (CSM) currently used. First a brief summary of modern day aircraft structural design procedures is presented. Following this, several ongoing CSM related projects at Northrop are discussed. Finally, shortcomings in this area, future requirements, and summary remarks are given.
Blazing Signature Filter: a library for fast pairwise similarity comparisons
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, Joon-Yong; Fujimoto, Grant M.; Wilson, Ryan
Identifying similarities between datasets is a fundamental task in data mining and has become an integral part of modern scientific investigation. Whether the task is to identify co-expressed genes in large-scale expression surveys or to predict combinations of gene knockouts which would elicit a similar phenotype, the underlying computational task is often a multi-dimensional similarity test. As datasets continue to grow, improvements to the efficiency, sensitivity or specificity of such computation will have broad impacts as it allows scientists to more completely explore the wealth of scientific data. A significant practical drawback of large-scale data mining is the vast majoritymore » of pairwise comparisons are unlikely to be relevant, meaning that they do not share a signature of interest. It is therefore essential to efficiently identify these unproductive comparisons as rapidly as possible and exclude them from more time-intensive similarity calculations. The Blazing Signature Filter (BSF) is a highly efficient pairwise similarity algorithm which enables extensive data mining within a reasonable amount of time. The algorithm transforms datasets into binary metrics, allowing it to utilize the computationally efficient bit operators and provide a coarse measure of similarity. As a result, the BSF can scale to high dimensionality and rapidly filter unproductive pairwise comparison. Two bioinformatics applications of the tool are presented to demonstrate the ability to scale to billions of pairwise comparisons and the usefulness of this approach.« less
Upgrading of the LGD cluster at JINR to support DLNP experiments
NASA Astrophysics Data System (ADS)
Bednyakov, I. V.; Dolbilov, A. G.; Ivanov, Yu. P.
2017-01-01
Since its construction in 2005, the Computing Cluster of the Dzhelepov Laboratory of Nuclear Problems has been mainly used to perform calculations (data analysis, simulation, etc.) for various scientific collaborations in which DLNP scientists take an active part. The Cluster also serves to train specialists. Much has changed in the past decades, and the necessity has arisen to upgrade the cluster, increasing its power and replacing the outdated equipment to maintain its reliability and modernity. In this work we describe the experience of performing this upgrading, which can be helpful for system administrators to put new equipment for clusters of this type into operation quickly and efficiently.
Accelerator Based Tools of Stockpile Stewardship
NASA Astrophysics Data System (ADS)
Seestrom, Susan
2017-01-01
The Manhattan Project had to solve difficult challenges in physics and materials science. During the cold war a large nuclear stockpile was developed. In both cases, the approach was largely empirical. Today that stockpile must be certified without nuclear testing, a task that becomes more difficult as the stockpile ages. I will discuss the role of modern accelerator based experiments, such as x-ray radiography, proton radiography, neutron and nuclear physics experiments, in stockpile stewardship. These new tools provide data of exceptional sensitivity and are answering questions about the stockpile, improving our scientific understanding, and providing validation for the computer simulations that are relied upon to certify todays' stockpile.
NASA Astrophysics Data System (ADS)
Sushkevich, T. A.; Strelkov, S. A.; Maksakova, S. V.
2017-11-01
We are talking about the national achievements of the world level in theory of radiation transfer in the system atmosphere-oceans and about the modern scientific potential developing in Russia, which adequately provides a methodological basis for theoretical and computational studies of radiation processes and radiation fields in the natural environments with the use of supercomputers and massively parallel processing for problems of remote sensing and the climate of Earth. A model of the radiation field in system "clouds cover the atmosphere-ocean" to the separation of the contributions of clouds, atmosphere and ocean.
For the greater credibility: Jesuit science and education in modern Portugal (1858-1910).
Malta Romeiras, Francisco
2018-03-01
Upon the restoration of the Society of Jesus in Portugal in 1858, the Jesuits founded two important colleges that made significant efforts in the promotion of hands-on experimental teaching of the natural sciences. At the Colégio de Campolide (Lisbon, 1858-1910) and the Colégio de São Fiel (Louriçal do Campo, 1863-1910) the Jesuits created modern chemistry and physics laboratories, organized significant botanical, zoological and geological collections, promoted scientific expeditions with their students to observe eclipses and to collect novel species of animals and plants, and engaged in original research work in physics, botany, and zoology. The successful implementation of modern scientific practices gained these colleges public recognition as the most prominent secondary institutions in nineteenth-century Portugal, and this made a major contribution to countering the widespread and commonly accepted anti-Jesuit accusations of obscurantism and scientific backwardness.
SIMD Optimization of Linear Expressions for Programmable Graphics Hardware
Bajaj, Chandrajit; Ihm, Insung; Min, Jungki; Oh, Jinsang
2009-01-01
The increased programmability of graphics hardware allows efficient graphical processing unit (GPU) implementations of a wide range of general computations on commodity PCs. An important factor in such implementations is how to fully exploit the SIMD computing capacities offered by modern graphics processors. Linear expressions in the form of ȳ = Ax̄ + b̄, where A is a matrix, and x̄, ȳ and b̄ are vectors, constitute one of the most basic operations in many scientific computations. In this paper, we propose a SIMD code optimization technique that enables efficient shader codes to be generated for evaluating linear expressions. It is shown that performance can be improved considerably by efficiently packing arithmetic operations into four-wide SIMD instructions through reordering of the operations in linear expressions. We demonstrate that the presented technique can be used effectively for programming both vertex and pixel shaders for a variety of mathematical applications, including integrating differential equations and solving a sparse linear system of equations using iterative methods. PMID:19946569
Artificial Intelligence in Medical Practice: The Question to the Answer?
Miller, D Douglas; Brown, Eric W
2018-02-01
Computer science advances and ultra-fast computing speeds find artificial intelligence (AI) broadly benefitting modern society-forecasting weather, recognizing faces, detecting fraud, and deciphering genomics. AI's future role in medical practice remains an unanswered question. Machines (computers) learn to detect patterns not decipherable using biostatistics by processing massive datasets (big data) through layered mathematical models (algorithms). Correcting algorithm mistakes (training) adds to AI predictive model confidence. AI is being successfully applied for image analysis in radiology, pathology, and dermatology, with diagnostic speed exceeding, and accuracy paralleling, medical experts. While diagnostic confidence never reaches 100%, combining machines plus physicians reliably enhances system performance. Cognitive programs are impacting medical practice by applying natural language processing to read the rapidly expanding scientific literature and collate years of diverse electronic medical records. In this and other ways, AI may optimize the care trajectory of chronic disease patients, suggest precision therapies for complex illnesses, reduce medical errors, and improve subject enrollment into clinical trials. Copyright © 2018 Elsevier Inc. All rights reserved.
A long history of breakdowns: A historiographical review.
Margócsy, Dániel
2017-06-01
The introduction to this special issue argues that network breakdowns play an important and unacknowledged role in the shaping and emergence of scientific knowledge. It focuses on transnational scientific networks from the early modern Republic of Letters to 21st-century globalized science. It attempts to unite the disparate historiography of the early modern Republic of Letters, the literature on 20th-century globalization, and the scholarship on Actor-Network Theory. We can perceive two, seemingly contradictory, changes to scientific networks over the past four hundred years. At the level of individuals, networks have become increasing fragile, as developments in communication and transportation technologies, and the emergence of regimes of standardization and instrumentation, have made it easier both to create new constellations of people and materials, and to replace and rearrange them. But at the level of institutions, collaborations have become much more extensive and long-lived, with single projects routinely outlasting even the arc of a full scientific career. In the modern world, the strength of institutions and macro-networks often relies on ideological regimes of standardization and instrumentation that can flexibly replace elements and individuals at will.
Two Cultures in Modern Science and Technology: For Safety and Validity Does Medicine Have to Update?
Becker, Robert E
2016-01-11
Two different scientific cultures go unreconciled in modern medicine. Each culture accepts that scientific knowledge and technologies are vulnerable to and easily invalidated by methods and conditions of acquisition, interpretation, and application. How these vulnerabilities are addressed separates the 2 cultures and potentially explains medicine's difficulties eradicating errors. A traditional culture, dominant in medicine, leaves error control in the hands of individual and group investigators and practitioners. A competing modern scientific culture accepts errors as inevitable, pernicious, and pervasive sources of adverse events throughout medical research and patient care too malignant for individuals or groups to control. Error risks to the validity of scientific knowledge and safety in patient care require systemwide programming able to support a culture in medicine grounded in tested, continually updated, widely promulgated, and uniformly implemented standards of practice for research and patient care. Experiences from successes in other sciences and industries strongly support the need for leadership from the Institute of Medicine's recommended Center for Patient Safely within the Federal Executive branch of government.
WASP (Write a Scientific Paper) using Excel - 1: Data entry and validation.
Grech, Victor
2018-02-01
Data collection for the purposes of analysis, after the planning and execution of a research study, commences with data input and validation. The process of data entry and analysis may appear daunting to the uninitiated, but as pointed out in the 1970s in a series of papers by British Medical Journal Deputy Editor TDV Swinscow, modern hardware and software (he was then referring to the availability of hand calculators) permits the performance of statistical testing outside a computer laboratory. In this day and age, modern software, such as the ubiquitous and almost universally familiar Microsoft Excel™ greatly facilitates this process. This first paper comprises the first of a collection of papers which will emulate Swinscow's series, in his own words, "addressed to readers who want to start at the beginning, not to those who are already skilled statisticians." These papers will have less focus on the actual arithmetic, and more emphasis on how to actually implement simple statistics, step by step, using Excel, thereby constituting the equivalent of Swinscow's papers in the personal computer age. Data entry can be facilitated by several underutilised features in Excel. This paper will explain Excel's little-known form function, data validation implementation at input stage, simple coding tips and data cleaning tools. Copyright © 2018 Elsevier B.V. All rights reserved.
Clocks to Computers: A Machine-Based “Big Picture” of the History of Modern Science.
van Lunteren, Frans
2016-12-01
Over the last few decades there have been several calls for a “big picture” of the history of science. There is a general need for a concise overview of the rise of modern science, with a clear structure allowing for a rough division into periods. This essay proposes such a scheme, one that is both elementary and comprehensive. It focuses on four machines, which can be seen to have mediated between science and society during successive periods of time: the clock, the balance, the steam engine, and the computer. Following an extended developmental phase, each of these machines came to play a highly visible role in Western societies, both socially and economically. Each of these machines, moreover, was used as a powerful resource for the understanding of both inorganic and organic nature. More specifically, their metaphorical use helped to construe and refine some key concepts that would play a prominent role in such understanding. In each case the key concept would at some point be considered to represent the ultimate building block of reality. Finally, in a refined form, each of these machines would eventually make its entry in scientific research, thereby strengthening the ties between these machines and nature.
Micro-Biomechanics of the Kebara 2 Hyoid and Its Implications for Speech in Neanderthals
D’Anastasio, Ruggero; Wroe, Stephen; Tuniz, Claudio; Mancini, Lucia; Cesana, Deneb T.; Dreossi, Diego; Ravichandiran, Mayoorendra; Attard, Marie; Parr, William C. H.; Agur, Anne; Capasso, Luigi
2013-01-01
The description of a Neanderthal hyoid from Kebara Cave (Israel) in 1989 fuelled scientific debate on the evolution of speech and complex language. Gross anatomy of the Kebara 2 hyoid differs little from that of modern humans. However, whether Homo neanderthalensis could use speech or complex language remains controversial. Similarity in overall shape does not necessarily demonstrate that the Kebara 2 hyoid was used in the same way as that of Homo sapiens. The mechanical performance of whole bones is partly controlled by internal trabecular geometries, regulated by bone-remodelling in response to the forces applied. Here we show that the Neanderthal and modern human hyoids also present very similar internal architectures and micro-biomechanical behaviours. Our study incorporates detailed analysis of histology, meticulous reconstruction of musculature, and computational biomechanical analysis with models incorporating internal micro-geometry. Because internal architecture reflects the loadings to which a bone is routinely subjected, our findings are consistent with a capacity for speech in the Neanderthals. PMID:24367509
NASA Astrophysics Data System (ADS)
Craney, Chris; Mazzeo, April; Lord, Kaye
1996-07-01
During the past five years the nation's concern for science education has expanded from a discussion about the future supply of Ph.D. scientists and its impact on the nation's scientific competitiveness to the broader consideration of the science education available to all students. Efforts to improve science education have led many authors to suggest greater collaboration between high school science teachers and their college/university colleagues. This article reviews the experience and outcomes of the Teachers + Occidental = Partnership in Science (TOPS) van program operating in the Los Angeles Metropolitan area. The program emphasizes an extensive ongoing staff development, responsiveness to teachers' concerns, technical and on-site support, and sustained interaction between participants and program staff. Access to modern technology, including computer-driven instruments and commercial data analysis software, coupled with increased teacher content knowledge has led to empowerment of teachers and changes in student interest in science. Results of student and teacher questionnaires are reviewed.
Planetary Exploration in the Classroom
NASA Astrophysics Data System (ADS)
Slivan, S. M.; Binzel, R. P.
1997-07-01
We have developed educational materials to seed a series of undergraduate level exercises on "Planetary Exploration in the Classroom." The goals of the series are to teach modern methods of planetary exploration and discovery to students having both science and non-science backgrounds. Using personal computers in a "hands-on" approach with images recorded by planetary spacecraft, students working through the exercises learn that modern scientific images are digital objects that can be examined and manipulated in quantitative detail. The initial exercises we've developed utilize NIH Image in conjunction with images from the Voyager spacecraft CDs. Current exercises are titled "Using 'NIH IMAGE' to View Voyager Images", "Resolving Surface Features on Io", "Discovery of Volcanoes on Io", and "Topography of Canyons on Ariel." We expect these exercises will be released during Fall 1997 and will be available via 'anonymous ftp'; detailed information about obtaining the exercises will be on the Web at "http://web.mit.edu/12s23/www/pec.html." This curriculum development was sponsored by NSF Grant DUE-9455329.
76 FR 41234 - Advanced Scientific Computing Advisory Committee Charter Renewal
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-13
... Secretariat, General Services Administration, notice is hereby given that the Advanced Scientific Computing... advice and recommendations concerning the Advanced Scientific Computing program in response only to... Advanced Scientific Computing Research program and recommendations based thereon; --Advice on the computing...
PREFACE: IC-MSQUARE 2012: International Conference on Mathematical Modelling in Physical Sciences
NASA Astrophysics Data System (ADS)
Kosmas, Theocharis; Vagenas, Elias; Vlachos, Dimitrios
2013-02-01
The first International Conference on Mathematical Modelling in Physical Sciences (IC-MSQUARE) took place in Budapest, Hungary, from Monday 3 to Friday 7 September 2012. The conference was attended by more than 130 participants, and hosted about 290 oral, poster and virtual papers by more than 460 pre-registered authors. The first IC-MSQUARE consisted of different and diverging workshops and thus covered various research fields in which mathematical modelling is used, such as theoretical/mathematical physics, neutrino physics, non-integrable systems, dynamical systems, computational nanoscience, biological physics, computational biomechanics, complex networks, stochastic modelling, fractional statistics, DNA dynamics, and macroeconomics. The scientific program was rather heavy since after the Keynote and Invited Talks in the morning, two parallel sessions ran every day. However, according to all attendees, the program was excellent with a high level of talks and the scientific environment was fruitful; thus all attendees had a creative time. The mounting question is whether this occurred accidentally, or whether IC-MSQUARE is a necessity in the field of physical and mathematical modelling. For all of us working in the field, the existing and established conferences in this particular field suffer from two distinguished and recognized drawbacks: the first is the increasing orientation, while the second refers to the extreme specialization of the meetings. Therefore, a conference which aims to promote the knowledge and development of high-quality research in mathematical fields concerned with applications of other scientific fields as well as modern technological trends in physics, chemistry, biology, medicine, economics, sociology, environmental sciences etc., appears to be a necessity. This is the key role that IC-MSQUARE will play. We would like to thank the Keynote Speaker and the Invited Speakers for their significant contributions to IC-MSQUARE. We would also like to thank the members of the International Scientific Committee and the members of the Organizing Committee. Conference Chairmen Theocharis Kosmas Department of Physics, University of Ioannina Elias Vagenas RCAAM, Academy of Athens Dimitrios Vlachos Department of Computer Science and Technology, University of Peloponnese The PDF also contains a list of members of the International Scientific Committes and details of the Keynote and Invited Speakers.
NASA Astrophysics Data System (ADS)
Tilley, Richard J. D.
2003-05-01
Colour is an important and integral part of everyday life, and an understanding and knowledge of the scientific principles behind colour, with its many applications and uses, is becoming increasingly important to a wide range of academic disciplines, from physical, medical and biological sciences through to the arts. Colour and the Optical Properties of Materials carefully introduces the science behind the subject, along with many modern and cutting-edge applications, chose to appeal to today's students. For science students, it provides a broad introduction to the subject and the many applications of colour. To more applied students, such as engineering and arts students, it provides the essential scientific background to colour and the many applications. Features: * Introduces the science behind the subject whilst closely connecting it to modern applications, such as colour displays, optical amplifiers and colour centre lasers * Richly illustrated with full-colour plates * Includes many worked examples, along with problems and exercises at the end of each chapter and selected answers at the back of the book * A Web site, including additional problems and full solutions to all the problems, which may be accessed at: www.cardiff.ac.uk/uwcc/engin/staff/rdjt/colour Written for students taking an introductory course in colour in a wide range of disciplines such as physics, chemistry, engineering, materials science, computer science, design, photography, architecture and textiles.
Workflows for Full Waveform Inversions
NASA Astrophysics Data System (ADS)
Boehm, Christian; Krischer, Lion; Afanasiev, Michael; van Driel, Martin; May, Dave A.; Rietmann, Max; Fichtner, Andreas
2017-04-01
Despite many theoretical advances and the increasing availability of high-performance computing clusters, full seismic waveform inversions still face considerable challenges regarding data and workflow management. While the community has access to solvers which can harness modern heterogeneous computing architectures, the computational bottleneck has fallen to these often manpower-bounded issues that need to be overcome to facilitate further progress. Modern inversions involve huge amounts of data and require a tight integration between numerical PDE solvers, data acquisition and processing systems, nonlinear optimization libraries, and job orchestration frameworks. To this end we created a set of libraries and applications revolving around Salvus (http://salvus.io), a novel software package designed to solve large-scale full waveform inverse problems. This presentation focuses on solving passive source seismic full waveform inversions from local to global scales with Salvus. We discuss (i) design choices for the aforementioned components required for full waveform modeling and inversion, (ii) their implementation in the Salvus framework, and (iii) how it is all tied together by a usable workflow system. We combine state-of-the-art algorithms ranging from high-order finite-element solutions of the wave equation to quasi-Newton optimization algorithms using trust-region methods that can handle inexact derivatives. All is steered by an automated interactive graph-based workflow framework capable of orchestrating all necessary pieces. This naturally facilitates the creation of new Earth models and hopefully sparks new scientific insights. Additionally, and even more importantly, it enhances reproducibility and reliability of the final results.
NASA Astrophysics Data System (ADS)
Hoang, Trinh Xuan; Ky, Nguyen Anh; Lan, Nguyen Tri; Viet, Nguyen Ai
2015-06-01
This volume contains selected papers presented at the 2nd International Workshop on Theoretical and Computational Physics (IWTCP-2): Modern Methods and Latest Results in Particle Physics, Nuclear Physics and Astrophysics and the 39th National Conference on Theoretical Physics (NCTP-39). Both the workshop and the conference were held from 28th - 31st July 2014 in Dakruco Hotel, Buon Ma Thuot, Dak Lak, Vietnam. The NCTP-39 and the IWTCP-2 were organized under the support of the Vietnamese Theoretical Physics Society, with a motivation to foster scientific exchanges between the theoretical and computational physicists in Vietnam and worldwide, as well as to promote high-standard level of research and education activities for young physicists in the country. The IWTCP-2 was also an External Activity of the Asia Pacific Center for Theoretical Physics (APCTP). About 100 participants coming from nine countries participated in the workshop and the conference. At the IWTCP-2 workshop, we had 16 invited talks presented by international experts, together with eight oral and ten poster contributions. At the NCTP-39, three invited talks, 15 oral contributions and 39 posters were presented. We would like to thank all invited speakers, participants and sponsors for making the workshop and the conference successful. Trinh Xuan Hoang, Nguyen Anh Ky, Nguyen Tri Lan and Nguyen Ai Viet
Increasing the reliability of ecological models using modern software engineering techniques
Robert M. Scheller; Brian R. Sturtevant; Eric J. Gustafson; Brendan C. Ward; David J. Mladenoff
2009-01-01
Modern software development techniques are largely unknown to ecologists. Typically, ecological models and other software tools are developed for limited research purposes, and additional capabilities are added later, usually in an ad hoc manner. Modern software engineering techniques can substantially increase scientific rigor and confidence in ecological models and...
Attitudes of Trainers and Medical Students towards Using Modern Practices
ERIC Educational Resources Information Center
Hadzhiiliev, Vassil Stefanov; Dobreva, Zhaneta Stoykova
2011-01-01
The development of universities as independent scientific centers determines their mission to incorporate the most modern achievements of science into the students' practical training. This research on the attitudes of the participants in this process towards the use of modern practices encompasses both trainers and students, and it consists of…
Buschini, José
2013-12-01
Using documentary sources, this work analyzes the creation and initial functioning of the Instituto de Investigaciones Hematológicas (Institute of Hematological Research) of the National Academy of Medicine (Buenos Aires, Argentina) in the context of the scientific modernization initiated within the country during the mid-1950s. Particular attention is paid to the generation of material bases and institutional and cultural mechanisms for the development of scientific research and of clinical practices guided by procedures and techniques rooted in the basic sciences. The formation and development of a research school in the Experimental Leukemia Section of the institute is explored as a case illustrative of the effective consolidation of initiatives oriented towards the organization of a scientific center.
Achterberg, Peter; de Koster, Willem; van der Waal, Jeroen
2017-08-01
Following up on suggestions that attitudes toward science are multi-dimensional, we analyze nationally representative survey data collected in the United States in 2014 ( N = 2006), and demonstrate the existence of a science confidence gap: some people place great trust in scientific methods and principles, but simultaneously distrust scientific institutions. This science confidence gap is strongly associated with level of education: it is larger among the less educated than among the more educated. We investigate explanations for these educational differences. Whereas hypotheses deduced from reflexive-modernization theory do not pass the test, those derived from theorizing on the role of anomie are corroborated. The less educated are more anomic (they have more modernity-induced cultural discontents), which not only underlies their distrust in scientific institutions, but also fuels their trust in scientific methods and principles. This explains why this science confidence gap is most pronounced among the less educated.
Accelerating EPI distortion correction by utilizing a modern GPU-based parallel computation.
Yang, Yao-Hao; Huang, Teng-Yi; Wang, Fu-Nien; Chuang, Tzu-Chao; Chen, Nan-Kuei
2013-04-01
The combination of phase demodulation and field mapping is a practical method to correct echo planar imaging (EPI) geometric distortion. However, since phase dispersion accumulates in each phase-encoding step, the calculation complexity of phase modulation is Ny-fold higher than conventional image reconstructions. Thus, correcting EPI images via phase demodulation is generally a time-consuming task. Parallel computing by employing general-purpose calculations on graphics processing units (GPU) can accelerate scientific computing if the algorithm is parallelized. This study proposes a method that incorporates the GPU-based technique into phase demodulation calculations to reduce computation time. The proposed parallel algorithm was applied to a PROPELLER-EPI diffusion tensor data set. The GPU-based phase demodulation method reduced the EPI distortion correctly, and accelerated the computation. The total reconstruction time of the 16-slice PROPELLER-EPI diffusion tensor images with matrix size of 128 × 128 was reduced from 1,754 seconds to 101 seconds by utilizing the parallelized 4-GPU program. GPU computing is a promising method to accelerate EPI geometric correction. The resulting reduction in computation time of phase demodulation should accelerate postprocessing for studies performed with EPI, and should effectuate the PROPELLER-EPI technique for clinical practice. Copyright © 2011 by the American Society of Neuroimaging.
Bassett, Danielle S; Sporns, Olaf
2017-01-01
Despite substantial recent progress, our understanding of the principles and mechanisms underlying complex brain function and cognition remains incomplete. Network neuroscience proposes to tackle these enduring challenges. Approaching brain structure and function from an explicitly integrative perspective, network neuroscience pursues new ways to map, record, analyze and model the elements and interactions of neurobiological systems. Two parallel trends drive the approach: the availability of new empirical tools to create comprehensive maps and record dynamic patterns among molecules, neurons, brain areas and social systems; and the theoretical framework and computational tools of modern network science. The convergence of empirical and computational advances opens new frontiers of scientific inquiry, including network dynamics, manipulation and control of brain networks, and integration of network processes across spatiotemporal domains. We review emerging trends in network neuroscience and attempt to chart a path toward a better understanding of the brain as a multiscale networked system. PMID:28230844
Kligfield, Paul; Gettes, Leonard S; Bailey, James J; Childers, Rory; Deal, Barbara J; Hancock, E William; van Herpen, Gerard; Kors, Jan A; Macfarlane, Peter; Mirvis, David M; Pahlm, Olle; Rautaharju, Pentti; Wagner, Galen S; Josephson, Mark; Mason, Jay W; Okin, Peter; Surawicz, Borys; Wellens, Hein
2007-03-13
This statement examines the relation of the resting ECG to its technology. Its purpose is to foster understanding of how the modern ECG is derived and displayed and to establish standards that will improve the accuracy and usefulness of the ECG in practice. Derivation of representative waveforms and measurements based on global intervals are described. Special emphasis is placed on digital signal acquisition and computer-based signal processing, which provide automated measurements that lead to computer-generated diagnostic statements. Lead placement, recording methods, and waveform presentation are reviewed. Throughout the statement, recommendations for ECG standards are placed in context of the clinical implications of evolving ECG technology.
Kligfield, Paul; Gettes, Leonard S; Bailey, James J; Childers, Rory; Deal, Barbara J; Hancock, E William; van Herpen, Gerard; Kors, Jan A; Macfarlane, Peter; Mirvis, David M; Pahlm, Olle; Rautaharju, Pentti; Wagner, Galen S; Josephson, Mark; Mason, Jay W; Okin, Peter; Surawicz, Borys; Wellens, Hein
2007-03-13
This statement examines the relation of the resting ECG to its technology. Its purpose is to foster understanding of how the modern ECG is derived and displayed and to establish standards that will improve the accuracy and usefulness of the ECG in practice. Derivation of representative waveforms and measurements based on global intervals are described. Special emphasis is placed on digital signal acquisition and computer-based signal processing, which provide automated measurements that lead to computer-generated diagnostic statements. Lead placement, recording methods, and waveform presentation are reviewed. Throughout the statement, recommendations for ECG standards are placed in context of the clinical implications of evolving ECG technology.
The markup is the model: reasoning about systems biology models in the Semantic Web era.
Kell, Douglas B; Mendes, Pedro
2008-06-07
Metabolic control analysis, co-invented by Reinhart Heinrich, is a formalism for the analysis of biochemical networks, and is a highly important intellectual forerunner of modern systems biology. Exchanging ideas and exchanging models are part of the international activities of science and scientists, and the Systems Biology Markup Language (SBML) allows one to perform the latter with great facility. Encoding such models in SBML allows their distributed analysis using loosely coupled workflows, and with the advent of the Internet the various software modules that one might use to analyze biochemical models can reside on entirely different computers and even on different continents. Optimization is at the core of many scientific and biotechnological activities, and Reinhart made many major contributions in this area, stimulating our own activities in the use of the methods of evolutionary computing for optimization.
The service telemetry and control device for space experiment “GRIS”
NASA Astrophysics Data System (ADS)
Glyanenko, A. S.
2016-02-01
Problems of scientific devices control (for example, fine control of measuring paths), collecting auxiliary (service information about working capacity, conditions of experiment carrying out, etc.) and preliminary data processing are actual for any space device. Modern devices for space research it is impossible to imagine without devices that didn't use digital data processing methods and specialized or standard interfaces and computing facilities. For realization of these functions in “GRIS” experiment onboard ISS for purposes minimization of dimensions, power consumption, the concept “system-on-chip” was chosen and realized. In the programmable logical integrated scheme by Microsemi from ProASIC3 family with maximum capacity up to 3M system gates, the computing kernel and all necessary peripherals are created. In this paper we discuss structure, possibilities and resources the service telemetry and control device for “GRIS” space experiment.
Using NASA Space Imaging Technology to Teach Earth and Sun Topics
NASA Astrophysics Data System (ADS)
Verner, E.; Bruhweiler, F. C.; Long, T.
2011-12-01
We teach an experimental college-level course, directed toward elementary education majors, emphasizing "hands-on" activities that can be easily applied to the elementary classroom. This course, Physics 240: "The Sun-Earth Connection" includes various ways to study selected topics in physics, earth science, and basic astronomy. Our lesson plans and EPO materials make extensive use of NASA imagery and cover topics about magnetism, the solar photospheric, chromospheric, coronal spectra, as well as earth science and climate. In addition we are developing and will cover topics on ecosystem structure, biomass and water on Earth. We strive to free the non-science undergraduate from the "fear of science" and replace it with the excitement of science such that these future teachers will carry this excitement to their future students. Hands-on experiments, computer simulations, analysis of real NASA data, and vigorous seminar discussions are blended in an inquiry-driven curriculum to instill confident understanding of basic physical science and modern, effective methods for teaching it. The course also demonstrates ways how scientific thinking and hands-on activities could be implemented in the classroom. We have designed this course to provide the non-science student a confident basic understanding of physical science and modern, effective methods for teaching it. Most of topics were selected using National Science Standards and National Mathematics Standards that are addressed in grades K-8. The course focuses on helping education majors: 1) Build knowledge of scientific concepts and processes; 2) Understand the measurable attributes of objects and the units and methods of measurements; 3) Conduct data analysis (collecting, organizing, presenting scientific data, and to predict the result); 4) Use hands-on approaches to teach science; 5) Be familiar with Internet science teaching resources. Here we share our experiences and challenges we face while teaching this course.
III International Conference on Laser and Plasma Researches and Technologies
NASA Astrophysics Data System (ADS)
2017-12-01
A.P. Kuznetsov and S.V. Genisaretskaya III Conference on Plasma and Laser Research and Technologies took place on January 24th until January 27th, 2017 at the National Research Nuclear University "MEPhI" (NRNU MEPhI). The Conference was organized by the Institute for Laser and Plasma Technologies and was supported by the Competitiveness Program of NRNU MEPhI. The conference program consisted of nine sections: • Laser physics and its application • Plasma physics and its application • Laser, plasma and radiation technologies in industry • Physics of extreme light fields • Controlled thermonuclear fusion • Modern problems of theoretical physics • Challenges in physics of solid state, functional materials and nanosystems • Particle accelerators and radiation technologies • Modern trends of quantum metrology. The conference is based on scientific fields as follows: • Laser, plasma and radiation technologies in industry, energetic, medicine; • Photonics, quantum metrology, optical information processing; • New functional materials, metamaterials, “smart” alloys and quantum systems; • Ultrahigh optical fields, high-power lasers, Mega Science facilities; • High-temperature plasma physics, environmentally-friendly energetic based on controlled thermonuclear fusion; • Spectroscopic synchrotron, neutron, laser research methods, quantum mechanical calculation and computer modelling of condensed media and nanostructures. More than 250 specialists took part in the Conference. They represented leading Russian scientific research centers and universities (National Research Centre "Kurchatov Institute", A.M. Prokhorov General Physics Institute, P.N. Lebedev Physical Institute, Troitsk Institute for Innovation and Fusion Research, Joint Institute for Nuclear Research, Moscow Institute of Physics and Tecnology and others) and leading scientific centers and universities from Germany, France, USA, Canada, Japan. We would like to thank heartily all of the speakers, participants, organizing and program committee members for their contribution to the conference.
Long, Nguyen Phuoc; Huy, Nguyen Tien; Trang, Nguyen Thi Huyen; Luan, Nguyen Thien; Anh, Nguyen Hoang; Nghi, Tran Diem; Hieu, Mai Van; Hirayama, Kenji; Karbwang, Juntra
2014-09-01
Ethics is one of the main pillars in the development of science. We performed a JoinPoint regression analysis to analyze the trends of ethical issue research over the past half century. The question is whether ethical issues are neglected despite their importance in modern research. PubMed electronic library was used to retrieve publications of all fields and ethical issues. JoinPoint regression analysis was used to identify the significant time trends of publications of all fields and ethical issues, as well as the proportion of publications on ethical issues to all fields over the past half century. Annual percent changes (APC) were computed with their 95% confidence intervals, and a p-value < 0.05 was considered statistically significant. We found that publications of ethical issues increased during the period of 1965-1996 but slightly fell in recent years (from 1996 to 2013). When comparing the absolute number of ethics related articles (APEI) to all publications of all fields (APAF) on PubMed, the results showed that the proportion of APEI to APAF statistically increased during the periods of 1965-1974, 1974-1986, and 1986-1993, with APCs of 11.0, 2.1, and 8.8, respectively. However, the trend has gradually dropped since 1993 and shown a marked decrease from 2002 to 2013 with an annual percent change of -7.4%. Scientific productivity in ethical issues research on over the past half century rapidly increased during the first 30-year period but has recently been in decline. Since ethics is an important aspect of scientific research, we suggest that greater attention is needed in order to emphasize the role of ethics in modern research.
NASA Astrophysics Data System (ADS)
Khalili, N.; Valliappan, S.; Li, Q.; Russell, A.
2010-07-01
The use for mathematical models of natural phenomena has underpinned science and engineering for centuries, but until the advent of modern computers and computational methods, the full utility of most of these models remained outside the reach of the engineering communities. Since World War II, advances in computational methods have transformed the way engineering and science is undertaken throughout the world. Today, theories of mechanics of solids and fluids, electromagnetism, heat transfer, plasma physics, and other scientific disciplines are implemented through computational methods in engineering analysis, design, manufacturing, and in studying broad classes of physical phenomena. The discipline concerned with the application of computational methods is now a key area of research, education, and application throughout the world. In the early 1980's, the International Association for Computational Mechanics (IACM) was founded to promote activities related to computational mechanics and has made impressive progress. The most important scientific event of IACM is the World Congress on Computational Mechanics. The first was held in Austin (USA) in 1986 and then in Stuttgart (Germany) in 1990, Chiba (Japan) in 1994, Buenos Aires (Argentina) in 1998, Vienna (Austria) in 2002, Beijing (China) in 2004, Los Angeles (USA) in 2006 and Venice, Italy; in 2008. The 9th World Congress on Computational Mechanics is held in conjunction with the 4th Asian Pacific Congress on Computational Mechanics under the auspices of Australian Association for Computational Mechanics (AACM), Asian Pacific Association for Computational Mechanics (APACM) and International Association for Computational Mechanics (IACM). The 1st Asian Pacific Congress was in Sydney (Australia) in 2001, then in Beijing (China) in 2004 and Kyoto (Japan) in 2007. The WCCM/APCOM 2010 publications consist of a printed book of abstracts given to delegates, along with 247 full length peer reviewed papers published with free access online in IOP Conference Series: Materials Science and Engineering. The editors acknowledge the help of the paper reviewers in maintaining a high standard of assessment and the co-operation of the authors in complying with the requirements of the editors and the reviewers. We also would like to take this opportunity to thank the members of the Local Organising Committee and the International Scientific Committee for helping make WCCM/APCOM 2010 a successful event. We also thank The University of New South Wales, The University of Newcastle, the Centre for Infrastructure Engineering and Safety (CIES), IACM, APCAM, AACM for their financial support, along with the United States Association for Computational Mechanics for the Travel Awards made available. N. Khalili S. Valliappan Q. Li A. Russell 19 July 2010 Sydney, Australia
76 FR 31945 - Advanced Scientific Computing Advisory Committee
Federal Register 2010, 2011, 2012, 2013, 2014
2011-06-02
... DEPARTMENT OF ENERGY Advanced Scientific Computing Advisory Committee AGENCY: Department of Energy... teleconference meeting of the Advanced Scientific Computing Advisory Committee (ASCAC). The Federal [email protected] . FOR FURTHER INFORMATION CONTACT: Melea Baker, Office of Advanced Scientific Computing...
Kim, Anita; Tidwell, Natasha
2014-12-01
The present 2 studies involved undergraduate participants and investigated whether various types of sexism and other correlated predictors, such as political conservatism and scientific discounting, can predict people's evaluations of social science research on sex stereotypes, sexism, and sex discrimination. In Study 1, participants high in hostile sexism, scientific discounting, and/or political conservatism were more critical of scientific studies that provided evidence for sexism than identical studies showing null results. Study 2 showed that participants high in modern sexism, hostile sexism, and political conservatism evaluated social scientific studies more negatively; in addition, assessments of social scientific evidence quality mediated the effect of modern sexism on admissibility ratings (b = -0.15, z = -4.16, p = .00). Overall, these results suggest that sexist beliefs can bias one's judgments of social scientific evidence. Future research should explore whether the same psychological processes operate for judges and jurors as they evaluate the admissibility of evidence and examine ways to attenuate the effect of sexism on evaluations. PsycINFO Database Record (c) 2014 APA, all rights reserved.
Theoretical bases of project management in conditions of innovative economy based on fuzzy modeling
NASA Astrophysics Data System (ADS)
Beilin, I. L.; Khomenko, V. V.
2018-05-01
In recent years, more and more Russian enterprises (both private and public) are trying to organize their activities on the basis of modern scientific research in order to improve the management of economic processes. Business planning, financial and investment analysis, modern software products based on the latest scientific developments are introduced everywhere. At the same time, there is a growing demand for market research (both at the microeconomic and macroeconomic levels), for financial and general economic information.
Extreme Scale Computing to Secure the Nation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, D L; McGraw, J R; Johnson, J R
2009-11-10
Since the dawn of modern electronic computing in the mid 1940's, U.S. national security programs have been dominant users of every new generation of high-performance computer. Indeed, the first general-purpose electronic computer, ENIAC (the Electronic Numerical Integrator and Computer), was used to calculate the expected explosive yield of early thermonuclear weapons designs. Even the U. S. numerical weather prediction program, another early application for high-performance computing, was initially funded jointly by sponsors that included the U.S. Air Force and Navy, agencies interested in accurate weather predictions to support U.S. military operations. For the decades of the cold war, national securitymore » requirements continued to drive the development of high performance computing (HPC), including advancement of the computing hardware and development of sophisticated simulation codes to support weapons and military aircraft design, numerical weather prediction as well as data-intensive applications such as cryptography and cybersecurity U.S. national security concerns continue to drive the development of high-performance computers and software in the U.S. and in fact, events following the end of the cold war have driven an increase in the growth rate of computer performance at the high-end of the market. This mainly derives from our nation's observance of a moratorium on underground nuclear testing beginning in 1992, followed by our voluntary adherence to the Comprehensive Test Ban Treaty (CTBT) beginning in 1995. The CTBT prohibits further underground nuclear tests, which in the past had been a key component of the nation's science-based program for assuring the reliability, performance and safety of U.S. nuclear weapons. In response to this change, the U.S. Department of Energy (DOE) initiated the Science-Based Stockpile Stewardship (SBSS) program in response to the Fiscal Year 1994 National Defense Authorization Act, which requires, 'in the absence of nuclear testing, a progam to: (1) Support a focused, multifaceted program to increase the understanding of the enduring stockpile; (2) Predict, detect, and evaluate potential problems of the aging of the stockpile; (3) Refurbish and re-manufacture weapons and components, as required; and (4) Maintain the science and engineering institutions needed to support the nation's nuclear deterrent, now and in the future'. This program continues to fulfill its national security mission by adding significant new capabilities for producing scientific results through large-scale computational simulation coupled with careful experimentation, including sub-critical nuclear experiments permitted under the CTBT. To develop the computational science and the computational horsepower needed to support its mission, SBSS initiated the Accelerated Strategic Computing Initiative, later renamed the Advanced Simulation & Computing (ASC) program (sidebar: 'History of ASC Computing Program Computing Capability'). The modern 3D computational simulation capability of the ASC program supports the assessment and certification of the current nuclear stockpile through calibration with past underground test (UGT) data. While an impressive accomplishment, continued evolution of national security mission requirements will demand computing resources at a significantly greater scale than we have today. In particular, continued observance and potential Senate confirmation of the Comprehensive Test Ban Treaty (CTBT) together with the U.S administration's promise for a significant reduction in the size of the stockpile and the inexorable aging and consequent refurbishment of the stockpile all demand increasing refinement of our computational simulation capabilities. Assessment of the present and future stockpile with increased confidence of the safety and reliability without reliance upon calibration with past or future test data is a long-term goal of the ASC program. This will be accomplished through significant increases in the scientific bases that underlie the computational tools. Computer codes must be developed that replace phenomenology with increased levels of scientific understanding together with an accompanying quantification of uncertainty. These advanced codes will place significantly higher demands on the computing infrastructure than do the current 3D ASC codes. This article discusses not only the need for a future computing capability at the exascale for the SBSS program, but also considers high performance computing requirements for broader national security questions. For example, the increasing concern over potential nuclear terrorist threats demands a capability to assess threats and potential disablement technologies as well as a rapid forensic capability for determining a nuclear weapons design from post-detonation evidence (nuclear counterterrorism).« less
The Development of Sociocultural Competence with the Help of Computer Technology
ERIC Educational Resources Information Center
Rakhimova, Alina E.; Yashina, Marianna E.; Mukhamadiarova, Albina F.; Sharipova, Astrid V.
2017-01-01
The article deals with the description of the process of development sociocultural knowledge and competences using computer technologies. On the whole the development of modern computer technologies allows teachers to broaden trainees' sociocultural outlook and trace their progress online. Observation of modern computer technologies and estimation…
75 FR 9887 - Advanced Scientific Computing Advisory Committee
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-04
... DEPARTMENT OF ENERGY Advanced Scientific Computing Advisory Committee AGENCY: Department of Energy... Advanced Scientific Computing Advisory Committee (ASCAC). Federal Advisory Committee Act (Pub. L. 92-463... INFORMATION CONTACT: Melea Baker, Office of Advanced Scientific Computing Research; SC-21/Germantown Building...
76 FR 9765 - Advanced Scientific Computing Advisory Committee
Federal Register 2010, 2011, 2012, 2013, 2014
2011-02-22
... DEPARTMENT OF ENERGY Advanced Scientific Computing Advisory Committee AGENCY: Office of Science... Advanced Scientific Computing Advisory Committee (ASCAC). The Federal Advisory Committee Act (Pub. L. 92... INFORMATION CONTACT: Melea Baker, Office of Advanced Scientific Computing Research, SC-21/Germantown Building...
77 FR 45345 - DOE/Advanced Scientific Computing Advisory Committee
Federal Register 2010, 2011, 2012, 2013, 2014
2012-07-31
... Recompetition results for Scientific Discovery through Advanced Computing (SciDAC) applications Co-design Public... DEPARTMENT OF ENERGY DOE/Advanced Scientific Computing Advisory Committee AGENCY: Office of... the Advanced Scientific Computing Advisory Committee (ASCAC). The Federal Advisory Committee Act (Pub...
75 FR 64720 - DOE/Advanced Scientific Computing Advisory Committee
Federal Register 2010, 2011, 2012, 2013, 2014
2010-10-20
... DEPARTMENT OF ENERGY DOE/Advanced Scientific Computing Advisory Committee AGENCY: Department of... the Advanced Scientific Computing Advisory Committee (ASCAC). Federal Advisory Committee Act (Pub. L.... FOR FURTHER INFORMATION CONTACT: Melea Baker, Office of Advanced Scientific Computing Research; SC-21...
Computing through Scientific Abstractions in SysBioPS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chin, George; Stephan, Eric G.; Gracio, Deborah K.
2004-10-13
Today, biologists and bioinformaticists have a tremendous amount of computational power at their disposal. With the availability of supercomputers, burgeoning scientific databases and digital libraries such as GenBank and PubMed, and pervasive computational environments such as the Grid, biologists have access to a wealth of computational capabilities and scientific data at hand. Yet, the rapid development of computational technologies has far exceeded the typical biologist’s ability to effectively apply the technology in their research. Computational sciences research and development efforts such as the Biology Workbench, BioSPICE (Biological Simulation Program for Intra-Cellular Evaluation), and BioCoRE (Biological Collaborative Research Environment) are importantmore » in connecting biologists and their scientific problems to computational infrastructures. On the Computational Cell Environment and Heuristic Entity-Relationship Building Environment projects at the Pacific Northwest National Laboratory, we are jointly developing a new breed of scientific problem solving environment called SysBioPSE that will allow biologists to access and apply computational resources in the scientific research context. In contrast to other computational science environments, SysBioPSE operates as an abstraction layer above a computational infrastructure. The goal of SysBioPSE is to allow biologists to apply computational resources in the context of the scientific problems they are addressing and the scientific perspectives from which they conduct their research. More specifically, SysBioPSE allows biologists to capture and represent scientific concepts and theories and experimental processes, and to link these views to scientific applications, data repositories, and computer systems.« less
Generative models for clinical applications in computational psychiatry.
Frässle, Stefan; Yao, Yu; Schöbi, Dario; Aponte, Eduardo A; Heinzle, Jakob; Stephan, Klaas E
2018-05-01
Despite the success of modern neuroimaging techniques in furthering our understanding of cognitive and pathophysiological processes, translation of these advances into clinically relevant tools has been virtually absent until now. Neuromodeling represents a powerful framework for overcoming this translational deadlock, and the development of computational models to solve clinical problems has become a major scientific goal over the last decade, as reflected by the emergence of clinically oriented neuromodeling fields like Computational Psychiatry, Computational Neurology, and Computational Psychosomatics. Generative models of brain physiology and connectivity in the human brain play a key role in this endeavor, striving for computational assays that can be applied to neuroimaging data from individual patients for differential diagnosis and treatment prediction. In this review, we focus on dynamic causal modeling (DCM) and its use for Computational Psychiatry. DCM is a widely used generative modeling framework for functional magnetic resonance imaging (fMRI) and magneto-/electroencephalography (M/EEG) data. This article reviews the basic concepts of DCM, revisits examples where it has proven valuable for addressing clinically relevant questions, and critically discusses methodological challenges and recent methodological advances. We conclude this review with a more general discussion of the promises and pitfalls of generative models in Computational Psychiatry and highlight the path that lies ahead of us. This article is categorized under: Neuroscience > Computation Neuroscience > Clinical Neuroscience. © 2018 Wiley Periodicals, Inc.
75 FR 43518 - Advanced Scientific Computing Advisory Committee; Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2010-07-26
... DEPARTMENT OF ENERGY Advanced Scientific Computing Advisory Committee; Meeting AGENCY: Office of... Scientific Computing Advisory Committee (ASCAC). Federal Advisory Committee Act (Pub. L. 92-463, 86 Stat. 770...: Melea Baker, Office of Advanced Scientific Computing Research; SC-21/Germantown Building; U. S...
Institutional and Individual Influences on Scientists' Data Sharing Behaviors
ERIC Educational Resources Information Center
Kim, Youngseek
2013-01-01
In modern research activities, scientific data sharing is essential, especially in terms of data-intensive science and scholarly communication. Scientific communities are making ongoing endeavors to promote scientific data sharing. Currently, however, data sharing is not always well-deployed throughout diverse science and engineering disciplines.…
NASA Astrophysics Data System (ADS)
Burba, G. G.; Johnson, D.; Velgersdyk, M.; Beaty, K.; Forgione, A.; Begashaw, I.; Allyn, D.
2015-12-01
Significant increases in data generation and computing power in recent years have greatly improved spatial and temporal flux data coverage on multiple scales, from a single station to continental flux networks. At the same time, operating budgets for flux teams and stations infrastructure are getting ever more difficult to acquire and sustain. With more stations and networks, larger data flows from each station, and smaller operating budgets, modern tools are needed to effectively and efficiently handle the entire process. This would help maximize time dedicated to answering research questions, and minimize time and expenses spent on data processing, quality control and station management. Cross-sharing the stations with external institutions may also help leverage available funding, increase scientific collaboration, and promote data analyses and publications. FluxSuite, a new advanced tool combining hardware, software and web-service, was developed to address these specific demands. It automates key stages of flux workflow, minimizes day-to-day site management, and modernizes the handling of data flows: Each next-generation station measures all parameters needed for flux computations Field microcomputer calculates final fully-corrected flux rates in real time, including computation-intensive Fourier transforms, spectra, co-spectra, multiple rotations, stationarity, footprint, etc. Final fluxes, radiation, weather and soil data are merged into a single quality-controlled file Multiple flux stations are linked into an automated time-synchronized network Flux network manager, or PI, can see all stations in real time, including fluxes, supporting data, automated reports, and email alerts PI can assign rights, allow or restrict access to stations and data: selected stations can be shared via rights-managed access internally or with external institutions Researchers without stations could form "virtual networks" for specific projects by collaborating with PIs from different actual networks This presentation provides detailed examples of FluxSuite currently utilized by two large flux networks in China (National Academy of Sciences & Agricultural Academy of Sciences), and smaller networks with stations in the USA, Germany, Ireland, Malaysia and other locations around the globe.
Improvement and speed optimization of numerical tsunami modelling program using OpenMP technology
NASA Astrophysics Data System (ADS)
Chernov, A.; Zaytsev, A.; Yalciner, A.; Kurkin, A.
2009-04-01
Currently, the basic problem of tsunami modeling is low speed of calculations which is unacceptable for services of the operative notification. Existing algorithms of numerical modeling of hydrodynamic processes of tsunami waves are developed without taking the opportunities of modern computer facilities. There is an opportunity to have considerable acceleration of process of calculations by using parallel algorithms. We discuss here new approach to parallelization tsunami modeling code using OpenMP Technology (for multiprocessing systems with the general memory). Nowadays, multiprocessing systems are easily accessible for everyone. The cost of the use of such systems becomes much lower comparing to the costs of clusters. This opportunity also benefits all programmers to apply multithreading algorithms on desktop computers of researchers. Other important advantage of the given approach is the mechanism of the general memory - there is no necessity to send data on slow networks (for example Ethernet). All memory is the common for all computing processes; it causes almost linear scalability of the program and processes. In the new version of NAMI DANCE using OpenMP technology and multi-threading algorithm provide 80% gain in speed in comparison with the one-thread version for dual-processor unit. The speed increased and 320% gain was attained for four core processor unit of PCs. Thus, it was possible to reduce considerably time of performance of calculations on the scientific workstations (desktops) without complete change of the program and user interfaces. The further modernization of algorithms of preparation of initial data and processing of results using OpenMP looks reasonable. The final version of NAMI DANCE with the increased computational speed can be used not only for research purposes but also in real time Tsunami Warning Systems.
Schultze-Petzold, H
1976-01-01
Regulations for the protection of useful animals can be traced to the early history of Law. The reason for such regulations has hardly changed up to the present: the expedient incorporation of the animal into the hierachy of values of the prevailing times. Decisive impulses invariably originated from the legal conception, the need for legal protection as well as from the scientific conceptions of society. The development rarely took a linear course and was not without setbacks. The prevention of cruelty to animals has always been faced with particular conflicting situations. Our pluralistic society with its marked philosophy of profit-making has to face such a problem, in particular as a result of livestock keeping in modern systems. The necessity and legitimacy of a permanent supply of large quantities of high-grade animal foodstuffs to be offered to our present industrial society on a competitive and low-cost basis, have contributed to this development. The public and parliament have for some time been demanding a modern federal act for the prevention of cruelty to animals based on a technical conception allowing also those questions of animal protection related to the present keeping of useful animals to be integrated, thus achieving a gradual balancing of interests. Such an Animal Protection Act came into force on October 1, 1972. On account of its scientific orientation it prompts us to give renewed thought to many present-day ideas about the keeping of animals, especially of useful animals, employing modern systems. With this objective in mind the Act has already strongly influenced the developing international harmonization of provisions for Animal Protection. The problems linked with "Animal Protection/Keeping of Useful Animals" require a harmonization of the ethical, scientific, economic and legal aspects as an indispensable prerequisite. On the basis of expert opinions prepared by a group of specialists of the Federal Ministry of Agriculture on the minimum requirements to be satisfied by modern systems of fowl breeding, the various scientific basic concepts and evaluations are presented. The value of the information yielded by modern research into animal behaviour is emphasized in this connection. Future legal ordinances in accordance with Clause 13, para 1 of the Animal Protection Act of July 24, 1972 for the protection of useful animals kept in modern systems call for a particularly thorough scientific foundation which must also stand up to examination by the courts. The problems to be solved require comprehensive research. An urgent task for the near future will be to give the resolution of these problems a firm scientific base. In addition to the topical approach to the subject "Animal Protection/Keeping of Useful Animals", indications are given for a comprehensive approach which will prove indispensable in the future...
Ancient Egypt and radiology, a future for the past!
NASA Astrophysics Data System (ADS)
Van Tiggelen, R.
2004-11-01
X-rays, discovered by W.K. Röntgen was a scientific bombshell and was received with extraordinary interest by scientist in all disciplines, including Egyptology: the first radiological essay was already made in Germany 3 months after Röntgens discovery. Since then, radiological examinations of mummies are used to detect frauds, to appreciate sex and age, and possible cause of death. As non-destructive tool it can reveal the nature of materials, presence of jewellery and amulets. The paper gives a brief history of major milestones in Belgium and abroad. More modern technology such as axial computed tomography and image colouring will allow better representations and reveal up to now undiscovered funerary artefacts.
The Weather Forecast Using Data Mining Research Based on Cloud Computing.
NASA Astrophysics Data System (ADS)
Wang, ZhanJie; Mazharul Mujib, A. B. M.
2017-10-01
Weather forecasting has been an important application in meteorology and one of the most scientifically and technologically challenging problem around the world. In my study, we have analyzed the use of data mining techniques in forecasting weather. This paper proposes a modern method to develop a service oriented architecture for the weather information systems which forecast weather using these data mining techniques. This can be carried out by using Artificial Neural Network and Decision tree Algorithms and meteorological data collected in Specific time. Algorithm has presented the best results to generate classification rules for the mean weather variables. The results showed that these data mining techniques can be enough for weather forecasting.
Preserving digital images for legal proceedings.
Benedetto, Anthony R
2007-12-01
The legal principles governing the use of radiologic images in court and other legal proceedings were developed before the introduction of computers in radiology and nuclear medicine imaging equipment. Modern digital images present a wide variety of new concerns that are not adequately addressed by the principles used by most lawyers and courts. This article discusses the most important of these new concerns, such as being able to prove that an image has not been altered and being able to prove that the hardware and software used to create it were scientifically reliable. A nonexhaustive set of recommendations are given to guide radiologists in beginning to review the image preservation procedures of their practices.
ERIC Educational Resources Information Center
Berg, A. I.; And Others
Five articles which were selected from a Russian language book on cybernetics and then translated are presented here. They deal with the topics of: computer-developed computers, heuristics and modern sciences, linguistics and practice, cybernetics and moral-ethical considerations, and computer chess programs. (Author/JY)
The Visualization Toolkit (VTK): Rewriting the rendering code for modern graphics cards
NASA Astrophysics Data System (ADS)
Hanwell, Marcus D.; Martin, Kenneth M.; Chaudhary, Aashish; Avila, Lisa S.
2015-09-01
The Visualization Toolkit (VTK) is an open source, permissively licensed, cross-platform toolkit for scientific data processing, visualization, and data analysis. It is over two decades old, originally developed for a very different graphics card architecture. Modern graphics cards feature fully programmable, highly parallelized architectures with large core counts. VTK's rendering code was rewritten to take advantage of modern graphics cards, maintaining most of the toolkit's programming interfaces. This offers the opportunity to compare the performance of old and new rendering code on the same systems/cards. Significant improvements in rendering speeds and memory footprints mean that scientific data can be visualized in greater detail than ever before. The widespread use of VTK means that these improvements will reap significant benefits.
The discovery of circulation and the origin of modern medicine during the italian renaissance.
Thiene, G
1997-03-01
This historical article discusses the dawn of anatomy during the Italian Renaissance, the role of the University of Padua in the origin of modern medicine, milestones in the development of modern medicine, the discovery of circulation, Padua leadership and Galileo's persecution for his scientific theories. Copyright © 1997 Elsevier Science Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Gel, Aytekin; Hu, Jonathan; Ould-Ahmed-Vall, ElMoustapha; Kalinkin, Alexander A.
2017-02-01
Legacy codes remain a crucial element of today's simulation-based engineering ecosystem due to the extensive validation process and investment in such software. The rapid evolution of high-performance computing architectures necessitates the modernization of these codes. One approach to modernization is a complete overhaul of the code. However, this could require extensive investments, such as rewriting in modern languages, new data constructs, etc., which will necessitate systematic verification and validation to re-establish the credibility of the computational models. The current study advocates using a more incremental approach and is a culmination of several modernization efforts of the legacy code MFIX, which is an open-source computational fluid dynamics code that has evolved over several decades, widely used in multiphase flows and still being developed by the National Energy Technology Laboratory. Two different modernization approaches,'bottom-up' and 'top-down', are illustrated. Preliminary results show up to 8.5x improvement at the selected kernel level with the first approach, and up to 50% improvement in total simulated time with the latter were achieved for the demonstration cases and target HPC systems employed.
Scientific knowledge and modern prospecting
Neuerburg, G.J.
1985-01-01
Modern prospecting is the systematic search for specified and generally ill-exposed components of the Earth's crust known as ore. This prospecting depends entirely on reliable, or scientific knowledge for guidance and for recognition of the search objects. Improvement in prospecting results from additions and refinements to scientific knowledge. Scientific knowledge is an ordered distillation of observations too numerous and too complex in themselves for easy understanding and for effective management. The ordering of these observations is accomplished by an evolutionary hierarchy of abstractions. These abstractions employ simplified descriptions consisting of characterization by selected properties, sampling to represent much larger parts of a phenomenon, generalized mappings of patterns of geometrical and numerical relations among properties, and explanation (theory) of these patterns as functional relations among the selected properties. Each abstraction is predicated on the mode of abstraction anticipated for the next higher level, so that research is a deductive process in which the highest level, theory, is indispensible for the growth and refinement of scientific knowledge, and therefore of prospecting methodology. ?? 1985 Springer-Verlag.
ERIC Educational Resources Information Center
Pramling, Niklas; Saljo, Roger
2007-01-01
The article reports an empirical study of how authors in popular science magazines attempt to render scientific knowledge intelligible to wide audiences. In bridging the two domains of "popular" and "scientific" knowledge, respectively, metaphor becomes central. We ask the empirical question of what metaphors are used when communicating about…
A project of a two meter telescope in North Africa
NASA Astrophysics Data System (ADS)
Benkhaldoun, Zouhair
2015-03-01
Site testing undertaken during the last 20 years by Moroccan researchers through international studies have shown that the Atlas mountains in Morocco has potentialities similar to those sites which host the largest telescopes in world. Given the quality of the sites and opportunities to conduct modern research, we believe that the installation of a 2m diameter telescope will open new horizons for Astronomy in Morocco and north Africa allowing our region to enter definitively into the very exclusive club of countries possessing an instrument of that size. A state of the art astrophysical observatory on any good astronomical observation site should be equipped with a modern 2m-class, robotic telescope and some smaller telescopes. Our plan should be to operate one of the most efficient robotic 2m class telescopes worldwide in order to offer optimal scientific opportunities for researchers and maintain highest standards for the education of students. Beside all categories of astronomical research fields, students will have the possibility to be educated intensively on the design, manufacturing and operating of modern state of the art computer controlled instruments. In the frame of such education and observation studies several PhD and dissertational work packages are possible. Many of the observations will be published in articles worldwide and a number of guest observers from other countries will have the possibility to take part in collaborations. This could be a starting point of an international reputation of our region in the field of modern astronomy.
Computational Chemistry Using Modern Electronic Structure Methods
ERIC Educational Resources Information Center
Bell, Stephen; Dines, Trevor J.; Chowdhry, Babur Z.; Withnall, Robert
2007-01-01
Various modern electronic structure methods are now days used to teach computational chemistry to undergraduate students. Such quantum calculations can now be easily used even for large size molecules.
Early modern mathematical instruments.
Bennett, Jim
2011-12-01
In considering the appropriate use of the terms "science" and "scientific instrument," tracing the history of "mathematical instruments" in the early modern period is offered as an illuminating alternative to the historian's natural instinct to follow the guiding lights of originality and innovation, even if the trail transgresses contemporary boundaries. The mathematical instrument was a well-defined category, shared across the academic, artisanal, and commercial aspects of instrumentation, and its narrative from the sixteenth to the eighteenth century was largely independent from other classes of device, in a period when a "scientific" instrument was unheard of.
Computer-assisted learning in critical care: from ENIAC to HAL.
Tegtmeyer, K; Ibsen, L; Goldstein, B
2001-08-01
Computers are commonly used to serve many functions in today's modern intensive care unit. One of the most intriguing and perhaps most challenging applications of computers has been to attempt to improve medical education. With the introduction of the first computer, medical educators began looking for ways to incorporate their use into the modern curriculum. Prior limitations of cost and complexity of computers have consistently decreased since their introduction, making it increasingly feasible to incorporate computers into medical education. Simultaneously, the capabilities and capacities of computers have increased. Combining the computer with other modern digital technology has allowed the development of more intricate and realistic educational tools. The purpose of this article is to briefly describe the history and use of computers in medical education with special reference to critical care medicine. In addition, we will examine the role of computers in teaching and learning and discuss the types of interaction between the computer user and the computer.
PyNEST: A Convenient Interface to the NEST Simulator.
Eppler, Jochen Martin; Helias, Moritz; Muller, Eilif; Diesmann, Markus; Gewaltig, Marc-Oliver
2008-01-01
The neural simulation tool NEST (http://www.nest-initiative.org) is a simulator for heterogeneous networks of point neurons or neurons with a small number of compartments. It aims at simulations of large neural systems with more than 10(4) neurons and 10(7) to 10(9) synapses. NEST is implemented in C++ and can be used on a large range of architectures from single-core laptops over multi-core desktop computers to super-computers with thousands of processor cores. Python (http://www.python.org) is a modern programming language that has recently received considerable attention in Computational Neuroscience. Python is easy to learn and has many extension modules for scientific computing (e.g. http://www.scipy.org). In this contribution we describe PyNEST, the new user interface to NEST. PyNEST combines NEST's efficient simulation kernel with the simplicity and flexibility of Python. Compared to NEST's native simulation language SLI, PyNEST makes it easier to set up simulations, generate stimuli, and analyze simulation results. We describe how PyNEST connects NEST and Python and how it is implemented. With a number of examples, we illustrate how it is used.
PyNEST: A Convenient Interface to the NEST Simulator
Eppler, Jochen Martin; Helias, Moritz; Muller, Eilif; Diesmann, Markus; Gewaltig, Marc-Oliver
2008-01-01
The neural simulation tool NEST (http://www.nest-initiative.org) is a simulator for heterogeneous networks of point neurons or neurons with a small number of compartments. It aims at simulations of large neural systems with more than 104 neurons and 107 to 109 synapses. NEST is implemented in C++ and can be used on a large range of architectures from single-core laptops over multi-core desktop computers to super-computers with thousands of processor cores. Python (http://www.python.org) is a modern programming language that has recently received considerable attention in Computational Neuroscience. Python is easy to learn and has many extension modules for scientific computing (e.g. http://www.scipy.org). In this contribution we describe PyNEST, the new user interface to NEST. PyNEST combines NEST's efficient simulation kernel with the simplicity and flexibility of Python. Compared to NEST's native simulation language SLI, PyNEST makes it easier to set up simulations, generate stimuli, and analyze simulation results. We describe how PyNEST connects NEST and Python and how it is implemented. With a number of examples, we illustrate how it is used. PMID:19198667
Renkawitz, Tobias; Tingart, Markus; Grifka, Joachim; Sendtner, Ernst; Kalteis, Thomas
2009-09-01
This article outlines the scientific basis and a state-of-the-art application of computer-assisted orthopedic surgery in total hip arthroplasty (THA) and provides a future perspective on this technology. Computer-assisted orthopedic surgery in primary THA has the potential to couple 3D simulations with real-time evaluations of surgical performance, which has brought these developments from the research laboratory all the way to clinical use. Nonimage- or imageless-based navigation systems without the need for additional pre- or intra-operative image acquisition have stood the test to significantly reduce the variability in positioning the acetabular component and have shown precise measurement of leg length and offset changes during THA. More recently, computer-assisted orthopedic surgery systems have opened a new frontier for accurate surgical practice in minimally invasive, tissue-preserving THA. The future generation of imageless navigation systems will switch from simple measurement tasks to real navigation tools. These software algorithms will consider the cup and stem as components of a coupled biomechanical system, navigating the orthopedic surgeon to find an optimized complementary component orientation rather than target values intraoperatively, and are expected to have a high impact on clinical practice and postoperative functionality in modern THA.
NASA Astrophysics Data System (ADS)
Georgiou, Harris
2009-10-01
Medical Informatics and the application of modern signal processing in the assistance of the diagnostic process in medical imaging is one of the more recent and active research areas today. This thesis addresses a variety of issues related to the general problem of medical image analysis, specifically in mammography, and presents a series of algorithms and design approaches for all the intermediate levels of a modern system for computer-aided diagnosis (CAD). The diagnostic problem is analyzed with a systematic approach, first defining the imaging characteristics and features that are relevant to probable pathology in mammo-grams. Next, these features are quantified and fused into new, integrated radio-logical systems that exhibit embedded digital signal processing, in order to improve the final result and minimize the radiological dose for the patient. In a higher level, special algorithms are designed for detecting and encoding these clinically interest-ing imaging features, in order to be used as input to advanced pattern classifiers and machine learning models. Finally, these approaches are extended in multi-classifier models under the scope of Game Theory and optimum collective deci-sion, in order to produce efficient solutions for combining classifiers with minimum computational costs for advanced diagnostic systems. The material covered in this thesis is related to a total of 18 published papers, 6 in scientific journals and 12 in international conferences.
Valleron, Alain-Jacques
2017-08-15
Automation of laboratory tests, bioinformatic analysis of biological sequences, and professional data management are used routinely in a modern university hospital-based infectious diseases institute. This dates back to at least the 1980s. However, the scientific methods of this 21st century are changing with the increased power and speed of computers, with the "big data" revolution having already happened in genomics and environment, and eventually arriving in medical informatics. The research will be increasingly "data driven," and the powerful machine learning methods whose efficiency is demonstrated in daily life will also revolutionize medical research. A university-based institute of infectious diseases must therefore not only gather excellent computer scientists and statisticians (as in the past, and as in any medical discipline), but also fully integrate the biologists and clinicians with these computer scientists, statisticians, and mathematical modelers having a broad culture in machine learning, knowledge representation, and knowledge discovery. © The Author 2017. Published by Oxford University Press for the Infectious Diseases Society of America. All rights reserved. For permissions, e-mail: journals.permissions@oup.com.
Computational ecology as an emerging science
Petrovskii, Sergei; Petrovskaya, Natalia
2012-01-01
It has long been recognized that numerical modelling and computer simulations can be used as a powerful research tool to understand, and sometimes to predict, the tendencies and peculiarities in the dynamics of populations and ecosystems. It has been, however, much less appreciated that the context of modelling and simulations in ecology is essentially different from those that normally exist in other natural sciences. In our paper, we review the computational challenges arising in modern ecology in the spirit of computational mathematics, i.e. with our main focus on the choice and use of adequate numerical methods. Somewhat paradoxically, the complexity of ecological problems does not always require the use of complex computational methods. This paradox, however, can be easily resolved if we recall that application of sophisticated computational methods usually requires clear and unambiguous mathematical problem statement as well as clearly defined benchmark information for model validation. At the same time, many ecological problems still do not have mathematically accurate and unambiguous description, and available field data are often very noisy, and hence it can be hard to understand how the results of computations should be interpreted from the ecological viewpoint. In this scientific context, computational ecology has to deal with a new paradigm: conventional issues of numerical modelling such as convergence and stability become less important than the qualitative analysis that can be provided with the help of computational techniques. We discuss this paradigm by considering computational challenges arising in several specific ecological applications. PMID:23565336
Traditional Chinese rehabilitative therapy in the process of modernization.
Zhuo, D H
1988-01-01
In the past few years modalities of traditional Chinese rehabilitative therapy have changed from an experimental approach towards the shaping of a modernized and scientific system. The landmark of this process is characterized by adoption of scientific methods in the appraisal of efficacy, provision of experimental evidence to unveil the mechanisms for the treatments and development of new modalities by innovation with modern technology. Recent advances in clinical and experimental studies on acupuncture, Chinese massage and manipulation, qigong, and Tai Ji exercise are reviewed, with a focus on new findings in physiological mechanisms and effects on anti-senility. Comments are made on new modalities such as 'physical therapy on acupoints'. Progress in the use of qigong (meditation therapy) in tapping mental potentials and remediating mental deficiency is also reported.
Are Opinions Based on Science: Modelling Social Response to Scientific Facts
Iñiguez, Gerardo; Tagüeña-Martínez, Julia; Kaski, Kimmo K.; Barrio, Rafael A.
2012-01-01
As scientists we like to think that modern societies and their members base their views, opinions and behaviour on scientific facts. This is not necessarily the case, even though we are all (over-) exposed to information flow through various channels of media, i.e. newspapers, television, radio, internet, and web. It is thought that this is mainly due to the conflicting information on the mass media and to the individual attitude (formed by cultural, educational and environmental factors), that is, one external factor and another personal factor. In this paper we will investigate the dynamical development of opinion in a small population of agents by means of a computational model of opinion formation in a co-evolving network of socially linked agents. The personal and external factors are taken into account by assigning an individual attitude parameter to each agent, and by subjecting all to an external but homogeneous field to simulate the effect of the media. We then adjust the field strength in the model by using actual data on scientific perception surveys carried out in two different populations, which allow us to compare two different societies. We interpret the model findings with the aid of simple mean field calculations. Our results suggest that scientifically sound concepts are more difficult to acquire than concepts not validated by science, since opposing individuals organize themselves in close communities that prevent opinion consensus. PMID:22905117
Are opinions based on science: modelling social response to scientific facts.
Iñiguez, Gerardo; Tagüeña-Martínez, Julia; Kaski, Kimmo K; Barrio, Rafael A
2012-01-01
As scientists we like to think that modern societies and their members base their views, opinions and behaviour on scientific facts. This is not necessarily the case, even though we are all (over-) exposed to information flow through various channels of media, i.e. newspapers, television, radio, internet, and web. It is thought that this is mainly due to the conflicting information on the mass media and to the individual attitude (formed by cultural, educational and environmental factors), that is, one external factor and another personal factor. In this paper we will investigate the dynamical development of opinion in a small population of agents by means of a computational model of opinion formation in a co-evolving network of socially linked agents. The personal and external factors are taken into account by assigning an individual attitude parameter to each agent, and by subjecting all to an external but homogeneous field to simulate the effect of the media. We then adjust the field strength in the model by using actual data on scientific perception surveys carried out in two different populations, which allow us to compare two different societies. We interpret the model findings with the aid of simple mean field calculations. Our results suggest that scientifically sound concepts are more difficult to acquire than concepts not validated by science, since opposing individuals organize themselves in close communities that prevent opinion consensus.
Towards prediction of correlated material properties using quantum Monte Carlo methods
NASA Astrophysics Data System (ADS)
Wagner, Lucas
Correlated electron systems offer a richness of physics far beyond noninteracting systems. If we would like to pursue the dream of designer correlated materials, or, even to set a more modest goal, to explain in detail the properties and effective physics of known materials, then accurate simulation methods are required. Using modern computational resources, quantum Monte Carlo (QMC) techniques offer a way to directly simulate electron correlations. I will show some recent results on a few extremely challenging materials including the metal-insulator transition of VO2, the ground state of the doped cuprates, and the pressure dependence of magnetic properties in FeSe. By using a relatively simple implementation of QMC, at least some properties of these materials can be described truly from first principles, without any adjustable parameters. Using the QMC platform, we have developed a way of systematically deriving effective lattice models from the simulation. This procedure is particularly attractive for correlated electron systems because the QMC methods treat the one-body and many-body components of the wave function and Hamiltonian on completely equal footing. I will show some examples of using this downfolding technique and the high accuracy of QMC to connect our intuitive ideas about interacting electron systems with high fidelity simulations. The work in this presentation was supported in part by NSF DMR 1206242, the U.S. Department of Energy, Office of Science, Office of Advanced Scientific Computing Research, Scientific Discovery through Advanced Computing (SciDAC) program under Award Number FG02-12ER46875, and the Center for Emergent Superconductivity, Department of Energy Frontier Research Center under Grant No. DEAC0298CH1088. Computing resources were provided by a Blue Waters Illinois grant and INCITE PhotSuper and SuperMatSim allocations.
Quantifying falsifiability of scientific theories
NASA Astrophysics Data System (ADS)
Nemenman, Ilya
I argue that the notion of falsifiability, a key concept in defining a valid scientific theory, can be quantified using Bayesian Model Selection, which is a standard tool in modern statistics. This relates falsifiability to the quantitative version of the statistical Occam's razor, and allows transforming some long-running arguments about validity of scientific theories from philosophical discussions to rigorous mathematical calculations.
[Trueness of modern natural science (1): the scientific revolution and the problem of philosophy].
Maeda, Y
2001-12-01
How can one characterize modern Europe? This problem is essentially related to the meaning of modern natural science, which was developed during the scientific revolution. Then how did viewpoints change during this revolution? The answer to this question also determined the basic character of modern philosophy. Through the examination of Aristotle's geocentric theory and kinematics, I have come to believe that the defect of Aristotle's was that he concluded that a visible sense image is an actual reflection of the reality as it is. From this point of view, the traditional theory of truth called "correspondence theory" is found to be an insufficient one. Therefore, in this paper I will show that the methodological and philosophical question "How do we see reality among phenomena?" is a very important one. This question is the one Plato struggled with, and also the one which guided Kant. It may be said that this can be seen as a group for a new metaphysics as a basic theory of reality.
Gel, Aytekin; Hu, Jonathan; Ould-Ahmed-Vall, ElMoustapha; ...
2017-03-20
Legacy codes remain a crucial element of today's simulation-based engineering ecosystem due to the extensive validation process and investment in such software. The rapid evolution of high-performance computing architectures necessitates the modernization of these codes. One approach to modernization is a complete overhaul of the code. However, this could require extensive investments, such as rewriting in modern languages, new data constructs, etc., which will necessitate systematic verification and validation to re-establish the credibility of the computational models. The current study advocates using a more incremental approach and is a culmination of several modernization efforts of the legacy code MFIX, whichmore » is an open-source computational fluid dynamics code that has evolved over several decades, widely used in multiphase flows and still being developed by the National Energy Technology Laboratory. Two different modernization approaches,‘bottom-up’ and ‘top-down’, are illustrated. Here, preliminary results show up to 8.5x improvement at the selected kernel level with the first approach, and up to 50% improvement in total simulated time with the latter were achieved for the demonstration cases and target HPC systems employed.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gel, Aytekin; Hu, Jonathan; Ould-Ahmed-Vall, ElMoustapha
Legacy codes remain a crucial element of today's simulation-based engineering ecosystem due to the extensive validation process and investment in such software. The rapid evolution of high-performance computing architectures necessitates the modernization of these codes. One approach to modernization is a complete overhaul of the code. However, this could require extensive investments, such as rewriting in modern languages, new data constructs, etc., which will necessitate systematic verification and validation to re-establish the credibility of the computational models. The current study advocates using a more incremental approach and is a culmination of several modernization efforts of the legacy code MFIX, whichmore » is an open-source computational fluid dynamics code that has evolved over several decades, widely used in multiphase flows and still being developed by the National Energy Technology Laboratory. Two different modernization approaches,‘bottom-up’ and ‘top-down’, are illustrated. Here, preliminary results show up to 8.5x improvement at the selected kernel level with the first approach, and up to 50% improvement in total simulated time with the latter were achieved for the demonstration cases and target HPC systems employed.« less
Development of Clinical Pharmacology in the Russian Federation.
Petrov, V I; Kagramanyan, I N; Khokhlov, A L; Frolov, M U; Lileeva, E G
2016-05-01
The article aims to provide the history, organization, and approaches to clinical pharmacology in the Russian Federation. This article is based on major international and Russian documents, along with groundbreaking historical facts and scientific articles related to the development of modern clinical pharmacology the Russian Federation. Improving the quality of drug therapy is the main goal of clinical pharmacology in the Russian Federation. Decisions of the World Health Organization, scientific achievements, and the work of well-known scientists among the world community and in the Russian Federation have strongly influenced the development of clinical pharmacology the Russian Federation. Clinical pharmacology in the Russian Federation addresses a wide range of problems; it actively engages in modern scientific research, education; and clinical practice. Clinical pharmacologists participate in studies of new drugs and often have a specific area of expertise. The future development of clinical pharmacology in the Russian Federation will be related to improvements in training, refinement of the framework that regulates clinical pharmacologists, and the creation of clinical pharmacology laboratories with modern equipment. Copyright © 2016 Elsevier HS Journals, Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Rachmatullah, Arif; Diana, Sariwulan; Rustaman, Nuryani Y.
2016-02-01
Along with the development of science and technology, the basic ability to read, write and count is not enough just to be able to survive in the modern era that surrounded by the products of science and technology. Scientific literacy is an ability that might be added as basic ability for human in the modern era. Recently, Fives et al. developed a new scientific literacy assessment for students, named as SLA (Scientific Literacy Assessment). A pilot study on the achievements of scientific literacy of middle school students in Sumedang using SLA was conducted to investigate the profile scientific literacy achievement of 223 middle school students in Sumedang, and compare the outcomes between genders (159 girls and 64 boys) and school accreditation (A and B) using a quantitative method with descriptive research-school survey. Based on the results, the average achievement of scientific literacy Sumedang middle school students is 45.21 and classified as the low category. The five components of scientific literacy, which is only one component in the medium category, namely science motivation and beliefs, and the four other components are in the low and very low category. Boys have higher scientific literacy, but the differences not statistically significant. Student's scientific literacy in an accredited school is higher than B, and the differences are statistically significant. Recommendation for further are: involve more research subjects, add more number of questions for each indicator, and conduct an independent research for each component.
The Scientific and Technical Revolution in the Socialist Republic of Viet Nam.
ERIC Educational Resources Information Center
Vien, Nguyen Khac
1979-01-01
Discussed are the reasons for the Socialist Republic of Viet Nam's scientific backwardness. A development project which will enable this country to become a modern, economically self-sufficient country by the year 2000 is outlined. (BT)
In the maw of the Ouroboros: an analysis of scientific literacy and democracy
NASA Astrophysics Data System (ADS)
Bang, Lars
2017-10-01
This paper explores the concept of scientific literacy through its relation to democracy and citizenship. Scientific literacy has received international attention in the twenty-first century as demonstrated by the Programme for International Student Assessment survey of 2006. It is no longer just a concept but has become a stated and testable outcome in the science education research community. This paper problematizes the `marriage' between scientific literacy and democracy, particularly the idea that scientific literacy is a presupposed necessity to proper citizenship and awareness of the role of science in modern society. A perusal of the science education literature can provide a history of scientific literacy, as it exists as a research category. Through Gilles Deleuze's notion of the Dogmatic Image of Thought and its relation to a Spinozist understanding of individuation/Becoming, it is argued that scientific literacy is not a recent invention and is problematic in its relation to democracy. This article is thus intended to act more as vehicle to move, stimulate and dramatize thought and potentially reconceptualise scientific literacy, than a comprehensive historical analysis. The concept of scientific literacy has undergone specific transformations in the last two centuries and has been enacted in different manifestations throughout modernity. Here the analysis draws upon Deleuze's reading of Michel Foucault and the notion of the Diagram related to Foucault's oeuvre, and is specifically using Foucault's notion of rationalities as actualized threads or clusters of discourse. The obvious link between science and democracy is an effect of specific rationalities within the epistemological field of science, rather than intrinsic, essential characteristics of science or scientific literacy. There is nothing intrinsic in its function for democracy. Through a case study of the work of Charles W. Eliot and Herbert Spencer and the modern enactment of scientific literacy in contemporary science education, this paper shows the cultural and historical contingencies on which the relation between scientific literacy and democracy has been constructed through a rationality this article calls the Man of Science. The mythical Ouroboros will be used as a Fresh Image of Thought to explore the movements and folds within the discursive formation of Scientific Literacy, the rationality of the Man of Science, and their relation to democracy.
Physical Attraction: The Mysteries of Magnetism
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stohr, Joachim
2004-12-14
Most people have intuitive associations with the word 'magnetism' based on everyday life: refrigerator magnets, the compass, north and south poles, or someone's 'magnetic personality'. Few people, however, realize how complicated the phenomenon really is, how much research still deals with the topic today, and how much it penetrates our modern industrialized world - from electricity, wireless communication at the speed of light to magnetic sensors in cars and data storage in computers. Stohr's lecture will provide a glimpse at the magic and science behind magnetism: its long history, scientific breakthroughs in its understanding, and its use in our modernmore » society. In the process Stohr will show how research at SSRL/SLAC is addressing some of the forefront issues in magnetism research and technology today.« less
NASA Technical Reports Server (NTRS)
Feng, Hui-Yu; VanderWijngaart, Rob; Biswas, Rupak; Biegel, Bryan (Technical Monitor)
2001-01-01
We describe the design of a new method for the measurement of the performance of modern computer systems when solving scientific problems featuring irregular, dynamic memory accesses. The method involves the solution of a stylized heat transfer problem on an unstructured, adaptive grid. A Spectral Element Method (SEM) with an adaptive, nonconforming mesh is selected to discretize the transport equation. The relatively high order of the SEM lowers the fraction of wall clock time spent on inter-processor communication, which eases the load balancing task and allows us to concentrate on the memory accesses. The benchmark is designed to be three-dimensional. Parallelization and load balance issues of a reference implementation will be described in detail in future reports.
State estimation improves prospects for ocean research
NASA Astrophysics Data System (ADS)
Stammer, Detlef; Wunsch, C.; Fukumori, I.; Marshall, J.
Rigorous global ocean state estimation methods can now be used to produce dynamically consistent time-varying model/data syntheses, the results of which are being used to study a variety of important scientific problems. Figure 1 shows a schematic of a complete ocean observing and synthesis system that includes global observations and state-of-the-art ocean general circulation models (OGCM) run on modern computer platforms. A global observing system is described in detail in Smith and Koblinsky [2001],and the present status of ocean modeling and anticipated improvements are addressed by Griffies et al. [2001]. Here, the focus is on the third component of state estimation: the synthesis of the observations and a model into a unified, dynamically consistent estimate.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Spong, D.A.
The design techniques and physics analysis of modern stellarator configurations for magnetic fusion research rely heavily on high performance computing and simulation. Stellarators, which are fundamentally 3-dimensional in nature, offer significantly more design flexibility than more symmetric devices such as the tokamak. By varying the outer boundary shape of the plasma, a variety of physics features, such as transport, stability, and heating efficiency can be optimized. Scientific visualization techniques are an important adjunct to this effort as they provide a necessary ergonomic link between the numerical results and the intuition of the human researcher. The authors have developed a varietymore » of visualization techniques for stellarators which both facilitate the design optimization process and allow the physics simulations to be more readily understood.« less
NASA Technical Reports Server (NTRS)
Denning, Peter J.
1989-01-01
In 1983 and 1984, the Infrared Astronomical Satellite (IRAS) detected 5,425 stellar objects and measured their infrared spectra. In 1987 a program called AUTOCLASS used Bayesian inference methods to discover the classes present in these data and determine the most probable class of each object, revealing unknown phenomena in astronomy. AUTOCLASS has rekindled the old debate on the suitability of Bayesian methods, which are computationally intensive, interpret probabilities as plausibility measures rather than frequencies, and appear to depend on a subjective assessment of the probability of a hypothesis before the data were collected. Modern statistical methods have, however, recently been shown to also depend on subjective elements. These debates bring into question the whole tradition of scientific objectivity and offer scientists a new way to take responsibility for their findings and conclusions.
Hansen, Werner
2016-12-01
The beginnings of modern western medicine reach to about 1800 when under the liberating influence of French Revolution observation of diseases was started to follow more scientifically justified criteria. At that time speculative doctrines prevailed, e. g. those set up natural philosopher Schelling. In this context Internist Friedrich Theodor von Frerichs at Berlin Charité University Hospital gained great merits because of his struggle for a scientifically-based experimental clinical medicine. This is demonstrated nicely in a recently found autograph document. © Georg Thieme Verlag KG Stuttgart · New York.
Application of Modern Fortran to Spacecraft Trajectory Design and Optimization
NASA Technical Reports Server (NTRS)
Williams, Jacob; Falck, Robert D.; Beekman, Izaak B.
2018-01-01
In this paper, applications of the modern Fortran programming language to the field of spacecraft trajectory optimization and design are examined. Modern object-oriented Fortran has many advantages for scientific programming, although many legacy Fortran aerospace codes have not been upgraded to use the newer standards (or have been rewritten in other languages perceived to be more modern). NASA's Copernicus spacecraft trajectory optimization program, originally a combination of Fortran 77 and Fortran 95, has attempted to keep up with modern standards and makes significant use of the new language features. Various algorithms and methods are presented from trajectory tools such as Copernicus, as well as modern Fortran open source libraries and other projects.
Mass Media and Global Warming: A Public Arenas Model of the Greenhouse Effect's Scientific Roots.
ERIC Educational Resources Information Center
Neuzil, Mark
1995-01-01
Uses the Public Arenas model to examine the historical roots of the greenhouse effect issue as communicated in scientific literature from the early 1800s to modern times. Utilizes a constructivist approach to discuss several possible explanations for the rise and fall of global warming as a social problem in the scientific arena. (PA)
ERIC Educational Resources Information Center
Kaomea, Julie
2013-01-01
Amidst late 19th-century efforts to emphasize modern medicine's transition to a more scientific approach, physicians seeking to represent themselves as scientists began wearing white laboratory coats. Today educational researchers are likewise urged to don metaphorical white coats as scientifically based research is held up as the cure-all for our…
Object-Oriented Scientific Programming with Fortran 90
NASA Technical Reports Server (NTRS)
Norton, C.
1998-01-01
Fortran 90 is a modern language that introduces many important new features beneficial for scientific programming. We discuss our experiences in plasma particle simulation and unstructured adaptive mesh refinement on supercomputers, illustrating the features of Fortran 90 that support the object-oriented methodology.
Emphasizing history in communicating scientific debates
NASA Astrophysics Data System (ADS)
Sherwood, S. C.
2010-12-01
Communication to the public of the reality of anthropogenic climate change has been less successful than many expect. The scientists themselves, the media, special interest groups, or the complexity of modern society are often blamed. However a look at past scientific paradigm shifts, in particular the Copernican revolution and the discovery of relativity, shows close parallels with the modern situation. Common aspects include the gradual formation of a scientific consensus in advance of the public; a politically partisan backlash against the new theory that, paradoxically, occurs after the arrival of conclusive supporting evidence; the prevalence of convincing but invalid pseudo-scientific counterarguments; the general failure of "debates" to increase public acceptance of the scientists' position; and, in the case of the heliocentric solar system, a very long time scale to final public acceptance (> 100 years). Greater emphasis on the lessons from such historical parallels, and on the success so far of consensus predictions of global warming made up to and including the first IPCC report in 1990, might be one useful way of enhancing the public's trust in science and scientists and thereby accelerate acceptance of uncomfortable scientific findings.
Diffusion of knowledge and globalization in the web of twentieth century science
NASA Astrophysics Data System (ADS)
Naumis, G. G.; Phillips, J. C.
2012-08-01
Scientific communication is an essential part of modern science: whereas Archimedes worked alone, Newton (correspondence with Hooke, 1676) acknowledged that “If I have seen a little further, it is by standing on the shoulders of Giants.” How is scientific communication reflected in the patterns of citations in scientific papers? How have these patterns changed in the 20th century, as both means of communication and individual transportation changed rapidly, compared to the earlier post-Newton 18th and 19th centuries? Here we discuss a diffusive model for scientific communications, based on a unique 2009 scientometric study of 25 million papers and 600 million citations that encapsulates the epistemology of modern science. The diffusive model predicts and explains, using no adjustable parameters, a surprisingly universal internal structure in the development of scientific research, which is essentially constant across the natural sciences, but which because of globalization changed qualitatively around 1960. Globalization corresponds physically to anomalous diffusion, which has been observed near the molecular glass transition, and can enhance molecular diffusion by factors as large as 100.
Archives and the Boundaries of Early Modern Science.
Popper, Nicholas
2016-03-01
This contribution argues that the study of early modern archives suggests a new agenda for historians of early modern science. While in recent years historians of science have begun to direct increased attention toward the collections amassed by figures and institutions traditionally portrayed as proto-scientific, archives proliferated across early modern Europe, emerging as powerful tools for creating knowledge in politics, history, and law as well as natural philosophy, botany, and more. The essay investigates the methods of production, collection, organization, and manipulation used by English statesmen and Crown officers such as Keeper of the State Papers Thomas Wilson and Secretary of State Joseph Williamson to govern their disorderly collections. Their methods, it is shown, were shared with contemporaries seeking to generate and manage other troves of evidence and in fact reflect a complex ecosystem of imitation and exchange across fields of inquiry. These commonalities suggest that historians of science should look beyond the ancestors of modern scientific disciplines to examine how practices of producing knowledge emerged and migrated throughout cultures of learning in Europe and beyond. Creating such a map of knowledge production and exchange, the essay concludes, would provide a renewed and expansive ambition for the field.
The art of seeing and painting.
Grossberg, Stephen
2008-01-01
The human urge to represent the three-dimensional world using two-dimensional pictorial representations dates back at least to Paleolithic times. Artists from ancient to modern times have struggled to understand how a few contours or color patches on a flat surface can induce mental representations of a three-dimensional scene. This article summarizes some of the recent breakthroughs in scientifically understanding how the brain sees that shed light on these struggles. These breakthroughs illustrate how various artists have intuitively understood paradoxical properties about how the brain sees, and have used that understanding to create great art. These paradoxical properties arise from how the brain forms the units of conscious visual perception; namely, representations of three-dimensional boundaries and surfaces. Boundaries and surfaces are computed in parallel cortical processing streams that obey computationally complementary properties. These streams interact at multiple levels to overcome their complementary weaknesses and to transform their complementary properties into consistent percepts. The article describes how properties of complementary consistency have guided the creation of many great works of art.
Volume and Value of Big Healthcare Data.
Dinov, Ivo D
Modern scientific inquiries require significant data-driven evidence and trans-disciplinary expertise to extract valuable information and gain actionable knowledge about natural processes. Effective evidence-based decisions require collection, processing and interpretation of vast amounts of complex data. The Moore's and Kryder's laws of exponential increase of computational power and information storage, respectively, dictate the need rapid trans-disciplinary advances, technological innovation and effective mechanisms for managing and interrogating Big Healthcare Data. In this article, we review important aspects of Big Data analytics and discuss important questions like: What are the challenges and opportunities associated with this biomedical, social, and healthcare data avalanche? Are there innovative statistical computing strategies to represent, model, analyze and interpret Big heterogeneous data? We present the foundation of a new compressive big data analytics (CBDA) framework for representation, modeling and inference of large, complex and heterogeneous datasets. Finally, we consider specific directions likely to impact the process of extracting information from Big healthcare data, translating that information to knowledge, and deriving appropriate actions.
Advanced interdisciplinary undergraduate program: light engineering
NASA Astrophysics Data System (ADS)
Bakholdin, Alexey; Bougrov, Vladislav; Voznesenskaya, Anna; Ezhova, Kseniia
2016-09-01
The undergraduate educational program "Light Engineering" of an advanced level of studies is focused on development of scientific learning outcomes and training of professionals, whose activities are in the interdisciplinary fields of Optical engineering and Technical physics. The program gives practical experience in transmission, reception, storage, processing and displaying information using opto-electronic devices, automation of optical systems design, computer image modeling, automated quality control and characterization of optical devices. The program is implemented in accordance with Educational standards of the ITMO University. The specific features of the Program is practice- and problem-based learning implemented by engaging students to perform research and projects, internships at the enterprises and in leading Russian and international research educational centers. The modular structure of the Program and a significant proportion of variable disciplines provide the concept of individual learning for each student. Learning outcomes of the program's graduates include theoretical knowledge and skills in natural science and core professional disciplines, deep knowledge of modern computer technologies, research expertise, design skills, optical and optoelectronic systems and devices.
Medical Information & Technology: Rapidly Expanding Vast Horizons
NASA Astrophysics Data System (ADS)
Sahni, Anil K.
2012-12-01
During ÑMedical Council Of India?, Platinum Jubilee Year (1933-2008) Celebrations, In Year 2008, Several Scientific Meeting/Seminar/Symposium, On Various Topics Of Contemporary Importance And Relevance In The Field Of ÑMedical Education And Ethics?, Were Organized, By Different Medical Colleges At Various Local, State, National Levels. The Present Discussion, Is An Comprehensive Summary Of Various Different Aspects of ìMedical Information Communication Technologyî, Especially UseFul For The Audience Stratum Group Of Those Amateur Medical & Paramedical Staff, With No Previous Work Experience Knowledge Of Computronics Applications. Outlining The, i.Administration Applications: Medical Records Etc, ii. Clinical Applications: Pros pective Scope Of TeleMedicine Applicabilities Etc iii. Other Applications: Efforts To Augment Improvement Of Medical Education, Medical Presentations, Medical Education And Research Etc. ÑMedical Trancription? & Related Recent Study Fields e.g ÑModern Pharmaceuticals?,ÑBio-Engineering?, ÑBio-Mechanics?, ÑBio-Technology? Etc., Along With Important Aspects Of Computers-General Considerations, Computer Ergonomics Assembled To Summarize, The AwareNess Regarding Basic Fundamentals Of Medical Computronics & Its Practically SuccessFul Utilities.
Silva-Lopes, Victor W; Monteiro-Leal, Luiz H
2003-07-01
The development of new technology and the possibility of fast information delivery by either Internet or Intranet connections are changing education. Microanatomy education depends basically on the correct interpretation of microscopy images by students. Modern microscopes coupled to computers enable the presentation of these images in a digital form by creating image databases. However, the access to this new technology is restricted entirely to those living in cities and towns with an Information Technology (IT) infrastructure. This study describes the creation of a free Internet histology database composed by high-quality images and also presents an inexpensive way to supply it to a greater number of students through Internet/Intranet connections. By using state-of-the-art scientific instruments, we developed a Web page (http://www2.uerj.br/~micron/atlas/atlasenglish/index.htm) that, in association with a multimedia microscopy laboratory, intends to help in the reduction of the IT educational gap between developed and underdeveloped regions. Copyright 2003 Wiley-Liss, Inc.
Volume and Value of Big Healthcare Data
Dinov, Ivo D.
2016-01-01
Modern scientific inquiries require significant data-driven evidence and trans-disciplinary expertise to extract valuable information and gain actionable knowledge about natural processes. Effective evidence-based decisions require collection, processing and interpretation of vast amounts of complex data. The Moore's and Kryder's laws of exponential increase of computational power and information storage, respectively, dictate the need rapid trans-disciplinary advances, technological innovation and effective mechanisms for managing and interrogating Big Healthcare Data. In this article, we review important aspects of Big Data analytics and discuss important questions like: What are the challenges and opportunities associated with this biomedical, social, and healthcare data avalanche? Are there innovative statistical computing strategies to represent, model, analyze and interpret Big heterogeneous data? We present the foundation of a new compressive big data analytics (CBDA) framework for representation, modeling and inference of large, complex and heterogeneous datasets. Finally, we consider specific directions likely to impact the process of extracting information from Big healthcare data, translating that information to knowledge, and deriving appropriate actions. PMID:26998309
Solving lattice QCD systems of equations using mixed precision solvers on GPUs
NASA Astrophysics Data System (ADS)
Clark, M. A.; Babich, R.; Barros, K.; Brower, R. C.; Rebbi, C.
2010-09-01
Modern graphics hardware is designed for highly parallel numerical tasks and promises significant cost and performance benefits for many scientific applications. One such application is lattice quantum chromodynamics (lattice QCD), where the main computational challenge is to efficiently solve the discretized Dirac equation in the presence of an SU(3) gauge field. Using NVIDIA's CUDA platform we have implemented a Wilson-Dirac sparse matrix-vector product that performs at up to 40, 135 and 212 Gflops for double, single and half precision respectively on NVIDIA's GeForce GTX 280 GPU. We have developed a new mixed precision approach for Krylov solvers using reliable updates which allows for full double precision accuracy while using only single or half precision arithmetic for the bulk of the computation. The resulting BiCGstab and CG solvers run in excess of 100 Gflops and, in terms of iterations until convergence, perform better than the usual defect-correction approach for mixed precision.
Yaniv, Ziv; Lowekamp, Bradley C; Johnson, Hans J; Beare, Richard
2018-06-01
Modern scientific endeavors increasingly require team collaborations to construct and interpret complex computational workflows. This work describes an image-analysis environment that supports the use of computational tools that facilitate reproducible research and support scientists with varying levels of software development skills. The Jupyter notebook web application is the basis of an environment that enables flexible, well-documented, and reproducible workflows via literate programming. Image-analysis software development is made accessible to scientists with varying levels of programming experience via the use of the SimpleITK toolkit, a simplified interface to the Insight Segmentation and Registration Toolkit. Additional features of the development environment include user friendly data sharing using online data repositories and a testing framework that facilitates code maintenance. SimpleITK provides a large number of examples illustrating educational and research-oriented image analysis workflows for free download from GitHub under an Apache 2.0 license: github.com/InsightSoftwareConsortium/SimpleITK-Notebooks .
The Moon in the Russian scientific-educational project: Kazan-GeoNa-2010
NASA Astrophysics Data System (ADS)
Gusev, A.; Kitiashvili, I.; Petrova, N.
Historically thousand-year Kazan city and the two-hundred-year Kazan university Russia carry out a role of the scientific-organizational and cultural-educational center of Volga region For the further successful development of educational and scientific-educational activity of the Russian Federation the Republic Tatarstan Kazan is offered the national project - the International Center of the Science and the Internet of Technologies bf GeoNa bf Geo metry of bf Na ture - bf GeoNa is developed - wisdom enthusiasm pride grandeur which includes a modern complex of conference halls up to 4 thousand places the Center the Internet of Technologies 3D Planetarium - development of the Moon PhysicsLand an active museum of natural sciences an oceanarium training a complex Spheres of Knowledge botanical and landscape oases In center bf GeoNa will be hosted conferences congresses fundamental scientific researches of the Moon scientific-educational actions presentation of the international scientific programs on lunar research modern lunar databases exhibition Hi-tech of the equipment the extensive cultural-educational tourist and cognitive programs Center bf GeoNa will enable scientists and teachers of the Russian universities to join to advanced achievements of a science information technologies to establish scientific communications with foreign colleagues in sphere of the high technology and educational projects with world space centers
Advances in Modern Botnet Understanding and the Accurate Enumeration of Infected Hosts
ERIC Educational Resources Information Center
Nunnery, Christopher Edward
2011-01-01
Botnets remain a potent threat due to evolving modern architectures, inadequate remediation methods, and inaccurate measurement techniques. In response, this research exposes the architectures and operations of two advanced botnets, techniques to enumerate infected hosts, and pursues the scientific refinement of infected-host enumeration data by…
Load Balancing Scientific Applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pearce, Olga Tkachyshyn
2014-12-01
The largest supercomputers have millions of independent processors, and concurrency levels are rapidly increasing. For ideal efficiency, developers of the simulations that run on these machines must ensure that computational work is evenly balanced among processors. Assigning work evenly is challenging because many large modern parallel codes simulate behavior of physical systems that evolve over time, and their workloads change over time. Furthermore, the cost of imbalanced load increases with scale because most large-scale scientific simulations today use a Single Program Multiple Data (SPMD) parallel programming model, and an increasing number of processors will wait for the slowest one atmore » the synchronization points. To address load imbalance, many large-scale parallel applications use dynamic load balance algorithms to redistribute work evenly. The research objective of this dissertation is to develop methods to decide when and how to load balance the application, and to balance it effectively and affordably. We measure and evaluate the computational load of the application, and develop strategies to decide when and how to correct the imbalance. Depending on the simulation, a fast, local load balance algorithm may be suitable, or a more sophisticated and expensive algorithm may be required. We developed a model for comparison of load balance algorithms for a specific state of the simulation that enables the selection of a balancing algorithm that will minimize overall runtime.« less
XVis: Visualization for the Extreme-Scale Scientific-Computation Ecosystem: Mid-year report FY17 Q2
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moreland, Kenneth D.; Pugmire, David; Rogers, David
The XVis project brings together the key elements of research to enable scientific discovery at extreme scale. Scientific computing will no longer be purely about how fast computations can be performed. Energy constraints, processor changes, and I/O limitations necessitate significant changes in both the software applications used in scientific computation and the ways in which scientists use them. Components for modeling, simulation, analysis, and visualization must work together in a computational ecosystem, rather than working independently as they have in the past. This project provides the necessary research and infrastructure for scientific discovery in this new computational ecosystem by addressingmore » four interlocking challenges: emerging processor technology, in situ integration, usability, and proxy analysis.« less
XVis: Visualization for the Extreme-Scale Scientific-Computation Ecosystem: Year-end report FY17.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moreland, Kenneth D.; Pugmire, David; Rogers, David
The XVis project brings together the key elements of research to enable scientific discovery at extreme scale. Scientific computing will no longer be purely about how fast computations can be performed. Energy constraints, processor changes, and I/O limitations necessitate significant changes in both the software applications used in scientific computation and the ways in which scientists use them. Components for modeling, simulation, analysis, and visualization must work together in a computational ecosystem, rather than working independently as they have in the past. This project provides the necessary research and infrastructure for scientific discovery in this new computational ecosystem by addressingmore » four interlocking challenges: emerging processor technology, in situ integration, usability, and proxy analysis.« less
XVis: Visualization for the Extreme-Scale Scientific-Computation Ecosystem. Mid-year report FY16 Q2
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moreland, Kenneth D.; Sewell, Christopher; Childs, Hank
The XVis project brings together the key elements of research to enable scientific discovery at extreme scale. Scientific computing will no longer be purely about how fast computations can be performed. Energy constraints, processor changes, and I/O limitations necessitate significant changes in both the software applications used in scientific computation and the ways in which scientists use them. Components for modeling, simulation, analysis, and visualization must work together in a computational ecosystem, rather than working independently as they have in the past. This project provides the necessary research and infrastructure for scientific discovery in this new computational ecosystem by addressingmore » four interlocking challenges: emerging processor technology, in situ integration, usability, and proxy analysis.« less
XVis: Visualization for the Extreme-Scale Scientific-Computation Ecosystem: Year-end report FY15 Q4.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moreland, Kenneth D.; Sewell, Christopher; Childs, Hank
The XVis project brings together the key elements of research to enable scientific discovery at extreme scale. Scientific computing will no longer be purely about how fast computations can be performed. Energy constraints, processor changes, and I/O limitations necessitate significant changes in both the software applications used in scientific computation and the ways in which scientists use them. Components for modeling, simulation, analysis, and visualization must work together in a computational ecosystem, rather than working independently as they have in the past. This project provides the necessary research and infrastructure for scientific discovery in this new computational ecosystem by addressingmore » four interlocking challenges: emerging processor technology, in situ integration, usability, and proxy analysis.« less
Shaikhouni, Ammar; Elder, J Bradley
2012-11-01
At the turn of the twentieth century, the only computational device used in neurosurgical procedures was the brain of the surgeon. Today, most neurosurgical procedures rely at least in part on the use of a computer to help perform surgeries accurately and safely. The techniques that revolutionized neurosurgery were mostly developed after the 1950s. Just before that era, the transistor was invented in the late 1940s, and the integrated circuit was invented in the late 1950s. During this time, the first automated, programmable computational machines were introduced. The rapid progress in the field of neurosurgery not only occurred hand in hand with the development of modern computers, but one also can state that modern neurosurgery would not exist without computers. The focus of this article is the impact modern computers have had on the practice of neurosurgery. Neuroimaging, neuronavigation, and neuromodulation are examples of tools in the armamentarium of the modern neurosurgeon that owe each step in their evolution to progress made in computer technology. Advances in computer technology central to innovations in these fields are highlighted, with particular attention to neuroimaging. Developments over the last 10 years in areas of sensors and robotics that promise to transform the practice of neurosurgery further are discussed. Potential impacts of advances in computers related to neurosurgery in developing countries and underserved regions are also discussed. As this article illustrates, the computer, with its underlying and related technologies, is central to advances in neurosurgery over the last half century. Copyright © 2012 Elsevier Inc. All rights reserved.
Families made by science. Arnold Gesell and the technologies of modern child adoption.
Herman, E
2001-12-01
This essay considers the effort to transform child adoption into a modern scientific enterprise during the first half of the twentieth century via a case study of Arnold Gesell (1880-1961), a Yale developmentalist well known for his studies of child growth and the applied technologies that emerged from them: normative scales promising to measure and predict development. Scientific adoption was a central aspiration for many human scientists, helping professionals, and state regulators. They aimed to reduce the numerous hazards presumed to be inherent in adopting children, especially infants, who were not one's "own." By importing insights and techniques drawn from the world of science into the practical world of family formation, scientific adoption stood for kinship by design. This case study explores one point of intersection between the history of science and the history of social welfare and social policy, simultaneously illustrating the cultural progress and power of scientific authority and the numerous obstacles to its practical realization.
ITK: enabling reproducible research and open science
McCormick, Matthew; Liu, Xiaoxiao; Jomier, Julien; Marion, Charles; Ibanez, Luis
2014-01-01
Reproducibility verification is essential to the practice of the scientific method. Researchers report their findings, which are strengthened as other independent groups in the scientific community share similar outcomes. In the many scientific fields where software has become a fundamental tool for capturing and analyzing data, this requirement of reproducibility implies that reliable and comprehensive software platforms and tools should be made available to the scientific community. The tools will empower them and the public to verify, through practice, the reproducibility of observations that are reported in the scientific literature. Medical image analysis is one of the fields in which the use of computational resources, both software and hardware, are an essential platform for performing experimental work. In this arena, the introduction of the Insight Toolkit (ITK) in 1999 has transformed the field and facilitates its progress by accelerating the rate at which algorithmic implementations are developed, tested, disseminated and improved. By building on the efficiency and quality of open source methodologies, ITK has provided the medical image community with an effective platform on which to build a daily workflow that incorporates the true scientific practices of reproducibility verification. This article describes the multiple tools, methodologies, and practices that the ITK community has adopted, refined, and followed during the past decade, in order to become one of the research communities with the most modern reproducibility verification infrastructure. For example, 207 contributors have created over 2400 unit tests that provide over 84% code line test coverage. The Insight Journal, an open publication journal associated with the toolkit, has seen over 360,000 publication downloads. The median normalized closeness centrality, a measure of knowledge flow, resulting from the distributed peer code review system was high, 0.46. PMID:24600387
Presenting Numerical Modelling of Explosive Volcanic Eruption to a General Public
NASA Astrophysics Data System (ADS)
Demaria, C.; Todesco, M.; Neri, A.; Blasi, G.
2001-12-01
Numerical modeling of explosive volcanic eruptions has been widely applied, during the last decades, to study pyroclastic flows dispersion along volcano's flanks and to evaluate their impact on urban areas. Results from these transient multi-phase and multi-component simulations are often reproduced in form of computer animations, representing the spatial and temporal evolution of relevant flow variables (such as temperature, or particle concentration). Despite being a sophisticated, technical tool to analyze and share modeling results within the scientific community, these animations truly look like colorful cartoons showing an erupting volcano and are especially suited to be shown to a general public. Thanks to their particular appeal, and to the large interest usually risen by exploding volcanoes, these animations have been presented several times on television and magazines and are currently displayed in a permanent exposition, at the Vesuvius Observatory in Naples. This work represents an effort to produce an accompanying tool for these animations, capable of explaining to a large audience the scientific meaning of what can otherwise look as a graphical exercise. Dealing with research aimed at the study of dangerous, explosive volcanoes, improving the general understanding of these scientific results plays an important role as far as risk perception is concerned. An educated population has better chances to follow an appropriate behavior, i.e.: one that could lead, on the long period, to a reduction of the potential risk. In this sense, a correct divulgation of scientific results, while improving the confidence of the population in the scientific community, should belong to the strategies adopted to mitigate volcanic risk. Due to the relevance of the long term final goal of such divulgation experiment, this work represents an interdisciplinary effort, combining scientific expertise and specific competence from the modern communication science and risk perception studies.
ITK: enabling reproducible research and open science.
McCormick, Matthew; Liu, Xiaoxiao; Jomier, Julien; Marion, Charles; Ibanez, Luis
2014-01-01
Reproducibility verification is essential to the practice of the scientific method. Researchers report their findings, which are strengthened as other independent groups in the scientific community share similar outcomes. In the many scientific fields where software has become a fundamental tool for capturing and analyzing data, this requirement of reproducibility implies that reliable and comprehensive software platforms and tools should be made available to the scientific community. The tools will empower them and the public to verify, through practice, the reproducibility of observations that are reported in the scientific literature. Medical image analysis is one of the fields in which the use of computational resources, both software and hardware, are an essential platform for performing experimental work. In this arena, the introduction of the Insight Toolkit (ITK) in 1999 has transformed the field and facilitates its progress by accelerating the rate at which algorithmic implementations are developed, tested, disseminated and improved. By building on the efficiency and quality of open source methodologies, ITK has provided the medical image community with an effective platform on which to build a daily workflow that incorporates the true scientific practices of reproducibility verification. This article describes the multiple tools, methodologies, and practices that the ITK community has adopted, refined, and followed during the past decade, in order to become one of the research communities with the most modern reproducibility verification infrastructure. For example, 207 contributors have created over 2400 unit tests that provide over 84% code line test coverage. The Insight Journal, an open publication journal associated with the toolkit, has seen over 360,000 publication downloads. The median normalized closeness centrality, a measure of knowledge flow, resulting from the distributed peer code review system was high, 0.46.
Combining Distributed and Shared Memory Models: Approach and Evolution of the Global Arrays Toolkit
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nieplocha, Jarek; Harrison, Robert J.; Kumar, Mukul
2002-07-29
Both shared memory and distributed memory models have advantages and shortcomings. Shared memory model is much easier to use but it ignores data locality/placement. Given the hierarchical nature of the memory subsystems in the modern computers this characteristic might have a negative impact on performance and scalability. Various techniques, such as code restructuring to increase data reuse and introducing blocking in data accesses, can address the problem and yield performance competitive with message passing[Singh], however at the cost of compromising the ease of use feature. Distributed memory models such as message passing or one-sided communication offer performance and scalability butmore » they compromise the ease-of-use. In this context, the message-passing model is sometimes referred to as?assembly programming for the scientific computing?. The Global Arrays toolkit[GA1, GA2] attempts to offer the best features of both models. It implements a shared-memory programming model in which data locality is managed explicitly by the programmer. This management is achieved by explicit calls to functions that transfer data between a global address space (a distributed array) and local storage. In this respect, the GA model has similarities to the distributed shared-memory models that provide an explicit acquire/release protocol. However, the GA model acknowledges that remote data is slower to access than local data and allows data locality to be explicitly specified and hence managed. The GA model exposes to the programmer the hierarchical memory of modern high-performance computer systems, and by recognizing the communication overhead for remote data transfer, it promotes data reuse and locality of reference. This paper describes the characteristics of the Global Arrays programming model, capabilities of the toolkit, and discusses its evolution.« less
NASA Astrophysics Data System (ADS)
Tucker, G. E.
1997-05-01
This NSF supported program, emphasizing hands-on learning and observation with modern instruments, is described in its pilot phase, prior to being launched nationally. A group of 14 year old students are using a small (21 cm) computer controlled telescope and CCD camera to do: (1) a 'sky survey' of brighter celestial objects, finding, identifying, and learning about them, and accumulating a portfolio of images, (2) photometry of variable stars, reducing the data to get a light curve, and (3) learn modern computer-based communication/dissemination skills by posting images and data to a Web site they are designing (http://www.javanet.com/ sky) and contributing data to archives (e.g. AAVSO) via the Internet. To attract more interest to astronomy and science in general and have a wider impact on the school and surrounding community, peer teaching is used as a pedagogical technique and families are encouraged to participate. Students teach e.g. astronomy, software and computers, Internet, instrumentation, and observing to other students, parents and the community by means of daytime presentations of their results (images and data) and evening public viewing at the telescope, operating the equipment themselves. Students can contribute scientifically significant data and experience the `discovery' aspect of science through observing projects where a measurement is made. Their `informal education' activities also help improve the perception of science in general and astronomy in particular in society at large. This program could benefit from collaboration with astronomers wanting to organize geographically distributed observing campaigns coordinated over the Internet and willing to advise on promising observational programs for small telescopes in the context of current science.
[Hans Gross and the beginning of criminology on a scientific basis].
Bachhiesl, Christian
2007-01-01
Modern criminology--if one wants to consider it a separate scientific discipline at all--is usually perceived as being mainly influenced by the methods of natural sciences supplemented by components from the field of psychology, which, at least in some of its conceptions, tends to define itself as a natural science, too. If we take a look at the history of science, we will see development of criminology in this direction was not necessarily inevitable. The scientific work of the Austrian Hans Gross (1847-1915), one of the founding fathers of scientific criminology, serves as an example of the way how natural sciences and their exact methods became established in the methodological apparatus of modern criminology, although in praxi his claim for the application of exact methods was all too often replaced by irrational and intuitive ways of working. Still, Hans Gross' fundamental decision for the exact methods derived from the natural sciences is an important step towards a criminology that can be understood as a part of natural sciences, largely superseding the methods of cultural sciences and anthropological philosophy. This approach made the (criminal) human being an object of measurement and can result in the concept of man as a mere phenomenon of quantity. This is, on the one hand, ethically questionable; on the other hand, it made modern criminology more efficient and successful.
Elementary Cosmology: From Aristotle's Universe to the Big Bang and Beyond
NASA Astrophysics Data System (ADS)
Kolata, James J.
2015-11-01
Cosmology is the study of the origin, size, and evolution of the entire universe. Every culture has developed a cosmology, whether it be based on religious, philosophical, or scientific principles. In this book, the evolution of the scientific understanding of the Universe in Western tradition is traced from the early Greek philosophers to the most modern 21st century view. After a brief introduction to the concept of the scientific method, the first part of the book describes the way in which detailed observations of the Universe, first with the naked eye and later with increasingly complex modern instruments, ultimately led to the development of the ``Big Bang'' theory. The second part of the book traces the evolution of the Big Bang including the very recent observation that the expansion of the Universe is itself accelerating with time.
NASA Astrophysics Data System (ADS)
2015-10-01
Involving young researchers in the scientific process, and allowing them to gain scientific experience, are important issues for scientific development. The International Conference for Students and Young Scientists ''Modern Technique and Technologies'' is one of a number of scientific events, held at National Research Tomsk Polytechnic University aimed at training and forming the scientific elite. During previous years the conference established itself as a serious scientific event at an international level, attracting members which annually number about 400 students and young scientists from Russia and near and far abroad. An important indicator of this scientific event is the large number of scientific areas covered, such as power engineering, heat power engineering, electronic devices for monitoring and diagnostics, instrumentation, materials and technologies of new generations, methods of research and diagnostics of materials, automatic control and system engineering, physical methods science and engineering, design and artistic aspects of engineering, social and humanitarian aspects of engineering. The main issues, which are discussed at the conference by young researchers, are connected with analysis of contemporary problems, application of new techniques and technologies, and consideration of their relationship. Over the years, the conference committee has gained a lot of experience in organizing scientific meetings. There are all the necessary conditions: the staff of organizers includes employees of Tomsk Polytechnic University; the auditoriums are equipped with modern demonstration and office equipment; leading scientists are TPU professors; the status of the Tomsk Polytechnic University as a leading research university in Russia also plays an important role. All this allows collaboration between leading scientists from all around the world, who are annually invited to give lectures at the conference. The editorial board expresses gratitude to the Administration of Tomsk Polytechnic University (TPU Rector, Professor P.S. Chubik and Vice Rector for Research and Innovation, Professor A.N. Dyachenko) for financial support of the conference. Also, we heartily thank both chairmen of the conference sections and the organizing committee's members for the great, effective, creative work in organizing and developing the conference as well as a significant contribution to the safeguarding and replenishment of the intellectual potential of Russia.
Analyzing user-generated online content for drug discovery: development and use of MedCrawler.
Helfenstein, Andreas; Tammela, Päivi
2017-04-15
Ethnopharmacology, or the scientific validation of traditional medicine, is a respected starting point in drug discovery. Home remedies and traditional use of plants are still widespread, also in Western societies. Instead of perusing ancient pharmacopeias, we developed MedCrawler, which we used to analyze blog posts for mentions of home remedies and their applications. This method is free and accessible from the office computer. We developed MedCrawler, a data mining tool for analyzing user-generated blog posts aiming to find modern 'traditional' medicine or home remedies. It searches user-generated blog posts and analyzes them for correlations between medically relevant terms. We also present examples and show that this method is capable of delivering both scientifically validated uses as well as not so well documented applications, which might serve as a starting point for follow-up research. Source code is available on GitHub at {{ https://github.com/a-hel/medcrawler }}. paivi.tammela@helsinki.fi. Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com
A Selective Overview of Variable Selection in High Dimensional Feature Space
Fan, Jianqing
2010-01-01
High dimensional statistical problems arise from diverse fields of scientific research and technological development. Variable selection plays a pivotal role in contemporary statistical learning and scientific discoveries. The traditional idea of best subset selection methods, which can be regarded as a specific form of penalized likelihood, is computationally too expensive for many modern statistical applications. Other forms of penalized likelihood methods have been successfully developed over the last decade to cope with high dimensionality. They have been widely applied for simultaneously selecting important variables and estimating their effects in high dimensional statistical inference. In this article, we present a brief account of the recent developments of theory, methods, and implementations for high dimensional variable selection. What limits of the dimensionality such methods can handle, what the role of penalty functions is, and what the statistical properties are rapidly drive the advances of the field. The properties of non-concave penalized likelihood and its roles in high dimensional statistical modeling are emphasized. We also review some recent advances in ultra-high dimensional variable selection, with emphasis on independence screening and two-scale methods. PMID:21572976
The challenge of cardiac modeling--interaction and integration.
Sideman, Samuel
2006-10-01
The goal of clinical cardiology is to obtain an integrated picture of the interacting parameters of muscle and vessel mechanics, blood circulation and myocardial perfusion, oxygen consumption and energy metabolism, and electrical activation and heart rate, thus relating to the true physiological and pathophysiological characteristics of the heart. Scientific insight into the cardiac physiology and performance is achieved by utilizing life sciences, for example, molecular biology, genetics and related intra- and intercellular phenomena, as well as the exact sciences, for example, mathematics, computer science, and related imaging and visualization techniques. The tools to achieve these goals are based on the intimate interactions between engineering science and medicine and the developments of modern, medically oriented technology. Most significant is the beneficiary effect of the globalization of science, the Internet, and the unprecedented international interaction and scientific cooperation in facing difficult multidisciplined challenges. This meeting aims to explore some important interactions in the cardiac system and relate to the integration of spatial and temporal interacting system parameters, so as to gain better insight into the structure and function of the cardiac system, thus leading to better therapeutic modalities.
Enabling a new Paradigm to Address Big Data and Open Science Challenges
NASA Astrophysics Data System (ADS)
Ramamurthy, Mohan; Fisher, Ward
2017-04-01
Data are not only the lifeblood of the geosciences but they have become the currency of the modern world in science and society. Rapid advances in computing, communi¬cations, and observational technologies — along with concomitant advances in high-resolution modeling, ensemble and coupled-systems predictions of the Earth system — are revolutionizing nearly every aspect of our field. Modern data volumes from high-resolution ensemble prediction/projection/simulation systems and next-generation remote-sensing systems like hyper-spectral satellite sensors and phased-array radars are staggering. For example, CMIP efforts alone will generate many petabytes of climate projection data for use in assessments of climate change. And NOAA's National Climatic Data Center projects that it will archive over 350 petabytes by 2030. For researchers and educators, this deluge and the increasing complexity of data brings challenges along with the opportunities for discovery and scientific breakthroughs. The potential for big data to transform the geosciences is enormous, but realizing the next frontier depends on effectively managing, analyzing, and exploiting these heterogeneous data sources, extracting knowledge and useful information from heterogeneous data sources in ways that were previously impossible, to enable discoveries and gain new insights. At the same time, there is a growing focus on the area of "Reproducibility or Replicability in Science" that has implications for Open Science. The advent of cloud computing has opened new avenues for not only addressing both big data and Open Science challenges to accelerate scientific discoveries. However, to successfully leverage the enormous potential of cloud technologies, it will require the data providers and the scientific communities to develop new paradigms to enable next-generation workflows and transform the conduct of science. Making data readily available is a necessary but not a sufficient condition. Data providers also need to give scientists an ecosystem that includes data, tools, workflows and other services needed to perform analytics, integration, interpretation, and synthesis - all in the same environment or platform. Instead of moving data to processing systems near users, as is the tradition, the cloud permits one to bring processing, computing, analysis and visualization to data - so called data proximate workbench capabilities, also known as server-side processing. In this talk, I will present the ongoing work at Unidata to facilitate a new paradigm for doing science by offering a suite of tools, resources, and platforms to leverage cloud technologies for addressing both big data and Open Science/reproducibility challenges. That work includes the development and deployment of new protocols for data access and server-side operations and Docker container images of key applications, JupyterHub Python notebook tools, and cloud-based analysis and visualization capability via the CloudIDV tool to enable reproducible workflows and effectively use the accessed data.
On a Modern Philosophy of Evaluating Scientific Publications
NASA Astrophysics Data System (ADS)
Guz, A. N.; Rushchitsky, J. J.; Chernyshenko, I. S.
2005-10-01
Current approaches to the citation analysis of scientific publications are outlined. Science Citation Index, Impact Factor, Immediacy Index, and the selection procedure for Essential Science Indicators—a relatively new citation analysis tool—are described. The new citation evaluation tool has yet not been discussed adequately by mechanicians
Generic and scientific constraints involving geoethics and geoeducation in planetary geosciences
NASA Astrophysics Data System (ADS)
Martínez-Frías, Jesús
2013-04-01
Geoscience education is a key factor in the academic, scientific and professional progress of any modern society. Geoethics is an interdisciplinary field, which involves Earth and Planetary Sciences as well as applied ethics, regarding the study of the abiotic world. These coss-cutting interactions linking scientific, societal and cultural aspects, consider our planet, in its modern approach, as a system and as a model. This new perspective is extremely important in the context of geoducation in planetary geosciences. In addition, Earth, our home planet, is the only planet in our solar system known to harbor life. This also makes it crucial to develop any scientific strategy and methodological technique (e.g. Raman spectroscopy) of searching for extraterrestrial life. In this context, it has been recently proposed [1-3] that the incorporation of the geoethical and geodiversity issues in planetary geology and astrobiology studies would enrich their methodological and conceptual character (mainly but not only in relation to planetary protection). Modern geoscience education must take into account that, in order to understand the origin and evolution of our planet, we need to be aware that the Earth is open to space, and that the study of meteorites, asteroids, the Moon and Mars is also essential for this purpose (Earth analogs are also unique sites to define planetary guidelines). Generic and scientific constraints involving geoethics and geoeducation should be incorporated into the teaching of all fundamental knowledge and skills for students and teachers. References: [1] Martinez-Frias, J. et al. (2009) 9th European Workshop on Astrobiology, EANA 09, 12-14 October 2009, Brussels, Belgiam. [2] Martinez-Frias, J., et al. (2010) 38th COSPAR Scientific Assembly. Protecting the Lunar and Martian Environments for Scientific Research, Bremen, Germany, 18-25 July. [3] Walsh et al. (2012) 43rd Lunar and Planetary Science Conference, 1910.pdf
NASA Astrophysics Data System (ADS)
Gomez, R.; Gentle, J.
2015-12-01
Modern data pipelines and computational processes require that meticulous methodologies be applied in order to insure that the source data, algorithms, and results are properly curated, managed and retained while remaining discoverable, accessible, and reproducible. Given the complexity of understanding the scientific problem domain being researched, combined with the overhead of learning to use advanced computing technologies, it becomes paramount that the next generation of scientists and researchers learn to embrace best-practices. The Integrative Computational Education and Research Traineeship (ICERT) is a National Science Foundation (NSF) Research Experience for Undergraduates (REU) Site at the Texas Advanced Computing Center (TACC). During Summer 2015, two ICERT interns joined the 3DDY project. 3DDY converts geospatial datasets into file types that can take advantage of new formats, such as natural user interfaces, interactive visualization, and 3D printing. Mentored by TACC researchers for ten weeks, students with no previous background in computational science learned to use scripts to build the first prototype of the 3DDY application, and leveraged Wrangler, the newest high performance computing (HPC) resource at TACC. Test datasets for quadrangles in central Texas were used to assemble the 3DDY workflow and code. Test files were successfully converted into a stereo lithographic (STL) format, which is amenable for use with a 3D printers. Test files and the scripts were documented and shared using the Figshare site while metadata was documented for the 3DDY application using OntoSoft. These efforts validated a straightforward set of workflows to transform geospatial data and established the first prototype version of 3DDY. Adding the data and software management procedures helped students realize a broader set of tangible results (e.g. Figshare entries), better document their progress and the final state of their work for the research group and community, helped students and researchers follow a clear set of formats and fill in the necessary details that may be lost otherwise, and exposed the students to the next generation workflows and practices for digital scholarship and scientific inquiry for converting geospatial data into formats that are easy to reuse.
Trace: a high-throughput tomographic reconstruction engine for large-scale datasets.
Bicer, Tekin; Gürsoy, Doğa; Andrade, Vincent De; Kettimuthu, Rajkumar; Scullin, William; Carlo, Francesco De; Foster, Ian T
2017-01-01
Modern synchrotron light sources and detectors produce data at such scale and complexity that large-scale computation is required to unleash their full power. One of the widely used imaging techniques that generates data at tens of gigabytes per second is computed tomography (CT). Although CT experiments result in rapid data generation, the analysis and reconstruction of the collected data may require hours or even days of computation time with a medium-sized workstation, which hinders the scientific progress that relies on the results of analysis. We present Trace, a data-intensive computing engine that we have developed to enable high-performance implementation of iterative tomographic reconstruction algorithms for parallel computers. Trace provides fine-grained reconstruction of tomography datasets using both (thread-level) shared memory and (process-level) distributed memory parallelization. Trace utilizes a special data structure called replicated reconstruction object to maximize application performance. We also present the optimizations that we apply to the replicated reconstruction objects and evaluate them using tomography datasets collected at the Advanced Photon Source. Our experimental evaluations show that our optimizations and parallelization techniques can provide 158× speedup using 32 compute nodes (384 cores) over a single-core configuration and decrease the end-to-end processing time of a large sinogram (with 4501 × 1 × 22,400 dimensions) from 12.5 h to <5 min per iteration. The proposed tomographic reconstruction engine can efficiently process large-scale tomographic data using many compute nodes and minimize reconstruction times.
Bioethics: secular philosophy, Jewish law and modern medicine.
Steinberg, A
1989-07-01
The recent unprecedented expansion of scientific knowledge and the greater awareness and involvement of the public in medical matters, as well as additional causes described here, have impelled the development of a new form of bioethics over the past three decades. Jewish law and philosophy have always dealt with medical issues. In recent years, however, a voluminous body of literature devoted to Jewish medical ethics has developed. It covers all relevant issues and offers Jewish solutions to many complex problems arising from the recent scientific breakthroughs. This article analyzes the differences between Jewish and secular philosophies regarding fundamental moral theories relevant to modern medical ethics.
Scientific Services on the Cloud
NASA Astrophysics Data System (ADS)
Chapman, David; Joshi, Karuna P.; Yesha, Yelena; Halem, Milt; Yesha, Yaacov; Nguyen, Phuong
Scientific Computing was one of the first every applications for parallel and distributed computation. To this date, scientific applications remain some of the most compute intensive, and have inspired creation of petaflop compute infrastructure such as the Oak Ridge Jaguar and Los Alamos RoadRunner. Large dedicated hardware infrastructure has become both a blessing and a curse to the scientific community. Scientists are interested in cloud computing for much the same reason as businesses and other professionals. The hardware is provided, maintained, and administrated by a third party. Software abstraction and virtualization provide reliability, and fault tolerance. Graduated fees allow for multi-scale prototyping and execution. Cloud computing resources are only a few clicks away, and by far the easiest high performance distributed platform to gain access to. There may still be dedicated infrastructure for ultra-scale science, but the cloud can easily play a major part of the scientific computing initiative.
The Principles for Successful Scientific Data Management Revisited
NASA Astrophysics Data System (ADS)
Walker, R. J.; King, T. A.; Joy, S. P.
2005-12-01
It has been 23 years since the National Research Council's Committee on Data Management and Computation (CODMAC) published its famous list of principles for successful scientific data management that have provided the framework for modern space science data management. CODMAC outlined seven principles: 1. Scientific Involvement in all aspects of space science missions. 2. Scientific Oversight of all scientific data-management activities. 3. Data Availability - Validated data should be made available to the scientific community in a timely manner. They should include appropriate ancillary data, and complete documentation. 4. Facilities - A proper balance between cost and scientific productivity should be maintained. 5. Software - Transportable well documented software should be available to process and analyze the data. 6. Scientific Data Storage - The data should be preserved in retrievable form. 7. Data System Funding - Adequate data funding should be made available at the outset of missions and protected from overruns. In this paper we will review the lessons learned in trying to apply these principles to space derived data. The Planetary Data System created the concept of data curation to carry out the CODMAC principles. Data curators are scientists and technologists who work directly with the mission scientists to create data products. The efficient application of the CODMAC principles requires that data curators and the mission team start early in a mission to plan for data access and archiving. To build the data products the planetary discipline adopted data access and documentation standards and has adhered to them. The data curators and mission team work together to produce data products and make them available. However even with early planning and agreement on standards the needs of the science community frequently far exceed the available resources. This is especially true for smaller principal investigator run missions. We will argue that one way to make data systems for small missions more effective is for the data curators to provide software tools to help develop the mission data system.
Building bridges between Ayurveda and Modern Science
Rastogi, Sanjeev
2010-01-01
The recent decade has witnessed many landmark observations, which have added to the scientific credentials of Ayurveda.It is however believed that instead of a retrospective approach of looking into the Ayurveda through the scientific reappraisals, a prospective approach through primary understanding of Ayurveda followed by a search into scientific linkage would be more appealing. This article brings the simplified yet scientific decoding of the core concepts of Ayurveda that form the framework of this ancient science of health. PMID:20532097
Center for Center for Technology for Advanced Scientific Component Software (TASCS)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kostadin, Damevski
A resounding success of the Scientific Discovery through Advanced Computing (SciDAC) program is that high-performance computational science is now universally recognized as a critical aspect of scientific discovery [71], complementing both theoretical and experimental research. As scientific communities prepare to exploit unprecedented computing capabilities of emerging leadership-class machines for multi-model simulations at the extreme scale [72], it is more important than ever to address the technical and social challenges of geographically distributed teams that combine expertise in domain science, applied mathematics, and computer science to build robust and flexible codes that can incorporate changes over time. The Center for Technologymore » for Advanced Scientific Component Software (TASCS)1 tackles these these issues by exploiting component-based software development to facilitate collaborative high-performance scientific computing.« less
NASA Astrophysics Data System (ADS)
Aiftimiei, D. C.; Antonacci, M.; Bagnasco, S.; Boccali, T.; Bucchi, R.; Caballer, M.; Costantini, A.; Donvito, G.; Gaido, L.; Italiano, A.; Michelotto, D.; Panella, M.; Salomoni, D.; Vallero, S.
2017-10-01
One of the challenges a scientific computing center has to face is to keep delivering well consolidated computational frameworks (i.e. the batch computing farm), while conforming to modern computing paradigms. The aim is to ease system administration at all levels (from hardware to applications) and to provide a smooth end-user experience. Within the INDIGO- DataCloud project, we adopt two different approaches to implement a PaaS-level, on-demand Batch Farm Service based on HTCondor and Mesos. In the first approach, described in this paper, the various HTCondor daemons are packaged inside pre-configured Docker images and deployed as Long Running Services through Marathon, profiting from its health checks and failover capabilities. In the second approach, we are going to implement an ad-hoc HTCondor framework for Mesos. Container-to-container communication and isolation have been addressed exploring a solution based on overlay networks (based on the Calico Project). Finally, we have studied the possibility to deploy an HTCondor cluster that spans over different sites, exploiting the Condor Connection Broker component, that allows communication across a private network boundary or firewall as in case of multi-site deployments. In this paper, we are going to describe and motivate our implementation choices and to show the results of the first tests performed.
Computational Science: A Research Methodology for the 21st Century
NASA Astrophysics Data System (ADS)
Orbach, Raymond L.
2004-03-01
Computational simulation - a means of scientific discovery that employs computer systems to simulate a physical system according to laws derived from theory and experiment - has attained peer status with theory and experiment. Important advances in basic science are accomplished by a new "sociology" for ultrascale scientific computing capability (USSCC), a fusion of sustained advances in scientific models, mathematical algorithms, computer architecture, and scientific software engineering. Expansion of current capabilities by factors of 100 - 1000 open up new vistas for scientific discovery: long term climatic variability and change, macroscopic material design from correlated behavior at the nanoscale, design and optimization of magnetic confinement fusion reactors, strong interactions on a computational lattice through quantum chromodynamics, and stellar explosions and element production. The "virtual prototype," made possible by this expansion, can markedly reduce time-to-market for industrial applications such as jet engines and safer, more fuel efficient cleaner cars. In order to develop USSCC, the National Energy Research Scientific Computing Center (NERSC) announced the competition "Innovative and Novel Computational Impact on Theory and Experiment" (INCITE), with no requirement for current DOE sponsorship. Fifty nine proposals for grand challenge scientific problems were submitted for a small number of awards. The successful grants, and their preliminary progress, will be described.
Anger and its control in Graeco-Roman and modern psychology.
Schimmel, S
1979-11-01
Modern psychologists have studied the phenomena of anger and hostility with diverse methodologies and from a variety of theoretical orientations. The close relationships between anger and aggression, psychosomatic disorder and personal unhappiness, make the understanding and control of anger an important individual and social goal. For all of its sophistication and accomplishment, however, most of the modern research demonstrates, to its disadvantage, a lack of historical perspective with respect to the analysis and treatment of anger, whether normal or pathological. This attitude has deprived psychology of a rich source of empirical observations, intriguing, testable hypotheses, and ingenious techniques of treatment. Of the literature that has been neglected, the analyses of the emotion of anger in the writings of Greek and Roman moral philosophers, particularly Aristotle (4th century B.C.), Seneca (1st century A.D.) and Plutarch (early 2nd century A.D.) are of particular interest. Although modern analyses and methods of treatment are in some ways more refined and more quantitatively precise, and are often subjected to validation and modification by empirical-experimental tests, scientific psychology has, to date, contributed relatively little to the understanding and control of anger that is novel except for research on its physiological dimensions. We can still benefit from the insight, prescriptions and procedures of the classicists, who in some respects offer more powerful methods of control than the most recently published works. Naturally, the modern psychotherapist or behavior therapist can and must go beyond the ancients, as is inherent in all scientific and intellectual progress, but there are no scientific or rational grounds for ignoring them as has been done for 75 years.
Rübel, Oliver; Dougherty, Max; Prabhat; Denes, Peter; Conant, David; Chang, Edward F.; Bouchard, Kristofer
2016-01-01
Neuroscience continues to experience a tremendous growth in data; in terms of the volume and variety of data, the velocity at which data is acquired, and in turn the veracity of data. These challenges are a serious impediment to sharing of data, analyses, and tools within and across labs. Here, we introduce BRAINformat, a novel data standardization framework for the design and management of scientific data formats. The BRAINformat library defines application-independent design concepts and modules that together create a general framework for standardization of scientific data. We describe the formal specification of scientific data standards, which facilitates sharing and verification of data and formats. We introduce the concept of Managed Objects, enabling semantic components of data formats to be specified as self-contained units, supporting modular and reusable design of data format components and file storage. We also introduce the novel concept of Relationship Attributes for modeling and use of semantic relationships between data objects. Based on these concepts we demonstrate the application of our framework to design and implement a standard format for electrophysiology data and show how data standardization and relationship-modeling facilitate data analysis and sharing. The format uses HDF5, enabling portable, scalable, and self-describing data storage and integration with modern high-performance computing for data-driven discovery. The BRAINformat library is open source, easy-to-use, and provides detailed user and developer documentation and is freely available at: https://bitbucket.org/oruebel/brainformat. PMID:27867355
Creativity, Scientific Practice, and Knowledge Production
ERIC Educational Resources Information Center
Fryer, Marilyn
2010-01-01
In this interesting article, Hisham Ghassib (2010) describes the transformation of science from its craft status in a pre-modern era to the major knowledge industry it is today. He then compares the production of scientific knowledge with industrial production, but makes the important distinction between the process of developing scientific…
A Brief Comment on the Surge of Modern Scientific Knowledge
ERIC Educational Resources Information Center
Freeman, Joan
2010-01-01
This article presents the author's response to Hisham B. Ghassib's article entitled "Where Does Creativity Fit into a Productivist Industrial Model of Knowledge Production?" Ghassib (2010) presents three intriguing and novel ideas which are worth anyone's attention. Firstly, that the constantly increasing amount of scientific knowledge can be…
Metaphoric Images from Abstract Concepts.
ERIC Educational Resources Information Center
Vizmuller-Zocco, Jana
1992-01-01
Discusses children's use of metaphors to create meaning, using as an example the pragmatic and "scientific" ways in which preschool children explain thunder and lightning to themselves. Argues that children are being shortchanged by modern scientific notions of abstractness and that they should be encouraged to create their own explanations of…
NASA Technical Reports Server (NTRS)
Bridgman, William T.; Shirah, Greg W.; Mitchell, Horace G.
2008-01-01
Today, scientific data and models can combine with modern animation tools to produce compelling visualizations to inform and educate. The Scientific Visualization Studio at Goddard Space Flight Center merges these techniques from the very different worlds of entertainment and science to enable scientists and the general public to 'see the unseeable' in new ways.
The Galileo Legend as Scientific Folklore.
ERIC Educational Resources Information Center
Lessl, Thomas M.
1999-01-01
Examines the various ways in which the legend of Galileo's persecution by the Roman Catholic Church diverges from scholarly readings of the Galileo affair. Finds five distinct themes of scientific ideology in the 40 accounts examined. Assesses the part that folklore plays in building and sustaining a professional ideology for the modern scientific…
Lydia Shattuck: "A Streak of the Modern."
ERIC Educational Resources Information Center
Shmurak, Carole B.; Handler, Bonnie S.
1991-01-01
Lydia Shattuck was responsible for the excellence in science instruction at Mount Holyoke Seminary. Shattuck graduated in 1851 and remained there as a faculty member, specializing in chemistry and botany. One of the first women to join scientific societies, she helped enlarge the sphere of women engaged in scientific research. (SM)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Geveci, Berk; Maynard, Robert
The XVis project brings together the key elements of research to enable scientific discovery at extreme scale. Scientific computing will no longer be purely about how fast computations can be performed. Energy constraints, processor changes, and I/O limitations necessitate significant changes in both the software applications used in scientific computation and the ways in which scientists use them. Components for modeling, simulation, analysis, and visualization must work together in a computational ecosystem, rather than working independently as they have in the past. The XVis project brought together collaborators from predominant DOE projects for visualization on accelerators and combining their respectivemore » features into a new visualization toolkit called VTK-m.« less
InteractiveROSETTA: a graphical user interface for the PyRosetta protein modeling suite.
Schenkelberg, Christian D; Bystroff, Christopher
2015-12-15
Modern biotechnical research is becoming increasingly reliant on computational structural modeling programs to develop novel solutions to scientific questions. Rosetta is one such protein modeling suite that has already demonstrated wide applicability to a number of diverse research projects. Unfortunately, Rosetta is largely a command-line-driven software package which restricts its use among non-computational researchers. Some graphical interfaces for Rosetta exist, but typically are not as sophisticated as commercial software. Here, we present InteractiveROSETTA, a graphical interface for the PyRosetta framework that presents easy-to-use controls for several of the most widely used Rosetta protocols alongside a sophisticated selection system utilizing PyMOL as a visualizer. InteractiveROSETTA is also capable of interacting with remote Rosetta servers, facilitating sophisticated protocols that are not accessible in PyRosetta or which require greater computational resources. InteractiveROSETTA is freely available at https://github.com/schenc3/InteractiveROSETTA/releases and relies upon a separate download of PyRosetta which is available at http://www.pyrosetta.org after obtaining a license (free for academic use). © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Clinical chemistry through Clinical Chemistry: a journal timeline.
Rej, Robert
2004-12-01
The establishment of the modern discipline of clinical chemistry was concurrent with the foundation of the journal Clinical Chemistry and that of the American Association for Clinical Chemistry in the late 1940s and early 1950s. To mark the 50th volume of this Journal, I chronicle and highlight scientific milestones, and those within the discipline, as documented in the pages of Clinical Chemistry. Amazing progress has been made in the field of laboratory diagnostics over these five decades, in many cases paralleling-as well as being bolstered by-the rapid pace in the development of computer technologies. Specific areas of laboratory medicine particularly well represented in Clinical Chemistry include lipids, endocrinology, protein markers, quality of laboratory measurements, molecular diagnostics, and general advances in methodology and instrumentation.
Applied mediation analyses: a review and tutorial.
Lange, Theis; Hansen, Kim Wadt; Sørensen, Rikke; Galatius, Søren
2017-01-01
In recent years, mediation analysis has emerged as a powerful tool to disentangle causal pathways from an exposure/treatment to clinically relevant outcomes. Mediation analysis has been applied in scientific fields as diverse as labour market relations and randomized clinical trials of heart disease treatments. In parallel to these applications, the underlying mathematical theory and computer tools have been refined. This combined review and tutorial will introduce the reader to modern mediation analysis including: the mathematical framework; required assumptions; and software implementation in the R package medflex. All results are illustrated using a recent study on the causal pathways stemming from the early invasive treatment of acute coronary syndrome, for which the rich Danish population registers allow us to follow patients' medication use and more after being discharged from hospital.
[Visual hygiene in LED lighting. Modern scientific imaginations].
Deynego, V N; Kaptsov, V A
2014-01-01
There are considered a classic and modern paradigm of perception of light and its impact on human health. To consider the perception of light as a complex self-organizing synergistic system of compression of information in the process of its sequencing was supposed. This allowed to develop a complex of interrelated measures, which may become the basis for modern hygiene, and determine requirements for the led lamp with biologically adequate spectrum of the light, for which there were obtained patents in Russia, Europe and USA.
Speculative Truth - Henry Cavendish, Natural Philosophy, and the Rise of Modern Theoretical Science
NASA Astrophysics Data System (ADS)
McCormmach, Russell
2004-03-01
With a never-before published paper by Lord Henry Cavendish, as well as a biography on him, this book offers a fascinating discourse on the rise of scientific attitudes and ways of knowing. A pioneering British physicist in the late 18th and early 19th centuries, Cavendish was widely considered to be the first full-time scientist in the modern sense. Through the lens of this unique thinker and writer, this book is about the birth of modern science.
78 FR 41046 - Advanced Scientific Computing Advisory Committee
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-09
... Services Administration, notice is hereby given that the Advanced Scientific Computing Advisory Committee will be renewed for a two-year period beginning on July 1, 2013. The Committee will provide advice to the Director, Office of Science (DOE), on the Advanced Scientific Computing Research Program managed...
Science communication as political communication
Scheufele, Dietram A.
2014-01-01
Scientific debates in modern societies often blur the lines between the science that is being debated and the political, moral, and legal implications that come with its societal applications. This manuscript traces the origins of this phenomenon to professional norms within the scientific discipline and to the nature and complexities of modern science and offers an expanded model of science communication that takes into account the political contexts in which science communication takes place. In a second step, it explores what we know from empirical work in political communication, public opinion research, and communication research about the dynamics that determine how issues are debated and attitudes are formed in political environments. Finally, it discusses how and why it will be increasingly important for science communicators to draw from these different literatures to ensure that the voice of the scientific community is heard in the broader societal debates surrounding science. PMID:25225389
Science communication as political communication.
Scheufele, Dietram A
2014-09-16
Scientific debates in modern societies often blur the lines between the science that is being debated and the political, moral, and legal implications that come with its societal applications. This manuscript traces the origins of this phenomenon to professional norms within the scientific discipline and to the nature and complexities of modern science and offers an expanded model of science communication that takes into account the political contexts in which science communication takes place. In a second step, it explores what we know from empirical work in political communication, public opinion research, and communication research about the dynamics that determine how issues are debated and attitudes are formed in political environments. Finally, it discusses how and why it will be increasingly important for science communicators to draw from these different literatures to ensure that the voice of the scientific community is heard in the broader societal debates surrounding science.
Knowledge in motion: The cultural politics of modern science translations in Arabic.
Elshakry, Marwa S
2008-12-01
This essay looks at the problem of the global circulation of modem scientific knowledge by looking at science translations in modern Arabic. In the commercial centers of the late Ottoman Empire, emerging transnational networks lay behind the development of new communities of knowledge, many of which sought to break with old linguistic and literary norms to redefine the basis of their authority. Far from acting as neutral purveyors of "universal truths," scientific translations thus served as key instruments in this ongoing process of sociopolitical and epistemological transformation and mediation. Fierce debates over translators' linguistic strategies and choices involved deliberations over the character of language and the nature of "science" itself. They were also crucially shaped by such geopolitical factors as the rise of European imperialism and anticolonial nationalism in the region. The essay concludes by arguing for the need for greater attention to the local factors involved in the translation of scientific concepts across borders.
Plagiarism Detection by Online Solutions.
Masic, Izet; Begic, Edin; Dobraca, Amra
2017-01-01
The problem of plagiarism represents one of the burning issues of the modern scientific world. Detection of plagiarism is a problem that the Editorial Board encounters in their daily work. Software solutions represent a good solution for the detection of plagiarism. The problem of plagiarism will become most discussed topic of the modern scientific world, especially due to the development of standard measures, which rank the work of one author. Investment in education, education of young research personnel about the importance of scientific research, with paying particular attention on ethical behavior, becomes an imperative of academic staff. Editors have to invest additional effort in the development of the base of reviewers team as well as in their proper guidance, because after all, despite the software solutions, they are the best weapon to fight plagiarism. Peer review process should be a key of successful operation of each journal.
Prokop, O
1975-01-01
In a paper presented on the occasion of the 5th Congress of gynecology in the GDR the author discusses in details the phenomenon of modern occultism and quackery. Preferably he examines the so called parapsychology and declares that this is a field without any scientific value. Therefore the course of action of parapsychologists is not to be accepted without resolute reply.
On Modern Cosmology and Its Place in Science Education
ERIC Educational Resources Information Center
Kragh, Helge
2011-01-01
Cosmology in its current meaning of the science of the universe is a topic that attracts as much popular as scientific interest. This paper argues that modern cosmology and its philosophical aspects should have a prominent place in science education. In the context of science teaching a partly historical approach is recommended, in particular an…
NASA Astrophysics Data System (ADS)
Tang, William M., Dr.
2006-01-01
The second annual Scientific Discovery through Advanced Computing (SciDAC) Conference was held from June 25-29, 2006 at the new Hyatt Regency Hotel in Denver, Colorado. This conference showcased outstanding SciDAC-sponsored computational science results achieved during the past year across many scientific domains, with an emphasis on science at scale. Exciting computational science that has been accomplished outside of the SciDAC program both nationally and internationally was also featured to help foster communication between SciDAC computational scientists and those funded by other agencies. This was illustrated by many compelling examples of how domain scientists collaborated productively with applied mathematicians and computer scientists to effectively take advantage of terascale computers (capable of performing trillions of calculations per second) not only to accelerate progress in scientific discovery in a variety of fields but also to show great promise for being able to utilize the exciting petascale capabilities in the near future. The SciDAC program was originally conceived as an interdisciplinary computational science program based on the guiding principle that strong collaborative alliances between domain scientists, applied mathematicians, and computer scientists are vital to accelerated progress and associated discovery on the world's most challenging scientific problems. Associated verification and validation are essential in this successful program, which was funded by the US Department of Energy Office of Science (DOE OS) five years ago. As is made clear in many of the papers in these proceedings, SciDAC has fundamentally changed the way that computational science is now carried out in response to the exciting challenge of making the best use of the rapid progress in the emergence of more and more powerful computational platforms. In this regard, Dr. Raymond Orbach, Energy Undersecretary for Science at the DOE and Director of the OS has stated: `SciDAC has strengthened the role of high-end computing in furthering science. It is defining whole new fields for discovery.' (SciDAC Review, Spring 2006, p8). Application domains within the SciDAC 2006 conference agenda encompassed a broad range of science including: (i) the DOE core mission of energy research involving combustion studies relevant to fuel efficiency and pollution issues faced today and magnetic fusion investigations impacting prospects for future energy sources; (ii) fundamental explorations into the building blocks of matter, ranging from quantum chromodynamics - the basic theory that describes how quarks make up the protons and neutrons of all matter - to the design of modern high-energy accelerators; (iii) the formidable challenges of predicting and controlling the behavior of molecules in quantum chemistry and the complex biomolecules determining the evolution of biological systems; (iv) studies of exploding stars for insights into the nature of the universe; and (v) integrated climate modeling to enable realistic analysis of earth's changing climate. Associated research has made it quite clear that advanced computation is often the only means by which timely progress is feasible when dealing with these complex, multi-component physical, chemical, and biological systems operating over huge ranges of temporal and spatial scales. Working with the domain scientists, applied mathematicians and computer scientists have continued to develop the discretizations of the underlying equations and the complementary algorithms to enable improvements in solutions on modern parallel computing platforms as they evolve from the terascale toward the petascale regime. Moreover, the associated tremendous growth of data generated from the terabyte to the petabyte range demands not only the advanced data analysis and visualization methods to harvest the scientific information but also the development of efficient workflow strategies which can deal with the data input/output, management, movement, and storage challenges. If scientific discovery is expected to keep apace with the continuing progression from tera- to petascale platforms, the vital alliance between domain scientists, applied mathematicians, and computer scientists will be even more crucial. During the SciDAC 2006 Conference, some of the future challenges and opportunities in interdisciplinary computational science were emphasized in the Advanced Architectures Panel and by Dr. Victor Reis, Senior Advisor to the Secretary of Energy, who gave a featured presentation on `Simulation, Computation, and the Global Nuclear Energy Partnership.' Overall, the conference provided an excellent opportunity to highlight the rising importance of computational science in the scientific enterprise and to motivate future investment in this area. As Michael Strayer, SciDAC Program Director, has noted: `While SciDAC may have started out as a specific program, Scientific Discovery through Advanced Computing has become a powerful concept for addressing some of the biggest challenges facing our nation and our world.' Looking forward to next year, the SciDAC 2007 Conference will be held from June 24-28 at the Westin Copley Plaza in Boston, Massachusetts. Chairman: David Keyes, Columbia University. The Organizing Committee for the SciDAC 2006 Conference would like to acknowledge the individuals whose talents and efforts were essential to the success of the meeting. Special thanks go to Betsy Riley for her leadership in building the infrastructure support for the conference, for identifying and then obtaining contributions from our corporate sponsors, for coordinating all media communications, and for her efforts in organizing and preparing the conference proceedings for publication; to Tim Jones for handling the hotel scouting, subcontracts, and exhibits and stage production; to Angela Harris for handling supplies, shipping, and tracking, poster sessions set-up, and for her efforts in coordinating and scheduling the promotional activities that took place during the conference; to John Bui and John Smith for their superb wireless networking and A/V set-up and support; to Cindy Latham for Web site design, graphic design, and quality control of proceedings submissions; and to Pamelia Nixon-Hartje of Ambassador for budget and quality control of catering. We are grateful for the highly professional dedicated efforts of all of these individuals, who were the cornerstones of the SciDAC 2006 Conference. Thanks also go to Angela Beach of the ORNL Conference Center for her efforts in executing the contracts with the hotel, Carolyn James of Colorado State for on-site registration supervision, Lora Wolfe and Brittany Hagen for administrative support at ORNL, and Dami Rich and Andrew Sproles for graphic design and production. We are also most grateful to the Oak Ridge National Laboratory, especially Jeff Nichols, and to our corporate sponsors, Data Direct Networks, Cray, IBM, SGI, and Institute of Physics Publishing for their support. We especially express our gratitude to the featured speakers, invited oral speakers, invited poster presenters, session chairs, and advanced architecture panelists and chair for their excellent contributions on behalf of SciDAC 2006. We would like to express our deep appreciation to Lali Chatterjee, Graham Douglas, Margaret Smith, and the production team of Institute of Physics Publishing, who worked tirelessly to publish the final conference proceedings in a timely manner. Finally, heartfelt thanks are extended to Michael Strayer, Associate Director for OASCR and SciDAC Director, and to the DOE program managers associated with SciDAC for their continuing enthusiasm and strong support for the annual SciDAC Conferences as a special venue to showcase the exciting scientific discovery achievements enabled by the interdisciplinary collaborations championed by the SciDAC program.
Theoretical and technological building blocks for an innovation accelerator
NASA Astrophysics Data System (ADS)
van Harmelen, F.; Kampis, G.; Börner, K.; van den Besselaar, P.; Schultes, E.; Goble, C.; Groth, P.; Mons, B.; Anderson, S.; Decker, S.; Hayes, C.; Buecheler, T.; Helbing, D.
2012-11-01
Modern science is a main driver of technological innovation. The efficiency of the scientific system is of key importance to ensure the competitiveness of a nation or region. However, the scientific system that we use today was devised centuries ago and is inadequate for our current ICT-based society: the peer review system encourages conservatism, journal publications are monolithic and slow, data is often not available to other scientists, and the independent validation of results is limited. The resulting scientific process is hence slow and sloppy. Building on the Innovation Accelerator paper by Helbing and Balietti [1], this paper takes the initial global vision and reviews the theoretical and technological building blocks that can be used for implementing an innovation (in first place: science) accelerator platform driven by re-imagining the science system. The envisioned platform would rest on four pillars: (i) Redesign the incentive scheme to reduce behavior such as conservatism, herding and hyping; (ii) Advance scientific publications by breaking up the monolithic paper unit and introducing other building blocks such as data, tools, experiment workflows, resources; (iii) Use machine readable semantics for publications, debate structures, provenance etc. in order to include the computer as a partner in the scientific process, and (iv) Build an online platform for collaboration, including a network of trust and reputation among the different types of stakeholders in the scientific system: scientists, educators, funding agencies, policy makers, students and industrial innovators among others. Any such improvements to the scientific system must support the entire scientific process (unlike current tools that chop up the scientific process into disconnected pieces), must facilitate and encourage collaboration and interdisciplinarity (again unlike current tools), must facilitate the inclusion of intelligent computing in the scientific process, must facilitate not only the core scientific process, but also accommodate other stakeholders such science policy makers, industrial innovators, and the general public. We first describe the current state of the scientific system together with up to a dozen new key initiatives, including an analysis of the role of science as an innovation accelerator. Our brief survey will show that there exist many separate ideas and concepts and diverse stand-alone demonstrator systems for different components of the ecosystem with many parts are still unexplored, and overall integration lacking. By analyzing a matrix of stakeholders vs. functionalities, we identify the required innovations. We (non-exhaustively) discuss a few of them: Publications that are meaningful to machines, innovative reviewing processes, data publication, workflow archiving and reuse, alternative impact metrics, tools for the detection of trends, community formation and emergence, as well as modular publications, citation objects and debate graphs. To summarize, the core idea behind the Innovation Accelerator is to develop new incentive models, rules, and interaction mechanisms to stimulate true innovation, revolutionizing the way in which we create knowledge and disseminate information.
Whole earth modeling: developing and disseminating scientific software for computational geophysics.
NASA Astrophysics Data System (ADS)
Kellogg, L. H.
2016-12-01
Historically, a great deal of specialized scientific software for modeling and data analysis has been developed by individual researchers or small groups of scientists working on their own specific research problems. As the magnitude of available data and computer power has increased, so has the complexity of scientific problems addressed by computational methods, creating both a need to sustain existing scientific software, and expand its development to take advantage of new algorithms, new software approaches, and new computational hardware. To that end, communities like the Computational Infrastructure for Geodynamics (CIG) have been established to support the use of best practices in scientific computing for solid earth geophysics research and teaching. Working as a scientific community enables computational geophysicists to take advantage of technological developments, improve the accuracy and performance of software, build on prior software development, and collaborate more readily. The CIG community, and others, have adopted an open-source development model, in which code is developed and disseminated by the community in an open fashion, using version control and software repositories like Git. One emerging issue is how to adequately identify and credit the intellectual contributions involved in creating open source scientific software. The traditional method of disseminating scientific ideas, peer reviewed publication, was not designed for review or crediting scientific software, although emerging publication strategies such software journals are attempting to address the need. We are piloting an integrated approach in which authors are identified and credited as scientific software is developed and run. Successful software citation requires integration with the scholarly publication and indexing mechanisms as well, to assign credit, ensure discoverability, and provide provenance for software.
Kelly D. Brownell: Award for Distinguished Scientific Applications of Psychology
ERIC Educational Resources Information Center
American Psychologist, 2012
2012-01-01
Presents a short biography of Kelly D. Brownwell, winner of the American Psychological Association's Award for Distinguished Scientific Applications of Psychology (2012). He won the award for outstanding contributions to our understanding of the etiology and management of obesity and the crisis it poses for the modern world. A seminal thinker in…
Science Education and Challenges of Globalization in Igbo Nation
ERIC Educational Resources Information Center
Ezeudu, F. O.; Nkokelonye, C. U.; Adigwe, J. C.
2013-01-01
This paper reviewed the scientific contents in Igbo culture. Description of the Igbos who constitutes an ethnic group occupying southeastern Nigeria was made. It x-rayed the pre-colonial, colonial, and post-colonial culture of Igbo people and identified the scientific cultural activities, which can be harnessed to meet the challenges of modern day…
The Influence of Positivism in the Nineteenth Century Astronomy in Argentina
NASA Astrophysics Data System (ADS)
Santilli, Haydée; Cornejo, Jorge Norberto
2013-06-01
In this paper we analyze the influence of positivism in Argentina astronomical culture in the nineteenth century. We did the analysis from two dimensions, scientific knowledge development and science teaching. Because Argentina was a very young country at that time, it was of singular importance, not only the development of scientific knowledge itself, but also the training of human resources for the transfer of such knowledge. In this regard, the influence of astronomy, in its role of modernizing discipline related to positivist ideal, was particularly noticeable in the training of teachers of primary schools. Domingo F. Sarmiento represents a turning point for the astronomy development in Argentina; his thought was strongly influenced by the Comtean positivism. Sarmiento believed that Copernican astronomy was one of the critical scientific disciplines to the formation of a "modern" citizen. Astronomy in Argentina was influenced by two epistemological streams: French and German positivism; however the first one was the most important. We shall show the relevant influence of the socio-historical context over the scientific development. We shall also see that science was a fundamental social actor in Argentina history.
Onwards facing backwards: the rhetoric of science in nineteenth-century Greece.
Tampakis, Kostas
2014-06-01
The aim of this paper is to show how the Greek men of science negotiated a role for their enterprise within the Greek public sphere, from the institution of the modern Greek state in the early 1830s to the first decades of the twentieth century. By focusing on instances where they appeared in public in their official capacity as scientific experts, I describe the rhetorical schemata and the narrative strategies with which Greek science experts engaged the discourses prevalent in nineteenth- and early twentieth-century Greece. In the end, my goal is to show how they were neither zealots of modernization nor neutral actors struggling in isolated wastelands. Rather, they appear as energetic agents who used scientific expertise, national ideals and their privileged cultural positions to construct a rhetoric that would further all three. They engaged eagerly and consistently with emerging political views, scientific subjects and cultural and political events, without presenting themselves, or being seen, as doing anything qualitatively different from their peers abroad. Greek scientists cross-contextualized the scientific enterprise, situating it in the space in which they were active.
Integrating Data Base into the Elementary School Science Program.
ERIC Educational Resources Information Center
Schlenker, Richard M.
This document describes seven science activities that combine scientific principles and computers. The objectives for the activities are to show students how the computer can be used as a tool to store and arrange scientific data, provide students with experience using the computer as a tool to manage scientific data, and provide students with…
Astronomy in the Russian Scientific-Educational Project: "KAZAN-GEONA-2010"
NASA Astrophysics Data System (ADS)
Gusev, A.; Kitiashvili, I.
2006-08-01
The European Union promotes the Sixth Framework Programme. One of the goals of the EU Programme is opening national research and training programs. A special role in the history of the Kazan University was played by the great mathematician Nikolai Lobachevsky - the founder of non-Euclidean geometry (1826). Historically, the thousand-year old city of Kazan and the two-hundred-year old Kazan University carry out the role of the scientific, organizational, and cultural educational center of the Volga region. For the continued successful development of educational and scientific-educational activity of the Russian Federation, the Republic Tatarstan, Kazan was offered the national project: the International Center of the Sciences and Internet Technologies "GeoNa" (Geometry of Nature - GeoNa - is wisdom, enthusiasm, pride, grandeur). This is a modern complex of conference halls including the Center for Internet Technologies, a 3D Planetarium - development of the Moon, PhysicsLand, an active museum of natural sciences, an oceanarium, and a training complex "Spheres of Knowledge". Center GeoNa promotes the direct and effective channel of cooperation with scientific centers around the world. GeoNa will host conferences, congresses, fundamental scientific research sessions of the Moon and planets, and scientific-educational actions: presentation of the international scientific programs on lunar research and modern lunar databases. A more intense program of exchange between scientific centers and organizations for a better knowledge and planning of their astronomical curricula and the introduction of the teaching of astronomy are proposed. Center GeoNa will enable scientists and teachers of the Russian universities with advanced achievements in science and information technologies to join together to establish scientific communications with foreign colleagues in the sphere of the high technology and educational projects with world scientific centers.
A high performance scientific cloud computing environment for materials simulations
NASA Astrophysics Data System (ADS)
Jorissen, K.; Vila, F. D.; Rehr, J. J.
2012-09-01
We describe the development of a scientific cloud computing (SCC) platform that offers high performance computation capability. The platform consists of a scientific virtual machine prototype containing a UNIX operating system and several materials science codes, together with essential interface tools (an SCC toolset) that offers functionality comparable to local compute clusters. In particular, our SCC toolset provides automatic creation of virtual clusters for parallel computing, including tools for execution and monitoring performance, as well as efficient I/O utilities that enable seamless connections to and from the cloud. Our SCC platform is optimized for the Amazon Elastic Compute Cloud (EC2). We present benchmarks for prototypical scientific applications and demonstrate performance comparable to local compute clusters. To facilitate code execution and provide user-friendly access, we have also integrated cloud computing capability in a JAVA-based GUI. Our SCC platform may be an alternative to traditional HPC resources for materials science or quantum chemistry applications.
Modeling the source of GW150914 with targeted numerical-relativity simulations
NASA Astrophysics Data System (ADS)
Lovelace, Geoffrey; Lousto, Carlos O.; Healy, James; Scheel, Mark A.; Garcia, Alyssa; O'Shaughnessy, Richard; Boyle, Michael; Campanelli, Manuela; Hemberger, Daniel A.; Kidder, Lawrence E.; Pfeiffer, Harald P.; Szilágyi, Béla; Teukolsky, Saul A.; Zlochower, Yosef
2016-12-01
In fall of 2015, the two LIGO detectors measured the gravitational wave signal GW150914, which originated from a pair of merging black holes (Abbott et al Virgo, LIGO Scientific 2016 Phys. Rev. Lett. 116 061102). In the final 0.2 s (about 8 gravitational-wave cycles) before the amplitude reached its maximum, the observed signal swept up in amplitude and frequency, from 35 Hz to 150 Hz. The theoretical gravitational-wave signal for merging black holes, as predicted by general relativity, can be computed only by full numerical relativity, because analytic approximations fail near the time of merger. Moreover, the nearly-equal masses, moderate spins, and small number of orbits of GW150914 are especially straightforward and efficient to simulate with modern numerical-relativity codes. In this paper, we report the modeling of GW150914 with numerical-relativity simulations, using black-hole masses and spins consistent with those inferred from LIGO’s measurement (Abbott et al LIGO Scientific Collaboration, Virgo Collaboration 2016 Phys. Rev. Lett. 116 241102). In particular, we employ two independent numerical-relativity codes that use completely different analytical and numerical methods to model the same merging black holes and to compute the emitted gravitational waveform; we find excellent agreement between the waveforms produced by the two independent codes. These results demonstrate the validity, impact, and potential of current and future studies using rapid-response, targeted numerical-relativity simulations for better understanding gravitational-wave observations.
On the Efficacy of Source Code Optimizations for Cache-Based Systems
NASA Technical Reports Server (NTRS)
VanderWijngaart, Rob F.; Saphir, William C.
1998-01-01
Obtaining high performance without machine-specific tuning is an important goal of scientific application programmers. Since most scientific processing is done on commodity microprocessors with hierarchical memory systems, this goal of "portable performance" can be achieved if a common set of optimization principles is effective for all such systems. It is widely believed, or at least hoped, that portable performance can be realized. The rule of thumb for optimization on hierarchical memory systems is to maximize temporal and spatial locality of memory references by reusing data and minimizing memory access stride. We investigate the effects of a number of optimizations on the performance of three related kernels taken from a computational fluid dynamics application. Timing the kernels on a range of processors, we observe an inconsistent and often counterintuitive impact of the optimizations on performance. In particular, code variations that have a positive impact on one architecture can have a negative impact on another, and variations expected to be unimportant can produce large effects. Moreover, we find that cache miss rates - as reported by a cache simulation tool, and confirmed by hardware counters - only partially explain the results. By contrast, the compiler-generated assembly code provides more insight by revealing the importance of processor-specific instructions and of compiler maturity, both of which strongly, and sometimes unexpectedly, influence performance. We conclude that it is difficult to obtain performance portability on modern cache-based computers, and comment on the implications of this result.
On the Efficacy of Source Code Optimizations for Cache-Based Systems
NASA Technical Reports Server (NTRS)
VanderWijngaart, Rob F.; Saphir, William C.; Saini, Subhash (Technical Monitor)
1998-01-01
Obtaining high performance without machine-specific tuning is an important goal of scientific application programmers. Since most scientific processing is done on commodity microprocessors with hierarchical memory systems, this goal of "portable performance" can be achieved if a common set of optimization principles is effective for all such systems. It is widely believed, or at least hoped, that portable performance can be realized. The rule of thumb for optimization on hierarchical memory systems is to maximize temporal and spatial locality of memory references by reusing data and minimizing memory access stride. We investigate the effects of a number of optimizations on the performance of three related kernels taken from a computational fluid dynamics application. Timing the kernels on a range of processors, we observe an inconsistent and often counterintuitive impact of the optimizations on performance. In particular, code variations that have a positive impact on one architecture can have a negative impact on another, and variations expected to be unimportant can produce large effects. Moreover, we find that cache miss rates-as reported by a cache simulation tool, and confirmed by hardware counters-only partially explain the results. By contrast, the compiler-generated assembly code provides more insight by revealing the importance of processor-specific instructions and of compiler maturity, both of which strongly, and sometimes unexpectedly, influence performance. We conclude that it is difficult to obtain performance portability on modern cache-based computers, and comment on the implications of this result.
The modern theory of biological evolution: an expanded synthesis.
Kutschera, Ulrich; Niklas, Karl J
2004-06-01
In 1858, two naturalists, Charles Darwin and Alfred Russel Wallace, independently proposed natural selection as the basic mechanism responsible for the origin of new phenotypic variants and, ultimately, new species. A large body of evidence for this hypothesis was published in Darwin's Origin of Species one year later, the appearance of which provoked other leading scientists like August Weismann to adopt and amplify Darwin's perspective. Weismann's neo-Darwinian theory of evolution was further elaborated, most notably in a series of books by Theodosius Dobzhansky, Ernst Mayr, Julian Huxley and others. In this article we first summarize the history of life on Earth and provide recent evidence demonstrating that Darwin's dilemma (the apparent missing Precambrian record of life) has been resolved. Next, the historical development and structure of the "modern synthesis" is described within the context of the following topics: paleobiology and rates of evolution, mass extinctions and species selection, macroevolution and punctuated equilibrium, sexual reproduction and recombination, sexual selection and altruism, endosymbiosis and eukaryotic cell evolution, evolutionary developmental biology, phenotypic plasticity, epigenetic inheritance and molecular evolution, experimental bacterial evolution, and computer simulations (in silico evolution of digital organisms). In addition, we discuss the expansion of the modern synthesis, embracing all branches of scientific disciplines. It is concluded that the basic tenets of the synthetic theory have survived, but in modified form. These sub-theories require continued elaboration, particularly in light of molecular biology, to answer open-ended questions concerning the mechanisms of evolution in all five kingdoms of life.
The modern theory of biological evolution: an expanded synthesis
NASA Astrophysics Data System (ADS)
Kutschera, Ulrich; Niklas, Karl J.
In 1858, two naturalists, Charles Darwin and Alfred Russel Wallace, independently proposed natural selection as the basic mechanism responsible for the origin of new phenotypic variants and, ultimately, new species. A large body of evidence for this hypothesis was published in Darwin's Origin of Species one year later, the appearance of which provoked other leading scientists like August Weismann to adopt and amplify Darwin's perspective. Weismann's neo-Darwinian theory of evolution was further elaborated, most notably in a series of books by Theodosius Dobzhansky, Ernst Mayr, Julian Huxley and others. In this article we first summarize the history of life on Earth and provide recent evidence demonstrating that Darwin's dilemma (the apparent missing Precambrian record of life) has been resolved. Next, the historical development and structure of the ``modern synthesis'' is described within the context of the following topics: paleobiology and rates of evolution, mass extinctions and species selection, macroevolution and punctuated equilibrium, sexual reproduction and recombination, sexual selection and altruism, endosymbiosis and eukaryotic cell evolution, evolutionary developmental biology, phenotypic plasticity, epigenetic inheritance and molecular evolution, experimental bacterial evolution, and computer simulations (in silico evolution of digital organisms). In addition, we discuss the expansion of the modern synthesis, embracing all branches of scientific disciplines. It is concluded that the basic tenets of the synthetic theory have survived, but in modified form. These sub-theories require continued elaboration, particularly in light of molecular biology, to answer open-ended questions concerning the mechanisms of evolution in all five kingdoms of life.
An Adaptable Seismic Data Format for Modern Scientific Workflows
NASA Astrophysics Data System (ADS)
Smith, J. A.; Bozdag, E.; Krischer, L.; Lefebvre, M.; Lei, W.; Podhorszki, N.; Tromp, J.
2013-12-01
Data storage, exchange, and access play a critical role in modern seismology. Current seismic data formats, such as SEED, SAC, and SEG-Y, were designed with specific applications in mind and are frequently a major bottleneck in implementing efficient workflows. We propose a new modern parallel format that can be adapted for a variety of seismic workflows. The Adaptable Seismic Data Format (ASDF) features high-performance parallel read and write support and the ability to store an arbitrary number of traces of varying sizes. Provenance information is stored inside the file so that users know the origin of the data as well as the precise operations that have been applied to the waveforms. The design of the new format is based on several real-world use cases, including earthquake seismology and seismic interferometry. The metadata is based on the proven XML schemas StationXML and QuakeML. Existing time-series analysis tool-kits are easily interfaced with this new format so that seismologists can use robust, previously developed software packages, such as ObsPy and the SAC library. ADIOS, netCDF4, and HDF5 can be used as the underlying container format. At Princeton University, we have chosen to use ADIOS as the container format because it has shown superior scalability for certain applications, such as dealing with big data on HPC systems. In the context of high-performance computing, we have implemented ASDF into the global adjoint tomography workflow on Oak Ridge National Laboratory's supercomputer Titan.
ERIC Educational Resources Information Center
Online Submission, 2010
2010-01-01
The 4th international conference "Nation and Language: Modern Aspects of Socio-Linguistic Development" continues an eight-year old tradition. The conference is organized by Kaunas University of Technology Panevezys Institute and aims to bring scientists and researchers together for a general scientific discussion on new trends in…
The case for a modern multiwavelength, polarization-sensitive LIDAR in orbit around Mars
Brown, Adrian J.; Michaels, Timothy I.; Byrne, Shane; Sun, Wenbo; Titus, Timothy N.; Colaprete, Anthony; Wolff, Michael J.; Videen, Gorden; Grund, Christian J.
2014-01-01
We present the scientific case to build a multiple-wavelength, active, near-infrared (NIR) instrument to measure the reflected intensity and polarization characteristics of backscattered radiation from planetary surfaces and atmospheres. We focus on the ability of such an instrument to enhance, perhaps revolutionize, our understanding of climate, volatiles and astrobiological potential of modern-day Mars.
Analytics and Action in Afghanistan
2010-09-01
rests on rational technology , and ultimately on scientific knowledge. No country could be modern without being eco- nomically advanced or...backwardness to enlight - ened modernity. Underdeveloped countries had failed to progress to what Max Weber called rational legalism because of the grip...Douglas Pike, Viet Cong: The Organization and Techniques of the National Liberation Front of South Vietnam (Boston: Massachusetts Institute of Technology
Is homeopathy a science?--Continuity and clash of concepts of science within holistic medicine.
Schmidt, Josef M
2009-06-01
The question of whether homeopathy is a science is currently discussed almost exclusively against the background of the modern concept of natural science. This approach, however, fails to notice that homeopathy-in terms of history of science-rests on different roots that can essentially be traced back to two most influential traditions of science: on the one hand, principles and notions of Aristotelism which determined 2,000 years of Western history of science and, on the other hand, the modern concept of natural science that has been dominating the history of medicine for less than 200 years. While Aristotle's "science of the living" still included ontologic and teleologic dimensions for the sake of comprehending nature in a uniform way, the interest of modern natural science was reduced to functional and causal explanations of all phenomena for the purpose of commanding nature. In order to prevent further ecological catastrophes as well as to regain lost dimensions of our lives, the one-sidedness and theory-loadedness of our modern natural-scientific view of life should henceforth be counterbalanced by lifeworld-practical Aristotelic categories. In this way, the ground would be ready to conceive the scientific character of homeopathy-in a broader, Aristotelian sense.
Modern Computational Techniques for the HMMER Sequence Analysis
2013-01-01
This paper focuses on the latest research and critical reviews on modern computing architectures, software and hardware accelerated algorithms for bioinformatics data analysis with an emphasis on one of the most important sequence analysis applications—hidden Markov models (HMM). We show the detailed performance comparison of sequence analysis tools on various computing platforms recently developed in the bioinformatics society. The characteristics of the sequence analysis, such as data and compute-intensive natures, make it very attractive to optimize and parallelize by using both traditional software approach and innovated hardware acceleration technologies. PMID:25937944
Constructing Scientific Arguments Using Evidence from Dynamic Computational Climate Models
ERIC Educational Resources Information Center
Pallant, Amy; Lee, Hee-Sun
2015-01-01
Modeling and argumentation are two important scientific practices students need to develop throughout school years. In this paper, we investigated how middle and high school students (N = 512) construct a scientific argument based on evidence from computational models with which they simulated climate change. We designed scientific argumentation…
Parallel processing for scientific computations
NASA Technical Reports Server (NTRS)
Alkhatib, Hasan S.
1995-01-01
The scope of this project dealt with the investigation of the requirements to support distributed computing of scientific computations over a cluster of cooperative workstations. Various experiments on computations for the solution of simultaneous linear equations were performed in the early phase of the project to gain experience in the general nature and requirements of scientific applications. A specification of a distributed integrated computing environment, DICE, based on a distributed shared memory communication paradigm has been developed and evaluated. The distributed shared memory model facilitates porting existing parallel algorithms that have been designed for shared memory multiprocessor systems to the new environment. The potential of this new environment is to provide supercomputing capability through the utilization of the aggregate power of workstations cooperating in a cluster interconnected via a local area network. Workstations, generally, do not have the computing power to tackle complex scientific applications, making them primarily useful for visualization, data reduction, and filtering as far as complex scientific applications are concerned. There is a tremendous amount of computing power that is left unused in a network of workstations. Very often a workstation is simply sitting idle on a desk. A set of tools can be developed to take advantage of this potential computing power to create a platform suitable for large scientific computations. The integration of several workstations into a logical cluster of distributed, cooperative, computing stations presents an alternative to shared memory multiprocessor systems. In this project we designed and evaluated such a system.
Brown, David K; Penkler, David L; Musyoka, Thommas M; Bishop, Özlem Tastan
2015-01-01
Complex computational pipelines are becoming a staple of modern scientific research. Often these pipelines are resource intensive and require days of computing time. In such cases, it makes sense to run them over high performance computing (HPC) clusters where they can take advantage of the aggregated resources of many powerful computers. In addition to this, researchers often want to integrate their workflows into their own web servers. In these cases, software is needed to manage the submission of jobs from the web interface to the cluster and then return the results once the job has finished executing. We have developed the Job Management System (JMS), a workflow management system and web interface for high performance computing (HPC). JMS provides users with a user-friendly web interface for creating complex workflows with multiple stages. It integrates this workflow functionality with the resource manager, a tool that is used to control and manage batch jobs on HPC clusters. As such, JMS combines workflow management functionality with cluster administration functionality. In addition, JMS provides developer tools including a code editor and the ability to version tools and scripts. JMS can be used by researchers from any field to build and run complex computational pipelines and provides functionality to include these pipelines in external interfaces. JMS is currently being used to house a number of bioinformatics pipelines at the Research Unit in Bioinformatics (RUBi) at Rhodes University. JMS is an open-source project and is freely available at https://github.com/RUBi-ZA/JMS.
Brown, David K.; Penkler, David L.; Musyoka, Thommas M.; Bishop, Özlem Tastan
2015-01-01
Complex computational pipelines are becoming a staple of modern scientific research. Often these pipelines are resource intensive and require days of computing time. In such cases, it makes sense to run them over high performance computing (HPC) clusters where they can take advantage of the aggregated resources of many powerful computers. In addition to this, researchers often want to integrate their workflows into their own web servers. In these cases, software is needed to manage the submission of jobs from the web interface to the cluster and then return the results once the job has finished executing. We have developed the Job Management System (JMS), a workflow management system and web interface for high performance computing (HPC). JMS provides users with a user-friendly web interface for creating complex workflows with multiple stages. It integrates this workflow functionality with the resource manager, a tool that is used to control and manage batch jobs on HPC clusters. As such, JMS combines workflow management functionality with cluster administration functionality. In addition, JMS provides developer tools including a code editor and the ability to version tools and scripts. JMS can be used by researchers from any field to build and run complex computational pipelines and provides functionality to include these pipelines in external interfaces. JMS is currently being used to house a number of bioinformatics pipelines at the Research Unit in Bioinformatics (RUBi) at Rhodes University. JMS is an open-source project and is freely available at https://github.com/RUBi-ZA/JMS. PMID:26280450
Collaborative Group Learning Approaches for Teaching Comparative Planetology
NASA Astrophysics Data System (ADS)
Slater, S. J.; Slater, T. F.
2013-12-01
Modern science education reform documents propose that the teaching of contemporary students should focus on doing science, rather than simply memorizing science. Duschl, Schweingruber, and Shouse (2007) eloquently argue for four science proficiencies for students. Students should: (i) Know, use, and interpret scientific explanations of the natural world; (ii) Generate and evaluate scientific evidence and explanations; (iii) Understand the nature and development of scientific knowledge; and (iv) Participate productively in scientific practices and discourse. In response, scholars with the CAPER Center for Astronomy & Physics Education Research are creating and field-tested two separate instructional approaches. The first of these is a series of computer-mediated, inquiry learning experiences for non-science majoring undergraduates based upon an inquiry-oriented teaching approach framed by the notions of backwards faded-scaffolding as an overarching theme for instruction. Backwards faded-scaffolding is a strategy where the conventional and rigidly linear scientific method is turned on its head and students are first taught how to create conclusions based on evidence, then how experimental design creates evidence, and only at the end introduces students to the most challenging part of inquiry - inventing scientifically appropriate questions. Planetary science databases and virtual environments used by students to conduct scientific investigations include the NASA and JPL Solar System Simulator and Eyes on the Solar System as well as the USGS Moon and Mars Global GIS Viewers. The second of these is known widely as a Lecture-Tutorial approach. Lecture-Tutorials are self-contained, collaborative group activities. The materials are designed specifically to be easily integrated into the lecture course and directly address the needs of busy and heavily-loaded teaching faculty for effective, student-centered, classroom-ready materials that do not require a drastic course revision for implementation. Students are asked to reason about difficult concepts, while working in pairs, and to discuss their ideas openly. Extensive evaluation results consistently suggest that both the backwards faded-scaffolding and the Lecture-Tutorials approaches are successful at engaging students in self-directed scientific discourse as measured by the Views on Scientific Inquiry (VOSI) as well as increasing their knowledge of science as measured by the Test Of Atronomy STandards (TOAST).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bethel, E. Wes; Frank, Randy; Fulcomer, Sam
Scientific visualization is the transformation of abstract information into images, and it plays an integral role in the scientific process by facilitating insight into observed or simulated phenomena. Visualization as a discipline spans many research areas from computer science, cognitive psychology and even art. Yet the most successful visualization applications are created when close synergistic interactions with domain scientists are part of the algorithmic design and implementation process, leading to visual representations with clear scientific meaning. Visualization is used to explore, to debug, to gain understanding, and as an analysis tool. Visualization is literally everywhere--images are present in this report,more » on television, on the web, in books and magazines--the common theme is the ability to present information visually that is rapidly assimilated by human observers, and transformed into understanding or insight. As an indispensable part a modern science laboratory, visualization is akin to the biologist's microscope or the electrical engineer's oscilloscope. Whereas the microscope is limited to small specimens or use of optics to focus light, the power of scientific visualization is virtually limitless: visualization provides the means to examine data that can be at galactic or atomic scales, or at any size in between. Unlike the traditional scientific tools for visual inspection, visualization offers the means to ''see the unseeable.'' Trends in demographics or changes in levels of atmospheric CO{sub 2} as a function of greenhouse gas emissions are familiar examples of such unseeable phenomena. Over time, visualization techniques evolve in response to scientific need. Each scientific discipline has its ''own language,'' verbal and visual, used for communication. The visual language for depicting electrical circuits is much different than the visual language for depicting theoretical molecules or trends in the stock market. There is no ''one visualization too'' that can serve as a panacea for all science disciplines. Instead, visualization researchers work hand in hand with domain scientists as part of the scientific research process to define, create, adapt and refine software that ''speaks the visual language'' of each scientific domain.« less
Enabling a high throughput real time data pipeline for a large radio telescope array with GPUs
NASA Astrophysics Data System (ADS)
Edgar, R. G.; Clark, M. A.; Dale, K.; Mitchell, D. A.; Ord, S. M.; Wayth, R. B.; Pfister, H.; Greenhill, L. J.
2010-10-01
The Murchison Widefield Array (MWA) is a next-generation radio telescope currently under construction in the remote Western Australia Outback. Raw data will be generated continuously at 5 GiB s-1, grouped into 8 s cadences. This high throughput motivates the development of on-site, real time processing and reduction in preference to archiving, transport and off-line processing. Each batch of 8 s data must be completely reduced before the next batch arrives. Maintaining real time operation will require a sustained performance of around 2.5 TFLOP s-1 (including convolutions, FFTs, interpolations and matrix multiplications). We describe a scalable heterogeneous computing pipeline implementation, exploiting both the high computing density and FLOP-per-Watt ratio of modern GPUs. The architecture is highly parallel within and across nodes, with all major processing elements performed by GPUs. Necessary scatter-gather operations along the pipeline are loosely synchronized between the nodes hosting the GPUs. The MWA will be a frontier scientific instrument and a pathfinder for planned peta- and exa-scale facilities.
Comprehensive efficiency analysis of supercomputer resource usage based on system monitoring data
NASA Astrophysics Data System (ADS)
Mamaeva, A. A.; Shaykhislamov, D. I.; Voevodin, Vad V.; Zhumatiy, S. A.
2018-03-01
One of the main problems of modern supercomputers is the low efficiency of their usage, which leads to the significant idle time of computational resources, and, in turn, to the decrease in speed of scientific research. This paper presents three approaches to study the efficiency of supercomputer resource usage based on monitoring data analysis. The first approach performs an analysis of computing resource utilization statistics, which allows to identify different typical classes of programs, to explore the structure of the supercomputer job flow and to track overall trends in the supercomputer behavior. The second approach is aimed specifically at analyzing off-the-shelf software packages and libraries installed on the supercomputer, since efficiency of their usage is becoming an increasingly important factor for the efficient functioning of the entire supercomputer. Within the third approach, abnormal jobs – jobs with abnormally inefficient behavior that differs significantly from the standard behavior of the overall supercomputer job flow – are being detected. For each approach, the results obtained in practice in the Supercomputer Center of Moscow State University are demonstrated.
VenomKB, a new knowledge base for facilitating the validation of putative venom therapies
Romano, Joseph D.; Tatonetti, Nicholas P.
2015-01-01
Animal venoms have been used for therapeutic purposes since the dawn of recorded history. Only a small fraction, however, have been tested for pharmaceutical utility. Modern computational methods enable the systematic exploration of novel therapeutic uses for venom compounds. Unfortunately, there is currently no comprehensive resource describing the clinical effects of venoms to support this computational analysis. We present VenomKB, a new publicly accessible knowledge base and website that aims to act as a repository for emerging and putative venom therapies. Presently, it consists of three database tables: (1) Manually curated records of putative venom therapies supported by scientific literature, (2) automatically parsed MEDLINE articles describing compounds that may be venom derived, and their effects on the human body, and (3) automatically retrieved records from the new Semantic Medline resource that describe the effects of venom compounds on mammalian anatomy. Data from VenomKB may be selectively retrieved in a variety of popular data formats, are open-source, and will be continually updated as venom therapies become better understood. PMID:26601758
Computer image analysis in caryopses quality evaluation as exemplified by malting barley
NASA Astrophysics Data System (ADS)
Koszela, K.; Raba, B.; Zaborowicz, M.; Przybył, K.; Wojcieszak, D.; Czekała, W.; Ludwiczak, A.; Przybylak, A.; Boniecki, P.; Przybył, J.
2015-07-01
One of the purposes to employ modern technologies in agricultural and food industry is to increase the efficiency and automation of production processes, which helps improve productive effectiveness of business enterprises, thus making them more competitive. Nowadays, a challenge presents itself for this branch of economy, to produce agricultural and food products characterized by the best parameters in terms of quality, while maintaining optimum production and distribution costs of the processed biological material. Thus, several scientific centers seek to devise new and improved methods and technologies in this field, which will allow to meet the expectations. A new solution, under constant development, is to employ the so-called machine vision which is to replace human work in both quality and quantity evaluation processes. An indisputable advantage of employing the method is keeping the evaluation unbiased while improving its rate and, what is important, eliminating the fatigue factor of the expert. This paper elaborates on the topic of quality evaluation by marking the contamination in malting barley grains using computer image analysis and selected methods of artificial intelligence [4-5].
Fundamentals--Rudolf Virchow and modern medicine.
Reese, D M
1998-01-01
The 19th century pathologist Rudolf Virchow was a physician, scientist, and revolutionary. The preeminent medical investigator of his day, Virchow remains best-known for his theory of cellular pathology, which laid the conceptual foundation for modern scientific medicine. Less appreciated are Virchow's numerous accomplishments in public health, anthropology, and European politics, including his quest for social justice and democracy in Imperial Germany. The study of Virchow's life and writings may provide contemporary physicians with a powerful role model as we grapple with the complexities of the modern medical enterprise. PMID:9735691
The need for data standards in zoomorphology.
Vogt, Lars; Nickel, Michael; Jenner, Ronald A; Deans, Andrew R
2013-07-01
eScience is a new approach to research that focuses on data mining and exploration rather than data generation or simulation. This new approach is arguably a driving force for scientific progress and requires data to be openly available, easily accessible via the Internet, and compatible with each other. eScience relies on modern standards for the reporting and documentation of data and metadata. Here, we suggest necessary components (i.e., content, concept, nomenclature, format) of such standards in the context of zoomorphology. We document the need for using data repositories to prevent data loss and how publication practice is currently changing, with the emergence of dynamic publications and the publication of digital datasets. Subsequently, we demonstrate that in zoomorphology the scientific record is still limited to published literature and that zoomorphological data are usually not accessible through data repositories. The underlying problem is that zoomorphology lacks the standards for data and metadata. As a consequence, zoomorphology cannot participate in eScience. We argue that the standardization of morphological data requires i) a standardized framework for terminologies for anatomy and ii) a formalized method of description that allows computer-parsable morphological data to be communicable, compatible, and comparable. The role of controlled vocabularies (e.g., ontologies) for developing respective terminologies and methods of description is discussed, especially in the context of data annotation and semantic enhancement of publications. Finally, we introduce the International Consortium for Zoomorphology Standards, a working group that is open to everyone and whose aim is to stimulate and synthesize dialog about standards. It is the Consortium's ultimate goal to assist the zoomorphology community in developing modern data and metadata standards, including anatomy ontologies, thereby facilitating the participation of zoomorphology in eScience. Copyright © 2013 Wiley Periodicals, Inc.
DCMS: A data analytics and management system for molecular simulation.
Kumar, Anand; Grupcev, Vladimir; Berrada, Meryem; Fogarty, Joseph C; Tu, Yi-Cheng; Zhu, Xingquan; Pandit, Sagar A; Xia, Yuni
Molecular Simulation (MS) is a powerful tool for studying physical/chemical features of large systems and has seen applications in many scientific and engineering domains. During the simulation process, the experiments generate a very large number of atoms and intend to observe their spatial and temporal relationships for scientific analysis. The sheer data volumes and their intensive interactions impose significant challenges for data accessing, managing, and analysis. To date, existing MS software systems fall short on storage and handling of MS data, mainly because of the missing of a platform to support applications that involve intensive data access and analytical process. In this paper, we present the database-centric molecular simulation (DCMS) system our team developed in the past few years. The main idea behind DCMS is to store MS data in a relational database management system (DBMS) to take advantage of the declarative query interface ( i.e. , SQL), data access methods, query processing, and optimization mechanisms of modern DBMSs. A unique challenge is to handle the analytical queries that are often compute-intensive. For that, we developed novel indexing and query processing strategies (including algorithms running on modern co-processors) as integrated components of the DBMS. As a result, researchers can upload and analyze their data using efficient functions implemented inside the DBMS. Index structures are generated to store analysis results that may be interesting to other users, so that the results are readily available without duplicating the analysis. We have developed a prototype of DCMS based on the PostgreSQL system and experiments using real MS data and workload show that DCMS significantly outperforms existing MS software systems. We also used it as a platform to test other data management issues such as security and compression.
NASA Astrophysics Data System (ADS)
Mayorova, Vera
2011-09-01
National priorities, defined by modern state of high-tech industries, demand adequate problem solving of training professionals possessing required modern qualifications. Modern tendencies of the development of aerospace technologies, harsh competition in the market of space services and expansion of international cooperation for implementation of space projects, demand sharp increase of the scientific/technical level and competitiveness of the developed projects. Especially important is to be able to solve technological problems, which in turn define the cost and quality attributes of the designed item, as well as the ability to utilize the most modern design principles. Training of highly efficient, creative professionals who are capable of generating and implementing new ideas is a very important factor driving not only the development of national economy and industry, but also enriching the human capital of the country. Moscow State Technical University named after N.E. Bauman developed and successfully implemented the project-oriented technology of professional training for aerospace industry. It assumes a multitude of forms, methodologies and organizational events, which allow preparing the specialists - on the basis of integration of scientific/technological and educational environment - who are adapted to the conditions of the intellectual market. The Youth Space Center of the University is the base where graduate and post-graduate students attend unique lectures as a part of the facultative course "Applied Cosmonautics", participate in annual International Youth Science School "Space Development: Theory and Practice" and develop innovative technical projects aimed at creation of real-life space hardware. Microsatellite technologies are being developed in Bauman University through various projects, which are implemented in a coordinated manner by way of accomplishing the following steps: development of small-size satellites by universities, using them as test-beds for quick and affordable trial-and-test of new technologies and design solutions in aerospace followed by implementation of selected efficiencies in the industry; development and improvement of ground control infrastructure based in the university, which includes the Mission Control Center and the Earth Remote Sensing Center; development of cooperative partnerships with international partners in the field of microsatellite technologies with the goal of sharing experience, uniting efforts in preparing and running scientific and educational experiments and creating next-generation spacecraft by multi-national student groups. Such approaches allow creating seamless environment that unites educational, scientific and innovative processes. This allows students to develop high professionalism, modern engineering thinking and stable engineering skills at an early stage of education at the university.
Hindu Responses to Darwinism: Assimilation and Rejection in a Colonial and Post-Colonial Context
NASA Astrophysics Data System (ADS)
MacKenzie Brown, C.
2010-06-01
Hindu responses to Darwinism, like Christian, have run the gamut from outright rejection to fairly robust but limited accommodations of the Darwinian perspective. Despite certain features of Hindu thought such as the enormous time-scales of traditional cosmogonies that may suggest considerable affinity with modern notions of organic evolution, more often than not traditional assumptions have worked against deep engagement with Darwinism, allowing only for superficial assimilation at best. Three fundamental factors have affected Hindu responses to Darwinism: the great diversity within the tradition spanning evolutionist and creationist perspectives, the encounter with Darwinism in the late nineteenth century as part of an alien culture, and the fact that this encounter occurred within a colonial context. This essay explores the complex interactions of these three factors, beginning with the diversity within the ancient and classical cosmological traditions, followed by consideration of colonial developments and the emergence of four representative Hindu approaches to Darwinism: Modern Vedic Evolutionism, Anthropic Vedic Evolutionism, Reactionary Vedic Evolutionism, and Modern Vedic Creationism. The essay concludes by discussing various epistemological issues in the attempts of modern Hindu apologists to legitimize Vedic world views. These issues include the appeal to modern science to confirm traditional ideals and values, while simultaneously subordinating scientific method to spiritual means of knowledge, or rejecting scientific methodology with its inbuilt skepticism entirely.
What Sorts of Worlds Do We Live in Nowadays? Teaching Biology in a Post-Modern Age.
ERIC Educational Resources Information Center
Reiss, Michael J.; Tunnicliffe, Sue Dale
2001-01-01
Explores implications of the view that there is no such thing as the scientific method for biology education. Suggests fresh approaches to the teaching of drawing in biology, the teaching of classification, and the teaching of human biology by illustrating opportunities for investigating and describing the world scientifically. (Contains 32…
ERIC Educational Resources Information Center
Appel, Stephen W.
1989-01-01
Examines the construction of racial scientific discourse within the milieu of an extremely racially segregated society. Traces the influence of capitalism, racism, Social Darwinism, eugenics, and "racial science" on the pedagogy of modern apartheid in South Africa. Finds evidence of pervasive effects of "scientific" ideas on…
ERIC Educational Resources Information Center
Hjerppe, Roland
Discussions between Portugal and Sweden regarding cooperation in the field of education have been going on since 1975. This report outlines short term and long range goals, conditions, and proposals of the Swedish mission to Portugal to implement modern information and documentation services in scientific and technical research and development.…
Contributions of Basic Sciences to Science of Education. Studies in Educational Administration.
ERIC Educational Resources Information Center
Lall, Bernard M.
The science of education has been influenced by the basic sciences to the extent that educational research now has been able to modernize its approach by accepting and using the basic scientific methodology and experimental techniques. Using primarily the same steps of scientific investigations, education today holds a place of much greater esteem…
ERIC Educational Resources Information Center
Patiño, José Fernando; Goulart, Daniel Magalhães
2016-01-01
This article contributes to the platform of thought proposed by González Rey in the development of qualitative epistemology and the theory of subjectivity. We discuss three core aspects: firstly, the general epistemological problems of modern science, with its non-critical, non-theoretical scientific ideals, and low reflexivity; secondly, we…
[The detection and cultivation of the scientific talent of young doctors].
Van Der Meer, J W M
2005-01-01
Although science is not a key issue for the general public in The Netherlands, and scouting talents is not a customary activity, it is of the utmost importance for the scientific progress to detect gifted young people and to motivate them for a career in medical sciences. The scouting of talent should start as early as possible. A working group of the Royal Netherlands Academy of Arts and Sciences has issued a report on secondary schooling in which scouting of talent is a central issue. The modern medical curricula at the universities in The Netherlands all offer a substantial elective programme and the modern teaching in small groups also offers opportunities for teachers to detect talent. Recognition of scientific talent is further possible during the research period that every medical student has to go through. In Nijmegen, the Department of Internal Medicine organises a yearly master class at the end of the summer for the best second year medical students; in this course they are introduced to the scientific approach in medicine: from bedside to bench and vice versa. With this course we try to enforce the motivation for medical research. A prime instrument for the development of scientific talent is the nationally funded PhD track for medical specialists in training.
NASA Astrophysics Data System (ADS)
Ramamurthy, M.
2005-12-01
A revolution is underway in the role played by cyberinfrastructure and data services in the conduct of research and education. We live in an era of an unprecedented data volume from diverse sources, multidisciplinary analysis and synthesis, and active, learner-centered education emphasis. For example, modern remote-sensing systems like hyperspectral satellite instruments generate terabytes of data each day. Environmental problems such as global change and water cycle transcend disciplinary as well as geographic boundaries, and their solution requires integrated earth system science approaches. Contemporary education strategies recommend adopting an Earth system science approach for teaching the geosciences, employing new pedagogical techniques such as enquiry-based learning and hands-on activities. Needless to add, today's education and research enterprise depends heavily on robust, flexible and scalable cyberinfrastructure, especially on the ready availability of quality data and appropriate tools to manipulate and integrate those data. Fortuitously, rapid advances in computing and communication technologies have also revolutionized how data, tools and services are being incorporated into the teaching and scientific enterprise. The exponential growth in the use of the Internet in education and research, largely due to the advent of the World Wide Web, is by now well documented. On the other hand, how some of the other technological and community trends that have shaped the use of cyberinfrastructure, especially data services, is less well understood. For example, the computing industry is converging on an approach called Web services that enables a standard and yet revolutionary way of building applications and methods to connect and exchange information over the Web. This new approach, based on XML - a widely accepted format for exchanging data and corresponding semantics over the Internet - enables applications, computer systems, and information processes to work together in a fundamentally different way. Likewise, the advent of digital libraries, grid computing platforms, interoperable frameworks, standards and protocols, open-source software, and community atmospheric models have been important drivers in shaping the use of a new generation of end-to-end cyberinfrastructure for solving some of the most challenging scientific and educational problems. In this talk, I will present an overview of the scientific, technological, and educational drivers and discuss recent developments in cyberinfrastructure and Unidata's role and directions in providing robust, end-to-end data services for solving geoscientific problems and advancing student learning.
Securing Secrets and Managing Trust in Modern Computing Applications
ERIC Educational Resources Information Center
Sayler, Andy
2016-01-01
The amount of digital data generated and stored by users increases every day. In order to protect this data, modern computing systems employ numerous cryptographic and access control solutions. Almost all of such solutions, however, require the keeping of certain secrets as the basis of their security models. How best to securely store and control…
ERIC Educational Resources Information Center
Zhamanov, Azamat; Yoo, Seong-Moo; Sakhiyeva, Zhulduz; Zhaparov, Meirambek
2018-01-01
Students nowadays are hard to be motivated to study lessons with traditional teaching methods. Computers, smartphones, tablets and other smart devices disturb students' attentions. Nevertheless, those smart devices can be used as auxiliary tools of modern teaching methods. In this article, the authors review two popular modern teaching methods:…
Hyperspectral processing in graphical processing units
NASA Astrophysics Data System (ADS)
Winter, Michael E.; Winter, Edwin M.
2011-06-01
With the advent of the commercial 3D video card in the mid 1990s, we have seen an order of magnitude performance increase with each generation of new video cards. While these cards were designed primarily for visualization and video games, it became apparent after a short while that they could be used for scientific purposes. These Graphical Processing Units (GPUs) are rapidly being incorporated into data processing tasks usually reserved for general purpose computers. It has been found that many image processing problems scale well to modern GPU systems. We have implemented four popular hyperspectral processing algorithms (N-FINDR, linear unmixing, Principal Components, and the RX anomaly detection algorithm). These algorithms show an across the board speedup of at least a factor of 10, with some special cases showing extreme speedups of a hundred times or more.
An Automatic Image-Based Modelling Method Applied to Forensic Infography
Zancajo-Blazquez, Sandra; Gonzalez-Aguilera, Diego; Gonzalez-Jorge, Higinio; Hernandez-Lopez, David
2015-01-01
This paper presents a new method based on 3D reconstruction from images that demonstrates the utility and integration of close-range photogrammetry and computer vision as an efficient alternative to modelling complex objects and scenarios of forensic infography. The results obtained confirm the validity of the method compared to other existing alternatives as it guarantees the following: (i) flexibility, permitting work with any type of camera (calibrated and non-calibrated, smartphone or tablet) and image (visible, infrared, thermal, etc.); (ii) automation, allowing the reconstruction of three-dimensional scenarios in the absence of manual intervention, and (iii) high quality results, sometimes providing higher resolution than modern laser scanning systems. As a result, each ocular inspection of a crime scene with any camera performed by the scientific police can be transformed into a scaled 3d model. PMID:25793628
NHDPlusHR: A national geospatial framework for surface-water information
Viger, Roland; Rea, Alan H.; Simley, Jeffrey D.; Hanson, Karen M.
2016-01-01
The U.S. Geological Survey is developing a new geospatial hydrographic framework for the United States, called the National Hydrography Dataset Plus High Resolution (NHDPlusHR), that integrates a diversity of the best-available information, robustly supports ongoing dataset improvements, enables hydrographic generalization to derive alternate representations of the network while maintaining feature identity, and supports modern scientific computing and Internet accessibility needs. This framework is based on the High Resolution National Hydrography Dataset, the Watershed Boundaries Dataset, and elevation from the 3-D Elevation Program, and will provide an authoritative, high precision, and attribute-rich geospatial framework for surface-water information for the United States. Using this common geospatial framework will provide a consistent basis for indexing water information in the United States, eliminate redundancy, and harmonize access to, and exchange of water information.
Introduction to the LaRC central scientific computing complex
NASA Technical Reports Server (NTRS)
Shoosmith, John N.
1993-01-01
The computers and associated equipment that make up the Central Scientific Computing Complex of the Langley Research Center are briefly described. The electronic networks that provide access to the various components of the complex and a number of areas that can be used by Langley and contractors staff for special applications (scientific visualization, image processing, software engineering, and grid generation) are also described. Flight simulation facilities that use the central computers are described. Management of the complex, procedures for its use, and available services and resources are discussed. This document is intended for new users of the complex, for current users who wish to keep appraised of changes, and for visitors who need to understand the role of central scientific computers at Langley.
OMPC: an Open-Source MATLAB®-to-Python Compiler
Jurica, Peter; van Leeuwen, Cees
2008-01-01
Free access to scientific information facilitates scientific progress. Open-access scientific journals are a first step in this direction; a further step is to make auxiliary and supplementary materials that accompany scientific publications, such as methodological procedures and data-analysis tools, open and accessible to the scientific community. To this purpose it is instrumental to establish a software base, which will grow toward a comprehensive free and open-source language of technical and scientific computing. Endeavors in this direction are met with an important obstacle. MATLAB®, the predominant computation tool in many fields of research, is a closed-source commercial product. To facilitate the transition to an open computation platform, we propose Open-source MATLAB®-to-Python Compiler (OMPC), a platform that uses syntax adaptation and emulation to allow transparent import of existing MATLAB® functions into Python programs. The imported MATLAB® modules will run independently of MATLAB®, relying on Python's numerical and scientific libraries. Python offers a stable and mature open source platform that, in many respects, surpasses commonly used, expensive commercial closed source packages. The proposed software will therefore facilitate the transparent transition towards a free and general open-source lingua franca for scientific computation, while enabling access to the existing methods and algorithms of technical computing already available in MATLAB®. OMPC is available at http://ompc.juricap.com. PMID:19225577
NASA Technical Reports Server (NTRS)
VanZandt, John
1994-01-01
The usage model of supercomputers for scientific applications, such as computational fluid dynamics (CFD), has changed over the years. Scientific visualization has moved scientists away from looking at numbers to looking at three-dimensional images, which capture the meaning of the data. This change has impacted the system models for computing. This report details the model which is used by scientists at NASA's research centers.
ERIC Educational Resources Information Center
Adams, Stephen T.
2004-01-01
Although one role of computers in science education is to help students learn specific science concepts, computers are especially intriguing as a vehicle for fostering the development of epistemological knowledge about the nature of scientific knowledge--what it means to "know" in a scientific sense (diSessa, 1985). In this vein, the…
EPA uses high-end scientific computing, geospatial services and remote sensing/imagery analysis to support EPA's mission. The Center for Environmental Computing (CEC) assists the Agency's program offices and regions to meet staff needs in these areas.
Borresen, Jon; Lynch, Stephen
2012-01-01
In the 1940s, the first generation of modern computers used vacuum tube oscillators as their principle components, however, with the development of the transistor, such oscillator based computers quickly became obsolete. As the demand for faster and lower power computers continues, transistors are themselves approaching their theoretical limit and emerging technologies must eventually supersede them. With the development of optical oscillators and Josephson junction technology, we are again presented with the possibility of using oscillators as the basic components of computers, and it is possible that the next generation of computers will be composed almost entirely of oscillatory devices. Here, we demonstrate how coupled threshold oscillators may be used to perform binary logic in a manner entirely consistent with modern computer architectures. We describe a variety of computational circuitry and demonstrate working oscillator models of both computation and memory.
Noel, Jean-Paul; Blanke, Olaf; Serino, Andrea
2018-06-06
Integrating information across sensory systems is a critical step toward building a cohesive representation of the environment and one's body, and as illustrated by numerous illusions, scaffolds subjective experience of the world and self. In the last years, classic principles of multisensory integration elucidated in the subcortex have been translated into the language of statistical inference understood by the neocortical mantle. Most importantly, a mechanistic systems-level description of multisensory computations via probabilistic population coding and divisive normalization is actively being put forward. In parallel, by describing and understanding bodily illusions, researchers have suggested multisensory integration of bodily inputs within the peripersonal space as a key mechanism in bodily self-consciousness. Importantly, certain aspects of bodily self-consciousness, although still very much a minority, have been recently casted under the light of modern computational understandings of multisensory integration. In doing so, we argue, the field of bodily self-consciousness may borrow mechanistic descriptions regarding the neural implementation of inference computations outlined by the multisensory field. This computational approach, leveraged on the understanding of multisensory processes generally, promises to advance scientific comprehension regarding one of the most mysterious questions puzzling humankind, that is, how our brain creates the experience of a self in interaction with the environment. © 2018 The Authors. Annals of the New York Academy of Sciences published by Wiley Periodicals, Inc. on behalf of New York Academy of Sciences.
Evaluating Application Resilience with XRay
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Sui; Bronevetsky, Greg; Li, Bin
2015-05-07
The rising count and shrinking feature size of transistors within modern computers is making them increasingly vulnerable to various types of soft faults. This problem is especially acute in high-performance computing (HPC) systems used for scientific computing, because these systems include many thousands of compute cores and nodes, all of which may be utilized in a single large-scale run. The increasing vulnerability of HPC applications to errors induced by soft faults is motivating extensive work on techniques to make these applications more resiilent to such faults, ranging from generic techniques such as replication or checkpoint/restart to algorithmspecific error detection andmore » tolerance techniques. Effective use of such techniques requires a detailed understanding of how a given application is affected by soft faults to ensure that (i) efforts to improve application resilience are spent in the code regions most vulnerable to faults and (ii) the appropriate resilience technique is applied to each code region. This paper presents XRay, a tool to view the application vulnerability to soft errors, and illustrates how XRay can be used in the context of a representative application. In addition to providing actionable insights into application behavior XRay automatically selects the number of fault injection experiments required to provide an informative view of application behavior, ensuring that the information is statistically well-grounded without performing unnecessary experiments.« less
Constraints and Opportunities in GCM Model Development
NASA Technical Reports Server (NTRS)
Schmidt, Gavin; Clune, Thomas
2010-01-01
Over the past 30 years climate models have evolved from relatively simple representations of a few atmospheric processes to complex multi-disciplinary system models which incorporate physics from bottom of the ocean to the mesopause and are used for seasonal to multi-million year timescales. Computer infrastructure over that period has gone from punchcard mainframes to modern parallel clusters. Constraints of working within an ever evolving research code mean that most software changes must be incremental so as not to disrupt scientific throughput. Unfortunately, programming methodologies have generally not kept pace with these challenges, and existing implementations now present a heavy and growing burden on further model development as well as limiting flexibility and reliability. Opportunely, advances in software engineering from other disciplines (e.g. the commercial software industry) as well as new generations of powerful development tools can be incorporated by the model developers to incrementally and systematically improve underlying implementations and reverse the long term trend of increasing development overhead. However, these methodologies cannot be applied blindly, but rather must be carefully tailored to the unique characteristics of scientific software development. We will discuss the need for close integration of software engineers and climate scientists to find the optimal processes for climate modeling.
Scientists and artists: ""Hey! You got art in my science! You got science on my art
DOE Office of Scientific and Technical Information (OSTI.GOV)
Elfman, Mary E; Hayes, Birchard P; Michel, Kelly D
The pairing of science and art has proven to be a powerful combination since the Renaissance. The combination of these two seemingly disparate disciplines ensured that even complex scientific theories could be explored and effectively communicated to both the subject matter expert and the layman. In modern times, science and art have frequently been considered disjoint, with objectives, philosophies, and perspectives often in direct opposition to each other. However, given the technological advances in computer science and high fidelity 3-D graphics development tools, this marriage of art and science is once again logically complimentary. Art, in the form of computermore » graphics and animation created on supercomputers, has already proven to be a powerful tool for improving scientific research and providing insight into nuclear phenomena. This paper discusses the power of pairing artists with scientists and engineers in order to pursue the possibilities of a widely accessible lightweight, interactive approach. We will use a discussion of photo-realism versus stylization to illuminate the expected beneficial outcome of such collaborations and the societal advantages gained by a non-traditional pa11nering of these two fields.« less
CRYSTMET—The NRCC Metals Crystallographic Data File
Wood, Gordon H.; Rodgers, John R.; Gough, S. Roger; Villars, Pierre
1996-01-01
CRYSTMET is a computer-readable database of critically evaluated crystallographic data for metals (including alloys, intermetallics and minerals) accompanied by pertinent chemical, physical and bibliographic information. It currently contains about 60 000 entries and covers the literature exhaustively from 1913. Scientific editing of the abstracted entries, consisting of numerous automated and manual checks, is done to ensure consistency with related, previously published studies, to assign structure types where necessary and to help guarantee the accuracy of the data and related information. Analyses of the entries and their distribution across key journals as a function of time show interesting trends in the complexity of the compounds studied as well as in the elements they contain. Two applications of CRYSTMET are the identification of unknowns and the prediction of properties of materials. CRYSTMET is available either online or via license of a private copy from the Canadian Scientific Numeric Database Service (CAN/SND). The indexed online search and analysis system is easy and economical to use yet fast and powerful. Development of a new system is under way combining the capabilities of ORACLE with the flexibility of a modern interface based on the Netscape browsing tool. PMID:27805157
OASYS (OrAnge SYnchrotron Suite): an open-source graphical environment for x-ray virtual experiments
NASA Astrophysics Data System (ADS)
Rebuffi, Luca; Sanchez del Rio, Manuel
2017-08-01
The evolution of the hardware platforms, the modernization of the software tools, the access to the codes of a large number of young people and the popularization of the open source software for scientific applications drove us to design OASYS (ORange SYnchrotron Suite), a completely new graphical environment for modelling X-ray experiments. The implemented software architecture allows to obtain not only an intuitive and very-easy-to-use graphical interface, but also provides high flexibility and rapidity for interactive simulations, making configuration changes to quickly compare multiple beamline configurations. Its purpose is to integrate in a synergetic way the most powerful calculation engines available. OASYS integrates different simulation strategies via the implementation of adequate simulation tools for X-ray Optics (e.g. ray tracing and wave optics packages). It provides a language to make them to communicate by sending and receiving encapsulated data. Python has been chosen as main programming language, because of its universality and popularity in scientific computing. The software Orange, developed at the University of Ljubljana (SLO), is the high level workflow engine that provides the interaction with the user and communication mechanisms.
CHRONICLE: Third International Symposium on Modern Optics, Budapest, September 1988
NASA Astrophysics Data System (ADS)
Bukhenskiĭ, M. F.; Nikitin, P. I.; Semenov, A. S.
1989-07-01
The Third International Symposium on Modern Optics (Optics-88), held in Budapest on 13-16 September 1988, was organized by the Hungarian Optical, Acoustic, and Cinematographic Society with the support of the International Commission on Optics and various scientific and industrial organizations in Hungary. The International Symposium Committee was composed of leading specialists from 11 countries in Asia, America, and Europe with A. M. Prokhorov (USSR) and N. Kroo (Hungary) as Co-chairmen. The purpose of this regular symposium is to summarize the scientific and technical progress underlying the developments in optics itself, discuss the branches of science where progress depends on optical methods in devices, and draw the attention of specialists to the most promising trends which should yield results in the immediate future.
[The organization of scientific innovative laboratory complex of modern technologies].
Totskaia, E G; Rozhnova, O M; Mamonova, E V
2013-01-01
The article discusses the actual issues of scientific innovative activity during the realization of principles of private-public partnership. The experience of development of model of scientific innovative complex is presented The possibilities to implement research achievements and their application in the area of cell technologies, technologies of regenerative medicine, biochip technologies are demonstrated. The opportunities to provide high level of diagnostic and treatment in practical health care increase of accessibility and quality of medical care and population health promotion are discussed.
Moll, F H
2015-02-01
The use of artifacts and objects from scientific medical collections and museums for academic teaching purposes are one of the main qualifying tasks of those institutions. In recent years, this aspect of scientific collections has again become on focus within academics. The collections offer a unique chance for visual and haptic forms of teaching in many fields. Due to the potential of scientific collections, educators in all branches in academic learning should be familiar with handling objects for such purposes.
Major Challenges for the Modern Chemistry in Particular and Science in General.
Uskokovíc, Vuk
2010-11-01
In the past few hundred years, science has exerted an enormous influence on the way the world appears to human observers. Despite phenomenal accomplishments of science, science nowadays faces numerous challenges that threaten its continued success. As scientific inventions become embedded within human societies, the challenges are further multiplied. In this critical review, some of the critical challenges for the field of modern chemistry are discussed, including: (a) interlinking theoretical knowledge and experimental approaches; (b) implementing the principles of sustainability at the roots of the chemical design; (c) defining science from a philosophical perspective that acknowledges both pragmatic and realistic aspects thereof; (d) instigating interdisciplinary research; (e) learning to recognize and appreciate the aesthetic aspects of scientific knowledge and methodology, and promote truly inspiring education in chemistry. In the conclusion, I recapitulate that the evolution of human knowledge inherently depends upon our ability to adopt creative problem-solving attitudes, and that challenges will always be present within the scope of scientific interests.
Parallel Index and Query for Large Scale Data Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chou, Jerry; Wu, Kesheng; Ruebel, Oliver
2011-07-18
Modern scientific datasets present numerous data management and analysis challenges. State-of-the-art index and query technologies are critical for facilitating interactive exploration of large datasets, but numerous challenges remain in terms of designing a system for process- ing general scientific datasets. The system needs to be able to run on distributed multi-core platforms, efficiently utilize underlying I/O infrastructure, and scale to massive datasets. We present FastQuery, a novel software framework that address these challenges. FastQuery utilizes a state-of-the-art index and query technology (FastBit) and is designed to process mas- sive datasets on modern supercomputing platforms. We apply FastQuery to processing ofmore » a massive 50TB dataset generated by a large scale accelerator modeling code. We demonstrate the scalability of the tool to 11,520 cores. Motivated by the scientific need to search for inter- esting particles in this dataset, we use our framework to reduce search time from hours to tens of seconds.« less
Observing the skies of Lisbon. Isaac de Sequeira Samuda, an estrangeirado in the Royal Society
Vieira, Carla Costa
2014-01-01
Elected in 1723, Isaac de Sequeira Samuda (1681–1729) was the first Jewish Fellow of the Royal Society. He had arrived in London just a few years earlier, escaping from the Portuguese Inquisition. Despite his past, he had no difficulty in establishing links with his country's diplomatic representatives in London. A physician and adviser on scientific subjects, he became a conduit between the emerging world of Portuguese astronomy and the British scientific community. He reported to the Royal Society on astronomical observations made in the new observatories in Lisbon and helped with the acquisition of scientific instruments and books destined for Portugal. These activities were facets of Samuda's unusual career and the diverse though often converging associations that he established until his death. As the member of a network active in the diffusion of new ideas and in the modernization of Portuguese science, Samuda can be regarded as an estrangeirado, as this term has come to be used in the modern literature. PMID:24921106
Modern affinity reagents: Recombinant antibodies and aptamers.
Groff, Katherine; Brown, Jeffrey; Clippinger, Amy J
2015-12-01
Affinity reagents are essential tools in both basic and applied research; however, there is a growing concern about the reproducibility of animal-derived monoclonal antibodies. The need for higher quality affinity reagents has prompted the development of methods that provide scientific, economic, and time-saving advantages and do not require the use of animals. This review describes two types of affinity reagents, recombinant antibodies and aptamers, which are non-animal technologies that can replace the use of animal-derived monoclonal antibodies. Recombinant antibodies are protein-based reagents, while aptamers are nucleic-acid-based. In light of the scientific advantages of these technologies, this review also discusses ways to gain momentum in the use of modern affinity reagents, including an update to the 1999 National Academy of Sciences monoclonal antibody production report and federal incentives for recombinant antibody and aptamer efforts. In the long-term, these efforts have the potential to improve the overall quality and decrease the cost of scientific research. Copyright © 2015 Elsevier Inc. All rights reserved.
Observing the skies of Lisbon. Isaac de Sequeira Samuda, an estrangeirado in the Royal Society.
Vieira, Carla Costa
2014-06-20
Elected in 1723, Isaac de Sequeira Samuda (1681-1729) was the first Jewish Fellow of the Royal Society. He had arrived in London just a few years earlier, escaping from the Portuguese Inquisition. Despite his past, he had no difficulty in establishing links with his country's diplomatic representatives in London. A physician and adviser on scientific subjects, he became a conduit between the emerging world of Portuguese astronomy and the British scientific community. He reported to the Royal Society on astronomical observations made in the new observatories in Lisbon and helped with the acquisition of scientific instruments and books destined for Portugal. These activities were facets of Samuda's unusual career and the diverse though often converging associations that he established until his death. As the member of a network active in the diffusion of new ideas and in the modernization of Portuguese science, Samuda can be regarded as an estrangeirado, as this term has come to be used in the modern literature.
Major Challenges for the Modern Chemistry in Particular and Science in General
Uskokovíc, Vuk
2013-01-01
In the past few hundred years, science has exerted an enormous influence on the way the world appears to human observers. Despite phenomenal accomplishments of science, science nowadays faces numerous challenges that threaten its continued success. As scientific inventions become embedded within human societies, the challenges are further multiplied. In this critical review, some of the critical challenges for the field of modern chemistry are discussed, including: (a) interlinking theoretical knowledge and experimental approaches; (b) implementing the principles of sustainability at the roots of the chemical design; (c) defining science from a philosophical perspective that acknowledges both pragmatic and realistic aspects thereof; (d) instigating interdisciplinary research; (e) learning to recognize and appreciate the aesthetic aspects of scientific knowledge and methodology, and promote truly inspiring education in chemistry. In the conclusion, I recapitulate that the evolution of human knowledge inherently depends upon our ability to adopt creative problem-solving attitudes, and that challenges will always be present within the scope of scientific interests. PMID:24465151
Using the High-Level Based Program Interface to Facilitate the Large Scale Scientific Computing
Shang, Yizi; Shang, Ling; Gao, Chuanchang; Lu, Guiming; Ye, Yuntao; Jia, Dongdong
2014-01-01
This paper is to make further research on facilitating the large-scale scientific computing on the grid and the desktop grid platform. The related issues include the programming method, the overhead of the high-level program interface based middleware, and the data anticipate migration. The block based Gauss Jordan algorithm as a real example of large-scale scientific computing is used to evaluate those issues presented above. The results show that the high-level based program interface makes the complex scientific applications on large-scale scientific platform easier, though a little overhead is unavoidable. Also, the data anticipation migration mechanism can improve the efficiency of the platform which needs to process big data based scientific applications. PMID:24574931
Defining Computational Thinking for Mathematics and Science Classrooms
ERIC Educational Resources Information Center
Weintrop, David; Beheshti, Elham; Horn, Michael; Orton, Kai; Jona, Kemi; Trouille, Laura; Wilensky, Uri
2016-01-01
Science and mathematics are becoming computational endeavors. This fact is reflected in the recently released Next Generation Science Standards and the decision to include "computational thinking" as a core scientific practice. With this addition, and the increased presence of computation in mathematics and scientific contexts, a new…
ERIC Educational Resources Information Center
Halbauer, Siegfried
1976-01-01
It was considered that students of intensive scientific Russian courses could learn vocabulary more efficiently if they were taught word stems and how to combine them with prefixes and suffixes to form scientific words. The computer programs developed to identify the most important stems is discussed. (Text is in German.) (FB)
Improving robustness and computational efficiency using modern C++
DOE Office of Scientific and Technical Information (OSTI.GOV)
Paterno, M.; Kowalkowski, J.; Green, C.
2014-01-01
For nearly two decades, the C++ programming language has been the dominant programming language for experimental HEP. The publication of ISO/IEC 14882:2011, the current version of the international standard for the C++ programming language, makes available a variety of language and library facilities for improving the robustness, expressiveness, and computational efficiency of C++ code. However, much of the C++ written by the experimental HEP community does not take advantage of the features of the language to obtain these benefits, either due to lack of familiarity with these features or concern that these features must somehow be computationally inefficient. In thismore » paper, we address some of the features of modern C+-+, and show how they can be used to make programs that are both robust and computationally efficient. We compare and contrast simple yet realistic examples of some common implementation patterns in C, currently-typical C++, and modern C++, and show (when necessary, down to the level of generated assembly language code) the quality of the executable code produced by recent C++ compilers, with the aim of allowing the HEP community to make informed decisions on the costs and benefits of the use of modern C++.« less
ERIC Educational Resources Information Center
Norris, Stephen P.; Macnab, John S.; Wonham, Marjorie; de Vries, Gerda
2009-01-01
This paper promotes the use of adapted primary literature as a curriculum and instruction innovation for use in high school. Adapted primary literature is useful for promoting an understanding of scientific and mathematical reasoning and argument and for introducing modern science into the schools. We describe a prototype adapted from a published…
From Generation to Generation: Oral Histories of Scientific Innovations from the 20th Century
ERIC Educational Resources Information Center
Bedrossian, Mindy J.
2010-01-01
The 20th century saw some of the most important technological and scientific discoveries in the history of humankind. The space shuttle, the internet, and other modern advances changed society forever, and yet many students cannot imagine what life was like before these technologies existed. In the project described here, students take a firsthand…
ERIC Educational Resources Information Center
Balashova, Yuliya B.
2016-01-01
This research reconstructs the traditions of scientific enlightenment in Russia. The turn of the nineteenth and twentieth centuries was chosen as the most representative period. The modern age saw the establishment of the optimal model for advancing science in the global context and its crucial segment--Russian science. This period was…
ERIC Educational Resources Information Center
Au, Wayne
2011-01-01
The application of the principles of scientific management within the structure, organization, and curriculum of public schools in the US became dominant during the early 1900s. Based upon research evidence from the modern day era of high-stakes testing in US public education, the fundamental logics guiding scientific management have resurfaced…
Problems of Scientific Research Activity in Institutions of Higher Learning
ERIC Educational Resources Information Center
Solodnikov, V. V.
2008-01-01
Under current conditions, the role played by scientific knowledge in all spheres of public life is rising substantially, and more and more attention is being paid to problems of the development and modernization of the Academy of Sciences. Not long ago, for example, there was wide response to the findings of a special study by S. Belanovskii on…
NASA Astrophysics Data System (ADS)
Sharkov, N. A.; Sharkova, O. A.
2018-05-01
The paper identifies the importance of the Leonhard Euler's discoveries in the field of shipbuilding for the scientific evolution of academician A. N. Krylov and for the modern knowledge in survivability and safety of ships. The works by Leonard Euler "Marine Science" and "The Moon Motion New Theory" are discussed.
ERIC Educational Resources Information Center
Steele, Erika M.
2013-01-01
The rapid advances in technology and scientific knowledge in modern society increases the need for a workforce with an understanding of technology and critical thinking skills College graduates are entering the working world without the critical thinking skills and ability to apply the scientific knowledge gained during their undergraduate…
NASA Astrophysics Data System (ADS)
Zakharova, Natalia; Piskovatsky, Nicolay; Gusev, Anatoly
2014-05-01
Development of Informational-Computational Systems (ICS) for data assimilation procedures is one of multidisciplinary problems. To study and solve these problems one needs to apply modern results from different disciplines and recent developments in: mathematical modeling; theory of adjoint equations and optimal control; inverse problems; numerical methods theory; numerical algebra and scientific computing. The above problems are studied in the Institute of Numerical Mathematics of the Russian Academy of Science (INM RAS) in ICS for personal computers. In this work the results on the Special data base development for ICS "INM RAS - Black Sea" are presented. In the presentation the input information for ICS is discussed, some special data processing procedures are described. In this work the results of forecast using ICS "INM RAS - Black Sea" with operational observation data assimilation are presented. This study was supported by the Russian Foundation for Basic Research (project No 13-01-00753) and by Presidium Program of Russian Academy of Sciences (project P-23 "Black sea as an imitational ocean model"). References 1. V.I. Agoshkov, M.V. Assovskii, S.A. Lebedev, Numerical simulation of Black Sea hydrothermodynamics taking into account tide-forming forces. Russ. J. Numer. Anal. Math. Modelling (2012) 27, No.1, pp. 5-31. 2. E.I. Parmuzin, V.I. Agoshkov, Numerical solution of the variational assimilation problem for sea surface temperature in the model of the Black Sea dynamics. Russ. J. Numer. Anal. Math. Modelling (2012) 27, No.1, pp. 69-94. 3. V.B. Zalesny, N.A. Diansky, V.V. Fomin, S.N. Moshonkin, S.G. Demyshev, Numerical model of the circulation of Black Sea and Sea of Azov. Russ. J. Numer. Anal. Math. Modelling (2012) 27, No.1, pp. 95-111. 4. Agoshkov V.I.,Assovsky M.B., Giniatulin S. V., Zakharova N.B., Kuimov G.V., Parmuzin E.I., Fomin V.V. Informational Computational system of variational assimilation of observation data "INM RAS - Black sea"// Ecological safety of coastal and shelf zones and complex use of shelf resources: Collection of scientific works. Issue 26, Volume 2. - National Academy of Sciences of Ukraine, Marine Hydrophysical Institute, Sebastopol, 2012. Pages 352-360. (In russian)
Scientific Visualization, Seeing the Unseeable
LBNL
2017-12-09
June 24, 2008 Berkeley Lab lecture: Scientific visualization transforms abstract data into readily comprehensible images, provide a vehicle for "seeing the unseeable," and play a central role in bo... June 24, 2008 Berkeley Lab lecture: Scientific visualization transforms abstract data into readily comprehensible images, provide a vehicle for "seeing the unseeable," and play a central role in both experimental and computational sciences. Wes Bethel, who heads the Scientific Visualization Group in the Computational Research Division, presents an overview of visualization and computer graphics, current research challenges, and future directions for the field.
On the Emergence of Modern Humans
ERIC Educational Resources Information Center
Amati, Daniele; Shallice, Tim
2007-01-01
The emergence of modern humans with their extraordinary cognitive capacities is ascribed to a novel type of cognitive computational process (sustained non-routine multi-level operations) required for abstract projectuality, held to be the common denominator of the cognitive capacities specific to modern humans. A brain operation (latching) that…
NASA Astrophysics Data System (ADS)
Burke, Lydia E. Carol-Ann
An expanding body of research explores the social, political, cultural and personal challenges presented by the Western emphasis of curricula around the world. The aim of my study is to advance this field of inquiry by gaining insight into perceptions of Western modern science presented by students, teachers and administrators in a given Caribbean setting. Through this study I asked how my research participants described the nature of scientific knowledge, how they related scientific knowledge to other culturally-valued knowledges and the meanings they attached to the geographic origins of science teachers. Situating this work firmly within the practice of Foucauldian critical discourse analysis, I have utilised a conceptual framework defined by the power/knowledge and complicity/resistance themes of post-colonial theory to support my interpretation of participant commentary in an overall quest that is concerned about the ways in which Western modern science might be exerting a colonising influence. Fourteen students, nine teachers (both expatriate and local) and three administrators participated in the study. I combined a semi-structured question and answer interview format with a card sort activity. I used a procedure based on my own adaptation of Stephenson's Q methodology, where the respondents placed 24 statements hierarchically along a continuum of increasing strength of agreement, presenting their rationalisations, personal stories and illustrations as they sorted. I used an inverse factor analysis, in combination with the interview transcripts, to assist me in the identification of three discourse positions described by my research participants: The truth value of scientific knowledge, The pragmatic use of science to promote progress, and The priority of cultural preservation. The interview transcripts were also analysed for emergent themes, providing an additional layer of data interpretation. The research findings raise concerns regarding the hegemonic potency of certain scientific assumptions and assertions of participants, leading me to emphasise the importance of developing teachers' knowledge of the historical, philosophical and social background of Western modern science as well as focusing on developing the conceptual and intellectual engagement of students with Western modern science without demanding the kind of belief commitment that would insist that students replace alternative modes of meaning making.
OMPC: an Open-Source MATLAB-to-Python Compiler.
Jurica, Peter; van Leeuwen, Cees
2009-01-01
Free access to scientific information facilitates scientific progress. Open-access scientific journals are a first step in this direction; a further step is to make auxiliary and supplementary materials that accompany scientific publications, such as methodological procedures and data-analysis tools, open and accessible to the scientific community. To this purpose it is instrumental to establish a software base, which will grow toward a comprehensive free and open-source language of technical and scientific computing. Endeavors in this direction are met with an important obstacle. MATLAB((R)), the predominant computation tool in many fields of research, is a closed-source commercial product. To facilitate the transition to an open computation platform, we propose Open-source MATLAB((R))-to-Python Compiler (OMPC), a platform that uses syntax adaptation and emulation to allow transparent import of existing MATLAB((R)) functions into Python programs. The imported MATLAB((R)) modules will run independently of MATLAB((R)), relying on Python's numerical and scientific libraries. Python offers a stable and mature open source platform that, in many respects, surpasses commonly used, expensive commercial closed source packages. The proposed software will therefore facilitate the transparent transition towards a free and general open-source lingua franca for scientific computation, while enabling access to the existing methods and algorithms of technical computing already available in MATLAB((R)). OMPC is available at http://ompc.juricap.com.
Borresen, Jon; Lynch, Stephen
2012-01-01
In the 1940s, the first generation of modern computers used vacuum tube oscillators as their principle components, however, with the development of the transistor, such oscillator based computers quickly became obsolete. As the demand for faster and lower power computers continues, transistors are themselves approaching their theoretical limit and emerging technologies must eventually supersede them. With the development of optical oscillators and Josephson junction technology, we are again presented with the possibility of using oscillators as the basic components of computers, and it is possible that the next generation of computers will be composed almost entirely of oscillatory devices. Here, we demonstrate how coupled threshold oscillators may be used to perform binary logic in a manner entirely consistent with modern computer architectures. We describe a variety of computational circuitry and demonstrate working oscillator models of both computation and memory. PMID:23173034
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
An account of the Caltech Concurrent Computation Program (C{sup 3}P), a five year project that focused on answering the question: Can parallel computers be used to do large-scale scientific computations '' As the title indicates, the question is answered in the affirmative, by implementing numerous scientific applications on real parallel computers and doing computations that produced new scientific results. In the process of doing so, C{sup 3}P helped design and build several new computers, designed and implemented basic system software, developed algorithms for frequently used mathematical computations on massively parallel machines, devised performance models and measured the performance of manymore » computers, and created a high performance computing facility based exclusively on parallel computers. While the initial focus of C{sup 3}P was the hypercube architecture developed by C. Seitz, many of the methods developed and lessons learned have been applied successfully on other massively parallel architectures.« less
NASA Astrophysics Data System (ADS)
Ford, Eric B.; Dindar, Saleh; Peters, Jorg
2015-08-01
The realism of astrophysical simulations and statistical analyses of astronomical data are set by the available computational resources. Thus, astronomers and astrophysicists are constantly pushing the limits of computational capabilities. For decades, astronomers benefited from massive improvements in computational power that were driven primarily by increasing clock speeds and required relatively little attention to details of the computational hardware. For nearly a decade, increases in computational capabilities have come primarily from increasing the degree of parallelism, rather than increasing clock speeds. Further increases in computational capabilities will likely be led by many-core architectures such as Graphical Processing Units (GPUs) and Intel Xeon Phi. Successfully harnessing these new architectures, requires significantly more understanding of the hardware architecture, cache hierarchy, compiler capabilities and network network characteristics.I will provide an astronomer's overview of the opportunities and challenges provided by modern many-core architectures and elastic cloud computing. The primary goal is to help an astronomical audience understand what types of problems are likely to yield more than order of magnitude speed-ups and which problems are unlikely to parallelize sufficiently efficiently to be worth the development time and/or costs.I will draw on my experience leading a team in developing the Swarm-NG library for parallel integration of large ensembles of small n-body systems on GPUs, as well as several smaller software projects. I will share lessons learned from collaborating with computer scientists, including both technical and soft skills. Finally, I will discuss the challenges of training the next generation of astronomers to be proficient in this new era of high-performance computing, drawing on experience teaching a graduate class on High-Performance Scientific Computing for Astrophysics and organizing a 2014 advanced summer school on Bayesian Computing for Astronomical Data Analysis with support of the Penn State Center for Astrostatistics and Institute for CyberScience.
Exploring Cloud Computing for Large-scale Scientific Applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lin, Guang; Han, Binh; Yin, Jian
This paper explores cloud computing for large-scale data-intensive scientific applications. Cloud computing is attractive because it provides hardware and software resources on-demand, which relieves the burden of acquiring and maintaining a huge amount of resources that may be used only once by a scientific application. However, unlike typical commercial applications that often just requires a moderate amount of ordinary resources, large-scale scientific applications often need to process enormous amount of data in the terabyte or even petabyte range and require special high performance hardware with low latency connections to complete computation in a reasonable amount of time. To address thesemore » challenges, we build an infrastructure that can dynamically select high performance computing hardware across institutions and dynamically adapt the computation to the selected resources to achieve high performance. We have also demonstrated the effectiveness of our infrastructure by building a system biology application and an uncertainty quantification application for carbon sequestration, which can efficiently utilize data and computation resources across several institutions.« less
Stellamor, K
1996-01-01
There is a disproportion between diagnostic and therapeutic medical achievements and the doctor/patient relationship. Are we allowed to do everything we are able to do in medicine? People are concerned and worried (genetic technology, invasive medicine, embryos in test tubes etc.). The crisis of ethics in medicine is evident. The analysis of the situation shows one of the causes in the shift of the paradigma-modern times to postmodern following scientific positivism-but also a loss of ethics in medicine due to an extreme secularism and to modern philosophical trends (Hans Jonas and the responsibility for the future and on the other hand modern utilitarism).
Untangling the Gordian Knot The Socio-Cultural Challenge of Syria
2015-10-30
declining legitimacy through construction of traditions suggesting his leadership of modern Islam. 12 Ironically, Hamid II used Islamist language and...Nasserist leadership , and Damascus took up this role. 20 Ba’athist regimes, such as Syria, remained strong until the 1991-2003 time period. 21 The...Legion- style standards) with the most modern ideas of the time (e.g. mass politics, video technology and scientific propaganda). 23 In this method
Espahangizi, Kijan
2015-09-01
Glass vessels such as flasks and test tubes play an ambiguous role in the historiography of modern laboratory research. In spite of the strong focus on the role of materiality in the last decades, the scientific glass vessel - while being symbolically omnipresent - has remained curiously neglected in regard to its materiality. The popular image or topos of the transparent, neutral, and quasi-immaterial glass container obstructs the view of the physico-chemical functionality of this constitutive inner boundary in modern laboratory environments and its material historicity. In order to understand how glass vessels were able to provide a stable epistemic containment of spatially enclosed experimental phenomena in the new laboratory ecologies emerging in the nineteenth and early twentieth century, I will focus on the history of the material standardization of laboratory glassware. I will follow the rise of a new awareness for measurement errors due to the chemical agency of experimental glass vessels, then I will sketch the emergence of a whole techno-scientific infrastructure for the improvement of glass container quality in late nineteenth-century Germany. In the last part of my argument, I will return to the laboratory by looking at the implementation of this glass reform that created a new oikos for the inner experimental milieus of modern laboratory research.
Cerebral localization in the nineteenth century--the birth of a science and its modern consequences.
Steinberg, David A
2009-07-01
Although many individuals contributed to the development of the science of cerebral localization, its conceptual framework is the work of a single man--John Hughlings Jackson (1835-1911), a Victorian physician practicing in London. Hughlings Jackson's formulation of a neurological science consisted of an axiomatic basis, an experimental methodology, and a clinical neurophysiology. His axiom--that the brain is an exclusively sensorimotor machine--separated neurology from psychiatry and established a rigorous and sophisticated structure for the brain and mind. Hughlings Jackson's experimental method utilized the focal lesion as a probe of brain function and created an evolutionary structure of somatotopic representation to explain clinical neurophysiology. His scientific theory of cerebral localization can be described as a weighted ordinal representation. Hughlings Jackson's theory of weighted ordinal representation forms the scientific basis for modern neurology. Though this science is utilized daily by every neurologist and forms the basis of neuroscience, the consequences of Hughlings Jackson's ideas are still not generally appreciated. For example, they imply the intrinsic inconsistency of some modern fields of neuroscience and neurology. Thus, "cognitive imaging" and the "neurology of art"--two topics of modern interest--are fundamentally oxymoronic according to the science of cerebral localization. Neuroscientists, therefore, still have much to learn from John Hughlings Jackson.
Aerospace Toxicology and Microbiology
NASA Technical Reports Server (NTRS)
James, John T.; Parmet, A. J.; Pierson, Duane L.
2007-01-01
Toxicology dates to the very earliest history of humanity with various poisons and venom being recognized as a method of hunting or waging war with the earliest documentation in the Evers papyrus (circa 1500 BCE). The Greeks identified specific poisons such as hemlock, a method of state execution, and the Greek word toxos (arrow) became the root of our modern science. The first scientific approach to the understanding of poisons and toxicology was the work during the late middle ages of Paracelsus. He formulated what were then revolutionary views that a specific toxic agent or "toxicon" caused specific dose-related effects. His principles have established the basis of modern pharmacology and toxicology. In 1700, Bernardo Ramazzini published the book De Morbis Artificum Diatriba (The Diseases of Workers) describing specific illnesses associated with certain labor, particularly metal workers exposed to mercury, lead, arsenic, and rock dust. Modern toxicology dates from development of the modern industrial chemical processes, the earliest involving an analytical method for arsenic by Marsh in 1836. Industrial organic chemicals were synthesized in the late 1800 s along with anesthetics and disinfectants. In 1908, Hamilton began the long study of occupational toxicology issues, and by WW I the scientific use of toxicants saw Haber creating war gases and defining time-dosage relationships that are used even today.
An Overview of the Computational Physics and Methods Group at Los Alamos National Laboratory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baker, Randal Scott
CCS Division was formed to strengthen the visibility and impact of computer science and computational physics research on strategic directions for the Laboratory. Both computer science and computational science are now central to scientific discovery and innovation. They have become indispensable tools for all other scientific missions at the Laboratory. CCS Division forms a bridge between external partners and Laboratory programs, bringing new ideas and technologies to bear on today’s important problems and attracting high-quality technical staff members to the Laboratory. The Computational Physics and Methods Group CCS-2 conducts methods research and develops scientific software aimed at the latest andmore » emerging HPC systems.« less
[Earth Science Technology Office's Computational Technologies Project
NASA Technical Reports Server (NTRS)
Fischer, James (Technical Monitor); Merkey, Phillip
2005-01-01
This grant supported the effort to characterize the problem domain of the Earth Science Technology Office's Computational Technologies Project, to engage the Beowulf Cluster Computing Community as well as the High Performance Computing Research Community so that we can predict the applicability of said technologies to the scientific community represented by the CT project and formulate long term strategies to provide the computational resources necessary to attain the anticipated scientific objectives of the CT project. Specifically, the goal of the evaluation effort is to use the information gathered over the course of the Round-3 investigations to quantify the trends in scientific expectations, the algorithmic requirements and capabilities of high-performance computers to satisfy this anticipated need.
Evaluating Modern Defenses Against Control Flow Hijacking
2015-09-01
unsound and could introduce false negatives (opening up another possible set of attacks). CFG Construction using DSA We next evaluate the precision of CFG...Evaluating Modern Defenses Against Control Flow Hijacking by Ulziibayar Otgonbaatar Submitted to the Department of Electrical Engineering and...Computer Science in partial fulfillment of the requirements for the degree of Master of Science in Computer Science and Engineering at the MASSACHUSETTS
Resiliency in Future Cyber Combat
2016-04-04
including the Internet , telecommunications networks, computer systems, and embed- ded processors and controllers.”6 One important point emerging from the...definition is that while the Internet is part of cyberspace, it is not all of cyberspace. Any computer processor capable of communicating with a...central proces- sor on a modern car are all part of cyberspace, although only some of them are routinely connected to the Internet . Most modern
Situated phenomenology and biological systems: Eastern and Western synthesis.
Schroeder, Marcin J; Vallverdú, Jordi
2015-12-01
Phenomenology was born with the mission to give foundations for science of experience and to open consciousness to scientific study. The influence of phenomenology initiated in the works of Husserl and continued in a wide range of works of others was immense, but mainly within the confines of philosophy and the humanities. The actual attempts to develop a scientific discipline of the study of consciousness and to carry out research on cognition and consciousness were always based on the methods of traditional science in which elimination of the subjective has been always a primary tenet. Thus, focus was mainly on neurological correlates of conscious phenomena. The present paper is an attempt to initiate an extension and revision of phenomenological methodology with the use of philosophical and scientific experience and knowledge accumulated in a century of inquiry and research in relevant disciplines. The question which disciplines are relevant is crucial and our answer is innovative. The range of disciplines involved here is from information science and studies of computation, up to cultural psychology and the studies of philosophical traditions of the East. Concepts related to information and computation studies provide a general conceptual framework free from the limitations of particular languages and of linguistic analysis. This conceptual framework is extending the original perspective of phenomenology to issues of modern technology and science. Cultural psychology gives us tools to root out what in phenomenology was considered universal for humanity, but was a result of European ethnocentrism. Most important here is the contrast between individualistic and collectivistic cultural determinants of consciousness. Finally, philosophical tradition of the East gives alternatives in seeking solutions for fundamental problems. This general outline of the research methodology is illustrated by an example of its use when phenomenology is studied within the conceptual framework of information. Copyright © 2015. Published by Elsevier Ltd.
Computers and Computation. Readings from Scientific American.
ERIC Educational Resources Information Center
Fenichel, Robert R.; Weizenbaum, Joseph
A collection of articles from "Scientific American" magazine has been put together at this time because the current period in computer science is one of consolidation rather than innovation. A few years ago, computer science was moving so swiftly that even the professional journals were more archival than informative; but today it is…
Overcoming the momentum of anachronism: American geologic mapping in a twenty-first-century world
House, P. Kyle; Clark, Ryan; Kopera, Joe
2013-01-01
The practice of geologic mapping is undergoing conceptual and methodological transformation. Profound changes in digital technology in the past 10 yr have potential to impact all aspects of geologic mapping. The future of geologic mapping as a relevant scientific enterprise depends on widespread adoption of new technology and ideas about the collection, meaning, and utility of geologic map data. It is critical that the geologic community redefine the primary elements of the traditional paper geologic map and improve the integration of the practice of making maps in the field and office with the new ways to record, manage, share, and visualize their underlying data. A modern digital geologic mapping model will enhance scientific discovery, meet elevated expectations of modern geologic map users, and accommodate inevitable future changes in technology.
[The international network and Italian modernization. Ruggero Ceppellini, genetics, and HLA].
Capocci, Mauro
2014-01-01
The paper reconstructs the scientific career of Ruggero Ceppellini, focusing especially on his role in the discovery of the genetic system underlying the Human Leucocyte Antigen. From his earliest investigations in blood group genetics, Ceppellini quickly became an internationally acknowledged authority in the field of immunogenetics--the study of genetics by means of immunological tools--and participated to the endeavor that ultimately yelded a new meaning for the word: thanks to the pioneering research in the HLA field, immunogenetics became the study of the genetic control of immune system. The paper will also place Ceppellini's scientific work against the backdrop of the modernization of Italian genetics after WWII, resulting from the efforts of a handful of scientists to connect to international networks and adopting new methodologies in life sciences.
ESIF 2016: Modernizing Our Grid and Energy System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Van Becelaere, Kimberly
This 2016 annual report highlights work conducted at the Energy Systems Integration Facility (ESIF) in FY 2016, including grid modernization, high-performance computing and visualization, and INTEGRATE projects.
Computer Technology: State of the Art.
ERIC Educational Resources Information Center
Withington, Frederic G.
1981-01-01
Describes the nature of modern general-purpose computer systems, including hardware, semiconductor electronics, microprocessors, computer architecture, input output technology, and system control programs. Seven suggested readings are cited. (FM)
NASA Astrophysics Data System (ADS)
Areeda, J. S.; Smith, J. R.; Lundgren, A. P.; Maros, E.; Macleod, D. M.; Zweizig, J.
2017-01-01
Gravitational-wave observatories around the world, including the Laser Interferometer Gravitational-Wave Observatory (LIGO), record a large volume of gravitational-wave output data and auxiliary data about the instruments and their environments. These data are stored at the observatory sites and distributed to computing clusters for data analysis. LigoDV-web is a web-based data viewer that provides access to data recorded at the LIGO Hanford, LIGO Livingston and GEO600 observatories, and the 40 m prototype interferometer at Caltech. The challenge addressed by this project is to provide meaningful visualizations of small data sets to anyone in the collaboration in a fast, secure and reliable manner with minimal software, hardware and training required of the end users. LigoDV-web is implemented as a Java Enterprise Application, with Shibboleth Single Sign On for authentication and authorization, and a proprietary network protocol used for data access on the back end. Collaboration members with proper credentials can request data be displayed in any of several general formats from any Internet appliance that supports a modern browser with Javascript and minimal HTML5 support, including personal computers, smartphones, and tablets. Since its inception in 2012, 634 unique users have visited the LigoDV-web website in a total of 33 , 861 sessions and generated a total of 139 , 875 plots. This infrastructure has been helpful in many analyses within the collaboration including follow-up of the data surrounding the first gravitational-wave events observed by LIGO in 2015.
ERIC Educational Resources Information Center
Hovardas, Tasos
2013-01-01
The aim of the paper is to make a critical reading of ecocentrism and its meta-scientific use of ecology. First, basic assumptions of ecocentrism will be examined, which involve nature's intrinsic value, postmodern and modern positions in ecocentrism, and the subject-object dichotomy under the lenses of ecocentrism. Then, we will discuss…
ERIC Educational Resources Information Center
Portnova, Tatiana V.
2016-01-01
The paper deals with various practices and methods for actualization of the scientific information in art excursions. The modern society is characterized by commitment to information richness. The range of cultural and historical materials used as the basis for art excursions is really immense. However if to consider the number of excursions with…
ERIC Educational Resources Information Center
Knyazkina, Evgeniya A.; Muravyeva, Elena V.; Biktemirova, Raisa G.; Zabirov, Dmitry D.; Gorbunova, Oksana A.; Biktemirova, Ella I.
2016-01-01
This article is devoted to the study of the attractiveness of the Republic of Tatarstan as a site for developing youth potential in a field of innovations. Modern approaches to the spread of scientific knowledge in the field of science and technology gave birth to synergies between the different structures in the development of scientific and…
Translations on USSR Military Affairs, Number 1319
1977-12-22
basis of military economics. As is known, the modern scientific and technological revolution has strengthened even more the dependence of war and...investment spheres of an academy’s graduates must also be consider- ed The teaching of political and military economies would border on enlight - enment...dynamics of its military, economic, scientific and technological potential without mastering the changes in the industrial structure of physical
Overview of machine vision methods in x-ray imaging and microtomography
NASA Astrophysics Data System (ADS)
Buzmakov, Alexey; Zolotov, Denis; Chukalina, Marina; Nikolaev, Dmitry; Gladkov, Andrey; Ingacheva, Anastasia; Yakimchuk, Ivan; Asadchikov, Victor
2018-04-01
Digital X-ray imaging became widely used in science, medicine, non-destructive testing. This allows using modern digital images analysis for automatic information extraction and interpretation. We give short review of scientific applications of machine vision in scientific X-ray imaging and microtomography, including image processing, feature detection and extraction, images compression to increase camera throughput, microtomography reconstruction, visualization and setup adjustment.
Hippocrates' complaint and the scientific ethos in early modern England.
Yeo, Richard
2018-04-01
Among the elements of the modern scientific ethos, as identified by R.K. Merton and others, is the commitment of individual effort to a long-term inquiry that may not bring substantial results in a lifetime. The challenge this presents was encapsulated in the aphorism of the ancient Greek physician, Hippocrates of Kos: vita brevis, ars longa (life is short, art is long). This article explores how this complaint was answered in the early modern period by Francis Bacon's call for the inauguration of the sciences over several generations, thereby imagining a succession of lives added together over time. However, Bacon also explored another response to Hippocrates: the devotion of a 'whole life', whether brief or long, to science. The endorsement of long-term inquiry in combination with intensive lifetime involvement was embraced by some leading Fellows of the Royal Society, such as Robert Boyle and Robert Hooke. The problem for individuals, however, was to find satisfaction in science despite concerns, in some fields, that current observations and experiments would not yield material able to be extended by future investigations.
The comeback of hand drawing in modern life sciences.
Chabrier, Renaud; Janke, Carsten
2018-03-01
Scientific manuscripts are full of images. Since the birth of the life sciences, these images were in a form of hand drawings, with great examples from da Vinci, Hooke, van Leeuwenhoek, Remak, Buffon, Bovery, Darwin, Huxley, Haeckel and Gray's Anatomy to name a few. However, in the course of the past century, photographs and simplified schematics have gradually taken over as a way of illustrating scientific data and concepts, assuming that these are 'accurate' representations of the truth. Here, we argue for the importance of reviving the art of scientific drawings as a way of effectively communicating complex scientific ideas to both specialists and the general public.
The Helicopter Antenna Radiation Prediction Code (HARP)
NASA Technical Reports Server (NTRS)
Klevenow, F. T.; Lynch, B. G.; Newman, E. H.; Rojas, R. G.; Scheick, J. T.; Shamansky, H. T.; Sze, K. Y.
1990-01-01
The first nine months effort in the development of a user oriented computer code, referred to as the HARP code, for analyzing the radiation from helicopter antennas is described. The HARP code uses modern computer graphics to aid in the description and display of the helicopter geometry. At low frequencies the helicopter is modeled by polygonal plates, and the method of moments is used to compute the desired patterns. At high frequencies the helicopter is modeled by a composite ellipsoid and flat plates, and computations are made using the geometrical theory of diffraction. The HARP code will provide a user friendly interface, employing modern computer graphics, to aid the user to describe the helicopter geometry, select the method of computation, construct the desired high or low frequency model, and display the results.
International disaster research
NASA Technical Reports Server (NTRS)
Silverstein, Martin Elliot
1991-01-01
No existing telecommunications system can be expected to provide strategy and tactics appropriate to the complex, many faceted problem of disaster. Despite the exciting capabilities of space, communications, remote sensing, and the miracles of modern medicine, complete turnkey transfers to the disaster problem do not make the fit, and cannot be expected to do so. In 1980, a Presidential team assigned the mission of exploring disaster response within the U.S. Federal Government encountered an unanticipated obstacle: disaster was essentially undefined. In the absence of a scientifically based paradigm of disaster, there can be no measure of cost effectiveness, optimum design of manpower structure, or precise application of any technology. These problems spawned a 10-year, multidisciplinary study designed to define the origins, anatomy, and necessary management techniques for catastrophes. The design of the study necessarily reflects interests and expertise in disaster medicine, emergency medicine, telecommunications, computer communications, and forencsic sciences. This study is described.
Shrager, Jeff; Billman, Dorrit; Convertino, Gregorio; Massar, J P; Pirolli, Peter
2010-01-01
Science is a form of distributed analysis involving both individual work that produces new knowledge and collaborative work to exchange information with the larger community. There are many particular ways in which individual and community can interact in science, and it is difficult to assess how efficient these are, and what the best way might be to support them. This paper reports on a series of experiments in this area and a prototype implementation using a research platform called CACHE. CACHE both supports experimentation with different structures of interaction between individual and community cognition and serves as a prototype for computational support for those structures. We particularly focus on CACHE-BC, the Bayes community version of CACHE, within which the community can break up analytical tasks into "mind-sized" units and use provenance tracking to keep track of the relationship between these units. Copyright © 2009 Cognitive Science Society, Inc.
Applications of hypermedia systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lennon, J.; Maurer, H.
1995-05-01
In this paper, we consider several new aspects of modern hypermedia systems. The applications discussed include: (1) General Information and Communication Systems: Distributed information systems for businesses, schools and universities, museums, libraries, health systems, etc. (2) Electronic orientation and information displays: Electronic guided tours, public information kiosks, and publicity dissemination with archive facilities. (3) Lecturing: A system going beyond the traditional to empower both teachers and learners. (4) Libraries: A further step towards fully electronic library systems. (5) Directories of all kinds: Staff, telephone, and all sorts of generic directories. (6) Administration: A fully integrated system such as the onemore » proposed will mean efficient data processing and valuable statistical data. (7) Research: Material can now be accessed from databases all around the world. The effects of networking and computer-supported collaborative work are discussed, and examples of new scientific visualization programs are quoted. The paper concludes with a section entitled {open_quotes}Future Directions{close_quotes}.« less
RELAP-7 Software Verification and Validation Plan
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, Curtis L.; Choi, Yong-Joon; Zou, Ling
This INL plan comprehensively describes the software for RELAP-7 and documents the software, interface, and software design requirements for the application. The plan also describes the testing-based software verification and validation (SV&V) process—a set of specially designed software models used to test RELAP-7. The RELAP-7 (Reactor Excursion and Leak Analysis Program) code is a nuclear reactor system safety analysis code being developed at Idaho National Laboratory (INL). The code is based on the INL’s modern scientific software development framework – MOOSE (Multi-Physics Object-Oriented Simulation Environment). The overall design goal of RELAP-7 is to take advantage of the previous thirty yearsmore » of advancements in computer architecture, software design, numerical integration methods, and physical models. The end result will be a reactor systems analysis capability that retains and improves upon RELAP5’s capability and extends the analysis capability for all reactor system simulation scenarios.« less
Direct digital conversion detector technology
NASA Astrophysics Data System (ADS)
Mandl, William J.; Fedors, Richard
1995-06-01
Future imaging sensors for the aerospace and commercial video markets will depend on low cost, high speed analog-to-digital (A/D) conversion to efficiently process optical detector signals. Current A/D methods place a heavy burden on system resources, increase noise, and limit the throughput. This paper describes a unique method for incorporating A/D conversion right on the focal plane array. This concept is based on Sigma-Delta sampling, and makes optimum use of the active detector real estate. Combined with modern digital signal processors, such devices will significantly increase data rates off the focal plane. Early conversion to digital format will also decrease the signal susceptibility to noise, lowering the communications bit error rate. Computer modeling of this concept is described, along with results from several simulation runs. A potential application for direct digital conversion is also reviewed. Future uses for this technology could range from scientific instruments to remote sensors, telecommunications gear, medical diagnostic tools, and consumer products.
NASA Astrophysics Data System (ADS)
Robbins, Dennis; Ford, K. E. Saavik
2018-01-01
The NSF-supported “AstroCom NYC” program, a collaboration of the City University of New York and the American Museum of Natural History (AMNH), has developed and offers hands-on workshops to undergraduate faculty on teaching science thought and practices. These professional development workshops emphasize a curriculum and pedagogical strategies that uses computers and other digital devices in a laboratory environment to teach students fundamental topics, including: proportional reasoning, control of variables thinking, experimental design, hypothesis testing, reasoning with data, and drawing conclusions from graphical displays. Topics addressed here are rarely taught in-depth during the formal undergraduate years and are frequently learned only after several apprenticeship research experiences. The goal of these workshops is to provide working and future faculty with an interactive experience in science learning and teaching using modern technological tools.
The Historical 'Science Driver': Early Telescopes and Scientific Incentive.
NASA Astrophysics Data System (ADS)
Abrahams, Peter
2011-01-01
The term 'science driver' was first used in the 1980s. The modern meaning of 'science' is far removed from its meaning in the first centuries of the telescope. It is anachronistic to refer to the 'science driver' of a historic telescope. However, there were scientific motivations behind many early telescopes, large reflectors in particular. The chronology of larger and improved telescopes will be placed in the context of the rationale for their creation. The evolution of scientific purpose of these instruments will be extracted and examined for patterns and significance.
Are Life, Consciousness, and Intelligence Cosmic Phenomena?
NASA Astrophysics Data System (ADS)
Kostro, Ludwik
2013-09-01
First of all the scientific reasons of astrophysics, astronomy and modern astrobiology in favor of the existence of life, consciousness and intelligence in the Universe will be presented and estimated (e.g. the Nobel Laureate Christian de Duve's arguments). The part played in this kind of scientific debate by the Copernicus principle will be stressed from the scientific and philosophical point of view. Since there are also philosophers and theologians who argue in favor of the existence of life in the Universe, their arguments will be shortly presented and estimated as well.
[Trends of the scientific work development in central military-and-clinical hospitals].
Tregubov, V N; Baranov, V V
2006-04-01
Scientific work in central military-and-clinical hospitals (CMCH) is very important since it leads to creation and application of modern medical technologies in practice of military-and-medical service, professional growth of doctors and improves the status of hospitals among other medical organizations. The analysis of CMCH under the Russian Ministry of Defense shows that the main role in the development of scientific work in central hospitals belongs to management which is the activity to perform planning, organization, coordination, motivation and control functions.
Chemist and meteorologist - Antoine Lavoisier.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gaffney, J. S.; Marley, N. A.; Environmental Research
2003-01-01
Antoine Lavoisier (1743-1794) is well known as the father of modern chemistry. His work on the chemistry of oxygen and the development of the concept of mass balance lighted the way for future chemists to apply rigorous scientific methods in their work. However, Lavoisier also made considerable contributions to other scientific disciplines, including meteorology and atmospheric science. This paper will survey the life of Antoine Lavoisier and his considerable scientific contributions. We will highlight his work on lightning and his attempts to develop a meteorological network for temperature and humidity measurements to support weather prediction.
NASA Astrophysics Data System (ADS)
Bergey, Bradley W.; Ketelhut, Diane Jass; Liang, Senfeng; Natarajan, Uma; Karakus, Melissa
2015-10-01
The primary aim of the study was to examine whether performance on a science assessment in an immersive virtual environment was associated with changes in scientific inquiry self-efficacy. A secondary aim of the study was to examine whether performance on the science assessment was equitable for students with different levels of computer game self-efficacy, including whether gender differences were observed. We examined 407 middle school students' scientific inquiry self-efficacy and computer game self-efficacy before and after completing a computer game-like assessment about a science mystery. Results from path analyses indicated that prior scientific inquiry self-efficacy predicted achievement on end-of-module questions, which in turn predicted change in scientific inquiry self-efficacy. By contrast, computer game self-efficacy was neither predictive of nor predicted by performance on the science assessment. While boys had higher computer game self-efficacy compared to girls, multi-group analyses suggested only minor gender differences in how efficacy beliefs related to performance. Implications for assessments with virtual environments and future design and research are discussed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hules, John
This 1998 annual report from the National Scientific Energy Research Computing Center (NERSC) presents the year in review of the following categories: Computational Science; Computer Science and Applied Mathematics; and Systems and Services. Also presented are science highlights in the following categories: Basic Energy Sciences; Biological and Environmental Research; Fusion Energy Sciences; High Energy and Nuclear Physics; and Advanced Scientific Computing Research and Other Projects.
Hypergraph-Based Combinatorial Optimization of Matrix-Vector Multiplication
ERIC Educational Resources Information Center
Wolf, Michael Maclean
2009-01-01
Combinatorial scientific computing plays an important enabling role in computational science, particularly in high performance scientific computing. In this thesis, we will describe our work on optimizing matrix-vector multiplication using combinatorial techniques. Our research has focused on two different problems in combinatorial scientific…
ERIC Educational Resources Information Center
Evans, C. D.
This paper describes the experiences of the industrial research laboratory of Kodak Ltd. in finding and providing a computer terminal most suited to its very varied requirements. These requirements include bibliographic and scientific data searching and access to a number of worldwide computing services for scientific computing work. The provision…
Astro-WISE: Chaining to the Universe
NASA Astrophysics Data System (ADS)
Valentijn, E. A.; McFarland, J. P.; Snigula, J.; Begeman, K. G.; Boxhoorn, D. R.; Rengelink, R.; Helmich, E.; Heraudeau, P.; Verdoes Kleijn, G.; Vermeij, R.; Vriend, W.-J.; Tempelaar, M. J.; Deul, E.; Kuijken, K.; Capaccioli, M.; Silvotti, R.; Bender, R.; Neeser, M.; Saglia, R.; Bertin, E.; Mellier, Y.
2007-10-01
The recent explosion of recorded digital data and its processed derivatives threatens to overwhelm researchers when analysing their experimental data or looking up data items in archives and file systems. While current hardware developments allow the acquisition, processing and storage of hundreds of terabytes of data at the cost of a modern sports car, the software systems to handle these data are lagging behind. This problem is very general and is well recognized by various scientific communities; several large projects have been initiated, e.g., DATAGRID/EGEE {http://www.eu-egee.org/} federates compute and storage power over the high-energy physical community, while the international astronomical community is building an Internet geared Virtual Observatory {http://www.euro-vo.org/pub/} (Padovani 2006) connecting archival data. These large projects either focus on a specific distribution aspect or aim to connect many sub-communities and have a relatively long trajectory for setting standards and a common layer. Here, we report first light of a very different solution (Valentijn & Kuijken 2004) to the problem initiated by a smaller astronomical IT community. It provides an abstract scientific information layer which integrates distributed scientific analysis with distributed processing and federated archiving and publishing. By designing new abstractions and mixing in old ones, a Science Information System with fully scalable cornerstones has been achieved, transforming data systems into knowledge systems. This break-through is facilitated by the full end-to-end linking of all dependent data items, which allows full backward chaining from the observer/researcher to the experiment. Key is the notion that information is intrinsic in nature and thus is the data acquired by a scientific experiment. The new abstraction is that software systems guide the user to that intrinsic information by forcing full backward and forward chaining in the data modelling.
Amplify scientific discovery with artificial intelligence
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gil, Yolanda; Greaves, Mark T.; Hendler, James
Computing innovations have fundamentally changed many aspects of scientific inquiry. For example, advances in robotics, high-end computing, networking, and databases now underlie much of what we do in science such as gene sequencing, general number crunching, sharing information between scientists, and analyzing large amounts of data. As computing has evolved at a rapid pace, so too has its impact in science, with the most recent computing innovations repeatedly being brought to bear to facilitate new forms of inquiry. Recently, advances in Artificial Intelligence (AI) have deeply penetrated many consumer sectors, including for example Apple’s Siri™ speech recognition system, real-time automatedmore » language translation services, and a new generation of self-driving cars and self-navigating drones. However, AI has yet to achieve comparable levels of penetration in scientific inquiry, despite its tremendous potential in aiding computers to help scientists tackle tasks that require scientific reasoning. We contend that advances in AI will transform the practice of science as we are increasingly able to effectively and jointly harness human and machine intelligence in the pursuit of major scientific challenges.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ahrens, J.P.; Shapiro, L.G.; Tanimoto, S.L.
1997-04-01
This paper describes a computing environment which supports computer-based scientific research work. Key features include support for automatic distributed scheduling and execution and computer-based scientific experimentation. A new flexible and extensible scheduling technique that is responsive to a user`s scheduling constraints, such as the ordering of program results and the specification of task assignments and processor utilization levels, is presented. An easy-to-use constraint language for specifying scheduling constraints, based on the relational database query language SQL, is described along with a search-based algorithm for fulfilling these constraints. A set of performance studies show that the environment can schedule and executemore » program graphs on a network of workstations as the user requests. A method for automatically generating computer-based scientific experiments is described. Experiments provide a concise method of specifying a large collection of parameterized program executions. The environment achieved significant speedups when executing experiments; for a large collection of scientific experiments an average speedup of 3.4 on an average of 5.5 scheduled processors was obtained.« less
The Computational Ecologist’s Toolbox
Computational ecology, nestled in the broader field of data science, is an interdisciplinary field that attempts to improve our understanding of complex ecological systems through the use of modern computational methods. Computational ecology is based on a union of competence in...
Bruteau's philosophy of spiritual evolution and consciousness: foundation for a nursing cosmology.
McCarthy, M Patrice
2011-01-01
The ontological foundation of the modern world view based on irreconcilable dichotomies has held hegemonic status since the dawn of the scientific revolution. The post-modern critique has exposed the inadequacies of the modern perspective and challenged the potential for any narrative to adequately ground a vision for the future. This paper proposes that the philosophy of Beatrice Bruteau can support a foundation for a visionary world view consistent with nursing's respect for human dignity and societal health. The author discusses the key concepts of Bruteau's perspective on societal evolution based on an integrated study of science, mathematics, religion, and philosophy. This perspective is discussed as a foundation to move beyond the dichotomous influence of the modern world view and the deconstructive critique of the post-modern perspective. The author suggests spiritual evolution and a participatory consciousness as an ontological foundation for a cosmology congruent with nursing's social mandate. © 2010 Blackwell Publishing Ltd.
Distributed GPU Computing in GIScience
NASA Astrophysics Data System (ADS)
Jiang, Y.; Yang, C.; Huang, Q.; Li, J.; Sun, M.
2013-12-01
Geoscientists strived to discover potential principles and patterns hidden inside ever-growing Big Data for scientific discoveries. To better achieve this objective, more capable computing resources are required to process, analyze and visualize Big Data (Ferreira et al., 2003; Li et al., 2013). Current CPU-based computing techniques cannot promptly meet the computing challenges caused by increasing amount of datasets from different domains, such as social media, earth observation, environmental sensing (Li et al., 2013). Meanwhile CPU-based computing resources structured as cluster or supercomputer is costly. In the past several years with GPU-based technology matured in both the capability and performance, GPU-based computing has emerged as a new computing paradigm. Compare to traditional computing microprocessor, the modern GPU, as a compelling alternative microprocessor, has outstanding high parallel processing capability with cost-effectiveness and efficiency(Owens et al., 2008), although it is initially designed for graphical rendering in visualization pipe. This presentation reports a distributed GPU computing framework for integrating GPU-based computing within distributed environment. Within this framework, 1) for each single computer, computing resources of both GPU-based and CPU-based can be fully utilized to improve the performance of visualizing and processing Big Data; 2) within a network environment, a variety of computers can be used to build up a virtual super computer to support CPU-based and GPU-based computing in distributed computing environment; 3) GPUs, as a specific graphic targeted device, are used to greatly improve the rendering efficiency in distributed geo-visualization, especially for 3D/4D visualization. Key words: Geovisualization, GIScience, Spatiotemporal Studies Reference : 1. Ferreira de Oliveira, M. C., & Levkowitz, H. (2003). From visual data exploration to visual data mining: A survey. Visualization and Computer Graphics, IEEE Transactions on, 9(3), 378-394. 2. Li, J., Jiang, Y., Yang, C., Huang, Q., & Rice, M. (2013). Visualizing 3D/4D Environmental Data Using Many-core Graphics Processing Units (GPUs) and Multi-core Central Processing Units (CPUs). Computers & Geosciences, 59(9), 78-89. 3. Owens, J. D., Houston, M., Luebke, D., Green, S., Stone, J. E., & Phillips, J. C. (2008). GPU computing. Proceedings of the IEEE, 96(5), 879-899.
ERIC Educational Resources Information Center
Gegner, Julie A.; Mackay, Donald H. J.; Mayer, Richard E.
2009-01-01
High school students can access original scientific research articles on the Internet, but may have trouble understanding them. To address this problem of online literacy, the authors developed a computer-based prototype for guiding students' comprehension of scientific articles. High school students were asked to read an original scientific…
ERIC Educational Resources Information Center
Weiss, Charles J.
2017-01-01
The Scientific Computing for Chemists course taught at Wabash College teaches chemistry students to use the Python programming language, Jupyter notebooks, and a number of common Python scientific libraries to process, analyze, and visualize data. Assuming no prior programming experience, the course introduces students to basic programming and…
Computational chemistry in pharmaceutical research: at the crossroads.
Bajorath, Jürgen
2012-01-01
Computational approaches are an integral part of pharmaceutical research. However, there are many of unsolved key questions that limit the scientific progress in the still evolving computational field and its impact on drug discovery. Importantly, a number of these questions are not new but date back many years. Hence, it might be difficult to conclusively answer them in the foreseeable future. Moreover, the computational field as a whole is characterized by a high degree of heterogeneity and so is, unfortunately, the quality of its scientific output. In light of this situation, it is proposed that changes in scientific standards and culture should be seriously considered now in order to lay a foundation for future progress in computational research.
[Earth and Space Sciences Project Services for NASA HPCC
NASA Technical Reports Server (NTRS)
Merkey, Phillip
2002-01-01
This grant supported the effort to characterize the problem domain of the Earth Science Technology Office's Computational Technologies Project, to engage the Beowulf Cluster Computing Community as well as the High Performance Computing Research Community so that we can predict the applicability of said technologies to the scientific community represented by the CT project and formulate long term strategies to provide the computational resources necessary to attain the anticipated scientific objectives of the CT project. Specifically, the goal of the evaluation effort is to use the information gathered over the course of the Round-3 investigations to quantify the trends in scientific expectations, the algorithmic requirements and capabilities of high-performance computers to satisfy this anticipated need.
Scholarly literature and the press: scientific impact and social perception of physics computing
NASA Astrophysics Data System (ADS)
Pia, M. G.; Basaglia, T.; Bell, Z. W.; Dressendorfer, P. V.
2014-06-01
The broad coverage of the search for the Higgs boson in the mainstream media is a relative novelty for high energy physics (HEP) research, whose achievements have traditionally been limited to scholarly literature. This paper illustrates the results of a scientometric analysis of HEP computing in scientific literature, institutional media and the press, and a comparative overview of similar metrics concerning representative particle physics measurements. The picture emerging from these scientometric data documents the relationship between the scientific impact and the social perception of HEP physics research versus that of HEP computing. The results of this analysis suggest that improved communication of the scientific and social role of HEP computing via press releases from the major HEP laboratories would be beneficial to the high energy physics community.
Software Reuse Methods to Improve Technological Infrastructure for e-Science
NASA Technical Reports Server (NTRS)
Marshall, James J.; Downs, Robert R.; Mattmann, Chris A.
2011-01-01
Social computing has the potential to contribute to scientific research. Ongoing developments in information and communications technology improve capabilities for enabling scientific research, including research fostered by social computing capabilities. The recent emergence of e-Science practices has demonstrated the benefits from improvements in the technological infrastructure, or cyber-infrastructure, that has been developed to support science. Cloud computing is one example of this e-Science trend. Our own work in the area of software reuse offers methods that can be used to improve new technological development, including cloud computing capabilities, to support scientific research practices. In this paper, we focus on software reuse and its potential to contribute to the development and evaluation of information systems and related services designed to support new capabilities for conducting scientific research.
Popplow, Marcus
2015-12-01
Recent critical approaches to what has conventionally been described as "scientific" and "technical" knowledge in early modern Europe have provided a wealth of new insights. So far, the various analytical concepts suggested by these studies have not yet been comprehensively discussed. The present essay argues that such comprehensive approaches might prove of special value for long-term and cross-cultural reflections on technology-related knowledge. As heuristic tools, the notions of "formalization" and "interaction" are proposed as part of alternative narratives to those highlighting the emergence of "science" as the most relevant development for technology-related knowledge in early modern Europe.
Leisure riding horses: research topics versus the needs of stakeholders.
Janczarek, Iwona; Wilk, Izabela
2017-07-01
Horses intended for leisure riding do not undergo any selection and most often retired sports horses or defective horses are chosen, as a low selling price determines their purchase by a leisure riding center. Unfortunately, horses bought at low prices usually have low utility value, are difficult to handle, require a special or individual approach and do not provide satisfaction in riding. However, neither modern horse breeding nor scientific research address the need to breed horses for leisure activities. There is no clear definition of a model leisure horse and criteria or information for its selection are not readily available in scientific publications. A wide spectrum of research methods may be used to evaluate various performance traits in horses intended for leisure activities. The fact that the population of recreational horses and their riders outnumber sporting horses should attract the special attention of scientific research. Their utility traits need to be determined with modern technology and methods in the same way they are for sporting horses. Such a system of evaluation would be very helpful for riders. © 2017 Japanese Society of Animal Science.
Ghost and self: Jung's paradigm shift and a response to Zinkin.
Rowland, Susan
2009-11-01
Zinkin's lucid challenge to Jung makes perfect sense. Indeed, it is the implications of this 'making sense' that this paper addresses. For Zinkin's characterization of the 'self' takes it as a 'concept' requiring coherence; a variety of abstract non-contextual knowledge that itself has a mythical heritage. Moreover, Zinkin's refinement of Jung seeks to make his work fit for the scientific paradigm of modernity. In turn, modernity's paradigm owes much to Newton's notion of knowledge via reductionism. Here knowledge or investigation is divided up into the smallest possible units with the aim of eventually putting it all together into 'one' picture of scientific truth. Unfortunately, 'reductionism' does not do justice to the resonant possibilities of Jung's writing. These look forward to a new scientific paradigm of the twenty-first century, of the interactive 'field', emergence and complexity theory. The paper works paradoxically by discovering Zinkin's 'intersubjective self' after all, in two undervalued narratives by Jung, his doctoral thesis and a short late ghost story. However, in the ambivalences and radical fictional experimentation of these fascinating texts can be discerned an-Other self, one both created and found.
NASA Astrophysics Data System (ADS)
Soonthornthum, Boonrucksar; Orchiston, Wayne; Komonjinda, Siramas
2012-09-01
The first great Thai ruler to encourage the adoption of Western culture and technology was King Narai, and his enlightened attitude led to the rapid development of Thailand. King Narai also had a passion for astronomy, and he pursued this interest by allowing French Jesuit missionaries to set up a large modern well-equipped astronomical observatory in Lopburi Province between AD 1685 and 1687. This was known as the Wat San Paolo Observatory, and King Narai and the missionaries observed a total lunar eclipse on 10 December 1685 and a partial solar eclipse on 30 April 1688. These observations and others made at Wat San Paolo Observatory during the 1680s marked the start of modern scientific astronomy in Thailand. In this paper we discuss King Narai's scientific and other interests, the founding of the Wat San Paolo Observatory, the missionaries who conducted the astronomical programs, their instruments and their observations. We also describe the surviving ruins of the Observatory and their interpretation as a site of national scientific importance in Thailand.
Modern Scientific Metaphors of Warfare: Updating the Doctrinal Paradigm
1993-05-27
eventful but not exceptional. In common with thousands of officers in armies across Europe, he served from boyhood on, experienced defeat and captivity as...Scientific Revolutions, 19. 38. Alexander Woodcock and Monte Davis, Catastrophe Theory (New York: E.P. Dutton, 1978) 43-57. 39. Alexander Woodcock ...Application of Catastrophe Theory to the Analysis of Military Behavior (The Hague: SHAPE Technical Center, 1984) 5. 40. Woodcock and Davis, Catastrophe
People’s Republic of China Scientific Abstracts, Number 175
1977-09-08
personal names, title and series) are available through Bell & Howell, Old Mansfield Road, Wooster, Ohio, 44691. Correspondence pertaining to matters ...explain all the observed meson states. Our theory can apply equally to the baryon states if the phenomenological potential Y, is reduced by a...of modern discovery and scientific advances, the space-time concept, indivisibility of matter , and no sequence distinguishment are all out-of-date
Isotope archaeology: reading the past in metals, minerals, and bone.
Stos-Gale, Z A
1992-01-01
The latest edition of the Oxford Dictionary (1989) defines archaeology as '... the scientific study of the remains and monuments of the prehistoric period'. It is not surprising, therefore, that modern archaeology draws as much as possible on scientific methods of investigation developed in other fields. In the last ten years the powerful method of quantitative isotope analysis has brought a new dimension to the examination of archaeological finds.
Advances in medical image computing.
Tolxdorff, T; Deserno, T M; Handels, H; Meinzer, H-P
2009-01-01
Medical image computing has become a key technology in high-tech applications in medicine and an ubiquitous part of modern imaging systems and the related processes of clinical diagnosis and intervention. Over the past years significant progress has been made in the field, both on methodological and on application level. Despite this progress there are still big challenges to meet in order to establish image processing routinely in health care. In this issue, selected contributions of the German Conference on Medical Image Processing (BVM) are assembled to present latest advances in the field of medical image computing. The winners of scientific awards of the German Conference on Medical Image Processing (BVM) 2008 were invited to submit a manuscript on their latest developments and results for possible publication in Methods of Information in Medicine. Finally, seven excellent papers were selected to describe important aspects of recent advances in the field of medical image processing. The selected papers give an impression of the breadth and heterogeneity of new developments. New methods for improved image segmentation, non-linear image registration and modeling of organs are presented together with applications of image analysis methods in different medical disciplines. Furthermore, state-of-the-art tools and techniques to support the development and evaluation of medical image processing systems in practice are described. The selected articles describe different aspects of the intense development in medical image computing. The image processing methods presented enable new insights into the patient's image data and have the future potential to improve medical diagnostics and patient treatment.
Sorgner, Helene
2016-06-01
This paper compares Feyerabend's arguments in Science in a Free Society to the controversial theory of expertise proposed by Harry Collins and Robert Evans as a Third Wave of Science Studies. Is the legitimacy of democratic decisions threatened by the unquestioned authority of scientific advice? Or does, on the contrary, science need protection from too much democratic participation in technical decisions? Where Feyerabend's political relativism envisions democratic society as inherently pluralist and demands equal contribution of all traditions and worldviews to public decision-making, Collins and Evans hold a conception of elective modernism, defending the reality and value of technical expertise and arguing that science deserves a privileged status in modern democracies, because scientific values are also democratic values. I will argue that Feyerabend's political relativism provides a valuable framework for the evaluation of Collins' and Evans' theory of expertise. By constructing a dialog between Feyerabend and this more recent approach in Science and Technology Studies, the aim of this article is not only to show where the two positions differ and in what way they might be reconciled, but also how Feyerabend's philosophy provides substantial input to contemporary debate. Copyright © 2015 Elsevier Ltd. All rights reserved.
The ubiquitous reflex and its critics in post-revolutionary Russia.
Sirotkina, Irina
2009-03-01
In the last century, the reflex was more than a scientific concept: it was a cultural idiom that could be used to various aims--political, scholarly, and artistic. In Russia in the 1920s, the reflex became a ubiquitous notion and a current word, part of the revolutionary discourse and, finally, a password to modernity. Two major factors contributed to it: physiological theories of the reflex, widespread in Russia at the early twentieth-century, and the materialist philosophy backed after the Revolution by the Communist party. Everybody who wished to be modern and materialist, in conformity with the official communist views, had to refer to reflexes. Yet, even in this period, the concept was not unproblematic and was criticized by some scientists, philosophers, artists and even Party members. In the paper, I describe both the array of uses of the term and the criticism it received in political, scientific and artistic discourses. It is not uncommon that, taking their origins in culture and common language, scientific concepts later return there in the form of metaphors. Similarly, the reflex was made into a rigorous scientific concept in the nineteenth century but, in the next century, it circulated as a cultural idiom penetrating various areas of political, artistic and academic life.
Multidimensional Environmental Data Resource Brokering on Computational Grids and Scientific Clouds
NASA Astrophysics Data System (ADS)
Montella, Raffaele; Giunta, Giulio; Laccetti, Giuliano
Grid computing has widely evolved over the past years, and its capabilities have found their way even into business products and are no longer relegated to scientific applications. Today, grid computing technology is not restricted to a set of specific grid open source or industrial products, but rather it is comprised of a set of capabilities virtually within any kind of software to create shared and highly collaborative production environments. These environments are focused on computational (workload) capabilities and the integration of information (data) into those computational capabilities. An active grid computing application field is the fully virtualization of scientific instruments in order to increase their availability and decrease operational and maintaining costs. Computational and information grids allow to manage real-world objects in a service-oriented way using industrial world-spread standards.
78 FR 6087 - Advanced Scientific Computing Advisory Committee
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-29
... INFORMATION CONTACT: Melea Baker, Office of Advanced Scientific Computing Research; SC-21/Germantown Building... Theory and Experiment (INCITE) Public Comment (10-minute rule) Public Participation: The meeting is open...
Computational Science in Armenia (Invited Talk)
NASA Astrophysics Data System (ADS)
Marandjian, H.; Shoukourian, Yu.
This survey is devoted to the development of informatics and computer science in Armenia. The results in theoretical computer science (algebraic models, solutions to systems of general form recursive equations, the methods of coding theory, pattern recognition and image processing), constitute the theoretical basis for developing problem-solving-oriented environments. As examples can be mentioned: a synthesizer of optimized distributed recursive programs, software tools for cluster-oriented implementations of two-dimensional cellular automata, a grid-aware web interface with advanced service trading for linear algebra calculations. In the direction of solving scientific problems that require high-performance computing resources, examples of completed projects include the field of physics (parallel computing of complex quantum systems), astrophysics (Armenian virtual laboratory), biology (molecular dynamics study of human red blood cell membrane), meteorology (implementing and evaluating the Weather Research and Forecast Model for the territory of Armenia). The overview also notes that the Institute for Informatics and Automation Problems of the National Academy of Sciences of Armenia has established a scientific and educational infrastructure, uniting computing clusters of scientific and educational institutions of the country and provides the scientific community with access to local and international computational resources, that is a strong support for computational science in Armenia.
Making medicine scientific: empiricism, rationality, and quackery in mid-Victorian Britain.
Weatherall, M W
1996-08-01
This paper discusses the strategies used to construct scientific medicine in mid-Victorian Britain. An opening section considers why it was thought desirable to create a properly scientific medicine, and outlines the empirical and rational bases of the medical establishment's projects for this. The bulk of the paper concerns an alternative approach to making medicine scientific--that put forward by certain advocates of homoeopathy--and how this approach was excluded from those arenas where scientific medicine was being created, and thereby made unscientific. This process is illustrated by the clash between homoeopathy and establishment medicine that occurred in mid-Victorian Cambridge. The final section briefly considers the complementary process of educating the public in what was properly scientific medicine, and what was not, and suggests that the processes of building boundaries to exclude competing practitioners, while keeping patients inside, created the space in which modern scientific medicine has flourished so successfully.
NASA Technical Reports Server (NTRS)
Denning, Peter J.; Tichy, Walter F.
1990-01-01
Highly parallel computing architectures are the only means to achieve the computation rates demanded by advanced scientific problems. A decade of research has demonstrated the feasibility of such machines and current research focuses on which architectures designated as multiple instruction multiple datastream (MIMD) and single instruction multiple datastream (SIMD) have produced the best results to date; neither shows a decisive advantage for most near-homogeneous scientific problems. For scientific problems with many dissimilar parts, more speculative architectures such as neural networks or data flow may be needed.
And yet, we were modern. The paradoxes of Iberian science after the Grand Narratives.
Pimentel, Juan; Pardo-Tomás, José
2017-06-01
In this article, we try to explain the origin of a disagreement; the sort that often arises when the subject is the history of early modern Spanish science. In the decades between 1970 and 1990, while some historians were trying to include Spain in the grand narrative of the rise of modern science, the very historical category of the Scientific Revolution was beginning to be dismantled. It could be said that Spaniards were boarding the flagship of modern science right before it sank. To understand this décalage it would be helpful to recall the role of the history of science during the years after the Franco dictatorship and Spain's transition to democracy. It was a discipline useful for putting behind us the Black Legend and Spanish exceptionalism.
ASCR Cybersecurity for Scientific Computing Integrity - Research Pathways and Ideas Workshop
DOE Office of Scientific and Technical Information (OSTI.GOV)
Peisert, Sean; Potok, Thomas E.; Jones, Todd
At the request of the U.S. Department of Energy's (DOE) Office of Science (SC) Advanced Scientific Computing Research (ASCR) program office, a workshop was held June 2-3, 2015, in Gaithersburg, MD, to identify potential long term (10 to +20 year) cybersecurity fundamental basic research and development challenges, strategies and roadmap facing future high performance computing (HPC), networks, data centers, and extreme-scale scientific user facilities. This workshop was a follow-on to the workshop held January 7-9, 2015, in Rockville, MD, that examined higher level ideas about scientific computing integrity specific to the mission of the DOE Office of Science. Issues includedmore » research computation and simulation that takes place on ASCR computing facilities and networks, as well as network-connected scientific instruments, such as those run by various DOE Office of Science programs. Workshop participants included researchers and operational staff from DOE national laboratories, as well as academic researchers and industry experts. Participants were selected based on the submission of abstracts relating to the topics discussed in the previous workshop report [1] and also from other ASCR reports, including "Abstract Machine Models and Proxy Architectures for Exascale Computing" [27], the DOE "Preliminary Conceptual Design for an Exascale Computing Initiative" [28], and the January 2015 machine learning workshop [29]. The workshop was also attended by several observers from DOE and other government agencies. The workshop was divided into three topic areas: (1) Trustworthy Supercomputing, (2) Extreme-Scale Data, Knowledge, and Analytics for Understanding and Improving Cybersecurity, and (3) Trust within High-end Networking and Data Centers. Participants were divided into three corresponding teams based on the category of their abstracts. The workshop began with a series of talks from the program manager and workshop chair, followed by the leaders for each of the three topics and a representative of each of the four major DOE Office of Science Advanced Scientific Computing Research Facilities: the Argonne Leadership Computing Facility (ALCF), the Energy Sciences Network (ESnet), the National Energy Research Scientific Computing Center (NERSC), and the Oak Ridge Leadership Computing Facility (OLCF). The rest of the workshop consisted of topical breakout discussions and focused writing periods that produced much of this report.« less
Computer-aided modelling and analysis of PV systems: a comparative study.
Koukouvaos, Charalambos; Kandris, Dionisis; Samarakou, Maria
2014-01-01
Modern scientific advances have enabled remarkable efficacy for photovoltaic systems with regard to the exploitation of solar energy, boosting them into having a rapidly growing position among the systems developed for the production of renewable energy. However, in many cases the design, analysis, and control of photovoltaic systems are tasks which are quite complex and thus difficult to be carried out. In order to cope with this kind of problems, appropriate software tools have been developed either as standalone products or parts of general purpose software platforms used to model and simulate the generation, transmission, and distribution of solar energy. The utilization of this kind of software tools may be extremely helpful to the successful performance evaluation of energy systems with maximum accuracy and minimum cost in time and effort. The work presented in this paper aims on a first level at the performance analysis of various configurations of photovoltaic systems through computer-aided modelling. On a second level, it provides a comparative evaluation of the credibility of two of the most advanced graphical programming environments, namely, Simulink and LabVIEW, with regard to their application in photovoltaic systems.
Computer-Aided Modelling and Analysis of PV Systems: A Comparative Study
Koukouvaos, Charalambos
2014-01-01
Modern scientific advances have enabled remarkable efficacy for photovoltaic systems with regard to the exploitation of solar energy, boosting them into having a rapidly growing position among the systems developed for the production of renewable energy. However, in many cases the design, analysis, and control of photovoltaic systems are tasks which are quite complex and thus difficult to be carried out. In order to cope with this kind of problems, appropriate software tools have been developed either as standalone products or parts of general purpose software platforms used to model and simulate the generation, transmission, and distribution of solar energy. The utilization of this kind of software tools may be extremely helpful to the successful performance evaluation of energy systems with maximum accuracy and minimum cost in time and effort. The work presented in this paper aims on a first level at the performance analysis of various configurations of photovoltaic systems through computer-aided modelling. On a second level, it provides a comparative evaluation of the credibility of two of the most advanced graphical programming environments, namely, Simulink and LabVIEW, with regard to their application in photovoltaic systems. PMID:24772007
Building Cognition: The Construction of Computational Representations for Scientific Discovery
ERIC Educational Resources Information Center
Chandrasekharan, Sanjay; Nersessian, Nancy J.
2015-01-01
Novel computational representations, such as simulation models of complex systems and video games for scientific discovery (Foldit, EteRNA etc.), are dramatically changing the way discoveries emerge in science and engineering. The cognitive roles played by such computational representations in discovery are not well understood. We present a…
Conversion of HSPF Legacy Model to a Platform-Independent, Open-Source Language
NASA Astrophysics Data System (ADS)
Heaphy, R. T.; Burke, M. P.; Love, J. T.
2015-12-01
Since its initial development over 30 years ago, the Hydrologic Simulation Program - FORTAN (HSPF) model has been used worldwide to support water quality planning and management. In the United States, HSPF receives widespread endorsement as a regulatory tool at all levels of government and is a core component of the EPA's Better Assessment Science Integrating Point and Nonpoint Sources (BASINS) system, which was developed to support nationwide Total Maximum Daily Load (TMDL) analysis. However, the model's legacy code and data management systems have limitations in their ability to integrate with modern software, hardware, and leverage parallel computing, which have left voids in optimization, pre-, and post-processing tools. Advances in technology and our scientific understanding of environmental processes that have occurred over the last 30 years mandate that upgrades be made to HSPF to allow it to evolve and continue to be a premiere tool for water resource planners. This work aims to mitigate the challenges currently facing HSPF through two primary tasks: (1) convert code to a modern widely accepted, open-source, high-performance computing (hpc) code; and (2) convert model input and output files to modern widely accepted, open-source, data model, library, and binary file format. Python was chosen as the new language for the code conversion. It is an interpreted, object-oriented, hpc code with dynamic semantics that has become one of the most popular open-source languages. While python code execution can be slow compared to compiled, statically typed programming languages, such as C and FORTRAN, the integration of Numba (a just-in-time specializing compiler) has allowed this challenge to be overcome. For the legacy model data management conversion, HDF5 was chosen to store the model input and output. The code conversion for HSPF's hydrologic and hydraulic modules has been completed. The converted code has been tested against HSPF's suite of "test" runs and shown good agreement and similar execution times while using the Numba compiler. Continued verification of the accuracy of the converted code against more complex legacy applications and improvement upon execution times by incorporating an intelligent network change detection tool is currently underway, and preliminary results will be presented.
ERIC Educational Resources Information Center
Prosise, Jeff
This document presents the principles behind modern computer graphics without straying into the arcane languages of mathematics and computer science. Illustrations accompany the clear, step-by-step explanations that describe how computers draw pictures. The 22 chapters of the book are organized into 5 sections. "Part 1: Computer Graphics in…
Synchronous international scientific mobility in the space of affiliations: evidence from Russia.
Markova, Yulia V; Shmatko, Natalia A; Katchanov, Yurij L
2016-01-01
The article presents a survey of Russian researchers' synchronous international scientific mobility as an element of the global system of scientific labor market. Synchronous international scientific mobility is a simultaneous holding of scientific positions in institutions located in different countries. The study explores bibliometric data from the Web of Science Core Collection and socio-economic indicators for 56 countries. In order to examine international scientific mobility, we use a method of affiliations. The paper introduces a model of synchronous international scientific mobility. It enables to specify country's involvement in the international division of scientific labor. Synchronous international scientific mobility is a modern form of the international division of labor in science. It encompasses various forms of part-time, temporary and remote employment of scientists. The analysis reveals the distribution of Russian authors in the space of affiliations, and directions of upward/downward international scientific mobility. The bibliometric characteristics of mobile authors are isomorphic to those of receiver country authors. Synchronous international scientific mobility of Russian authors is determined by differences in scientific impacts between receiver and donor countries.
Listmania. How lists can open up fresh possibilities for research in the history of science.
Delbourgo, James; Müller-Wille, Staffan
2012-12-01
Anthropologists, linguists, cultural historians, and literary scholars have long emphasized the value of examining writing as a material practice and have often invoked the list as a paradigmatic example thereof. This Focus section explores how lists can open up fresh possibilities for research in the history of science. Drawing on examples from the early modern period, the contributors argue that attention to practices of list making reveals important relations between mercantile, administrative, and scientific attempts to organize the contents of the world. Early modern lists projected both spatial and temporal visions of nature: they inventoried objects in the process of exchange and collection; they projected possible trajectories for future endeavor; they publicized the social identities of scientific practitioners; and they became research tools that transformed understandings of the natural order.
Kostagiolas, Petros A; Aggelopoulou, Vasiliki A; Niakas, Dimitris
2011-12-01
Hospital pharmacists need access to high-quality information in order to constantly update their knowledge and improve their skills. In their modern role, they are expected to address three types of challenges: scientific, organizational and administrative, thus having an increased need for adequate information and library services. This study investigates the information-seeking behaviour of public hospital pharmacists providing evidence from Greece that could be used to encourage the development of effective information hospital services and study the links between the information seeking behaviour of hospital pharmacists and their modern scientific and professional role. An empirical research was conducted between January and February 2010 with the development and distribution of a structured questionnaire. The questionnaire was filled in and returned by 88 public hospital pharmacists from a total of 286 working in all Greek public hospitals, providing a response rate of 31%. The hospital pharmacists in Greece are in search of scientific information and, more particularly, pharmaceutical information (e.g., drug indications, storage, dosage and prices). The Internet and the National Organization of Medicines are their main information sources, while the lack of time and organized information are the main obstacles they have to face when seeking information. The modern professional role of hospital pharmacists as invaluable contributors to efficient and safer healthcare services may be further supported through the development of specialized libraries and information services within Greek public hospitals. © 2011 The authors. Health Information and Libraries Journal © 2011 Health Libraries Group.
Universal Cosmic Absolute and Modern Science
NASA Astrophysics Data System (ADS)
Kostro, Ludwik
The official Sciences, especially all natural sciences, respect in their researches the principle of methodic naturalism i.e. they consider all phenomena as entirely natural and therefore in their scientific explanations they do never adduce or cite supernatural entities and forces. The purpose of this paper is to show that Modern Science has its own self-existent, self-acting, and self-sufficient Natural All-in Being or Omni-Being i.e. the entire Nature as a Whole that justifies the scientific methodic naturalism. Since this Natural All-in Being is one and only It should be considered as the own scientifically justified Natural Absolute of Science and should be called, in my opinion, the Universal Cosmic Absolute of Modern Science. It will be also shown that the Universal Cosmic Absolute is ontologically enormously stratified and is in its ultimate i.e. in its most fundamental stratum trans-reistic and trans-personal. It means that in its basic stratum. It is neither a Thing or a Person although It contains in Itself all things and persons with all other sentient and conscious individuals as well, On the turn of the 20th century the Science has begun to look for a theory of everything, for a final theory, for a master theory. In my opinion the natural Universal Cosmic Absolute will constitute in such a theory the radical all penetrating Ultimate Basic Reality and will substitute step by step the traditional supernatural personal Absolute.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hey, Tony; Agarwal, Deborah; Borgman, Christine
The Advanced Scientific Computing Advisory Committee (ASCAC) was charged to form a standing subcommittee to review the Department of Energy’s Office of Scientific and Technical Information (OSTI) and to begin by assessing the quality and effectiveness of OSTI’s recent and current products and services and to comment on its mission and future directions in the rapidly changing environment for scientific publication and data. The Committee met with OSTI staff and reviewed available products, services and other materials. This report summaries their initial findings and recommendations.
Analysis of Defenses Against Code Reuse Attacks on Modern and New Architectures
2015-09-01
soundness or completeness. An incomplete analysis will produce extra edges in the CFG that might allow an attacker to slip through. An unsound analysis...Analysis of Defenses Against Code Reuse Attacks on Modern and New Architectures by Isaac Noah Evans Submitted to the Department of Electrical...Engineering and Computer Science in partial fulfillment of the requirements for the degree of Master of Engineering in Electrical Engineering and Computer
Body metaphors--reading the body in contemporary culture.
Skara, Danica
2004-01-01
This paper addresses the linguistic reframing of the human body in contemporary culture. Our aim is to provide a linguistic description of the ways in which the body is represented in modern English language. First, we will try to focus on body metaphors in general. We have collected a sample of 300 words and phrases functioning as body metaphors in modern English language. Reading the symbolism of the body we are witnessing changes in the basic metaphorical structuring of the human body. The results show that new vocabulary binds different fields of knowledge associated with machines and human beings according to a shared textual frame: human as computer and computer as human metaphor. Humans are almost blended with computers and vice versa. This metaphorical use of the human body and its parts reveals not only currents of unconscious though but also the structures of modern society and culture.
Federal Aviation Administration : challenges in modernizing the agency
DOT National Transportation Integrated Search
2000-02-01
FAA's efforts to implement initiatives in five key areas-air traffic control modernization, procurement and personnel reform, aviation safety, aviation and computer security, and financial management-have met with limited success. For example, FAA ha...
ERIC Educational Resources Information Center
Sliva, William R.
1977-01-01
The size of modern dairy plant operations has led to extreme specialization in product manufacturing, milk processing, microbiological analysis, chemical and mathematical computations. Morrisville Agricultural and Technical College, New York, has modernized its curricula to meet these changes. (HD)
ERIC Educational Resources Information Center
Jacobson, Michael J.; Taylor, Charlotte E.; Richards, Deborah
2016-01-01
In this paper, we propose computational scientific inquiry (CSI) as an innovative model for learning important scientific knowledge and new practices for "doing" science. This approach involves the use of a "game-like" virtual world for students to experience virtual biological fieldwork in conjunction with using an agent-based…
ERIC Educational Resources Information Center
Hulshof, Casper D.; de Jong, Ton
2006-01-01
Students encounter many obstacles during scientific discovery learning with computer-based simulations. It is hypothesized that an effective type of support, that does not interfere with the scientific discovery learning process, should be delivered on a "just-in-time" base. This study explores the effect of facilitating access to…
Student Perceived Importance and Correlations of Selected Computer Literacy Course Topics
ERIC Educational Resources Information Center
Ciampa, Mark
2013-01-01
Traditional college-level courses designed to teach computer literacy are in a state of flux. Today's students have high rates of access to computing technology and computer ownership, leading many policy decision makers to conclude that students already are computer literate and thus computer literacy courses are dinosaurs in a modern digital…
NASA Astrophysics Data System (ADS)
Ding, Yea-Chung
2010-11-01
In recent years national parks worldwide have introduced online virtual tourism, through which potential visitors can search for tourist information. Most virtual tourism websites are a simulation of an existing location, usually composed of panoramic images, a sequence of hyperlinked still or video images, and/or virtual models of the actual location. As opposed to actual tourism, a virtual tour is typically accessed on a personal computer or an interactive kiosk. Using modern Digital Earth techniques such as high resolution satellite images, precise GPS coordinates and powerful 3D WebGIS, however, it's possible to create more realistic scenic models to present natural terrain and man-made constructions in greater detail. This article explains how to create an online scientific reality tourist guide for the Jinguashi Gold Ecological Park at Jinguashi in northern Taiwan, China. This project uses high-resolution Formosat 2 satellite images and digital aerial images in conjunction with DTM to create a highly realistic simulation of terrain, with the addition of 3DMAX to add man-made constructions and vegetation. Using this 3D Geodatabase model in conjunction with INET 3D WebGIS software, we have found Digital Earth concept can greatly improve and expand the presentation of traditional online virtual tours on the websites.
Design and implementation of land reservation system
NASA Astrophysics Data System (ADS)
Gao, Yurong; Gao, Qingqiang
2009-10-01
Land reservation is defined as a land management policy for insuring the government to control primary land market. It requires the government to obtain the land first, according to plan, by purchase, confiscation and exchanging, and then exploit and consolidate the land for reservation. Underlying this policy, it is possible for the government to satisfy and manipulate the needs of land for urban development. The author designs and develops "Land Reservation System for Eastern Lake Development District" (LRSELDD), which deals with the realistic land requirement problems in Wuhan Eastern Lake Development Districts. The LRSELDD utilizes modern technologies and solutions of computer science and GIS to process multiple source data related with land. Based on experiments on the system, this paper will first analyze workflow land reservation system and design the system structure based on its principles, then illustrate the approach of organization and management of spatial data, describe the system functions according to the characteristics of land reservation and consolidation finally. The system is running to serve for current work in Eastern Lake Development Districts. It is able to scientifically manage both current and planning land information, as well as the information about land supplying. We use the LRSELDD in our routine work, and with such information, decisions on land confiscation and allocation will be made wisely and scientifically.
Computational Ecology and Open Science: Tools to Help Manage Lakes for Cyanobacteria in Lakes
Computational ecology is an interdisciplinary field that takes advantage of modern computation abilities to expand our ecological understanding. As computational ecologists, we use large data sets, which often cover large spatial extents, and advanced statistical/mathematical co...
Modern Electronic Devices: An Increasingly Common Cause of Skin Disorders in Consumers.
Corazza, Monica; Minghetti, Sara; Bertoldi, Alberto Maria; Martina, Emanuela; Virgili, Annarosa; Borghi, Alessandro
2016-01-01
: The modern conveniences and enjoyment brought about by electronic devices bring with them some health concerns. In particular, personal electronic devices are responsible for rising cases of several skin disorders, including pressure, friction, contact dermatitis, and other physical dermatitis. The universal use of such devices, either for work or recreational purposes, will probably increase the occurrence of polymorphous skin manifestations over time. It is important for clinicians to consider electronics as potential sources of dermatological ailments, for proper patient management. We performed a literature review on skin disorders associated with the personal use of modern technology, including personal computers and laptops, personal computer accessories, mobile phones, tablets, video games, and consoles.
[On the evolution of scientific thought].
de Micheli, Alfredo; Iturralde Torres, Pedro
2015-01-01
The Nominalists of the XIV century, precursors of modern science, thought that science's object was not the general, vague and indeterminate but the particular, which is real and can be known directly. About the middle of the XVII Century the bases of the modern science became established thanks to a revolution fomented essentially by Galileo, Bacon and Descartes. During the XVIII Century, parallel to the development of the great current of English Empiricism, a movement of scientific renewal also arose in continental Europe following the discipline of the Dutch Physicians and of Boerhaave. In the XIX Century, Claude Bernard dominated the scientific medicine but his rigorous determinism impeded him from taking into account the immense and unforeseeable field of the random. Nowadays, we approach natural science and medicine, from particular groups of facts; that is, from the responses of Nature to specific questions, but not from the general laws. Furthermore, in recent epistemology, the concept that experimental data are not pure facts, but rather, facts interpreted within a hermeneutical context has been established. Finally a general tendency to retrieve philosophical questions concerning the understanding of essence and existence can frequently be seen in scientific inquiry. In the light of the evolution of medical thought, it is possible to establish the position of scientific medicine within the movement of ideas dominating in our time. Copyright © 2014 Instituto Nacional de Cardiología Ignacio Chávez. Published by Masson Doyma México S.A. All rights reserved.
The Development of Genetics in the Light of Thomas Kuhn's Theory of Scientific Revolutions.
Portin, Petter
2015-01-01
The concept of a paradigm is in the key position in Thomas Kuhn's theory of scientific revolutions. A paradigm is the framework within which the results, concepts, hypotheses and theories of scientific research work are understood. According to Kuhn, a paradigm guides the working and efforts of scientists during the time period which he calls the period of normal science. Before long, however, normal science leads to unexplained matters, a situation that then leads the development of the scientific discipline in question to a paradigm shift--a scientific revolution. When a new theory is born, it has either gradually emerged as an extension of the past theory, or the old theory has become a borderline case in the new theory. In the former case, one can speak of a paradigm extension. According to the present author, the development of modern genetics has, until very recent years, been guided by a single paradigm, the Mendelian paradigm which Gregor Mendel launched 150 years ago, and under the guidance of this paradigm the development of genetics has proceeded in a normal fashion in the spirit of logical positivism. Modern discoveries in genetics have, however, created a situation which seems to be leading toward a paradigm shift. The most significant of these discoveries are the findings of adaptive mutations, the phenomenon of transgenerational epigenetic inheritance, and, above all, the present deeply critical state of the concept of the gene.
Phipps, Eric T.; D'Elia, Marta; Edwards, Harold C.; ...
2017-04-18
In this study, quantifying simulation uncertainties is a critical component of rigorous predictive simulation. A key component of this is forward propagation of uncertainties in simulation input data to output quantities of interest. Typical approaches involve repeated sampling of the simulation over the uncertain input data, and can require numerous samples when accurately propagating uncertainties from large numbers of sources. Often simulation processes from sample to sample are similar and much of the data generated from each sample evaluation could be reused. We explore a new method for implementing sampling methods that simultaneously propagates groups of samples together in anmore » embedded fashion, which we call embedded ensemble propagation. We show how this approach takes advantage of properties of modern computer architectures to improve performance by enabling reuse between samples, reducing memory bandwidth requirements, improving memory access patterns, improving opportunities for fine-grained parallelization, and reducing communication costs. We describe a software technique for implementing embedded ensemble propagation based on the use of C++ templates and describe its integration with various scientific computing libraries within Trilinos. We demonstrate improved performance, portability and scalability for the approach applied to the simulation of partial differential equations on a variety of CPU, GPU, and accelerator architectures, including up to 131,072 cores on a Cray XK7 (Titan).« less
Semantic Web Compatible Names and Descriptions for Organisms
NASA Astrophysics Data System (ADS)
Wang, H.; Wilson, N.; McGuinness, D. L.
2012-12-01
Modern scientific names are critical for understanding the biological literature and provide a valuable way to understand evolutionary relationships. To validly publish a name, a description is required to separate the described group of organisms from those described by other names at the same level of the taxonomic hierarchy. The frequent revision of descriptions due to new evolutionary evidence has lead to situations where a single given scientific name may over time have multiple descriptions associated with it and a given published description may apply to multiple scientific names. Because of these many-to-many relationships between scientific names and descriptions, the usage of scientific names as a proxy for descriptions is inevitably ambiguous. Another issue lies in the fact that the precise application of scientific names often requires careful microscopic work, or increasingly, genetic sequencing, as scientific names are focused on the evolutionary relatedness between and within named groups such as species, genera, families, etc. This is problematic to many audiences, especially field biologists, who often do not have access to the instruments and tools required to make identifications on a microscopic or genetic basis. To better connect scientific names to descriptions and find a more convenient way to support computer assisted identification, we proposed the Semantic Vernacular System, a novel naming system that creates named, machine-interpretable descriptions for groups of organisms, and is compatible with the Semantic Web. Unlike the evolutionary relationship based scientific naming system, it emphasizes the observable features of organisms. By independently naming the descriptions composed of sets of observational features, as well as maintaining connections to scientific names, it preserves the observational data used to identify organisms. The system is designed to support a peer-review mechanism for creating new names, and uses a controlled vocabulary encoded in the Web Ontology Language to represent the observational features. A prototype of the system is currently under development in collaboration with the Mushroom Observer website. It allows users to propose new names and descriptions for fungi, provide feedback on those proposals, and ultimately have them formally approved. It relies on SPARQL queries and semantic reasoning for data management. This effort will offer the mycology community a knowledge base of fungal observational features and a tool for identifying fungal observations. It will also serve as an operational specification of how the Semantic Vernacular System can be used in practice in one scientific community (in this case mycology).
ERIC Educational Resources Information Center
Fox, Jeffrey L.
1986-01-01
Discusses various topics and issues related to the scientific enterprise in Cuba. Notes that Cuban science is emphasizing biotechnology and research on the island's chief crop (sugarcane), although hampered by limited personnel and lack of modern laboratory equipment. (JN)
ERIC Educational Resources Information Center
Chan, Kit Yu Karen; Yang, Sylvia; Maliska, Max E.; Grunbaum, Daniel
2012-01-01
The National Science Education Standards have highlighted the importance of active learning and reflection for contemporary scientific methods in K-12 classrooms, including the use of models. Computer modeling and visualization are tools that researchers employ in their scientific inquiry process, and often computer models are used in…
Architectural Principles and Experimentation of Distributed High Performance Virtual Clusters
ERIC Educational Resources Information Center
Younge, Andrew J.
2016-01-01
With the advent of virtualization and Infrastructure-as-a-Service (IaaS), the broader scientific computing community is considering the use of clouds for their scientific computing needs. This is due to the relative scalability, ease of use, advanced user environment customization abilities, and the many novel computing paradigms available for…
ERIC Educational Resources Information Center
Tuncer, Murat
2013-01-01
Present research investigates reciprocal relations amidst computer self-efficacy, scientific research and information literacy self-efficacy. Research findings have demonstrated that according to standardized regression coefficients, computer self-efficacy has a positive effect on information literacy self-efficacy. Likewise it has been detected…
ERIC Educational Resources Information Center
Hansen, John; Barnett, Michael; MaKinster, James; Keating, Thomas
2004-01-01
The increased availability of computational modeling software has created opportunities for students to engage in scientific inquiry through constructing computer-based models of scientific phenomena. However, despite the growing trend of integrating technology into science curricula, educators need to understand what aspects of these technologies…