Sample records for central scientific computing

  1. Introduction to the LaRC central scientific computing complex

    NASA Technical Reports Server (NTRS)

    Shoosmith, John N.

    1993-01-01

    The computers and associated equipment that make up the Central Scientific Computing Complex of the Langley Research Center are briefly described. The electronic networks that provide access to the various components of the complex and a number of areas that can be used by Langley and contractors staff for special applications (scientific visualization, image processing, software engineering, and grid generation) are also described. Flight simulation facilities that use the central computers are described. Management of the complex, procedures for its use, and available services and resources are discussed. This document is intended for new users of the complex, for current users who wish to keep appraised of changes, and for visitors who need to understand the role of central scientific computers at Langley.

  2. Scientific Visualization, Seeing the Unseeable

    ScienceCinema

    LBNL

    2017-12-09

    June 24, 2008 Berkeley Lab lecture: Scientific visualization transforms abstract data into readily comprehensible images, provide a vehicle for "seeing the unseeable," and play a central role in bo... June 24, 2008 Berkeley Lab lecture: Scientific visualization transforms abstract data into readily comprehensible images, provide a vehicle for "seeing the unseeable," and play a central role in both experimental and computational sciences. Wes Bethel, who heads the Scientific Visualization Group in the Computational Research Division, presents an overview of visualization and computer graphics, current research challenges, and future directions for the field.

  3. Scientific Visualization: The Modern Oscilloscope for "Seeing the Unseeable" (LBNL Summer Lecture Series)

    ScienceCinema

    Bethel, E. Wes [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Computational Research Division and Scientific Visualization Group

    2018-05-07

    Summer Lecture Series 2008: Scientific visualization transforms abstract data into readily comprehensible images, provide a vehicle for "seeing the unseeable," and play a central role in both experimental and computational sciences. Wes Bethel, who heads the Scientific Visualization Group in the Computational Research Division, presents an overview of visualization and computer graphics, current research challenges, and future directions for the field.

  4. An Overview of the Computational Physics and Methods Group at Los Alamos National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baker, Randal Scott

    CCS Division was formed to strengthen the visibility and impact of computer science and computational physics research on strategic directions for the Laboratory. Both computer science and computational science are now central to scientific discovery and innovation. They have become indispensable tools for all other scientific missions at the Laboratory. CCS Division forms a bridge between external partners and Laboratory programs, bringing new ideas and technologies to bear on today’s important problems and attracting high-quality technical staff members to the Laboratory. The Computational Physics and Methods Group CCS-2 conducts methods research and develops scientific software aimed at the latest andmore » emerging HPC systems.« less

  5. LaRC local area networks to support distributed computing

    NASA Technical Reports Server (NTRS)

    Riddle, E. P.

    1984-01-01

    The Langley Research Center's (LaRC) Local Area Network (LAN) effort is discussed. LaRC initiated the development of a LAN to support a growing distributed computing environment at the Center. The purpose of the network is to provide an improved capability (over inteactive and RJE terminal access) for sharing multivendor computer resources. Specifically, the network will provide a data highway for the transfer of files between mainframe computers, minicomputers, work stations, and personal computers. An important influence on the overall network design was the vital need of LaRC researchers to efficiently utilize the large CDC mainframe computers in the central scientific computing facility. Although there was a steady migration from a centralized to a distributed computing environment at LaRC in recent years, the work load on the central resources increased. Major emphasis in the network design was on communication with the central resources within the distributed environment. The network to be implemented will allow researchers to utilize the central resources, distributed minicomputers, work stations, and personal computers to obtain the proper level of computing power to efficiently perform their jobs.

  6. Computer graphics and the graphic artist

    NASA Technical Reports Server (NTRS)

    Taylor, N. L.; Fedors, E. G.; Pinelli, T. E.

    1985-01-01

    A centralized computer graphics system is being developed at the NASA Langley Research Center. This system was required to satisfy multiuser needs, ranging from presentation quality graphics prepared by a graphic artist to 16-mm movie simulations generated by engineers and scientists. While the major thrust of the central graphics system was directed toward engineering and scientific applications, hardware and software capabilities to support the graphic artists were integrated into the design. This paper briefly discusses the importance of computer graphics in research; the central graphics system in terms of systems, software, and hardware requirements; the application of computer graphics to graphic arts, discussed in terms of the requirements for a graphic arts workstation; and the problems encountered in applying computer graphics to the graphic arts. The paper concludes by presenting the status of the central graphics system.

  7. Computer networks for remote laboratories in physics and engineering

    NASA Technical Reports Server (NTRS)

    Starks, Scott; Elizandro, David; Leiner, Barry M.; Wiskerchen, Michael

    1988-01-01

    This paper addresses a relatively new approach to scientific research, telescience, which is the conduct of scientific operations in locations remote from the site of central experimental activity. A testbed based on the concepts of telescience is being developed to ultimately enable scientific researchers on earth to conduct experiments onboard the Space Station. This system along with background materials are discussed.

  8. The philosophy of scientific experimentation: a review

    PubMed Central

    2009-01-01

    Practicing and studying automated experimentation may benefit from philosophical reflection on experimental science in general. This paper reviews the relevant literature and discusses central issues in the philosophy of scientific experimentation. The first two sections present brief accounts of the rise of experimental science and of its philosophical study. The next sections discuss three central issues of scientific experimentation: the scientific and philosophical significance of intervention and production, the relationship between experimental science and technology, and the interactions between experimental and theoretical work. The concluding section identifies three issues for further research: the role of computing and, more specifically, automating, in experimental research, the nature of experimentation in the social and human sciences, and the significance of normative, including ethical, problems in experimental science. PMID:20098589

  9. French Plans for Fifth Generation Computer Systems.

    DTIC Science & Technology

    1984-12-07

    centrally man- French industry In electronics, compu- aged project in France that covers all ters, software, and services and to make the facets of the...Centre National of Japan’s Fifth Generation Project , the de Recherche Scientifique (CNRS) Cooper- French scientific and industrial com- ative Research...systems, man-computer The National Projects interaction, novel computer structures, The French Ministry of Research and knowledge-based computer systems

  10. Massive Data, the Digitization of Science, and Reproducibility of Results

    ScienceCinema

    Stodden, Victoria

    2018-04-27

    As the scientific enterprise becomes increasingly computational and data-driven, the nature of the information communicated must change. Without inclusion of the code and data with published computational results, we are engendering a credibility crisis in science. Controversies such as ClimateGate, the microarray-based drug sensitivity clinical trials under investigation at Duke University, and retractions from prominent journals due to unverified code suggest the need for greater transparency in our computational science. In this talk I argue that the scientific method be restored to (1) a focus on error control as central to scientific communication and (2) complete communication of the underlying methodology producing the results, ie. reproducibility. I outline barriers to these goals based on recent survey work (Stodden 2010), and suggest solutions such as the “Reproducible Research Standard” (Stodden 2009), giving open licensing options designed to create an intellectual property framework for scientists consonant with longstanding scientific norms.

  11. Bioinformatics and Astrophysics Cluster (BinAc)

    NASA Astrophysics Data System (ADS)

    Krüger, Jens; Lutz, Volker; Bartusch, Felix; Dilling, Werner; Gorska, Anna; Schäfer, Christoph; Walter, Thomas

    2017-09-01

    BinAC provides central high performance computing capacities for bioinformaticians and astrophysicists from the state of Baden-Württemberg. The bwForCluster BinAC is part of the implementation concept for scientific computing for the universities in Baden-Württemberg. Community specific support is offered through the bwHPC-C5 project.

  12. SNS programming environment user's guide

    NASA Technical Reports Server (NTRS)

    Tennille, Geoffrey M.; Howser, Lona M.; Humes, D. Creig; Cronin, Catherine K.; Bowen, John T.; Drozdowski, Joseph M.; Utley, Judith A.; Flynn, Theresa M.; Austin, Brenda A.

    1992-01-01

    The computing environment is briefly described for the Supercomputing Network Subsystem (SNS) of the Central Scientific Computing Complex of NASA Langley. The major SNS computers are a CRAY-2, a CRAY Y-MP, a CONVEX C-210, and a CONVEX C-220. The software is described that is common to all of these computers, including: the UNIX operating system, computer graphics, networking utilities, mass storage, and mathematical libraries. Also described is file management, validation, SNS configuration, documentation, and customer services.

  13. Protocols for Handling Messages Between Simulation Computers

    NASA Technical Reports Server (NTRS)

    Balcerowski, John P.; Dunnam, Milton

    2006-01-01

    Practical Simulator Network (PSimNet) is a set of data-communication protocols designed especially for use in handling messages between computers that are engaging cooperatively in real-time or nearly-real-time training simulations. In a typical application, computers that provide individualized training at widely dispersed locations would communicate, by use of PSimNet, with a central host computer that would provide a common computational- simulation environment and common data. Originally intended for use in supporting interfaces between training computers and computers that simulate the responses of spacecraft scientific payloads, PSimNet could be especially well suited for a variety of other applications -- for example, group automobile-driver training in a classroom. Another potential application might lie in networking of automobile-diagnostic computers at repair facilities to a central computer that would compile the expertise of numerous technicians and engineers and act as an expert consulting technician.

  14. PANORAMA: An approach to performance modeling and diagnosis of extreme-scale workflows

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Deelman, Ewa; Carothers, Christopher; Mandal, Anirban

    Here we report that computational science is well established as the third pillar of scientific discovery and is on par with experimentation and theory. However, as we move closer toward the ability to execute exascale calculations and process the ensuing extreme-scale amounts of data produced by both experiments and computations alike, the complexity of managing the compute and data analysis tasks has grown beyond the capabilities of domain scientists. Therefore, workflow management systems are absolutely necessary to ensure current and future scientific discoveries. A key research question for these workflow management systems concerns the performance optimization of complex calculation andmore » data analysis tasks. The central contribution of this article is a description of the PANORAMA approach for modeling and diagnosing the run-time performance of complex scientific workflows. This approach integrates extreme-scale systems testbed experimentation, structured analytical modeling, and parallel systems simulation into a comprehensive workflow framework called Pegasus for understanding and improving the overall performance of complex scientific workflows.« less

  15. PANORAMA: An approach to performance modeling and diagnosis of extreme-scale workflows

    DOE PAGES

    Deelman, Ewa; Carothers, Christopher; Mandal, Anirban; ...

    2015-07-14

    Here we report that computational science is well established as the third pillar of scientific discovery and is on par with experimentation and theory. However, as we move closer toward the ability to execute exascale calculations and process the ensuing extreme-scale amounts of data produced by both experiments and computations alike, the complexity of managing the compute and data analysis tasks has grown beyond the capabilities of domain scientists. Therefore, workflow management systems are absolutely necessary to ensure current and future scientific discoveries. A key research question for these workflow management systems concerns the performance optimization of complex calculation andmore » data analysis tasks. The central contribution of this article is a description of the PANORAMA approach for modeling and diagnosing the run-time performance of complex scientific workflows. This approach integrates extreme-scale systems testbed experimentation, structured analytical modeling, and parallel systems simulation into a comprehensive workflow framework called Pegasus for understanding and improving the overall performance of complex scientific workflows.« less

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ayer, Vidya M.; Miguez, Sheila; Toby, Brian H.

    Scientists have been central to the historical development of the computer industry, but the importance of software only continues to grow for all areas of scientific research and in particular for powder diffraction. Knowing how to program a computer is a basic and useful skill for scientists. The article introduces the three types of programming languages and why scripting languages are now preferred for scientists. Of them, the authors assert Python is the most useful and easiest to learn. Python is introduced. Also presented is an overview to a few of the many add-on packages available to extend the capabilitiesmore » of Python, for example, for numerical computations, scientific graphics and graphical user interface programming.« less

  17. Exploiting graphics processing units for computational biology and bioinformatics.

    PubMed

    Payne, Joshua L; Sinnott-Armstrong, Nicholas A; Moore, Jason H

    2010-09-01

    Advances in the video gaming industry have led to the production of low-cost, high-performance graphics processing units (GPUs) that possess more memory bandwidth and computational capability than central processing units (CPUs), the standard workhorses of scientific computing. With the recent release of generalpurpose GPUs and NVIDIA's GPU programming language, CUDA, graphics engines are being adopted widely in scientific computing applications, particularly in the fields of computational biology and bioinformatics. The goal of this article is to concisely present an introduction to GPU hardware and programming, aimed at the computational biologist or bioinformaticist. To this end, we discuss the primary differences between GPU and CPU architecture, introduce the basics of the CUDA programming language, and discuss important CUDA programming practices, such as the proper use of coalesced reads, data types, and memory hierarchies. We highlight each of these topics in the context of computing the all-pairs distance between instances in a dataset, a common procedure in numerous disciplines of scientific computing. We conclude with a runtime analysis of the GPU and CPU implementations of the all-pairs distance calculation. We show our final GPU implementation to outperform the CPU implementation by a factor of 1700.

  18. Embracing the quantum limit in silicon computing.

    PubMed

    Morton, John J L; McCamey, Dane R; Eriksson, Mark A; Lyon, Stephen A

    2011-11-16

    Quantum computers hold the promise of massive performance enhancements across a range of applications, from cryptography and databases to revolutionary scientific simulation tools. Such computers would make use of the same quantum mechanical phenomena that pose limitations on the continued shrinking of conventional information processing devices. Many of the key requirements for quantum computing differ markedly from those of conventional computers. However, silicon, which plays a central part in conventional information processing, has many properties that make it a superb platform around which to build a quantum computer. © 2011 Macmillan Publishers Limited. All rights reserved

  19. ANL statement of site strategy for computing workstations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fenske, K.R.; Boxberger, L.M.; Amiot, L.W.

    1991-11-01

    This Statement of Site Strategy describes the procedure at Argonne National Laboratory for defining, acquiring, using, and evaluating scientific and office workstations and related equipment and software in accord with DOE Order 1360.1A (5-30-85), and Laboratory policy. It is Laboratory policy to promote the installation and use of computing workstations to improve productivity and communications for both programmatic and support personnel, to ensure that computing workstations acquisitions meet the expressed need in a cost-effective manner, and to ensure that acquisitions of computing workstations are in accord with Laboratory and DOE policies. The overall computing site strategy at ANL is tomore » develop a hierarchy of integrated computing system resources to address the current and future computing needs of the laboratory. The major system components of this hierarchical strategy are: Supercomputers, Parallel computers, Centralized general purpose computers, Distributed multipurpose minicomputers, and Computing workstations and office automation support systems. Computing workstations include personal computers, scientific and engineering workstations, computer terminals, microcomputers, word processing and office automation electronic workstations, and associated software and peripheral devices costing less than $25,000 per item.« less

  20. Nurturing a growing field: Computers & Geosciences

    NASA Astrophysics Data System (ADS)

    Mariethoz, Gregoire; Pebesma, Edzer

    2017-10-01

    Computational issues are becoming increasingly critical for virtually all fields of geoscience. This includes the development of improved algorithms and models, strategies for implementing high-performance computing, or the management and visualization of the large datasets provided by an ever-growing number of environmental sensors. Such issues are central to scientific fields as diverse as geological modeling, Earth observation, geophysics or climatology, to name just a few. Related computational advances, across a range of geoscience disciplines, are the core focus of Computers & Geosciences, which is thus a truly multidisciplinary journal.

  1. Profile of central research and application laboratory of Aǧrı İbrahim Çeçen University

    NASA Astrophysics Data System (ADS)

    Türkoǧlu, Emir Alper; Kurt, Murat; Tabay, Dilruba

    2016-04-01

    Aǧrı İbrahim Çeçen University built a central research and application laboratory (CRAL) in the east of Turkey. The CRAL possesses 7 research and analysis laboratories, 12 experts and researchers, 8 standard rooms for guest researchers, a restaurant, a conference hall, a meeting room, a prey room and a computer laboratory. The CRAL aims certain collaborations between researchers, experts, clinicians and educators in the areas of biotechnology, bioimagining, food safety & quality, omic sciences such as genomics, proteomics and metallomics. It also intends to develop sustainable solutions in agriculture and animal husbandry, promote public health quality, collect scientific knowledge and keep it for future generations, contribute scientific awareness of all stratums of society, provide consulting for small initiatives and industries. It has been collaborated several scientific foundations since 2011.

  2. Hyper-Spectral Synthesis of Active OB Stars Using GLaDoS

    NASA Astrophysics Data System (ADS)

    Hill, N. R.; Townsend, R. H. D.

    2016-11-01

    In recent years there has been considerable interest in using graphics processing units (GPUs) to perform scientific computations that have traditionally been handled by central processing units (CPUs). However, there is one area where the scientific potential of GPUs has been overlooked - computer graphics, the task they were originally designed for. Here we introduce GLaDoS, a hyper-spectral code which leverages the graphics capabilities of GPUs to synthesize spatially and spectrally resolved images of complex stellar systems. We demonstrate how GLaDoS can be applied to calculate observables for various classes of stars including systems with inhomogenous surface temperatures and contact binaries.

  3. How to Excel at Research Computing in Times of Diminishing Resources, Growing Demand, and Expanding Possibilities. A Report on the ECAR Annual Meeting. Research Bulletin

    ERIC Educational Resources Information Center

    Grajek, Susan

    2014-01-01

    For three days in January 2014, more than a hundred thought leaders met in Tempe, Arizona, to discuss the present and future challenges and opportunities for IT's support of research. Recommendations to improve institutions' support of scientific and humanities research included shaping central IT's role as an aggregator; ensuring that central IT…

  4. I/O-Efficient Scientific Computation Using TPIE

    NASA Technical Reports Server (NTRS)

    Vengroff, Darren Erik; Vitter, Jeffrey Scott

    1996-01-01

    In recent years, input/output (I/O)-efficient algorithms for a wide variety of problems have appeared in the literature. However, systems specifically designed to assist programmers in implementing such algorithms have remained scarce. TPIE is a system designed to support I/O-efficient paradigms for problems from a variety of domains, including computational geometry, graph algorithms, and scientific computation. The TPIE interface frees programmers from having to deal not only with explicit read and write calls, but also the complex memory management that must be performed for I/O-efficient computation. In this paper we discuss applications of TPIE to problems in scientific computation. We discuss algorithmic issues underlying the design and implementation of the relevant components of TPIE and present performance results of programs written to solve a series of benchmark problems using our current TPIE prototype. Some of the benchmarks we present are based on the NAS parallel benchmarks while others are of our own creation. We demonstrate that the central processing unit (CPU) overhead required to manage I/O is small and that even with just a single disk, the I/O overhead of I/O-efficient computation ranges from negligible to the same order of magnitude as CPU time. We conjecture that if we use a number of disks in parallel this overhead can be all but eliminated.

  5. Image processing mini manual

    NASA Technical Reports Server (NTRS)

    Matthews, Christine G.; Posenau, Mary-Anne; Leonard, Desiree M.; Avis, Elizabeth L.; Debure, Kelly R.; Stacy, Kathryn; Vonofenheim, Bill

    1992-01-01

    The intent is to provide an introduction to the image processing capabilities available at the Langley Research Center (LaRC) Central Scientific Computing Complex (CSCC). Various image processing software components are described. Information is given concerning the use of these components in the Data Visualization and Animation Laboratory at LaRC.

  6. Annual ADP planning document

    NASA Technical Reports Server (NTRS)

    Mogilevsky, M.

    1973-01-01

    The Category A computer systems at KSC (Al and A2) which perform scientific and business/administrative operations are described. This data division is responsible for scientific requirements supporting Saturn, Atlas/Centaur, Titan/Centaur, Titan III, and Delta vehicles, and includes realtime functions, Apollo-Soyuz Test Project (ASTP), and the Space Shuttle. The work is performed chiefly on the GEL-635 (Al) system located in the Central Instrumentation Facility (CIF). The Al system can perform computations and process data in three modes: (1) real-time critical mode; (2) real-time batch mode; and (3) batch mode. The Division's IBM-360/50 (A2) system, also at the CIF, performs business/administrative data processing such as personnel, procurement, reliability, financial management and payroll, real-time inventory management, GSE accounting, preventive maintenance, and integrated launch vehicle modification status.

  7. Learning to Think Spatially in an Undergraduate Interdisciplinary Computational Design Context: A Case Study

    ERIC Educational Resources Information Center

    Ben Youssef, Belgacem; Berry, Barbara

    2012-01-01

    Spatial thinking skills are vital for success in everyday living and work, not to mention the centrality of spatial reasoning in scientific discoveries, design-based disciplines, medicine, geosciences and mathematics to name a few. This case study describes a course in spatial thinking and communicating designed and delivered by an…

  8. Jenkins-CI, an Open-Source Continuous Integration System, as a Scientific Data and Image-Processing Platform.

    PubMed

    Moutsatsos, Ioannis K; Hossain, Imtiaz; Agarinis, Claudia; Harbinski, Fred; Abraham, Yann; Dobler, Luc; Zhang, Xian; Wilson, Christopher J; Jenkins, Jeremy L; Holway, Nicholas; Tallarico, John; Parker, Christian N

    2017-03-01

    High-throughput screening generates large volumes of heterogeneous data that require a diverse set of computational tools for management, processing, and analysis. Building integrated, scalable, and robust computational workflows for such applications is challenging but highly valuable. Scientific data integration and pipelining facilitate standardized data processing, collaboration, and reuse of best practices. We describe how Jenkins-CI, an "off-the-shelf," open-source, continuous integration system, is used to build pipelines for processing images and associated data from high-content screening (HCS). Jenkins-CI provides numerous plugins for standard compute tasks, and its design allows the quick integration of external scientific applications. Using Jenkins-CI, we integrated CellProfiler, an open-source image-processing platform, with various HCS utilities and a high-performance Linux cluster. The platform is web-accessible, facilitates access and sharing of high-performance compute resources, and automates previously cumbersome data and image-processing tasks. Imaging pipelines developed using the desktop CellProfiler client can be managed and shared through a centralized Jenkins-CI repository. Pipelines and managed data are annotated to facilitate collaboration and reuse. Limitations with Jenkins-CI (primarily around the user interface) were addressed through the selection of helper plugins from the Jenkins-CI community.

  9. Jenkins-CI, an Open-Source Continuous Integration System, as a Scientific Data and Image-Processing Platform

    PubMed Central

    Moutsatsos, Ioannis K.; Hossain, Imtiaz; Agarinis, Claudia; Harbinski, Fred; Abraham, Yann; Dobler, Luc; Zhang, Xian; Wilson, Christopher J.; Jenkins, Jeremy L.; Holway, Nicholas; Tallarico, John; Parker, Christian N.

    2016-01-01

    High-throughput screening generates large volumes of heterogeneous data that require a diverse set of computational tools for management, processing, and analysis. Building integrated, scalable, and robust computational workflows for such applications is challenging but highly valuable. Scientific data integration and pipelining facilitate standardized data processing, collaboration, and reuse of best practices. We describe how Jenkins-CI, an “off-the-shelf,” open-source, continuous integration system, is used to build pipelines for processing images and associated data from high-content screening (HCS). Jenkins-CI provides numerous plugins for standard compute tasks, and its design allows the quick integration of external scientific applications. Using Jenkins-CI, we integrated CellProfiler, an open-source image-processing platform, with various HCS utilities and a high-performance Linux cluster. The platform is web-accessible, facilitates access and sharing of high-performance compute resources, and automates previously cumbersome data and image-processing tasks. Imaging pipelines developed using the desktop CellProfiler client can be managed and shared through a centralized Jenkins-CI repository. Pipelines and managed data are annotated to facilitate collaboration and reuse. Limitations with Jenkins-CI (primarily around the user interface) were addressed through the selection of helper plugins from the Jenkins-CI community. PMID:27899692

  10. SciSpark's SRDD : A Scientific Resilient Distributed Dataset for Multidimensional Data

    NASA Astrophysics Data System (ADS)

    Palamuttam, R. S.; Wilson, B. D.; Mogrovejo, R. M.; Whitehall, K. D.; Mattmann, C. A.; McGibbney, L. J.; Ramirez, P.

    2015-12-01

    Remote sensing data and climate model output are multi-dimensional arrays of massive sizes locked away in heterogeneous file formats (HDF5/4, NetCDF 3/4) and metadata models (HDF-EOS, CF) making it difficult to perform multi-stage, iterative science processing since each stage requires writing and reading data to and from disk. We have developed SciSpark, a robust Big Data framework, that extends ApacheTM Spark for scaling scientific computations. Apache Spark improves the map-reduce implementation in ApacheTM Hadoop for parallel computing on a cluster, by emphasizing in-memory computation, "spilling" to disk only as needed, and relying on lazy evaluation. Central to Spark is the Resilient Distributed Dataset (RDD), an in-memory distributed data structure that extends the functional paradigm provided by the Scala programming language. However, RDDs are ideal for tabular or unstructured data, and not for highly dimensional data. The SciSpark project introduces the Scientific Resilient Distributed Dataset (sRDD), a distributed-computing array structure which supports iterative scientific algorithms for multidimensional data. SciSpark processes data stored in NetCDF and HDF files by partitioning them across time or space and distributing the partitions among a cluster of compute nodes. We show usability and extensibility of SciSpark by implementing distributed algorithms for geospatial operations on large collections of multi-dimensional grids. In particular we address the problem of scaling an automated method for finding Mesoscale Convective Complexes. SciSpark provides a tensor interface to support the pluggability of different matrix libraries. We evaluate performance of the various matrix libraries in distributed pipelines, such as Nd4jTM and BreezeTM. We detail the architecture and design of SciSpark, our efforts to integrate climate science algorithms, parallel ingest and partitioning (sharding) of A-Train satellite observations from model grids. These solutions are encompassed in SciSpark, an open-source software framework for distributed computing on scientific data.

  11. Building A Community Focused Data and Modeling Collaborative platform with Hardware Virtualization Technology

    NASA Astrophysics Data System (ADS)

    Michaelis, A.; Wang, W.; Melton, F. S.; Votava, P.; Milesi, C.; Hashimoto, H.; Nemani, R. R.; Hiatt, S. H.

    2009-12-01

    As the length and diversity of the global earth observation data records grow, modeling and analyses of biospheric conditions increasingly requires multiple terabytes of data from a diversity of models and sensors. With network bandwidth beginning to flatten, transmission of these data from centralized data archives presents an increasing challenge, and costs associated with local storage and management of data and compute resources are often significant for individual research and application development efforts. Sharing community valued intermediary data sets, results and codes from individual efforts with others that are not in direct funded collaboration can also be a challenge with respect to time, cost and expertise. We purpose a modeling, data and knowledge center that houses NASA satellite data, climate data and ancillary data where a focused community may come together to share modeling and analysis codes, scientific results, knowledge and expertise on a centralized platform, named Ecosystem Modeling Center (EMC). With the recent development of new technologies for secure hardware virtualization, an opportunity exists to create specific modeling, analysis and compute environments that are customizable, “archiveable” and transferable. Allowing users to instantiate such environments on large compute infrastructures that are directly connected to large data archives may significantly reduce costs and time associated with scientific efforts by alleviating users from redundantly retrieving and integrating data sets and building modeling analysis codes. The EMC platform also provides the possibility for users receiving indirect assistance from expertise through prefabricated compute environments, potentially reducing study “ramp up” times.

  12. CRYSNET manual. Informal report. [Hardware and software of crystallographic computing network

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None,

    1976-07-01

    This manual describes the hardware and software which together make up the crystallographic computing network (CRYSNET). The manual is intended as a users' guide and also provides general information for persons without any experience with the system. CRYSNET is a network of intelligent remote graphics terminals that are used to communicate with the CDC Cyber 70/76 computing system at the Brookhaven National Laboratory (BNL) Central Scientific Computing Facility. Terminals are in active use by four research groups in the field of crystallography. A protein data bank has been established at BNL to store in machine-readable form atomic coordinates and othermore » crystallographic data for macromolecules. The bank currently includes data for more than 20 proteins. This structural information can be accessed at BNL directly by the CRYSNET graphics terminals. More than two years of experience has been accumulated with CRYSNET. During this period, it has been demonstrated that the terminals, which provide access to a large, fast third-generation computer, plus stand-alone interactive graphics capability, are useful for computations in crystallography, and in a variety of other applications as well. The terminal hardware, the actual operations of the terminals, and the operations of the BNL Central Facility are described in some detail, and documentation of the terminal and central-site software is given. (RWR)« less

  13. CONVEX mini manual

    NASA Technical Reports Server (NTRS)

    Tennille, Geoffrey M.; Howser, Lona M.

    1993-01-01

    The use of the CONVEX computers that are an integral part of the Supercomputing Network Subsystems (SNS) of the Central Scientific Computing Complex of LaRC is briefly described. Features of the CONVEX computers that are significantly different than the CRAY supercomputers are covered, including: FORTRAN, C, architecture of the CONVEX computers, the CONVEX environment, batch job submittal, debugging, performance analysis, utilities unique to CONVEX, and documentation. This revision reflects the addition of the Applications Compiler and X-based debugger, CXdb. The document id intended for all CONVEX users as a ready reference to frequently asked questions and to more detailed information contained with the vendor manuals. It is appropriate for both the novice and the experienced user.

  14. Disciplines, models, and computers: the path to computational quantum chemistry.

    PubMed

    Lenhard, Johannes

    2014-12-01

    Many disciplines and scientific fields have undergone a computational turn in the past several decades. This paper analyzes this sort of turn by investigating the case of computational quantum chemistry. The main claim is that the transformation from quantum to computational quantum chemistry involved changes in three dimensions. First, on the side of instrumentation, small computers and a networked infrastructure took over the lead from centralized mainframe architecture. Second, a new conception of computational modeling became feasible and assumed a crucial role. And third, the field of computa- tional quantum chemistry became organized in a market-like fashion and this market is much bigger than the number of quantum theory experts. These claims will be substantiated by an investigation of the so-called density functional theory (DFT), the arguably pivotal theory in the turn to computational quantum chemistry around 1990.

  15. A simple centrality index for scientific social recognition

    NASA Astrophysics Data System (ADS)

    Kinouchi, Osame; Soares, Leonardo D. H.; Cardoso, George C.

    2018-02-01

    We introduce a new centrality index for bipartite networks of papers and authors that we call K-index. The K-index grows with the citation performance of the papers that cite a given researcher and can be seen as a measure of scientific social recognition. Indeed, the K-index measures the number of hubs, defined in a self-consistent way in the bipartite network, that cites a given author. We show that the K-index can be computed by simple inspection of the Web of Science platform and presents several advantages over other centrality indexes, in particular Hirsch h-index. The K-index is robust to self-citations, is not limited by the total number of papers published by a researcher as occurs for the h-index and can distinguish in a consistent way researchers that have the same h-index but very different scientific social recognition. The K-index easily detects a known case of a researcher with inflated number of papers, citations and h-index due to scientific misconduct. Finally, we show that, in a sample of twenty-eight physics Nobel laureates and twenty-eight highly cited non-Nobel-laureate physicists, the K-index correlates better to the achievement of the prize than the number of papers, citations, citations per paper, citing articles or the h-index. Clustering researchers in a K versus h plot reveals interesting outliers that suggest that these two indexes can present complementary independent information.

  16. Scientific Computing Strategic Plan for the Idaho National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Whiting, Eric Todd

    Scientific computing is a critical foundation of modern science. Without innovations in the field of computational science, the essential missions of the Department of Energy (DOE) would go unrealized. Taking a leadership role in such innovations is Idaho National Laboratory’s (INL’s) challenge and charge, and is central to INL’s ongoing success. Computing is an essential part of INL’s future. DOE science and technology missions rely firmly on computing capabilities in various forms. Modeling and simulation, fueled by innovations in computational science and validated through experiment, are a critical foundation of science and engineering. Big data analytics from an increasing numbermore » of widely varied sources is opening new windows of insight and discovery. Computing is a critical tool in education, science, engineering, and experiments. Advanced computing capabilities in the form of people, tools, computers, and facilities, will position INL competitively to deliver results and solutions on important national science and engineering challenges. A computing strategy must include much more than simply computers. The foundational enabling component of computing at many DOE national laboratories is the combination of a showcase like data center facility coupled with a very capable supercomputer. In addition, network connectivity, disk storage systems, and visualization hardware are critical and generally tightly coupled to the computer system and co located in the same facility. The existence of these resources in a single data center facility opens the doors to many opportunities that would not otherwise be possible.« less

  17. Job Superscheduler Architecture and Performance in Computational Grid Environments

    NASA Technical Reports Server (NTRS)

    Shan, Hongzhang; Oliker, Leonid; Biswas, Rupak

    2003-01-01

    Computational grids hold great promise in utilizing geographically separated heterogeneous resources to solve large-scale complex scientific problems. However, a number of major technical hurdles, including distributed resource management and effective job scheduling, stand in the way of realizing these gains. In this paper, we propose a novel grid superscheduler architecture and three distributed job migration algorithms. We also model the critical interaction between the superscheduler and autonomous local schedulers. Extensive performance comparisons with ideal, central, and local schemes using real workloads from leading computational centers are conducted in a simulation environment. Additionally, synthetic workloads are used to perform a detailed sensitivity analysis of our superscheduler. Several key metrics demonstrate that substantial performance gains can be achieved via smart superscheduling in distributed computational grids.

  18. A cross-disciplinary introduction to quantum annealing-based algorithms

    NASA Astrophysics Data System (ADS)

    Venegas-Andraca, Salvador E.; Cruz-Santos, William; McGeoch, Catherine; Lanzagorta, Marco

    2018-04-01

    A central goal in quantum computing is the development of quantum hardware and quantum algorithms in order to analyse challenging scientific and engineering problems. Research in quantum computation involves contributions from both physics and computer science; hence this article presents a concise introduction to basic concepts from both fields that are used in annealing-based quantum computation, an alternative to the more familiar quantum gate model. We introduce some concepts from computer science required to define difficult computational problems and to realise the potential relevance of quantum algorithms to find novel solutions to those problems. We introduce the structure of quantum annealing-based algorithms as well as two examples of this kind of algorithms for solving instances of the max-SAT and Minimum Multicut problems. An overview of the quantum annealing systems manufactured by D-Wave Systems is also presented.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Türkoğlu, Emir Alper, E-mail: eaturkoglu@yandex.com; Ağrı İbrahim Çeçen University, Central Research and Application Laboratory, Ağrı; Kurt, Murat, E-mail: muratkurt60@hotmail.com

    Ağrı İbrahim Çeçen University built a central research and application laboratory (CRAL) in the east of Turkey. The CRAL possesses 7 research and analysis laboratories, 12 experts and researchers, 8 standard rooms for guest researchers, a restaurant, a conference hall, a meeting room, a prey room and a computer laboratory. The CRAL aims certain collaborations between researchers, experts, clinicians and educators in the areas of biotechnology, bioimagining, food safety & quality, omic sciences such as genomics, proteomics and metallomics. It also intends to develop sustainable solutions in agriculture and animal husbandry, promote public health quality, collect scientific knowledge and keepmore » it for future generations, contribute scientific awareness of all stratums of society, provide consulting for small initiatives and industries. It has been collaborated several scientific foundations since 2011.« less

  20. SKYSHINE-II procedure: calculation of the effects of structure design on neutron, primary gamma-ray and secondary gamma-ray dose rates in air. Supplement number 1. Technical report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lampley, C.M.

    1981-01-01

    This report describes many of the computational methods employed within the SKYSHINE-II program. A brief description of the new data base is included, as is a description of the input data requirements and formats needed to properly execute a SKYSHINE-II problem. Utilization instructions for the program are provided for operation of the SKYSHINE-II Code on the Brookhaven National Laboratory Central Scientific Computing Facility (See NUREG/CR-0781, RRA-T7901 for complete information).

  1. Molecular dynamics simulations through GPU video games technologies

    PubMed Central

    Loukatou, Styliani; Papageorgiou, Louis; Fakourelis, Paraskevas; Filntisi, Arianna; Polychronidou, Eleftheria; Bassis, Ioannis; Megalooikonomou, Vasileios; Makałowski, Wojciech; Vlachakis, Dimitrios; Kossida, Sophia

    2016-01-01

    Bioinformatics is the scientific field that focuses on the application of computer technology to the management of biological information. Over the years, bioinformatics applications have been used to store, process and integrate biological and genetic information, using a wide range of methodologies. One of the most de novo techniques used to understand the physical movements of atoms and molecules is molecular dynamics (MD). MD is an in silico method to simulate the physical motions of atoms and molecules under certain conditions. This has become a state strategic technique and now plays a key role in many areas of exact sciences, such as chemistry, biology, physics and medicine. Due to their complexity, MD calculations could require enormous amounts of computer memory and time and therefore their execution has been a big problem. Despite the huge computational cost, molecular dynamics have been implemented using traditional computers with a central memory unit (CPU). A graphics processing unit (GPU) computing technology was first designed with the goal to improve video games, by rapidly creating and displaying images in a frame buffer such as screens. The hybrid GPU-CPU implementation, combined with parallel computing is a novel technology to perform a wide range of calculations. GPUs have been proposed and used to accelerate many scientific computations including MD simulations. Herein, we describe the new methodologies developed initially as video games and how they are now applied in MD simulations. PMID:27525251

  2. JPRS Report, Science & Technology, USSR: Computers

    DTIC Science & Technology

    1987-09-29

    Reliability of Protected Systems (L.S. Stoykova, O.A. Yushchenko; KIBERNETIKA, No 5, Sep-Oct 86) U Decision Making Based on Analysis of a Decision...34 published by the Central Scientific Research Institute for Information and Technoeconomic Research on Material and Technical Supply (TsNIITEIMS) of the...was said becomes clear after a subconscious analysis of the context. We have built our device according to the same pattern. In contrast to its

  3. Magnetic tape user guide

    NASA Technical Reports Server (NTRS)

    Evans, A. B.; Lee, L. L.

    1985-01-01

    This User Guide provides a general introduction to the structure, use, and handling of magnetic tapes at Langley Research Center (LaRC). The topics covered are tape terminology, physical characteristics, error prevention and detection, and creating, using, and maintaining tapes. Supplementary documentation is referenced where it might be helpful. The documentation is included for the tape utility programs, BLOCK, UNBLOCK, and TAPEDMP, which are available at the Central Scientific Computing Complex at LaRC.

  4. AstroGrid-D: Grid technology for astronomical science

    NASA Astrophysics Data System (ADS)

    Enke, Harry; Steinmetz, Matthias; Adorf, Hans-Martin; Beck-Ratzka, Alexander; Breitling, Frank; Brüsemeister, Thomas; Carlson, Arthur; Ensslin, Torsten; Högqvist, Mikael; Nickelt, Iliya; Radke, Thomas; Reinefeld, Alexander; Reiser, Angelika; Scholl, Tobias; Spurzem, Rainer; Steinacker, Jürgen; Voges, Wolfgang; Wambsganß, Joachim; White, Steve

    2011-02-01

    We present status and results of AstroGrid-D, a joint effort of astrophysicists and computer scientists to employ grid technology for scientific applications. AstroGrid-D provides access to a network of distributed machines with a set of commands as well as software interfaces. It allows simple use of computer and storage facilities and to schedule or monitor compute tasks and data management. It is based on the Globus Toolkit middleware (GT4). Chapter 1 describes the context which led to the demand for advanced software solutions in Astrophysics, and we state the goals of the project. We then present characteristic astrophysical applications that have been implemented on AstroGrid-D in chapter 2. We describe simulations of different complexity, compute-intensive calculations running on multiple sites (Section 2.1), and advanced applications for specific scientific purposes (Section 2.2), such as a connection to robotic telescopes (Section 2.2.3). We can show from these examples how grid execution improves e.g. the scientific workflow. Chapter 3 explains the software tools and services that we adapted or newly developed. Section 3.1 is focused on the administrative aspects of the infrastructure, to manage users and monitor activity. Section 3.2 characterises the central components of our architecture: The AstroGrid-D information service to collect and store metadata, a file management system, the data management system, and a job manager for automatic submission of compute tasks. We summarise the successfully established infrastructure in chapter 4, concluding with our future plans to establish AstroGrid-D as a platform of modern e-Astronomy.

  5. Mathematical and Computational Challenges in Population Biology and Ecosystems Science

    NASA Technical Reports Server (NTRS)

    Levin, Simon A.; Grenfell, Bryan; Hastings, Alan; Perelson, Alan S.

    1997-01-01

    Mathematical and computational approaches provide powerful tools in the study of problems in population biology and ecosystems science. The subject has a rich history intertwined with the development of statistics and dynamical systems theory, but recent analytical advances, coupled with the enhanced potential of high-speed computation, have opened up new vistas and presented new challenges. Key challenges involve ways to deal with the collective dynamics of heterogeneous ensembles of individuals, and to scale from small spatial regions to large ones. The central issues-understanding how detail at one scale makes its signature felt at other scales, and how to relate phenomena across scales-cut across scientific disciplines and go to the heart of algorithmic development of approaches to high-speed computation. Examples are given from ecology, genetics, epidemiology, and immunology.

  6. Significantly reducing the processing times of high-speed photometry data sets using a distributed computing model

    NASA Astrophysics Data System (ADS)

    Doyle, Paul; Mtenzi, Fred; Smith, Niall; Collins, Adrian; O'Shea, Brendan

    2012-09-01

    The scientific community is in the midst of a data analysis crisis. The increasing capacity of scientific CCD instrumentation and their falling costs is contributing to an explosive generation of raw photometric data. This data must go through a process of cleaning and reduction before it can be used for high precision photometric analysis. Many existing data processing pipelines either assume a relatively small dataset or are batch processed by a High Performance Computing centre. A radical overhaul of these processing pipelines is required to allow reduction and cleaning rates to process terabyte sized datasets at near capture rates using an elastic processing architecture. The ability to access computing resources and to allow them to grow and shrink as demand fluctuates is essential, as is exploiting the parallel nature of the datasets. A distributed data processing pipeline is required. It should incorporate lossless data compression, allow for data segmentation and support processing of data segments in parallel. Academic institutes can collaborate and provide an elastic computing model without the requirement for large centralized high performance computing data centers. This paper demonstrates how a base 10 order of magnitude improvement in overall processing time has been achieved using the "ACN pipeline", a distributed pipeline spanning multiple academic institutes.

  7. Use of a secure Internet Web site for collaborative medical research.

    PubMed

    Marshall, W W; Haley, R W

    2000-10-11

    Researchers who collaborate on clinical research studies from diffuse locations need a convenient, inexpensive, secure way to record and manage data. The Internet, with its World Wide Web, provides a vast network that enables researchers with diverse types of computers and operating systems anywhere in the world to log data through a common interface. Development of a Web site for scientific data collection can be organized into 10 steps, including planning the scientific database, choosing a database management software system, setting up database tables for each collaborator's variables, developing the Web site's screen layout, choosing a middleware software system to tie the database software to the Web site interface, embedding data editing and calculation routines, setting up the database on the central server computer, obtaining a unique Internet address and name for the Web site, applying security measures to the site, and training staff who enter data. Ensuring the security of an Internet database requires limiting the number of people who have access to the server, setting up the server on a stand-alone computer, requiring user-name and password authentication for server and Web site access, installing a firewall computer to prevent break-ins and block bogus information from reaching the server, verifying the identity of the server and client computers with certification from a certificate authority, encrypting information sent between server and client computers to avoid eavesdropping, establishing audit trails to record all accesses into the Web site, and educating Web site users about security techniques. When these measures are carefully undertaken, in our experience, information for scientific studies can be collected and maintained on Internet databases more efficiently and securely than through conventional systems of paper records protected by filing cabinets and locked doors. JAMA. 2000;284:1843-1849.

  8. Using text analysis to quantify the similarity and evolution of scientific disciplines

    PubMed Central

    Dias, Laércio; Scharloth, Joachim

    2018-01-01

    We use an information-theoretic measure of linguistic similarity to investigate the organization and evolution of scientific fields. An analysis of almost 20 M papers from the past three decades reveals that the linguistic similarity is related but different from experts and citation-based classifications, leading to an improved view on the organization of science. A temporal analysis of the similarity of fields shows that some fields (e.g. computer science) are becoming increasingly central, but that on average the similarity between pairs of disciplines has not changed in the last decades. This suggests that tendencies of convergence (e.g. multi-disciplinarity) and divergence (e.g. specialization) of disciplines are in balance. PMID:29410857

  9. Using text analysis to quantify the similarity and evolution of scientific disciplines.

    PubMed

    Dias, Laércio; Gerlach, Martin; Scharloth, Joachim; Altmann, Eduardo G

    2018-01-01

    We use an information-theoretic measure of linguistic similarity to investigate the organization and evolution of scientific fields. An analysis of almost 20 M papers from the past three decades reveals that the linguistic similarity is related but different from experts and citation-based classifications, leading to an improved view on the organization of science. A temporal analysis of the similarity of fields shows that some fields (e.g. computer science) are becoming increasingly central, but that on average the similarity between pairs of disciplines has not changed in the last decades. This suggests that tendencies of convergence (e.g. multi-disciplinarity) and divergence (e.g. specialization) of disciplines are in balance.

  10. Assessing Scientific Practices Using Machine-Learning Methods: How Closely Do They Match Clinical Interview Performance?

    NASA Astrophysics Data System (ADS)

    Beggrow, Elizabeth P.; Ha, Minsu; Nehm, Ross H.; Pearl, Dennis; Boone, William J.

    2014-02-01

    The landscape of science education is being transformed by the new Framework for Science Education (National Research Council, A framework for K-12 science education: practices, crosscutting concepts, and core ideas. The National Academies Press, Washington, DC, 2012), which emphasizes the centrality of scientific practices—such as explanation, argumentation, and communication—in science teaching, learning, and assessment. A major challenge facing the field of science education is developing assessment tools that are capable of validly and efficiently evaluating these practices. Our study examined the efficacy of a free, open-source machine-learning tool for evaluating the quality of students' written explanations of the causes of evolutionary change relative to three other approaches: (1) human-scored written explanations, (2) a multiple-choice test, and (3) clinical oral interviews. A large sample of undergraduates (n = 104) exposed to varying amounts of evolution content completed all three assessments: a clinical oral interview, a written open-response assessment, and a multiple-choice test. Rasch analysis was used to compute linear person measures and linear item measures on a single logit scale. We found that the multiple-choice test displayed poor person and item fit (mean square outfit >1.3), while both oral interview measures and computer-generated written response measures exhibited acceptable fit (average mean square outfit for interview: person 0.97, item 0.97; computer: person 1.03, item 1.06). Multiple-choice test measures were more weakly associated with interview measures (r = 0.35) than the computer-scored explanation measures (r = 0.63). Overall, Rasch analysis indicated that computer-scored written explanation measures (1) have the strongest correspondence to oral interview measures; (2) are capable of capturing students' normative scientific and naive ideas as accurately as human-scored explanations, and (3) more validly detect understanding than the multiple-choice assessment. These findings demonstrate the great potential of machine-learning tools for assessing key scientific practices highlighted in the new Framework for Science Education.

  11. 76 FR 41234 - Advanced Scientific Computing Advisory Committee Charter Renewal

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-13

    ... Secretariat, General Services Administration, notice is hereby given that the Advanced Scientific Computing... advice and recommendations concerning the Advanced Scientific Computing program in response only to... Advanced Scientific Computing Research program and recommendations based thereon; --Advice on the computing...

  12. Network effects on scientific collaborations.

    PubMed

    Uddin, Shahadat; Hossain, Liaquat; Rasmussen, Kim

    2013-01-01

    The analysis of co-authorship network aims at exploring the impact of network structure on the outcome of scientific collaborations and research publications. However, little is known about what network properties are associated with authors who have increased number of joint publications and are being cited highly. Measures of social network analysis, for example network centrality and tie strength, have been utilized extensively in current co-authorship literature to explore different behavioural patterns of co-authorship networks. Using three SNA measures (i.e., degree centrality, closeness centrality and betweenness centrality), we explore scientific collaboration networks to understand factors influencing performance (i.e., citation count) and formation (tie strength between authors) of such networks. A citation count is the number of times an article is cited by other articles. We use co-authorship dataset of the research field of 'steel structure' for the year 2005 to 2009. To measure the strength of scientific collaboration between two authors, we consider the number of articles co-authored by them. In this study, we examine how citation count of a scientific publication is influenced by different centrality measures of its co-author(s) in a co-authorship network. We further analyze the impact of the network positions of authors on the strength of their scientific collaborations. We use both correlation and regression methods for data analysis leading to statistical validation. We identify that citation count of a research article is positively correlated with the degree centrality and betweenness centrality values of its co-author(s). Also, we reveal that degree centrality and betweenness centrality values of authors in a co-authorship network are positively correlated with the strength of their scientific collaborations. Authors' network positions in co-authorship networks influence the performance (i.e., citation count) and formation (i.e., tie strength) of scientific collaborations.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nikolic, R J

    This month's issue has the following articles: (1) Dawn of a New Era of Scientific Discovery - Commentary by Edward I. Moses; (2) At the Frontiers of Fundamental Science Research - Collaborators from national laboratories, universities, and international organizations are using the National Ignition Facility to probe key fundamental science questions; (3) Livermore Responds to Crisis in Post-Earthquake Japan - More than 70 Laboratory scientists provided round-the-clock expertise in radionuclide analysis and atmospheric dispersion modeling as part of the nation's support to Japan following the March 2011 earthquake and nuclear accident; (4) A Comprehensive Resource for Modeling, Simulation, and Experimentsmore » - A new Web-based resource called MIDAS is a central repository for material properties, experimental data, and computer models; and (5) Finding Data Needles in Gigabit Haystacks - Livermore computer scientists have developed a novel computer architecture based on 'persistent' memory to ease data-intensive computations.« less

  14. A review of bioinformatics training applied to research in molecular medicine, agriculture and biodiversity in Costa Rica and Central America.

    PubMed

    Orozco, Allan; Morera, Jessica; Jiménez, Sergio; Boza, Ricardo

    2013-09-01

    Today, Bioinformatics has become a scientific discipline with great relevance for the Molecular Biosciences and for the Omics sciences in general. Although developed countries have progressed with large strides in Bioinformatics education and research, in other regions, such as Central America, the advances have occurred in a gradual way and with little support from the Academia, either at the undergraduate or graduate level. To address this problem, the University of Costa Rica's Medical School, a regional leader in Bioinformatics in Central America, has been conducting a series of Bioinformatics workshops, seminars and courses, leading to the creation of the region's first Bioinformatics Master's Degree. The recent creation of the Central American Bioinformatics Network (BioCANET), associated to the deployment of a supporting computational infrastructure (HPC Cluster) devoted to provide computing support for Molecular Biology in the region, is providing a foundational stone for the development of Bioinformatics in the area. Central American bioinformaticians have participated in the creation of as well as co-founded the Iberoamerican Bioinformatics Society (SOIBIO). In this article, we review the most recent activities in education and research in Bioinformatics from several regional institutions. These activities have resulted in further advances for Molecular Medicine, Agriculture and Biodiversity research in Costa Rica and the rest of the Central American countries. Finally, we provide summary information on the first Central America Bioinformatics International Congress, as well as the creation of the first Bioinformatics company (Indromics Bioinformatics), spin-off the Academy in Central America and the Caribbean.

  15. 76 FR 31945 - Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-02

    ... DEPARTMENT OF ENERGY Advanced Scientific Computing Advisory Committee AGENCY: Department of Energy... teleconference meeting of the Advanced Scientific Computing Advisory Committee (ASCAC). The Federal [email protected] . FOR FURTHER INFORMATION CONTACT: Melea Baker, Office of Advanced Scientific Computing...

  16. 75 FR 9887 - Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-04

    ... DEPARTMENT OF ENERGY Advanced Scientific Computing Advisory Committee AGENCY: Department of Energy... Advanced Scientific Computing Advisory Committee (ASCAC). Federal Advisory Committee Act (Pub. L. 92-463... INFORMATION CONTACT: Melea Baker, Office of Advanced Scientific Computing Research; SC-21/Germantown Building...

  17. 76 FR 9765 - Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-22

    ... DEPARTMENT OF ENERGY Advanced Scientific Computing Advisory Committee AGENCY: Office of Science... Advanced Scientific Computing Advisory Committee (ASCAC). The Federal Advisory Committee Act (Pub. L. 92... INFORMATION CONTACT: Melea Baker, Office of Advanced Scientific Computing Research, SC-21/Germantown Building...

  18. 77 FR 45345 - DOE/Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-31

    ... Recompetition results for Scientific Discovery through Advanced Computing (SciDAC) applications Co-design Public... DEPARTMENT OF ENERGY DOE/Advanced Scientific Computing Advisory Committee AGENCY: Office of... the Advanced Scientific Computing Advisory Committee (ASCAC). The Federal Advisory Committee Act (Pub...

  19. 75 FR 64720 - DOE/Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-20

    ... DEPARTMENT OF ENERGY DOE/Advanced Scientific Computing Advisory Committee AGENCY: Department of... the Advanced Scientific Computing Advisory Committee (ASCAC). Federal Advisory Committee Act (Pub. L.... FOR FURTHER INFORMATION CONTACT: Melea Baker, Office of Advanced Scientific Computing Research; SC-21...

  20. Computing through Scientific Abstractions in SysBioPS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chin, George; Stephan, Eric G.; Gracio, Deborah K.

    2004-10-13

    Today, biologists and bioinformaticists have a tremendous amount of computational power at their disposal. With the availability of supercomputers, burgeoning scientific databases and digital libraries such as GenBank and PubMed, and pervasive computational environments such as the Grid, biologists have access to a wealth of computational capabilities and scientific data at hand. Yet, the rapid development of computational technologies has far exceeded the typical biologist’s ability to effectively apply the technology in their research. Computational sciences research and development efforts such as the Biology Workbench, BioSPICE (Biological Simulation Program for Intra-Cellular Evaluation), and BioCoRE (Biological Collaborative Research Environment) are importantmore » in connecting biologists and their scientific problems to computational infrastructures. On the Computational Cell Environment and Heuristic Entity-Relationship Building Environment projects at the Pacific Northwest National Laboratory, we are jointly developing a new breed of scientific problem solving environment called SysBioPSE that will allow biologists to access and apply computational resources in the scientific research context. In contrast to other computational science environments, SysBioPSE operates as an abstraction layer above a computational infrastructure. The goal of SysBioPSE is to allow biologists to apply computational resources in the context of the scientific problems they are addressing and the scientific perspectives from which they conduct their research. More specifically, SysBioPSE allows biologists to capture and represent scientific concepts and theories and experimental processes, and to link these views to scientific applications, data repositories, and computer systems.« less

  1. 75 FR 43518 - Advanced Scientific Computing Advisory Committee; Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-26

    ... DEPARTMENT OF ENERGY Advanced Scientific Computing Advisory Committee; Meeting AGENCY: Office of... Scientific Computing Advisory Committee (ASCAC). Federal Advisory Committee Act (Pub. L. 92-463, 86 Stat. 770...: Melea Baker, Office of Advanced Scientific Computing Research; SC-21/Germantown Building; U. S...

  2. Network Effects on Scientific Collaborations

    PubMed Central

    Uddin, Shahadat; Hossain, Liaquat; Rasmussen, Kim

    2013-01-01

    Background The analysis of co-authorship network aims at exploring the impact of network structure on the outcome of scientific collaborations and research publications. However, little is known about what network properties are associated with authors who have increased number of joint publications and are being cited highly. Methodology/Principal Findings Measures of social network analysis, for example network centrality and tie strength, have been utilized extensively in current co-authorship literature to explore different behavioural patterns of co-authorship networks. Using three SNA measures (i.e., degree centrality, closeness centrality and betweenness centrality), we explore scientific collaboration networks to understand factors influencing performance (i.e., citation count) and formation (tie strength between authors) of such networks. A citation count is the number of times an article is cited by other articles. We use co-authorship dataset of the research field of ‘steel structure’ for the year 2005 to 2009. To measure the strength of scientific collaboration between two authors, we consider the number of articles co-authored by them. In this study, we examine how citation count of a scientific publication is influenced by different centrality measures of its co-author(s) in a co-authorship network. We further analyze the impact of the network positions of authors on the strength of their scientific collaborations. We use both correlation and regression methods for data analysis leading to statistical validation. We identify that citation count of a research article is positively correlated with the degree centrality and betweenness centrality values of its co-author(s). Also, we reveal that degree centrality and betweenness centrality values of authors in a co-authorship network are positively correlated with the strength of their scientific collaborations. Conclusions/Significance Authors’ network positions in co-authorship networks influence the performance (i.e., citation count) and formation (i.e., tie strength) of scientific collaborations. PMID:23469021

  3. Template Interfaces for Agile Parallel Data-Intensive Science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ramakrishnan, Lavanya; Gunter, Daniel; Pastorello, Gilerto Z.

    Tigres provides a programming library to compose and execute large-scale data-intensive scientific workflows from desktops to supercomputers. DOE User Facilities and large science collaborations are increasingly generating large enough data sets that it is no longer practical to download them to a desktop to operate on them. They are instead stored at centralized compute and storage resources such as high performance computing (HPC) centers. Analysis of this data requires an ability to run on these facilities, but with current technologies, scaling an analysis to an HPC center and to a large data set is difficult even for experts. Tigres ismore » addressing the challenge of enabling collaborative analysis of DOE Science data through a new concept of reusable "templates" that enable scientists to easily compose, run and manage collaborative computational tasks. These templates define common computation patterns used in analyzing a data set.« less

  4. HEP - A semaphore-synchronized multiprocessor with central control. [Heterogeneous Element Processor

    NASA Technical Reports Server (NTRS)

    Gilliland, M. C.; Smith, B. J.; Calvert, W.

    1976-01-01

    The paper describes the design concept of the Heterogeneous Element Processor (HEP), a system tailored to the special needs of scientific simulation. In order to achieve high-speed computation required by simulation, HEP features a hierarchy of processes executing in parallel on a number of processors, with synchronization being largely accomplished by hardware. A full-empty-reserve scheme of synchronization is realized by zero-one-valued hardware semaphores. A typical system has, besides the control computer and the scheduler, an algebraic module, a memory module, a first-in first-out (FIFO) module, an integrator module, and an I/O module. The architecture of the scheduler and the algebraic module is examined in detail.

  5. Intelligent Network Management and Functional Cerebellum Synthesis

    NASA Technical Reports Server (NTRS)

    Loebner, Egon E.

    1989-01-01

    Transdisciplinary modeling of the cerebellum across histology, physiology, and network engineering provides preliminary results at three organization levels: input/output links to central nervous system networks; links between the six neuron populations in the cerebellum; and computation among the neurons of the populations. Older models probably underestimated the importance and role of climbing fiber input which seems to supply write as well as read signals, not just to Purkinje but also to basket and stellate neurons. The well-known mossy fiber-granule cell-Golgi cell system should also respond to inputs originating from climbing fibers. Corticonuclear microcomplexing might be aided by stellate and basket computation and associate processing. Technological and scientific implications of the proposed cerebellum model are discussed.

  6. Numerically stable, scalable formulas for parallel and online computation of higher-order multivariate central moments with arbitrary weights

    DOE PAGES

    Pebay, Philippe; Terriberry, Timothy B.; Kolla, Hemanth; ...

    2016-03-29

    Formulas for incremental or parallel computation of second order central moments have long been known, and recent extensions of these formulas to univariate and multivariate moments of arbitrary order have been developed. Such formulas are of key importance in scenarios where incremental results are required and in parallel and distributed systems where communication costs are high. We survey these recent results, and improve them with arbitrary-order, numerically stable one-pass formulas which we further extend with weighted and compound variants. We also develop a generalized correction factor for standard two-pass algorithms that enables the maintenance of accuracy over nearly the fullmore » representable range of the input, avoiding the need for extended-precision arithmetic. We then empirically examine algorithm correctness for pairwise update formulas up to order four as well as condition number and relative error bounds for eight different central moment formulas, each up to degree six, to address the trade-offs between numerical accuracy and speed of the various algorithms. Finally, we demonstrate the use of the most elaborate among the above mentioned formulas, with the utilization of the compound moments for a practical large-scale scientific application.« less

  7. Architectural Strategies for Enabling Data-Driven Science at Scale

    NASA Astrophysics Data System (ADS)

    Crichton, D. J.; Law, E. S.; Doyle, R. J.; Little, M. M.

    2017-12-01

    The analysis of large data collections from NASA or other agencies is often executed through traditional computational and data analysis approaches, which require users to bring data to their desktops and perform local data analysis. Alternatively, data are hauled to large computational environments that provide centralized data analysis via traditional High Performance Computing (HPC). Scientific data archives, however, are not only growing massive, but are also becoming highly distributed. Neither traditional approach provides a good solution for optimizing analysis into the future. Assumptions across the NASA mission and science data lifecycle, which historically assume that all data can be collected, transmitted, processed, and archived, will not scale as more capable instruments stress legacy-based systems. New paradigms are needed to increase the productivity and effectiveness of scientific data analysis. This paradigm must recognize that architectural and analytical choices are interrelated, and must be carefully coordinated in any system that aims to allow efficient, interactive scientific exploration and discovery to exploit massive data collections, from point of collection (e.g., onboard) to analysis and decision support. The most effective approach to analyzing a distributed set of massive data may involve some exploration and iteration, putting a premium on the flexibility afforded by the architectural framework. The framework should enable scientist users to assemble workflows efficiently, manage the uncertainties related to data analysis and inference, and optimize deep-dive analytics to enhance scalability. In many cases, this "data ecosystem" needs to be able to integrate multiple observing assets, ground environments, archives, and analytics, evolving from stewardship of measurements of data to using computational methodologies to better derive insight from the data that may be fused with other sets of data. This presentation will discuss architectural strategies, including a 2015-2016 NASA AIST Study on Big Data, for evolving scientific research towards massively distributed data-driven discovery. It will include example use cases across earth science, planetary science, and other disciplines.

  8. Using SIR (Scientific Information Retrieval System) for data management during a field program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tichler, J.L.

    As part of the US Department of Energy's program, PRocessing of Emissions by Clouds and Precipitation (PRECP), a team of scientists from four laboratories conducted a study in north central New York State, to characterize the chemical and physical processes occurring in winter storms. Sampling took place from three aircraft, two instrumented motor homes and a network of 26 surface precipitation sampling sites. Data management personnel were part of the field program, using a portable IBM PC-AT computer to enter information as it became available during the field study. Having the same database software on the field computer and onmore » the cluster of VAX 11/785 computers in use aided database development and the transfer of data between machines. 2 refs., 3 figs., 5 tabs.« less

  9. ScienceCentral: open access full-text archive of scientific journals based on Journal Article Tag Suite regardless of their languages.

    PubMed

    Huh, Sun

    2013-01-01

    ScienceCentral, a free or open access, full-text archive of scientific journal literature at the Korean Federation of Science and Technology Societies, was under test in September 2013. Since it is a Journal Article Tag Suite-based full text database, extensible markup language files of all languages can be presented, according to Unicode Transformation Format 8-bit encoding. It is comparable to PubMed Central: however, there are two distinct differences. First, its scope comprises all science fields; second, it accepts all language journals. Launching ScienceCentral is the first step for free access or open access academic scientific journals of all languages to leap to the world, including scientific journals from Croatia.

  10. Crowd-Sourcing Seismic Data: Lessons Learned from the Quake-Catcher Network

    NASA Astrophysics Data System (ADS)

    Cochran, E. S.; Sumy, D. F.; DeGroot, R. M.; Clayton, R. W.

    2015-12-01

    The Quake Catcher Network (QCN; qcn.caltech.edu) uses low cost micro-electro-mechanical system (MEMS) sensors hosted by volunteers to collect seismic data. Volunteers use accelerometers internal to laptop computers, phones, tablets or small (the size of a matchbox) MEMS sensors plugged into desktop computers using a USB connector to collect scientifically useful data. Data are collected and sent to a central server using the Berkeley Open Infrastructure for Network Computing (BOINC) distributed computing software. Since 2008, when the first citizen scientists joined the QCN project, sensors installed in museums, schools, offices, and residences have collected thousands of earthquake records. We present and describe the rapid installations of very dense sensor networks that have been undertaken following several large earthquakes including the 2010 M8.8 Maule Chile, the 2010 M7.1 Darfield, New Zealand, and the 2015 M7.8 Gorkha, Nepal earthquake. These large data sets allowed seismologists to develop new rapid earthquake detection capabilities and closely examine source, path, and site properties that impact ground shaking at a site. We show how QCN has engaged a wide sector of the public in scientific data collection, providing the public with insights into how seismic data are collected and used. Furthermore, we describe how students use data recorded by QCN sensors installed in their classrooms to explore and investigate earthquakes that they felt, as part of 'teachable moment' exercises.

  11. Reflections on Peter Slezak and the 'Sociology of Scientific Knowledge`

    NASA Astrophysics Data System (ADS)

    Suchting, W. A.

    The paper examines central parts of the first of two papers in this journal by Peter Slezak criticising sociology of scientific knowledge and also considers, independently, some of the main philosophical issues raised by the sociologists of science, in particular David Bloor. The general conclusion is that each account alludes to different and crucial aspects of the nature of knowledge without, severally or jointly, being able to theorise them adequately. The appendix contains epistemological theses central to a more adequate theory of scientific knowledge.... our Histories of six Thousand Moons make no Mention of any other, than the two great Empires of Lilliput and Blefuscu. Which mighty Powers have ... been engaged in a most obstinate War for six and thirty Moons past. It began upon the following Occasion. It is allowed on all Hands, that the primitive Way of breaking Eggs before we eat them, was upon the larger End: But ... the Emperor [of Lilliput] ... published an Edict, commanding all his Subjects, upon great Penalties, to break the smaller End of their Eggs. The People so resented this Law, that ... there have been six Rebellions raised on that Account ... These civil Commotions were constantly fomented by the Monarchs of Blefuscu ... It is computed, that eleven Thousand have, at several Times, suffered Death, rather than break Eggs at the smaller End. Many hundred large Volumes have published upon this Controversy ...

  12. JACOB: an enterprise framework for computational chemistry.

    PubMed

    Waller, Mark P; Dresselhaus, Thomas; Yang, Jack

    2013-06-15

    Here, we present just a collection of beans (JACOB): an integrated batch-based framework designed for the rapid development of computational chemistry applications. The framework expedites developer productivity by handling the generic infrastructure tier, and can be easily extended by user-specific scientific code. Paradigms from enterprise software engineering were rigorously applied to create a scalable, testable, secure, and robust framework. A centralized web application is used to configure and control the operation of the framework. The application-programming interface provides a set of generic tools for processing large-scale noninteractive jobs (e.g., systematic studies), or for coordinating systems integration (e.g., complex workflows). The code for the JACOB framework is open sourced and is available at: www.wallerlab.org/jacob. Copyright © 2013 Wiley Periodicals, Inc.

  13. ScienceCentral: open access full-text archive of scientific journals based on Journal Article Tag Suite regardless of their languages

    PubMed Central

    Huh, Sun

    2013-01-01

    ScienceCentral, a free or open access, full-text archive of scientific journal literature at the Korean Federation of Science and Technology Societies, was under test in September 2013. Since it is a Journal Article Tag Suite-based full text database, extensible markup language files of all languages can be presented, according to Unicode Transformation Format 8-bit encoding. It is comparable to PubMed Central: however, there are two distinct differences. First, its scope comprises all science fields; second, it accepts all language journals. Launching ScienceCentral is the first step for free access or open access academic scientific journals of all languages to leap to the world, including scientific journals from Croatia. PMID:24266292

  14. Individual versus group decision making: Jurors’ reliance on central and peripheral information to evaluate expert testimony

    PubMed Central

    Bottoms, Bette L.; Peter-Hagene, Liana C.

    2017-01-01

    To investigate dual-process persuasion theories in the context of group decision making, we studied low and high need-for-cognition (NFC) participants within a mock trial study. Participants considered plaintiff and defense expert scientific testimony that varied in argument strength. All participants heard a cross-examination of the experts focusing on peripheral information (e.g., credentials) about the expert, but half were randomly assigned to also hear central information highlighting flaws in the expert’s message (e.g., quality of the research presented by the expert). Participants rendered pre- and post-group-deliberation verdicts, which were considered “scientifically accurate” if the verdicts reflected the strong (versus weak) expert message, and “scientifically inaccurate” if they reflected the weak (versus strong) expert message. For individual participants, we replicated studies testing classic persuasion theories: Factors promoting reliance on central information (i.e., central cross-examination, high NFC) improved verdict accuracy because they sensitized individual participants to the quality discrepancy between the experts’ messages. Interestingly, however, at the group level, the more that scientifically accurate mock jurors discussed peripheral (versus central) information about the experts, the more likely their group was to reach the scientifically accurate verdict. When participants were arguing for the scientifically accurate verdict consistent with the strong expert message, peripheral comments increased their persuasiveness, which made the group more likely to reach the more scientifically accurate verdict. PMID:28931011

  15. Individual versus group decision making: Jurors' reliance on central and peripheral information to evaluate expert testimony.

    PubMed

    Salerno, Jessica M; Bottoms, Bette L; Peter-Hagene, Liana C

    2017-01-01

    To investigate dual-process persuasion theories in the context of group decision making, we studied low and high need-for-cognition (NFC) participants within a mock trial study. Participants considered plaintiff and defense expert scientific testimony that varied in argument strength. All participants heard a cross-examination of the experts focusing on peripheral information (e.g., credentials) about the expert, but half were randomly assigned to also hear central information highlighting flaws in the expert's message (e.g., quality of the research presented by the expert). Participants rendered pre- and post-group-deliberation verdicts, which were considered "scientifically accurate" if the verdicts reflected the strong (versus weak) expert message, and "scientifically inaccurate" if they reflected the weak (versus strong) expert message. For individual participants, we replicated studies testing classic persuasion theories: Factors promoting reliance on central information (i.e., central cross-examination, high NFC) improved verdict accuracy because they sensitized individual participants to the quality discrepancy between the experts' messages. Interestingly, however, at the group level, the more that scientifically accurate mock jurors discussed peripheral (versus central) information about the experts, the more likely their group was to reach the scientifically accurate verdict. When participants were arguing for the scientifically accurate verdict consistent with the strong expert message, peripheral comments increased their persuasiveness, which made the group more likely to reach the more scientifically accurate verdict.

  16. Development of a Centralized Automated Scientific and Technical Information Service in the People's Republic of Bulgaria.

    ERIC Educational Resources Information Center

    Kiratsov, P.

    1983-01-01

    Discusses the design and organization of the Automated Information Centre, a centralized automated scientific and technical information service established within the main organ of Bulgaria's National System for Scientific and Technical Information, with UNESCO and United Nations Development Program assistance. Problems and perspectives for…

  17. Simulating and mapping spatial complexity using multi-scale techniques

    USGS Publications Warehouse

    De Cola, L.

    1994-01-01

    A central problem in spatial analysis is the mapping of data for complex spatial fields using relatively simple data structures, such as those of a conventional GIS. This complexity can be measured using such indices as multi-scale variance, which reflects spatial autocorrelation, and multi-fractal dimension, which characterizes the values of fields. These indices are computed for three spatial processes: Gaussian noise, a simple mathematical function, and data for a random walk. Fractal analysis is then used to produce a vegetation map of the central region of California based on a satellite image. This analysis suggests that real world data lie on a continuum between the simple and the random, and that a major GIS challenge is the scientific representation and understanding of rapidly changing multi-scale fields. -Author

  18. XVis: Visualization for the Extreme-Scale Scientific-Computation Ecosystem: Mid-year report FY17 Q2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moreland, Kenneth D.; Pugmire, David; Rogers, David

    The XVis project brings together the key elements of research to enable scientific discovery at extreme scale. Scientific computing will no longer be purely about how fast computations can be performed. Energy constraints, processor changes, and I/O limitations necessitate significant changes in both the software applications used in scientific computation and the ways in which scientists use them. Components for modeling, simulation, analysis, and visualization must work together in a computational ecosystem, rather than working independently as they have in the past. This project provides the necessary research and infrastructure for scientific discovery in this new computational ecosystem by addressingmore » four interlocking challenges: emerging processor technology, in situ integration, usability, and proxy analysis.« less

  19. XVis: Visualization for the Extreme-Scale Scientific-Computation Ecosystem: Year-end report FY17.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moreland, Kenneth D.; Pugmire, David; Rogers, David

    The XVis project brings together the key elements of research to enable scientific discovery at extreme scale. Scientific computing will no longer be purely about how fast computations can be performed. Energy constraints, processor changes, and I/O limitations necessitate significant changes in both the software applications used in scientific computation and the ways in which scientists use them. Components for modeling, simulation, analysis, and visualization must work together in a computational ecosystem, rather than working independently as they have in the past. This project provides the necessary research and infrastructure for scientific discovery in this new computational ecosystem by addressingmore » four interlocking challenges: emerging processor technology, in situ integration, usability, and proxy analysis.« less

  20. XVis: Visualization for the Extreme-Scale Scientific-Computation Ecosystem. Mid-year report FY16 Q2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moreland, Kenneth D.; Sewell, Christopher; Childs, Hank

    The XVis project brings together the key elements of research to enable scientific discovery at extreme scale. Scientific computing will no longer be purely about how fast computations can be performed. Energy constraints, processor changes, and I/O limitations necessitate significant changes in both the software applications used in scientific computation and the ways in which scientists use them. Components for modeling, simulation, analysis, and visualization must work together in a computational ecosystem, rather than working independently as they have in the past. This project provides the necessary research and infrastructure for scientific discovery in this new computational ecosystem by addressingmore » four interlocking challenges: emerging processor technology, in situ integration, usability, and proxy analysis.« less

  1. XVis: Visualization for the Extreme-Scale Scientific-Computation Ecosystem: Year-end report FY15 Q4.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moreland, Kenneth D.; Sewell, Christopher; Childs, Hank

    The XVis project brings together the key elements of research to enable scientific discovery at extreme scale. Scientific computing will no longer be purely about how fast computations can be performed. Energy constraints, processor changes, and I/O limitations necessitate significant changes in both the software applications used in scientific computation and the ways in which scientists use them. Components for modeling, simulation, analysis, and visualization must work together in a computational ecosystem, rather than working independently as they have in the past. This project provides the necessary research and infrastructure for scientific discovery in this new computational ecosystem by addressingmore » four interlocking challenges: emerging processor technology, in situ integration, usability, and proxy analysis.« less

  2. Grid site availability evaluation and monitoring at CMS

    DOE PAGES

    Lyons, Gaston; Maciulaitis, Rokas; Bagliesi, Giuseppe; ...

    2017-10-01

    The Compact Muon Solenoid (CMS) experiment at the Large Hadron Collider (LHC) uses distributed grid computing to store, process, and analyse the vast quantity of scientific data recorded every year. The computing resources are grouped into sites and organized in a tiered structure. Each site provides computing and storage to the CMS computing grid. Over a hundred sites worldwide contribute with resources from hundred to well over ten thousand computing cores and storage from tens of TBytes to tens of PBytes. In such a large computing setup scheduled and unscheduled outages occur continually and are not allowed to significantly impactmore » data handling, processing, and analysis. Unscheduled capacity and performance reductions need to be detected promptly and corrected. CMS developed a sophisticated site evaluation and monitoring system for Run 1 of the LHC based on tools of the Worldwide LHC Computing Grid. For Run 2 of the LHC the site evaluation and monitoring system is being overhauled to enable faster detection/reaction to failures and a more dynamic handling of computing resources. Furthermore, enhancements to better distinguish site from central service issues and to make evaluations more transparent and informative to site support staff are planned.« less

  3. Grid site availability evaluation and monitoring at CMS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lyons, Gaston; Maciulaitis, Rokas; Bagliesi, Giuseppe

    The Compact Muon Solenoid (CMS) experiment at the Large Hadron Collider (LHC) uses distributed grid computing to store, process, and analyse the vast quantity of scientific data recorded every year. The computing resources are grouped into sites and organized in a tiered structure. Each site provides computing and storage to the CMS computing grid. Over a hundred sites worldwide contribute with resources from hundred to well over ten thousand computing cores and storage from tens of TBytes to tens of PBytes. In such a large computing setup scheduled and unscheduled outages occur continually and are not allowed to significantly impactmore » data handling, processing, and analysis. Unscheduled capacity and performance reductions need to be detected promptly and corrected. CMS developed a sophisticated site evaluation and monitoring system for Run 1 of the LHC based on tools of the Worldwide LHC Computing Grid. For Run 2 of the LHC the site evaluation and monitoring system is being overhauled to enable faster detection/reaction to failures and a more dynamic handling of computing resources. Furthermore, enhancements to better distinguish site from central service issues and to make evaluations more transparent and informative to site support staff are planned.« less

  4. Grid site availability evaluation and monitoring at CMS

    NASA Astrophysics Data System (ADS)

    Lyons, Gaston; Maciulaitis, Rokas; Bagliesi, Giuseppe; Lammel, Stephan; Sciabà, Andrea

    2017-10-01

    The Compact Muon Solenoid (CMS) experiment at the Large Hadron Collider (LHC) uses distributed grid computing to store, process, and analyse the vast quantity of scientific data recorded every year. The computing resources are grouped into sites and organized in a tiered structure. Each site provides computing and storage to the CMS computing grid. Over a hundred sites worldwide contribute with resources from hundred to well over ten thousand computing cores and storage from tens of TBytes to tens of PBytes. In such a large computing setup scheduled and unscheduled outages occur continually and are not allowed to significantly impact data handling, processing, and analysis. Unscheduled capacity and performance reductions need to be detected promptly and corrected. CMS developed a sophisticated site evaluation and monitoring system for Run 1 of the LHC based on tools of the Worldwide LHC Computing Grid. For Run 2 of the LHC the site evaluation and monitoring system is being overhauled to enable faster detection/reaction to failures and a more dynamic handling of computing resources. Enhancements to better distinguish site from central service issues and to make evaluations more transparent and informative to site support staff are planned.

  5. A computer-assisted data collection system for use in a multicenter study of American Indians and Alaska Natives: SCAPES.

    PubMed

    Edwards, Roger L; Edwards, Sandra L; Bryner, James; Cunningham, Kelly; Rogers, Amy; Slattery, Martha L

    2008-04-01

    We describe a computer-assisted data collection system developed for a multicenter cohort study of American Indian and Alaska Native people. The study computer-assisted participant evaluation system or SCAPES is built around a central database server that controls a small private network with touch screen workstations. SCAPES encompasses the self-administered questionnaires, the keyboard-based stations for interviewer-administered questionnaires, a system for inputting medical measurements, and administrative tasks such as data exporting, backup and management. Elements of SCAPES hardware/network design, data storage, programming language, software choices, questionnaire programming including the programming of questionnaires administered using audio computer-assisted self-interviewing (ACASI), and participant identification/data security system are presented. Unique features of SCAPES are that data are promptly made available to participants in the form of health feedback; data can be quickly summarized for tribes for health monitoring and planning at the community level; and data are available to study investigators for analyses and scientific evaluation.

  6. The changing features of the body-mind problem.

    PubMed

    Agassi, Joseph

    2007-01-01

    The body-mind problem invites scientific study, since mental events are repeated and repeatable and invite testable explanations. They seemed troublesome because of the classical theory of substance that failed to solve its own central problems. These are soluble with the aid of the theory of the laws of nature, particularly in its emergentist version [Bunge, M., 1980. The Body-mind Problem, Pergamon, Oxford] that invites refutable explanations [Popper, K.R., 1959. The Logic of Scientific Discovery, Hutchinson, London]. The view of mental properties as emergent is a modification of the two chief classical views, materialism and dualism. As this view invites testable explanations of events of the inner world, it is better than the quasi-behaviorist view of self-awareness as computer-style self-monitoring [Minsky, M., Laske, O., 1992. A conversation with Marvin Minsky. AI Magazine 13 (3), 31-45].

  7. Creating Web-Based Scientific Applications Using Java Servlets

    NASA Technical Reports Server (NTRS)

    Palmer, Grant; Arnold, James O. (Technical Monitor)

    2001-01-01

    There are many advantages to developing web-based scientific applications. Any number of people can access the application concurrently. The application can be accessed from a remote location. The application becomes essentially platform-independent because it can be run from any machine that has internet access and can run a web browser. Maintenance and upgrades to the application are simplified since only one copy of the application exists in a centralized location. This paper details the creation of web-based applications using Java servlets. Java is a powerful, versatile programming language that is well suited to developing web-based programs. A Java servlet provides the interface between the central server and the remote client machines. The servlet accepts input data from the client, runs the application on the server, and sends the output back to the client machine. The type of servlet that supports the HTTP protocol will be discussed in depth. Among the topics the paper will discuss are how to write an http servlet, how the servlet can run applications written in Java and other languages, and how to set up a Java web server. The entire process will be demonstrated by building a web-based application to compute stagnation point heat transfer.

  8. The Ontology of Clinical Research (OCRe): An Informatics Foundation for the Science of Clinical Research

    PubMed Central

    Sim, Ida; Tu, Samson W.; Carini, Simona; Lehmann, Harold P.; Pollock, Brad H.; Peleg, Mor; Wittkowski, Knut M.

    2013-01-01

    To date, the scientific process for generating, interpreting, and applying knowledge has received less informatics attention than operational processes for conducting clinical studies. The activities of these scientific processes — the science of clinical research — are centered on the study protocol, which is the abstract representation of the scientific design of a clinical study. The Ontology of Clinical Research (OCRe) is an OWL 2 model of the entities and relationships of study design protocols for the purpose of computationally supporting the design and analysis of human studies. OCRe’s modeling is independent of any specific study design or clinical domain. It includes a study design typology and a specialized module called ERGO Annotation for capturing the meaning of eligibility criteria. In this paper, we describe the key informatics use cases of each phase of a study’s scientific lifecycle, present OCRe and the principles behind its modeling, and describe applications of OCRe and associated technologies to a range of clinical research use cases. OCRe captures the central semantics that underlies the scientific processes of clinical research and can serve as an informatics foundation for supporting the entire range of knowledge activities that constitute the science of clinical research. PMID:24239612

  9. [Trends of the scientific work development in central military-and-clinical hospitals].

    PubMed

    Tregubov, V N; Baranov, V V

    2006-04-01

    Scientific work in central military-and-clinical hospitals (CMCH) is very important since it leads to creation and application of modern medical technologies in practice of military-and-medical service, professional growth of doctors and improves the status of hospitals among other medical organizations. The analysis of CMCH under the Russian Ministry of Defense shows that the main role in the development of scientific work in central hospitals belongs to management which is the activity to perform planning, organization, coordination, motivation and control functions.

  10. Scientific Services on the Cloud

    NASA Astrophysics Data System (ADS)

    Chapman, David; Joshi, Karuna P.; Yesha, Yelena; Halem, Milt; Yesha, Yaacov; Nguyen, Phuong

    Scientific Computing was one of the first every applications for parallel and distributed computation. To this date, scientific applications remain some of the most compute intensive, and have inspired creation of petaflop compute infrastructure such as the Oak Ridge Jaguar and Los Alamos RoadRunner. Large dedicated hardware infrastructure has become both a blessing and a curse to the scientific community. Scientists are interested in cloud computing for much the same reason as businesses and other professionals. The hardware is provided, maintained, and administrated by a third party. Software abstraction and virtualization provide reliability, and fault tolerance. Graduated fees allow for multi-scale prototyping and execution. Cloud computing resources are only a few clicks away, and by far the easiest high performance distributed platform to gain access to. There may still be dedicated infrastructure for ultra-scale science, but the cloud can easily play a major part of the scientific computing initiative.

  11. Center for Center for Technology for Advanced Scientific Component Software (TASCS)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kostadin, Damevski

    A resounding success of the Scientific Discovery through Advanced Computing (SciDAC) program is that high-performance computational science is now universally recognized as a critical aspect of scientific discovery [71], complementing both theoretical and experimental research. As scientific communities prepare to exploit unprecedented computing capabilities of emerging leadership-class machines for multi-model simulations at the extreme scale [72], it is more important than ever to address the technical and social challenges of geographically distributed teams that combine expertise in domain science, applied mathematics, and computer science to build robust and flexible codes that can incorporate changes over time. The Center for Technologymore » for Advanced Scientific Component Software (TASCS)1 tackles these these issues by exploiting component-based software development to facilitate collaborative high-performance scientific computing.« less

  12. Computational Science: A Research Methodology for the 21st Century

    NASA Astrophysics Data System (ADS)

    Orbach, Raymond L.

    2004-03-01

    Computational simulation - a means of scientific discovery that employs computer systems to simulate a physical system according to laws derived from theory and experiment - has attained peer status with theory and experiment. Important advances in basic science are accomplished by a new "sociology" for ultrascale scientific computing capability (USSCC), a fusion of sustained advances in scientific models, mathematical algorithms, computer architecture, and scientific software engineering. Expansion of current capabilities by factors of 100 - 1000 open up new vistas for scientific discovery: long term climatic variability and change, macroscopic material design from correlated behavior at the nanoscale, design and optimization of magnetic confinement fusion reactors, strong interactions on a computational lattice through quantum chromodynamics, and stellar explosions and element production. The "virtual prototype," made possible by this expansion, can markedly reduce time-to-market for industrial applications such as jet engines and safer, more fuel efficient cleaner cars. In order to develop USSCC, the National Energy Research Scientific Computing Center (NERSC) announced the competition "Innovative and Novel Computational Impact on Theory and Experiment" (INCITE), with no requirement for current DOE sponsorship. Fifty nine proposals for grand challenge scientific problems were submitted for a small number of awards. The successful grants, and their preliminary progress, will be described.

  13. Analysis of citation networks as a new tool for scientific research

    DOE PAGES

    Vasudevan, R. K.; Ziatdinov, M.; Chen, C.; ...

    2016-12-06

    The rapid growth of scientific publications necessitates new methods to understand the direction of scientific research within fields of study, ascertain the importance of particular groups, authors, or institutions, compute metrics that can determine the importance (centrality) of particular seminal papers, and provide insight into the social (collaboration) networks that are present. We present one such method based on analysis of citation networks, using the freely available CiteSpace Program. We use citation network analysis on three examples, including a single material that has been widely explored in the last decade (BiFeO 3), two small subfields with a minimal number ofmore » authors (flexoelectricity and Kitaev physics), and a much wider field with thousands of publications pertaining to a single technique (scanning tunneling microscopy). Interpretation of the analysis and key insights into the fields, such as whether the fields are experiencing resurgence or stagnation, are discussed, and author or collaboration networks that are prominent are determined. Such methods represent a paradigm shift in our way of dealing with the large volume of scientific publications and could change the way literature searches and reviews are conducted, as well as how the impact of specific work is assessed.« less

  14. Analysis of citation networks as a new tool for scientific research

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vasudevan, R. K.; Ziatdinov, M.; Chen, C.

    The rapid growth of scientific publications necessitates new methods to understand the direction of scientific research within fields of study, ascertain the importance of particular groups, authors, or institutions, compute metrics that can determine the importance (centrality) of particular seminal papers, and provide insight into the social (collaboration) networks that are present. We present one such method based on analysis of citation networks, using the freely available CiteSpace Program. We use citation network analysis on three examples, including a single material that has been widely explored in the last decade (BiFeO 3), two small subfields with a minimal number ofmore » authors (flexoelectricity and Kitaev physics), and a much wider field with thousands of publications pertaining to a single technique (scanning tunneling microscopy). Interpretation of the analysis and key insights into the fields, such as whether the fields are experiencing resurgence or stagnation, are discussed, and author or collaboration networks that are prominent are determined. Such methods represent a paradigm shift in our way of dealing with the large volume of scientific publications and could change the way literature searches and reviews are conducted, as well as how the impact of specific work is assessed.« less

  15. Active Flash: Out-of-core Data Analytics on Flash Storage

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boboila, Simona; Kim, Youngjae; Vazhkudai, Sudharshan S

    2012-01-01

    Next generation science will increasingly come to rely on the ability to perform efficient, on-the-fly analytics of data generated by high-performance computing (HPC) simulations, modeling complex physical phenomena. Scientific computing workflows are stymied by the traditional chaining of simulation and data analysis, creating multiple rounds of redundant reads and writes to the storage system, which grows in cost with the ever-increasing gap between compute and storage speeds in HPC clusters. Recent HPC acquisitions have introduced compute node-local flash storage as a means to alleviate this I/O bottleneck. We propose a novel approach, Active Flash, to expedite data analysis pipelines bymore » migrating to the location of the data, the flash device itself. We argue that Active Flash has the potential to enable true out-of-core data analytics by freeing up both the compute core and the associated main memory. By performing analysis locally, dependence on limited bandwidth to a central storage system is reduced, while allowing this analysis to proceed in parallel with the main application. In addition, offloading work from the host to the more power-efficient controller reduces peak system power usage, which is already in the megawatt range and poses a major barrier to HPC system scalability. We propose an architecture for Active Flash, explore energy and performance trade-offs in moving computation from host to storage, demonstrate the ability of appropriate embedded controllers to perform data analysis and reduction tasks at speeds sufficient for this application, and present a simulation study of Active Flash scheduling policies. These results show the viability of the Active Flash model, and its capability to potentially have a transformative impact on scientific data analysis.« less

  16. iSPHERE - A New Approach to Collaborative Research and Cloud Computing

    NASA Astrophysics Data System (ADS)

    Al-Ubaidi, T.; Khodachenko, M. L.; Kallio, E. J.; Harry, A.; Alexeev, I. I.; Vázquez-Poletti, J. L.; Enke, H.; Magin, T.; Mair, M.; Scherf, M.; Poedts, S.; De Causmaecker, P.; Heynderickx, D.; Congedo, P.; Manolescu, I.; Esser, B.; Webb, S.; Ruja, C.

    2015-10-01

    The project iSPHERE (integrated Scientific Platform for HEterogeneous Research and Engineering) that has been proposed for Horizon 2020 (EINFRA-9- 2015, [1]) aims at creating a next generation Virtual Research Environment (VRE) that embraces existing and emerging technologies and standards in order to provide a versatile platform for scientific investigations and collaboration. The presentation will introduce the large project consortium, provide a comprehensive overview of iSPHERE's basic concepts and approaches and outline general user requirements that the VRE will strive to satisfy. An overview of the envisioned architecture will be given, focusing on the adapted Service Bus concept, i.e. the "Scientific Service Bus" as it is called in iSPHERE. The bus will act as a central hub for all communication and user access, and will be implemented in the course of the project. The agile approach [2] that has been chosen for detailed elaboration and documentation of user requirements, as well as for the actual implementation of the system, will be outlined and its motivation and basic structure will be discussed. The presentation will show which user communities will benefit and which concrete problems, scientific investigations are facing today, will be tackled by the system. Another focus of the presentation is iSPHERE's seamless integration of cloud computing resources and how these will benefit scientific modeling teams by providing a reliable and web based environment for cloud based model execution, storage of results, and comparison with measurements, including fully web based tools for data mining, analysis and visualization. Also the envisioned creation of a dedicated data model for experimental plasma physics will be discussed. It will be shown why the Scientific Service Bus provides an ideal basis to integrate a number of data models and communication protocols and to provide mechanisms for data exchange across multiple and even multidisciplinary platforms.

  17. Toward a Climate OSSE for NASA Earth Sciences

    NASA Astrophysics Data System (ADS)

    Leroy, S. S.; Collins, W. D.; Feldman, D.; Field, R. D.; Ming, Y.; Pawson, S.; Sanderson, B.; Schmidt, G. A.

    2016-12-01

    In the Continuity Study, the National Academy of Sciences advised that future space missions be rated according to five categories: the importance of a well-defined scientific objective, the utility of the observation in addressing the scientific objective, the quality with which the observation can be made, the probability of the mission's success, and the mission's affordability. The importance, probability, and affordability are evaluated subjectively by scientific consensus, by engineering review panels, and by cost models; however, the utility and quality can be evaluated objectively by a climate observation system simulation experiment (COSSE). A discussion of the philosophical underpinnings of a COSSE for NASA Earth Sciences will be presented. A COSSE is built upon a perturbed physics ensemble of a sophisticated climate model that can simulate a mission's prospective observations and its well-defined quantitative scientific objective and that can capture the uncertainty associated with each. A strong correlation between observation and scientific objective after consideration of physical uncertainty leads to a high quality. Persistence of a high correlation after inclusion of the proposed measurement error leads to a high utility. There are five criteria that govern that nature of a particular COSSE: (1) whether the mission's scientific objective is one of hypothesis testing or climate prediction, (2) whether the mission is empirical or inferential, (3) whether the core climate model captures essential physical uncertainties, (4) the level of detail of the simulated observations, and (5) whether complementarity or redundancy of information is to be valued. Computation of the quality and utility is done using Bayesian statistics, as has been done previously for multi-decadal climate prediction conditioned on existing data. We advocate for a new program within NASA Earth Sciences to establish a COSSE capability. Creation of a COSSE program within NASA Earth Sciences will require answers from the climate research community to basic questions, such as whether a COSSE capability should be centralized or de-centralized. Most importantly, the quantified scientific objective of a proposed mission must be defined with extreme specificity for a COSSE to be applied.

  18. XVIS: Visualization for the Extreme-Scale Scientific-Computation Ecosystem Final Scientific/Technical Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Geveci, Berk; Maynard, Robert

    The XVis project brings together the key elements of research to enable scientific discovery at extreme scale. Scientific computing will no longer be purely about how fast computations can be performed. Energy constraints, processor changes, and I/O limitations necessitate significant changes in both the software applications used in scientific computation and the ways in which scientists use them. Components for modeling, simulation, analysis, and visualization must work together in a computational ecosystem, rather than working independently as they have in the past. The XVis project brought together collaborators from predominant DOE projects for visualization on accelerators and combining their respectivemore » features into a new visualization toolkit called VTK-m.« less

  19. Graphics processing units in bioinformatics, computational biology and systems biology.

    PubMed

    Nobile, Marco S; Cazzaniga, Paolo; Tangherloni, Andrea; Besozzi, Daniela

    2017-09-01

    Several studies in Bioinformatics, Computational Biology and Systems Biology rely on the definition of physico-chemical or mathematical models of biological systems at different scales and levels of complexity, ranging from the interaction of atoms in single molecules up to genome-wide interaction networks. Traditional computational methods and software tools developed in these research fields share a common trait: they can be computationally demanding on Central Processing Units (CPUs), therefore limiting their applicability in many circumstances. To overcome this issue, general-purpose Graphics Processing Units (GPUs) are gaining an increasing attention by the scientific community, as they can considerably reduce the running time required by standard CPU-based software, and allow more intensive investigations of biological systems. In this review, we present a collection of GPU tools recently developed to perform computational analyses in life science disciplines, emphasizing the advantages and the drawbacks in the use of these parallel architectures. The complete list of GPU-powered tools here reviewed is available at http://bit.ly/gputools. © The Author 2016. Published by Oxford University Press.

  20. Joint the Center for Applied Scientific Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gamblin, Todd; Bremer, Timo; Van Essen, Brian

    The Center for Applied Scientific Computing serves as Livermore Lab’s window to the broader computer science, computational physics, applied mathematics, and data science research communities. In collaboration with academic, industrial, and other government laboratory partners, we conduct world-class scientific research and development on problems critical to national security. CASC applies the power of high-performance computing and the efficiency of modern computational methods to the realms of stockpile stewardship, cyber and energy security, and knowledge discovery for intelligence applications.

  1. 78 FR 41046 - Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-09

    ... Services Administration, notice is hereby given that the Advanced Scientific Computing Advisory Committee will be renewed for a two-year period beginning on July 1, 2013. The Committee will provide advice to the Director, Office of Science (DOE), on the Advanced Scientific Computing Research Program managed...

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hagberg, Aric; Swart, Pieter; S Chult, Daniel

    NetworkX is a Python language package for exploration and analysis of networks and network algorithms. The core package provides data structures for representing many types of networks, or graphs, including simple graphs, directed graphs, and graphs with parallel edges and self loops. The nodes in NetworkX graphs can be any (hashable) Python object and edges can contain arbitrary data; this flexibility mades NetworkX ideal for representing networks found in many different scientific fields. In addition to the basic data structures many graph algorithms are implemented for calculating network properties and structure measures: shortest paths, betweenness centrality, clustering, and degree distributionmore » and many more. NetworkX can read and write various graph formats for eash exchange with existing data, and provides generators for many classic graphs and popular graph models, such as the Erdoes-Renyi, Small World, and Barabasi-Albert models, are included. The ease-of-use and flexibility of the Python programming language together with connection to the SciPy tools make NetworkX a powerful tool for scientific computations. We discuss some of our recent work studying synchronization of coupled oscillators to demonstrate how NetworkX enables research in the field of computational networks.« less

  3. A Virtual Rock Physics Laboratory Through Visualized and Interactive Experiments

    NASA Astrophysics Data System (ADS)

    Vanorio, T.; Di Bonito, C.; Clark, A. C.

    2014-12-01

    As new scientific challenges demand more comprehensive and multidisciplinary investigations, laboratory experiments are not expected to become simpler and/or faster. Experimental investigation is an indispensable element of scientific inquiry and must play a central role in the way current and future generations of scientist make decisions. To turn the complexity of laboratory work (and that of rocks!) into dexterity, engagement, and expanded learning opportunities, we are building an interactive, virtual laboratory reproducing in form and function the Stanford Rock Physics Laboratory, at Stanford University. The objective is to combine lectures on laboratory techniques and an online repository of visualized experiments consisting of interactive, 3-D renderings of equipment used to measure properties central to the study of rock physics (e.g., how to saturate rocks, how to measure porosity, permeability, and elastic wave velocity). We use a game creation system together with 3-D computer graphics, and a narrative voice to guide the user through the different phases of the experimental protocol. The main advantage gained in employing computer graphics over video footage is that students can virtually open the instrument, single out its components, and assemble it. Most importantly, it helps describe the processes occurring within the rock. These latter cannot be tracked while simply recording the physical experiment, but computer animation can efficiently illustrate what happens inside rock samples (e.g., describing acoustic waves, and/or fluid flow through a porous rock under pressure within an opaque core-holder - Figure 1). The repository of visualized experiments will complement lectures on laboratory techniques and constitute an on-line course offered through the EdX platform at Stanford. This will provide a virtual laboratory for anyone, anywhere to facilitate teaching/learning of introductory laboratory classes in Geophysics and expand the number of courses that can be offered for curricula in Earth Sciences. The primary goal is to open up a research laboratory such as the one available at Stanford to promising students worldwide who are currently left out of such educational resources.

  4. CRAY mini manual. Revision D

    NASA Technical Reports Server (NTRS)

    Tennille, Geoffrey M.; Howser, Lona M.

    1993-01-01

    This document briefly describes the use of the CRAY supercomputers that are an integral part of the Supercomputing Network Subsystem of the Central Scientific Computing Complex at LaRC. Features of the CRAY supercomputers are covered, including: FORTRAN, C, PASCAL, architectures of the CRAY-2 and CRAY Y-MP, the CRAY UNICOS environment, batch job submittal, debugging, performance analysis, parallel processing, utilities unique to CRAY, and documentation. The document is intended for all CRAY users as a ready reference to frequently asked questions and to more detailed information contained in the vendor manuals. It is appropriate for both the novice and the experienced user.

  5. Whole earth modeling: developing and disseminating scientific software for computational geophysics.

    NASA Astrophysics Data System (ADS)

    Kellogg, L. H.

    2016-12-01

    Historically, a great deal of specialized scientific software for modeling and data analysis has been developed by individual researchers or small groups of scientists working on their own specific research problems. As the magnitude of available data and computer power has increased, so has the complexity of scientific problems addressed by computational methods, creating both a need to sustain existing scientific software, and expand its development to take advantage of new algorithms, new software approaches, and new computational hardware. To that end, communities like the Computational Infrastructure for Geodynamics (CIG) have been established to support the use of best practices in scientific computing for solid earth geophysics research and teaching. Working as a scientific community enables computational geophysicists to take advantage of technological developments, improve the accuracy and performance of software, build on prior software development, and collaborate more readily. The CIG community, and others, have adopted an open-source development model, in which code is developed and disseminated by the community in an open fashion, using version control and software repositories like Git. One emerging issue is how to adequately identify and credit the intellectual contributions involved in creating open source scientific software. The traditional method of disseminating scientific ideas, peer reviewed publication, was not designed for review or crediting scientific software, although emerging publication strategies such software journals are attempting to address the need. We are piloting an integrated approach in which authors are identified and credited as scientific software is developed and run. Successful software citation requires integration with the scholarly publication and indexing mechanisms as well, to assign credit, ensure discoverability, and provide provenance for software.

  6. Managing data from multiple disciplines, scales, and sites to support synthesis and modeling

    USGS Publications Warehouse

    Olson, R. J.; Briggs, J. M.; Porter, J.H.; Mah, Grant R.; Stafford, S.G.

    1999-01-01

    The synthesis and modeling of ecological processes at multiple spatial and temporal scales involves bringing together and sharing data from numerous sources. This article describes a data and information system model that facilitates assembling, managing, and sharing diverse data from multiple disciplines, scales, and sites to support integrated ecological studies. Cross-site scientific-domain working groups coordinate the development of data associated with their particular scientific working group, including decisions about data requirements, data to be compiled, data formats, derived data products, and schedules across the sites. The Web-based data and information system consists of nodes for each working group plus a central node that provides data access, project information, data query, and other functionality. The approach incorporates scientists and computer experts in the working groups and provides incentives for individuals to submit documented data to the data and information system.

  7. Scientific visualization of volumetric radar cross section data

    NASA Astrophysics Data System (ADS)

    Wojszynski, Thomas G.

    1992-12-01

    For aircraft design and mission planning, designers, threat analysts, mission planners, and pilots require a Radar Cross Section (RCS) central tendency with its associated distribution about a specified aspect and its relation to a known threat, Historically, RCS data sets have been statically analyzed to evaluate a d profile. However, Scientific Visualization, the application of computer graphics techniques to produce pictures of complex physical phenomena appears to be a more promising tool to interpret this data. This work describes data reduction techniques and a surface rendering algorithm to construct and display a complex polyhedron from adjacent contours of RCS data. Data reduction is accomplished by sectorizing the data and characterizing the statistical properties of the data. Color, lighting, and orientation cues are added to complete the visualization system. The tool may be useful for synthesis, design, and analysis of complex, low observable air vehicles.

  8. Scale-Independent Bibliometric Indicators

    ERIC Educational Resources Information Center

    Katz, J. Sylvan

    2005-01-01

    This article presents the author's critique of Anthony F. J. van Raan's article titled, "Measurement of Central Aspects of Scientific Research: Performance, Interdisciplinarity, Structure." van Raan makes an excellent case for using bibliometric data to measure some central aspects of scientific research and to construct indicators of…

  9. Integrating Data Base into the Elementary School Science Program.

    ERIC Educational Resources Information Center

    Schlenker, Richard M.

    This document describes seven science activities that combine scientific principles and computers. The objectives for the activities are to show students how the computer can be used as a tool to store and arrange scientific data, provide students with experience using the computer as a tool to manage scientific data, and provide students with…

  10. A high performance scientific cloud computing environment for materials simulations

    NASA Astrophysics Data System (ADS)

    Jorissen, K.; Vila, F. D.; Rehr, J. J.

    2012-09-01

    We describe the development of a scientific cloud computing (SCC) platform that offers high performance computation capability. The platform consists of a scientific virtual machine prototype containing a UNIX operating system and several materials science codes, together with essential interface tools (an SCC toolset) that offers functionality comparable to local compute clusters. In particular, our SCC toolset provides automatic creation of virtual clusters for parallel computing, including tools for execution and monitoring performance, as well as efficient I/O utilities that enable seamless connections to and from the cloud. Our SCC platform is optimized for the Amazon Elastic Compute Cloud (EC2). We present benchmarks for prototypical scientific applications and demonstrate performance comparable to local compute clusters. To facilitate code execution and provide user-friendly access, we have also integrated cloud computing capability in a JAVA-based GUI. Our SCC platform may be an alternative to traditional HPC resources for materials science or quantum chemistry applications.

  11. Software engineering and automatic continuous verification of scientific software

    NASA Astrophysics Data System (ADS)

    Piggott, M. D.; Hill, J.; Farrell, P. E.; Kramer, S. C.; Wilson, C. R.; Ham, D.; Gorman, G. J.; Bond, T.

    2011-12-01

    Software engineering of scientific code is challenging for a number of reasons including pressure to publish and a lack of awareness of the pitfalls of software engineering by scientists. The Applied Modelling and Computation Group at Imperial College is a diverse group of researchers that employ best practice software engineering methods whilst developing open source scientific software. Our main code is Fluidity - a multi-purpose computational fluid dynamics (CFD) code that can be used for a wide range of scientific applications from earth-scale mantle convection, through basin-scale ocean dynamics, to laboratory-scale classic CFD problems, and is coupled to a number of other codes including nuclear radiation and solid modelling. Our software development infrastructure consists of a number of free tools that could be employed by any group that develops scientific code and has been developed over a number of years with many lessons learnt. A single code base is developed by over 30 people for which we use bazaar for revision control, making good use of the strong branching and merging capabilities. Using features of Canonical's Launchpad platform, such as code review, blueprints for designing features and bug reporting gives the group, partners and other Fluidity uers an easy-to-use platform to collaborate and allows the induction of new members of the group into an environment where software development forms a central part of their work. The code repositoriy are coupled to an automated test and verification system which performs over 20,000 tests, including unit tests, short regression tests, code verification and large parallel tests. Included in these tests are build tests on HPC systems, including local and UK National HPC services. The testing of code in this manner leads to a continuous verification process; not a discrete event performed once development has ceased. Much of the code verification is done via the "gold standard" of comparisons to analytical solutions via the method of manufactured solutions. By developing and verifying code in tandem we avoid a number of pitfalls in scientific software development and advocate similar procedures for other scientific code applications.

  12. Constructing Scientific Arguments Using Evidence from Dynamic Computational Climate Models

    ERIC Educational Resources Information Center

    Pallant, Amy; Lee, Hee-Sun

    2015-01-01

    Modeling and argumentation are two important scientific practices students need to develop throughout school years. In this paper, we investigated how middle and high school students (N = 512) construct a scientific argument based on evidence from computational models with which they simulated climate change. We designed scientific argumentation…

  13. The rise of machine consciousness: studying consciousness with computational models.

    PubMed

    Reggia, James A

    2013-08-01

    Efforts to create computational models of consciousness have accelerated over the last two decades, creating a field that has become known as artificial consciousness. There have been two main motivations for this controversial work: to develop a better scientific understanding of the nature of human/animal consciousness and to produce machines that genuinely exhibit conscious awareness. This review begins by briefly explaining some of the concepts and terminology used by investigators working on machine consciousness, and summarizes key neurobiological correlates of human consciousness that are particularly relevant to past computational studies. Models of consciousness developed over the last twenty years are then surveyed. These models are largely found to fall into five categories based on the fundamental issue that their developers have selected as being most central to consciousness: a global workspace, information integration, an internal self-model, higher-level representations, or attention mechanisms. For each of these five categories, an overview of past work is given, a representative example is presented in some detail to illustrate the approach, and comments are provided on the contributions and limitations of the methodology. Three conclusions are offered about the state of the field based on this review: (1) computational modeling has become an effective and accepted methodology for the scientific study of consciousness, (2) existing computational models have successfully captured a number of neurobiological, cognitive, and behavioral correlates of conscious information processing as machine simulations, and (3) no existing approach to artificial consciousness has presented a compelling demonstration of phenomenal machine consciousness, or even clear evidence that artificial phenomenal consciousness will eventually be possible. The paper concludes by discussing the importance of continuing work in this area, considering the ethical issues it raises, and making predictions concerning future developments. Copyright © 2013 Elsevier Ltd. All rights reserved.

  14. Parallel processing for scientific computations

    NASA Technical Reports Server (NTRS)

    Alkhatib, Hasan S.

    1995-01-01

    The scope of this project dealt with the investigation of the requirements to support distributed computing of scientific computations over a cluster of cooperative workstations. Various experiments on computations for the solution of simultaneous linear equations were performed in the early phase of the project to gain experience in the general nature and requirements of scientific applications. A specification of a distributed integrated computing environment, DICE, based on a distributed shared memory communication paradigm has been developed and evaluated. The distributed shared memory model facilitates porting existing parallel algorithms that have been designed for shared memory multiprocessor systems to the new environment. The potential of this new environment is to provide supercomputing capability through the utilization of the aggregate power of workstations cooperating in a cluster interconnected via a local area network. Workstations, generally, do not have the computing power to tackle complex scientific applications, making them primarily useful for visualization, data reduction, and filtering as far as complex scientific applications are concerned. There is a tremendous amount of computing power that is left unused in a network of workstations. Very often a workstation is simply sitting idle on a desk. A set of tools can be developed to take advantage of this potential computing power to create a platform suitable for large scientific computations. The integration of several workstations into a logical cluster of distributed, cooperative, computing stations presents an alternative to shared memory multiprocessor systems. In this project we designed and evaluated such a system.

  15. Computational biology and bioinformatics in Nigeria.

    PubMed

    Fatumo, Segun A; Adoga, Moses P; Ojo, Opeolu O; Oluwagbemi, Olugbenga; Adeoye, Tolulope; Ewejobi, Itunuoluwa; Adebiyi, Marion; Adebiyi, Ezekiel; Bewaji, Clement; Nashiru, Oyekanmi

    2014-04-01

    Over the past few decades, major advances in the field of molecular biology, coupled with advances in genomic technologies, have led to an explosive growth in the biological data generated by the scientific community. The critical need to process and analyze such a deluge of data and turn it into useful knowledge has caused bioinformatics to gain prominence and importance. Bioinformatics is an interdisciplinary research area that applies techniques, methodologies, and tools in computer and information science to solve biological problems. In Nigeria, bioinformatics has recently played a vital role in the advancement of biological sciences. As a developing country, the importance of bioinformatics is rapidly gaining acceptance, and bioinformatics groups comprised of biologists, computer scientists, and computer engineers are being constituted at Nigerian universities and research institutes. In this article, we present an overview of bioinformatics education and research in Nigeria. We also discuss professional societies and academic and research institutions that play central roles in advancing the discipline in Nigeria. Finally, we propose strategies that can bolster bioinformatics education and support from policy makers in Nigeria, with potential positive implications for other developing countries.

  16. Computational Biology and Bioinformatics in Nigeria

    PubMed Central

    Fatumo, Segun A.; Adoga, Moses P.; Ojo, Opeolu O.; Oluwagbemi, Olugbenga; Adeoye, Tolulope; Ewejobi, Itunuoluwa; Adebiyi, Marion; Adebiyi, Ezekiel; Bewaji, Clement; Nashiru, Oyekanmi

    2014-01-01

    Over the past few decades, major advances in the field of molecular biology, coupled with advances in genomic technologies, have led to an explosive growth in the biological data generated by the scientific community. The critical need to process and analyze such a deluge of data and turn it into useful knowledge has caused bioinformatics to gain prominence and importance. Bioinformatics is an interdisciplinary research area that applies techniques, methodologies, and tools in computer and information science to solve biological problems. In Nigeria, bioinformatics has recently played a vital role in the advancement of biological sciences. As a developing country, the importance of bioinformatics is rapidly gaining acceptance, and bioinformatics groups comprised of biologists, computer scientists, and computer engineers are being constituted at Nigerian universities and research institutes. In this article, we present an overview of bioinformatics education and research in Nigeria. We also discuss professional societies and academic and research institutions that play central roles in advancing the discipline in Nigeria. Finally, we propose strategies that can bolster bioinformatics education and support from policy makers in Nigeria, with potential positive implications for other developing countries. PMID:24763310

  17. Root resorption during orthodontic treatment.

    PubMed

    Walker, Sally

    2010-01-01

    Medline, Embase, LILACS, The Cochrane Library (Cochrane Database of Systematic Reviews, CENTRAL, and Cochrane Oral Health Group Trials Register) Web of Science, EBM Reviews, Computer Retrieval of Information on Scientific Project (CRISP, www.crisp.cit.nih.gov), On-Line Computer Library Center (www.oclc.org), Google Index to Scientific and Technical Proceedings, PAHO (www.paho.org), WHOLis (www.who.int/library/databases/en), BBO (Brazilian Bibliography of Dentistry), CEPS (Chinese Electronic Periodical Services), Conference materials (www.bl.uk/services/bsds/dsc/conference.html), ProQuest Dissertation Abstracts and Thesis database, TrialCentral (www.trialscentral.org), National Research Register (www.controlled-trials.com), www.Clinicaltrials.gov and SIGLE (System for Information on Grey Literature in Europe). Randomised controlled trials including split mouth design, recording the presence or absence of external apical root resorption (EARR) by treatment group at the end of the treatment period. Data were extracted independently by two reviewers using specially designed and piloted forms. Quality was also assessed independently by the same reviewers. After evaluating titles and abstracts, 144 full articles were obtained of which 13 articles, describing 11 trials, fulfilled the criteria for inclusion. Differences in the methodological approaches and reporting results made quantitative statistical comparisons impossible. Evidence suggests that comprehensive orthodontic treatment causes increased incidence and severity of root resorption, and heavy forces might be particularly harmful. Orthodontically induced inflammatory root resorption is unaffected by archwire sequencing, bracket prescription, and self-ligation. Previous trauma and tooth morphology are unlikely causative factors. There is some evidence that a two- to three-month pause in treatment decreases total root resorption. The results were inconclusive in the clinical management of root resorption, but there is evidence to support the use of light forces, especially with incisor intrusion.

  18. Tuning the cache memory usage in tomographic reconstruction on standard computers with Advanced Vector eXtensions (AVX)

    PubMed Central

    Agulleiro, Jose-Ignacio; Fernandez, Jose-Jesus

    2015-01-01

    Cache blocking is a technique widely used in scientific computing to minimize the exchange of information with main memory by reusing the data kept in cache memory. In tomographic reconstruction on standard computers using vector instructions, cache blocking turns out to be central to optimize performance. To this end, sinograms of the tilt-series and slices of the volumes to be reconstructed have to be divided into small blocks that fit into the different levels of cache memory. The code is then reorganized so as to operate with a block as much as possible before proceeding with another one. This data article is related to the research article titled Tomo3D 2.0 – Exploitation of Advanced Vector eXtensions (AVX) for 3D reconstruction (Agulleiro and Fernandez, 2015) [1]. Here we present data of a thorough study of the performance of tomographic reconstruction by varying cache block sizes, which allows derivation of expressions for their automatic quasi-optimal tuning. PMID:26217710

  19. Tuning the cache memory usage in tomographic reconstruction on standard computers with Advanced Vector eXtensions (AVX).

    PubMed

    Agulleiro, Jose-Ignacio; Fernandez, Jose-Jesus

    2015-06-01

    Cache blocking is a technique widely used in scientific computing to minimize the exchange of information with main memory by reusing the data kept in cache memory. In tomographic reconstruction on standard computers using vector instructions, cache blocking turns out to be central to optimize performance. To this end, sinograms of the tilt-series and slices of the volumes to be reconstructed have to be divided into small blocks that fit into the different levels of cache memory. The code is then reorganized so as to operate with a block as much as possible before proceeding with another one. This data article is related to the research article titled Tomo3D 2.0 - Exploitation of Advanced Vector eXtensions (AVX) for 3D reconstruction (Agulleiro and Fernandez, 2015) [1]. Here we present data of a thorough study of the performance of tomographic reconstruction by varying cache block sizes, which allows derivation of expressions for their automatic quasi-optimal tuning.

  20. Biotea: semantics for Pubmed Central.

    PubMed

    Garcia, Alexander; Lopez, Federico; Garcia, Leyla; Giraldo, Olga; Bucheli, Victor; Dumontier, Michel

    2018-01-01

    A significant portion of biomedical literature is represented in a manner that makes it difficult for consumers to find or aggregate content through a computational query. One approach to facilitate reuse of the scientific literature is to structure this information as linked data using standardized web technologies. In this paper we present the second version of Biotea, a semantic, linked data version of the open-access subset of PubMed Central that has been enhanced with specialized annotation pipelines that uses existing infrastructure from the National Center for Biomedical Ontology. We expose our models, services, software and datasets. Our infrastructure enables manual and semi-automatic annotation, resulting data are represented as RDF-based linked data and can be readily queried using the SPARQL query language. We illustrate the utility of our system with several use cases. Our datasets, methods and techniques are available at http://biotea.github.io.

  1. OMPC: an Open-Source MATLAB®-to-Python Compiler

    PubMed Central

    Jurica, Peter; van Leeuwen, Cees

    2008-01-01

    Free access to scientific information facilitates scientific progress. Open-access scientific journals are a first step in this direction; a further step is to make auxiliary and supplementary materials that accompany scientific publications, such as methodological procedures and data-analysis tools, open and accessible to the scientific community. To this purpose it is instrumental to establish a software base, which will grow toward a comprehensive free and open-source language of technical and scientific computing. Endeavors in this direction are met with an important obstacle. MATLAB®, the predominant computation tool in many fields of research, is a closed-source commercial product. To facilitate the transition to an open computation platform, we propose Open-source MATLAB®-to-Python Compiler (OMPC), a platform that uses syntax adaptation and emulation to allow transparent import of existing MATLAB® functions into Python programs. The imported MATLAB® modules will run independently of MATLAB®, relying on Python's numerical and scientific libraries. Python offers a stable and mature open source platform that, in many respects, surpasses commonly used, expensive commercial closed source packages. The proposed software will therefore facilitate the transparent transition towards a free and general open-source lingua franca for scientific computation, while enabling access to the existing methods and algorithms of technical computing already available in MATLAB®. OMPC is available at http://ompc.juricap.com. PMID:19225577

  2. Development of the Central Dogma Concept Inventory (CDCI) Assessment Tool

    ERIC Educational Resources Information Center

    Newman, Dina L.; Snyder, Christopher W.; Fisk, J. Nick; Wright, L. Kate

    2016-01-01

    Scientific teaching requires scientifically constructed, field-tested instruments to accurately evaluate student thinking and gauge teacher effectiveness. We have developed a 23-question, multiple select--format assessment of student understanding of the essential concepts of the central dogma of molecular biology that is appropriate for all…

  3. Scientific Computing Paradigm

    NASA Technical Reports Server (NTRS)

    VanZandt, John

    1994-01-01

    The usage model of supercomputers for scientific applications, such as computational fluid dynamics (CFD), has changed over the years. Scientific visualization has moved scientists away from looking at numbers to looking at three-dimensional images, which capture the meaning of the data. This change has impacted the system models for computing. This report details the model which is used by scientists at NASA's research centers.

  4. Commentary: Considerations in Pedagogy and Assessment in the Use of Computers to Promote Learning about Scientific Models

    ERIC Educational Resources Information Center

    Adams, Stephen T.

    2004-01-01

    Although one role of computers in science education is to help students learn specific science concepts, computers are especially intriguing as a vehicle for fostering the development of epistemological knowledge about the nature of scientific knowledge--what it means to "know" in a scientific sense (diSessa, 1985). In this vein, the…

  5. High-End Scientific Computing

    EPA Pesticide Factsheets

    EPA uses high-end scientific computing, geospatial services and remote sensing/imagery analysis to support EPA's mission. The Center for Environmental Computing (CEC) assists the Agency's program offices and regions to meet staff needs in these areas.

  6. Using the High-Level Based Program Interface to Facilitate the Large Scale Scientific Computing

    PubMed Central

    Shang, Yizi; Shang, Ling; Gao, Chuanchang; Lu, Guiming; Ye, Yuntao; Jia, Dongdong

    2014-01-01

    This paper is to make further research on facilitating the large-scale scientific computing on the grid and the desktop grid platform. The related issues include the programming method, the overhead of the high-level program interface based middleware, and the data anticipate migration. The block based Gauss Jordan algorithm as a real example of large-scale scientific computing is used to evaluate those issues presented above. The results show that the high-level based program interface makes the complex scientific applications on large-scale scientific platform easier, though a little overhead is unavoidable. Also, the data anticipation migration mechanism can improve the efficiency of the platform which needs to process big data based scientific applications. PMID:24574931

  7. Hot Spots and Hot Moments in Scientific Collaborations and Social Movements

    ERIC Educational Resources Information Center

    Parker, John N.; Hackett, Edward J.

    2012-01-01

    Emotions are essential but little understood components of research; they catalyze and sustain creative scientific work and fuel the scientific and intellectual social movements (SIMs) that propel scientific change. Adopting a micro-sociological focus, we examine how emotions shape two intellectual processes central to all scientific work:…

  8. Defining Computational Thinking for Mathematics and Science Classrooms

    ERIC Educational Resources Information Center

    Weintrop, David; Beheshti, Elham; Horn, Michael; Orton, Kai; Jona, Kemi; Trouille, Laura; Wilensky, Uri

    2016-01-01

    Science and mathematics are becoming computational endeavors. This fact is reflected in the recently released Next Generation Science Standards and the decision to include "computational thinking" as a core scientific practice. With this addition, and the increased presence of computation in mathematics and scientific contexts, a new…

  9. Ermittlung von Wortstaemmen in russischen wissenschaftlichen Fachsprachen mit Hilfe des Computers (Establishing Word Stems in Scientific Russian With the Aid of a Computer)

    ERIC Educational Resources Information Center

    Halbauer, Siegfried

    1976-01-01

    It was considered that students of intensive scientific Russian courses could learn vocabulary more efficiently if they were taught word stems and how to combine them with prefixes and suffixes to form scientific words. The computer programs developed to identify the most important stems is discussed. (Text is in German.) (FB)

  10. Centralized Planning for Multiple Exploratory Robots

    NASA Technical Reports Server (NTRS)

    Estlin, Tara; Rabideau, Gregg; Chien, Steve; Barrett, Anthony

    2005-01-01

    A computer program automatically generates plans for a group of robotic vehicles (rovers) engaged in geological exploration of terrain. The program rapidly generates multiple command sequences that can be executed simultaneously by the rovers. Starting from a set of high-level goals, the program creates a sequence of commands for each rover while respecting hardware constraints and limitations on resources of each rover and of hardware (e.g., a radio communication terminal) shared by all the rovers. First, a separate model of each rover is loaded into a centralized planning subprogram. The centralized planning software uses the models of the rovers plus an iterative repair algorithm to resolve conflicts posed by demands for resources and by constraints associated with the all the rovers and the shared hardware. During repair, heuristics are used to make planning decisions that will result in solutions that will be better and will be found faster than would otherwise be possible. In particular, techniques from prior solutions of the multiple-traveling- salesmen problem are used as heuristics to generate plans in which the paths taken by the rovers to assigned scientific targets are shorter than they would otherwise be.

  11. OMPC: an Open-Source MATLAB-to-Python Compiler.

    PubMed

    Jurica, Peter; van Leeuwen, Cees

    2009-01-01

    Free access to scientific information facilitates scientific progress. Open-access scientific journals are a first step in this direction; a further step is to make auxiliary and supplementary materials that accompany scientific publications, such as methodological procedures and data-analysis tools, open and accessible to the scientific community. To this purpose it is instrumental to establish a software base, which will grow toward a comprehensive free and open-source language of technical and scientific computing. Endeavors in this direction are met with an important obstacle. MATLAB((R)), the predominant computation tool in many fields of research, is a closed-source commercial product. To facilitate the transition to an open computation platform, we propose Open-source MATLAB((R))-to-Python Compiler (OMPC), a platform that uses syntax adaptation and emulation to allow transparent import of existing MATLAB((R)) functions into Python programs. The imported MATLAB((R)) modules will run independently of MATLAB((R)), relying on Python's numerical and scientific libraries. Python offers a stable and mature open source platform that, in many respects, surpasses commonly used, expensive commercial closed source packages. The proposed software will therefore facilitate the transparent transition towards a free and general open-source lingua franca for scientific computation, while enabling access to the existing methods and algorithms of technical computing already available in MATLAB((R)). OMPC is available at http://ompc.juricap.com.

  12. Parallel computing works

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    An account of the Caltech Concurrent Computation Program (C{sup 3}P), a five year project that focused on answering the question: Can parallel computers be used to do large-scale scientific computations '' As the title indicates, the question is answered in the affirmative, by implementing numerous scientific applications on real parallel computers and doing computations that produced new scientific results. In the process of doing so, C{sup 3}P helped design and build several new computers, designed and implemented basic system software, developed algorithms for frequently used mathematical computations on massively parallel machines, devised performance models and measured the performance of manymore » computers, and created a high performance computing facility based exclusively on parallel computers. While the initial focus of C{sup 3}P was the hypercube architecture developed by C. Seitz, many of the methods developed and lessons learned have been applied successfully on other massively parallel architectures.« less

  13. Exploring Cloud Computing for Large-scale Scientific Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lin, Guang; Han, Binh; Yin, Jian

    This paper explores cloud computing for large-scale data-intensive scientific applications. Cloud computing is attractive because it provides hardware and software resources on-demand, which relieves the burden of acquiring and maintaining a huge amount of resources that may be used only once by a scientific application. However, unlike typical commercial applications that often just requires a moderate amount of ordinary resources, large-scale scientific applications often need to process enormous amount of data in the terabyte or even petabyte range and require special high performance hardware with low latency connections to complete computation in a reasonable amount of time. To address thesemore » challenges, we build an infrastructure that can dynamically select high performance computing hardware across institutions and dynamically adapt the computation to the selected resources to achieve high performance. We have also demonstrated the effectiveness of our infrastructure by building a system biology application and an uncertainty quantification application for carbon sequestration, which can efficiently utilize data and computation resources across several institutions.« less

  14. Causal biological network database: a comprehensive platform of causal biological network models focused on the pulmonary and vascular systems

    PubMed Central

    Boué, Stéphanie; Talikka, Marja; Westra, Jurjen Willem; Hayes, William; Di Fabio, Anselmo; Park, Jennifer; Schlage, Walter K.; Sewer, Alain; Fields, Brett; Ansari, Sam; Martin, Florian; Veljkovic, Emilija; Kenney, Renee; Peitsch, Manuel C.; Hoeng, Julia

    2015-01-01

    With the wealth of publications and data available, powerful and transparent computational approaches are required to represent measured data and scientific knowledge in a computable and searchable format. We developed a set of biological network models, scripted in the Biological Expression Language, that reflect causal signaling pathways across a wide range of biological processes, including cell fate, cell stress, cell proliferation, inflammation, tissue repair and angiogenesis in the pulmonary and cardiovascular context. This comprehensive collection of networks is now freely available to the scientific community in a centralized web-based repository, the Causal Biological Network database, which is composed of over 120 manually curated and well annotated biological network models and can be accessed at http://causalbionet.com. The website accesses a MongoDB, which stores all versions of the networks as JSON objects and allows users to search for genes, proteins, biological processes, small molecules and keywords in the network descriptions to retrieve biological networks of interest. The content of the networks can be visualized and browsed. Nodes and edges can be filtered and all supporting evidence for the edges can be browsed and is linked to the original articles in PubMed. Moreover, networks may be downloaded for further visualization and evaluation. Database URL: http://causalbionet.com PMID:25887162

  15. NAS technical summaries: Numerical aerodynamic simulation program, March 1991 - February 1992

    NASA Technical Reports Server (NTRS)

    1992-01-01

    NASA created the Numerical Aerodynamic Simulation (NAS) Program in 1987 to focus resources on solving critical problems in aeroscience and related disciplines by utilizing the power of the most advanced supercomputers available. The NAS Program provides scientists with the necessary computing power to solve today's most demanding computational fluid dynamics problems and serves as a pathfinder in integrating leading-edge supercomputing technologies, thus benefiting other supercomputer centers in Government and industry. This report contains selected scientific results from the 1991-92 NAS Operational Year, March 4, 1991 to March 3, 1992, which is the fifth year of operation. During this year, the scientific community was given access to a Cray-2 and a Cray Y-MP. The Cray-2, the first generation supercomputer, has four processors, 256 megawords of central memory, and a total sustained speed of 250 million floating point operations per second. The Cray Y-MP, the second generation supercomputer, has eight processors and a total sustained speed of one billion floating point operations per second. Additional memory was installed this year, doubling capacity from 128 to 256 megawords of solid-state storage-device memory. Because of its higher performance, the Cray Y-MP delivered approximately 77 percent of the total number of supercomputer hours used during this year.

  16. [Earth Science Technology Office's Computational Technologies Project

    NASA Technical Reports Server (NTRS)

    Fischer, James (Technical Monitor); Merkey, Phillip

    2005-01-01

    This grant supported the effort to characterize the problem domain of the Earth Science Technology Office's Computational Technologies Project, to engage the Beowulf Cluster Computing Community as well as the High Performance Computing Research Community so that we can predict the applicability of said technologies to the scientific community represented by the CT project and formulate long term strategies to provide the computational resources necessary to attain the anticipated scientific objectives of the CT project. Specifically, the goal of the evaluation effort is to use the information gathered over the course of the Round-3 investigations to quantify the trends in scientific expectations, the algorithmic requirements and capabilities of high-performance computers to satisfy this anticipated need.

  17. Nephele: a cloud platform for simplified, standardized and reproducible microbiome data analysis.

    PubMed

    Weber, Nick; Liou, David; Dommer, Jennifer; MacMenamin, Philip; Quiñones, Mariam; Misner, Ian; Oler, Andrew J; Wan, Joe; Kim, Lewis; Coakley McCarthy, Meghan; Ezeji, Samuel; Noble, Karlynn; Hurt, Darrell E

    2018-04-15

    Widespread interest in the study of the microbiome has resulted in data proliferation and the development of powerful computational tools. However, many scientific researchers lack the time, training, or infrastructure to work with large datasets or to install and use command line tools. The National Institute of Allergy and Infectious Diseases (NIAID) has created Nephele, a cloud-based microbiome data analysis platform with standardized pipelines and a simple web interface for transforming raw data into biological insights. Nephele integrates common microbiome analysis tools as well as valuable reference datasets like the healthy human subjects cohort of the Human Microbiome Project (HMP). Nephele is built on the Amazon Web Services cloud, which provides centralized and automated storage and compute capacity, thereby reducing the burden on researchers and their institutions. https://nephele.niaid.nih.gov and https://github.com/niaid/Nephele. darrell.hurt@nih.gov.

  18. POM.gpu-v1.0: a GPU-based Princeton Ocean Model

    NASA Astrophysics Data System (ADS)

    Xu, S.; Huang, X.; Oey, L.-Y.; Xu, F.; Fu, H.; Zhang, Y.; Yang, G.

    2015-09-01

    Graphics processing units (GPUs) are an attractive solution in many scientific applications due to their high performance. However, most existing GPU conversions of climate models use GPUs for only a few computationally intensive regions. In the present study, we redesign the mpiPOM (a parallel version of the Princeton Ocean Model) with GPUs. Specifically, we first convert the model from its original Fortran form to a new Compute Unified Device Architecture C (CUDA-C) code, then we optimize the code on each of the GPUs, the communications between the GPUs, and the I / O between the GPUs and the central processing units (CPUs). We show that the performance of the new model on a workstation containing four GPUs is comparable to that on a powerful cluster with 408 standard CPU cores, and it reduces the energy consumption by a factor of 6.8.

  19. Nephele: a cloud platform for simplified, standardized and reproducible microbiome data analysis

    PubMed Central

    Weber, Nick; Liou, David; Dommer, Jennifer; MacMenamin, Philip; Quiñones, Mariam; Misner, Ian; Oler, Andrew J; Wan, Joe; Kim, Lewis; Coakley McCarthy, Meghan; Ezeji, Samuel; Noble, Karlynn; Hurt, Darrell E

    2018-01-01

    Abstract Motivation Widespread interest in the study of the microbiome has resulted in data proliferation and the development of powerful computational tools. However, many scientific researchers lack the time, training, or infrastructure to work with large datasets or to install and use command line tools. Results The National Institute of Allergy and Infectious Diseases (NIAID) has created Nephele, a cloud-based microbiome data analysis platform with standardized pipelines and a simple web interface for transforming raw data into biological insights. Nephele integrates common microbiome analysis tools as well as valuable reference datasets like the healthy human subjects cohort of the Human Microbiome Project (HMP). Nephele is built on the Amazon Web Services cloud, which provides centralized and automated storage and compute capacity, thereby reducing the burden on researchers and their institutions. Availability and implementation https://nephele.niaid.nih.gov and https://github.com/niaid/Nephele Contact darrell.hurt@nih.gov PMID:29028892

  20. Geoinformatics in the public service: building a cyberinfrastructure across the geological surveys

    USGS Publications Warehouse

    Allison, M. Lee; Gundersen, Linda C.; Richard, Stephen M.; Keller, G. Randy; Baru, Chaitanya

    2011-01-01

    Advanced information technology infrastructure is increasingly being employed in the Earth sciences to provide researchers with efficient access to massive central databases and to integrate diversely formatted information from a variety of sources. These geoinformatics initiatives enable manipulation, modeling and visualization of data in a consistent way, and are helping to develop integrated Earth models at various scales, and from the near surface to the deep interior. This book uses a series of case studies to demonstrate computer and database use across the geosciences. Chapters are thematically grouped into sections that cover data collection and management; modeling and community computational codes; visualization and data representation; knowledge management and data integration; and web services and scientific workflows. Geoinformatics is a fascinating and accessible introduction to this emerging field for readers across the solid Earth sciences and an invaluable reference for researchers interested in initiating new cyberinfrastructure projects of their own.

  1. Closely Spaced Independent Parallel Runway Simulation.

    DTIC Science & Technology

    1984-10-01

    facility consists of the Central Computer Facility, the Controller Laboratory, and the Simulator Pilot Complex. CENTRAL COMPUTER FACILITY. The Central... Computer Facility consists of a group of mainframes, minicomputers, and associated peripherals which host the operational and data acquisition...in the Controller Laboratory and convert their verbal directives into a keyboard entry which is transmitted to the Central Computer Complex, where

  2. Computers and Computation. Readings from Scientific American.

    ERIC Educational Resources Information Center

    Fenichel, Robert R.; Weizenbaum, Joseph

    A collection of articles from "Scientific American" magazine has been put together at this time because the current period in computer science is one of consolidation rather than innovation. A few years ago, computer science was moving so swiftly that even the professional journals were more archival than informative; but today it is…

  3. Data Processing Center of Radioastron Project: 3 years of operation.

    NASA Astrophysics Data System (ADS)

    Shatskaya, Marina

    ASC DATA PROCESSING CENTER (DPC) of Radioastron Project is a fail-safe complex centralized system of interconnected software/ hardware components along with organizational procedures. Tasks facing of the scientific data processing center are organization of service information exchange, collection of scientific data, storage of all of scientific data, data science oriented processing. DPC takes part in the informational exchange with two tracking stations in Pushchino (Russia) and Green Bank (USA), about 30 ground telescopes, ballistic center, tracking headquarters and session scheduling center. Enormous flows of information go to Astro Space Center. For the inquiring of enormous data volumes we develop specialized network infrastructure, Internet channels and storage. The computer complex has been designed at the Astro Space Center (ASC) of Lebedev Physical Institute and includes: - 800 TB on-line storage, - 2000 TB hard drive archive, - backup system on magnetic tapes (2000 TB); - 24 TB redundant storage at Pushchino Radio Astronomy Observatory; - Web and FTP servers, - DPC management and data transmission networks. The structure and functions of ASC Data Processing Center are fully adequate to the data processing requirements of the Radioastron Mission and has been successfully confirmed during Fringe Search, Early Science Program and first year of Key Science Program.

  4. Crowd-Sourcing Seismic Data for Education and Research Opportunities with the Quake-Catcher Network

    NASA Astrophysics Data System (ADS)

    Sumy, D. F.; DeGroot, R. M.; Benthien, M. L.; Cochran, E. S.; Taber, J. J.

    2016-12-01

    The Quake Catcher Network (QCN; quakecatcher.net) uses low cost micro-electro-mechanical system (MEMS) sensors hosted by volunteers to collect seismic data. Volunteers use accelerometers internal to laptop computers, phones, tablets or small (the size of a matchbox) MEMS sensors plugged into desktop computers using a USB connector to collect scientifically useful data. Data are collected and sent to a central server using the Berkeley Open Infrastructure for Network Computing (BOINC) distributed computing software. Since 2008, sensors installed in museums, schools, offices, and residences have collected thousands of earthquake records, including the 2010 M8.8 Maule, Chile, the 2010 M7.1 Darfield, New Zealand, and 2015 M7.8 Gorkha, Nepal earthquakes. In 2016, the QCN in the United States transitioned to the Incorporated Research Institutions for Seismology (IRIS) Consortium and the Southern California Earthquake Center (SCEC), which are facilities funded through the National Science Foundation and the United States Geological Survey, respectively. The transition has allowed for an influx of new ideas and new education related efforts, which include focused installations in several school districts in southern California, on Native American reservations in North Dakota, and in the most seismically active state in the contiguous U.S. - Oklahoma. We present and describe these recent educational opportunities, and highlight how QCN has engaged a wide sector of the public in scientific data collection, particularly through the QCN-EPIcenter Network and NASA Mars InSight teacher programs. QCN provides the public with information and insight into how seismic data are collected, and how researchers use these data to better understand and characterize seismic activity. Lastly, we describe how students use data recorded by QCN sensors installed in their classrooms to explore and investigate felt earthquakes, and look towards the bright future of the network.

  5. Learning genetic inquiry through the use, revision, and justification of explanatory models

    NASA Astrophysics Data System (ADS)

    Cartier, Jennifer Lorraine

    Central to the process of inquiry in science is the construction and assessment of models that can be used to explain (and in some cases, predict) natural phenomena. This dissertation is a qualitative study of student learning in a high school biology course that was designed to give students opportunities to learn about genetic inquiry in part by providing them with authentic experiences doing inquiry in the discipline. With the aid of a computer program that generates populations of "fruit flies", the students in this class worked in groups structured like scientific communities to build, revise, and defend explanatory models for various inheritance phenomena. Analysis of the ways in which the first cohort of students assessed their inheritance models revealed that all students assessed models based upon empirical fit (data/model match). However, in contrast to the practice of scientists and despite explicit instruction, students did not consistently apply conceptual assessment criteria to their models. That is, they didn't seek consistency between underlying concepts or processes in their models and those of other important genetic models, such as meiosis. This is perhaps in part because they lacked an understanding of models as conceptual rather than physical entities. Subsequently, the genetics curriculum was altered in order to create more opportunities for students to address epistemological issues associated with model assessment throughout the course. The second cohort of students' understanding of models changed over the nine-week period: initially the majority of students equated scientific models with "proof" (generally physical) of "theories"; at the end of the course, most students demonstrated understanding of the conceptual nature of scientific models and the need to justify such knowledge according to both its empirical utility and conceptual consistency. Through model construction and assessment (i.e. scientific inquiry), students were able to come to a rich understanding of both the central concepts of transmission genetics and important epistemological aspects of genetic practice.

  6. Applied Mathematics at the U.S. Department of Energy: Past, Present and a View to the Future

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, D L; Bell, J; Estep, D

    2008-02-15

    Over the past half-century, the Applied Mathematics program in the U.S. Department of Energy's Office of Advanced Scientific Computing Research has made significant, enduring advances in applied mathematics that have been essential enablers of modern computational science. Motivated by the scientific needs of the Department of Energy and its predecessors, advances have been made in mathematical modeling, numerical analysis of differential equations, optimization theory, mesh generation for complex geometries, adaptive algorithms and other important mathematical areas. High-performance mathematical software libraries developed through this program have contributed as much or more to the performance of modern scientific computer codes as themore » high-performance computers on which these codes run. The combination of these mathematical advances and the resulting software has enabled high-performance computers to be used for scientific discovery in ways that could only be imagined at the program's inception. Our nation, and indeed our world, face great challenges that must be addressed in coming years, and many of these will be addressed through the development of scientific understanding and engineering advances yet to be discovered. The U.S. Department of Energy (DOE) will play an essential role in providing science-based solutions to many of these problems, particularly those that involve the energy, environmental and national security needs of the country. As the capability of high-performance computers continues to increase, the types of questions that can be answered by applying this huge computational power become more varied and more complex. It will be essential that we find new ways to develop and apply the mathematics necessary to enable the new scientific and engineering discoveries that are needed. In August 2007, a panel of experts in applied, computational and statistical mathematics met for a day and a half in Berkeley, California to understand the mathematical developments required to meet the future science and engineering needs of the DOE. It is important to emphasize that the panelists were not asked to speculate only on advances that might be made in their own research specialties. Instead, the guidance this panel was given was to consider the broad science and engineering challenges that the DOE faces and identify the corresponding advances that must occur across the field of mathematics for these challenges to be successfully addressed. As preparation for the meeting, each panelist was asked to review strategic planning and other informational documents available for one or more of the DOE Program Offices, including the Offices of Science, Nuclear Energy, Fossil Energy, Environmental Management, Legacy Management, Energy Efficiency & Renewable Energy, Electricity Delivery & Energy Reliability and Civilian Radioactive Waste Management as well as the National Nuclear Security Administration. The panelists reported on science and engineering needs for each of these offices, and then discussed and identified mathematical advances that will be required if these challenges are to be met. A review of DOE challenges in energy, the environment and national security brings to light a broad and varied array of questions that the DOE must answer in the coming years. A representative subset of such questions includes: (1) Can we predict the operating characteristics of a clean coal power plant? (2) How stable is the plasma containment in a tokamak? (3) How quickly is climate change occurring and what are the uncertainties in the predicted time scales? (4) How quickly can an introduced bio-weapon contaminate the agricultural environment in the US? (5) How do we modify models of the atmosphere and clouds to incorporate newly collected data of possibly of new types? (6) How quickly can the United States recover if part of the power grid became inoperable? (7) What are optimal locations and communication protocols for sensing devices in a remote-sensing network? (8) How can new materials be designed with a specified desirable set of properties? In comparing and contrasting these and other questions of importance to DOE, the panel found that while the scientific breadth of the requirements is enormous, a central theme emerges: Scientists are being asked to identify or provide technology, or to give expert analysis to inform policy-makers that requires the scientific understanding of increasingly complex physical and engineered systems. In addition, as the complexity of the systems of interest increases, neither experimental observation nor mathematical and computational modeling alone can access all components of the system over the entire range of scales or conditions needed to provide the required scientific understanding.« less

  7. Comparing the Consumption of CPU Hours with Scientific Output for the Extreme Science and Engineering Discovery Environment (XSEDE).

    PubMed

    Knepper, Richard; Börner, Katy

    2016-01-01

    This paper presents the results of a study that compares resource usage with publication output using data about the consumption of CPU cycles from the Extreme Science and Engineering Discovery Environment (XSEDE) and resulting scientific publications for 2,691 institutions/teams. Specifically, the datasets comprise a total of 5,374,032,696 central processing unit (CPU) hours run in XSEDE during July 1, 2011 to August 18, 2015 and 2,882 publications that cite the XSEDE resource. Three types of studies were conducted: a geospatial analysis of XSEDE providers and consumers, co-authorship network analysis of XSEDE publications, and bi-modal network analysis of how XSEDE resources are used by different research fields. Resulting visualizations show that a diverse set of consumers make use of XSEDE resources, that users of XSEDE publish together frequently, and that the users of XSEDE with the highest resource usage tend to be "traditional" high-performance computing (HPC) community members from astronomy, atmospheric science, physics, chemistry, and biology.

  8. Love-hate for man-machine metaphors in Soviet physiology: from Pavlov to "physiological cybernetics".

    PubMed

    Gerovitch, Slava

    2002-06-01

    This article reinterprets the debate between orthodox followers of the Pavlovian reflex theory and Soviet "cybernetic physiologists" in the 1950s and 60s as a clash of opposing man-machine metaphors. While both sides accused each other of "mechanistic," reductionist methodology, they did not see anything "mechanistic" about their own central metaphors: the telephone switchboard metaphor for nervous activity (the Pavlovians), and the analogies between the human brain and a computer (the cyberneticians). I argue that the scientific utility of machine analogies was closely intertwined with their philosophical and political meanings and that new interpretations of these metaphors emerged as a result of political conflicts and a realignment of forces within the scientific community and in society at large. I suggest that the constant travel of man-machine analogies, back and forth between physiology and technology has blurred the traditional categories of the "mechanistic" and the "organic" in Soviet neurophysiology, as perhaps in the history of physiology in general.

  9. Scientific Inquiry Self-Efficacy and Computer Game Self-Efficacy as Predictors and Outcomes of Middle School Boys' and Girls' Performance in a Science Assessment in a Virtual Environment

    NASA Astrophysics Data System (ADS)

    Bergey, Bradley W.; Ketelhut, Diane Jass; Liang, Senfeng; Natarajan, Uma; Karakus, Melissa

    2015-10-01

    The primary aim of the study was to examine whether performance on a science assessment in an immersive virtual environment was associated with changes in scientific inquiry self-efficacy. A secondary aim of the study was to examine whether performance on the science assessment was equitable for students with different levels of computer game self-efficacy, including whether gender differences were observed. We examined 407 middle school students' scientific inquiry self-efficacy and computer game self-efficacy before and after completing a computer game-like assessment about a science mystery. Results from path analyses indicated that prior scientific inquiry self-efficacy predicted achievement on end-of-module questions, which in turn predicted change in scientific inquiry self-efficacy. By contrast, computer game self-efficacy was neither predictive of nor predicted by performance on the science assessment. While boys had higher computer game self-efficacy compared to girls, multi-group analyses suggested only minor gender differences in how efficacy beliefs related to performance. Implications for assessments with virtual environments and future design and research are discussed.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hules, John

    This 1998 annual report from the National Scientific Energy Research Computing Center (NERSC) presents the year in review of the following categories: Computational Science; Computer Science and Applied Mathematics; and Systems and Services. Also presented are science highlights in the following categories: Basic Energy Sciences; Biological and Environmental Research; Fusion Energy Sciences; High Energy and Nuclear Physics; and Advanced Scientific Computing Research and Other Projects.

  11. Data Mining and Machine Learning in Time-Domain Discovery and Classification

    NASA Astrophysics Data System (ADS)

    Bloom, Joshua S.; Richards, Joseph W.

    2012-03-01

    The changing heavens have played a central role in the scientific effort of astronomers for centuries. Galileo's synoptic observations of the moons of Jupiter and the phases of Venus starting in 1610, provided strong refutation of Ptolemaic cosmology. These observations came soon after the discovery of Kepler's supernova had challenged the notion of an unchanging firmament. In more modern times, the discovery of a relationship between period and luminosity in some pulsational variable stars [41] led to the inference of the size of the Milky way, the distance scale to the nearest galaxies, and the expansion of the Universe (see Ref. [30] for review). Distant explosions of supernovae were used to uncover the existence of dark energy and provide a precise numerical account of dark matter (e.g., [3]). Repeat observations of pulsars [71] and nearby main-sequence stars revealed the presence of the first extrasolar planets [17,35,44,45]. Indeed, time-domain observations of transient events and variable stars, as a technique, influences a broad diversity of pursuits in the entire astronomy endeavor [68]. While, at a fundamental level, the nature of the scientific pursuit remains unchanged, the advent of astronomy as a data-driven discipline presents fundamental challenges to the way in which the scientific process must now be conducted. Digital images (and data cubes) are not only getting larger, there are more of them. On logistical grounds, this taxes storage and transport systems. But it also implies that the intimate connection that astronomers have always enjoyed with their data - from collection to processing to analysis to inference - necessarily must evolve. Figure 6.1 highlights some of the ways that the pathway to scientific inference is now influenced (if not driven by) modern automation processes, computing, data-mining, and machine-learning (ML). The emerging reliance on computation and ML is a general one - a central theme of this book - but the time-domain aspect of the data and the objects of interest presents some unique challenges. First, any collection, storage, transport, and computational framework for processing the streaming data must be able to keep up with the dataflow. This is not necessarily true, for instance, with static sky science, where metrics of interest can be computed off-line and on a timescale much longer than the time required to obtain the data. Second, many types of transient (one-off) events evolve quickly in time and require more observations to fully understand the nature of the events. This demands that time-changing events are quickly discovered, classified, and broadcast to other follow-up facilities. All of this must happen robustly with, in some cases, very limited data. Last, the process of discovery and classification must be calibrated to the available resources for computation and follow-up. That is, the precision of classification must be weighed against the computational cost of producing that level of precision. Likewise, the cost of being wrong about the classification of some sorts of sources must be balanced against the scientific gains about being right about the classification of other types of sources. Quantifying these trade-offs, especially in the presence of a limited amount of follow-up resources (such as the availability of larger telescope observations) is not straightforward and inheres domain-specific imperatives that will, in general, differ from astronomer to astronomer. This chapter presents an overview of the current directions in ML and data-mining techniques in the context of time-domain astronomy. Ultimately the goal - if not just the necessity given the data rates and the diversity of questions to be answered - is to abstract the traditional role of astronomer in the entire scientific process. In some sense, this takes us full circle from the pre modern view of the scientific pursuit presented in Vermeer's "The Astronomer" (Figure 6.2): in broad daylight, he contemplates the nighttime heavens from depictions presented to him on globe, based on observations that others have made. He is an abstract thinker, far removed from data collection and processing; his most visceral connection to the skies is just the feel of the orb under his fingers. Substitute the globe for a plot on a screen generated from a structured query language (SQL) query to a massive public database in the cloud, and we have a picture of the modern astronomer benefitting from the ML and data-mining tools operating on an almost unfathomable amount of raw data.

  12. Basic Inferences of Scientific Reasoning, Argumentation, and Discovery

    ERIC Educational Resources Information Center

    Lawson, Anton E.

    2010-01-01

    Helping students better understand how scientists reason and argue to draw scientific conclusions has long been viewed as a critical component of scientific literacy, thus remains a central goal of science instruction. However, differences of opinion persist regarding the nature of scientific reasoning, argumentation, and discovery. Accordingly,…

  13. Hypergraph-Based Combinatorial Optimization of Matrix-Vector Multiplication

    ERIC Educational Resources Information Center

    Wolf, Michael Maclean

    2009-01-01

    Combinatorial scientific computing plays an important enabling role in computational science, particularly in high performance scientific computing. In this thesis, we will describe our work on optimizing matrix-vector multiplication using combinatorial techniques. Our research has focused on two different problems in combinatorial scientific…

  14. The Versatile Terminal.

    ERIC Educational Resources Information Center

    Evans, C. D.

    This paper describes the experiences of the industrial research laboratory of Kodak Ltd. in finding and providing a computer terminal most suited to its very varied requirements. These requirements include bibliographic and scientific data searching and access to a number of worldwide computing services for scientific computing work. The provision…

  15. Central mechanisms for force and motion--towards computational synthesis of human movement.

    PubMed

    Hemami, Hooshang; Dariush, Behzad

    2012-12-01

    Anatomical, physiological and experimental research on the human body can be supplemented by computational synthesis of the human body for all movement: routine daily activities, sports, dancing, and artistic and exploratory involvements. The synthesis requires thorough knowledge about all subsystems of the human body and their interactions, and allows for integration of known knowledge in working modules. It also affords confirmation and/or verification of scientific hypotheses about workings of the central nervous system (CNS). A simple step in this direction is explored here for controlling the forces of constraint. It requires co-activation of agonist-antagonist musculature. The desired trajectories of motion and the force of contact have to be provided by the CNS. The spinal control involves projection onto a muscular subset that induces the force of contact. The projection of force in the sensory motor cortex is implemented via a well-defined neural population unit, and is executed in the spinal cord by a standard integral controller requiring input from tendon organs. The sensory motor cortex structure is extended to the case for directing motion via two neural population units with vision input and spindle efferents. Digital computer simulations show the feasibility of the system. The formulation is modular and can be extended to multi-link limbs, robot and humanoid systems with many pairs of actuators or muscles. It can be expanded to include reticular activating structures and learning. Copyright © 2012 Elsevier Ltd. All rights reserved.

  16. To What Extent Does Current Scientific Research and Textbook Content Align? A Methodology and Case Study

    ERIC Educational Resources Information Center

    Bierema, Andrea M.-K.; Schwartz, Renee S.; Gill, Sharon A.

    2017-01-01

    Recent calls for reform in education recommend science curricula to be based on central ideas instead of a larger number of topics and for alignment between current scientific research and curricula. Because alignment is rarely studied, especially for central ideas, we developed a methodology to discover the extent of alignment between primary…

  17. Amplify scientific discovery with artificial intelligence

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gil, Yolanda; Greaves, Mark T.; Hendler, James

    Computing innovations have fundamentally changed many aspects of scientific inquiry. For example, advances in robotics, high-end computing, networking, and databases now underlie much of what we do in science such as gene sequencing, general number crunching, sharing information between scientists, and analyzing large amounts of data. As computing has evolved at a rapid pace, so too has its impact in science, with the most recent computing innovations repeatedly being brought to bear to facilitate new forms of inquiry. Recently, advances in Artificial Intelligence (AI) have deeply penetrated many consumer sectors, including for example Apple’s Siri™ speech recognition system, real-time automatedmore » language translation services, and a new generation of self-driving cars and self-navigating drones. However, AI has yet to achieve comparable levels of penetration in scientific inquiry, despite its tremendous potential in aiding computers to help scientists tackle tasks that require scientific reasoning. We contend that advances in AI will transform the practice of science as we are increasingly able to effectively and jointly harness human and machine intelligence in the pursuit of major scientific challenges.« less

  18. A distributed computing environment with support for constraint-based task scheduling and scientific experimentation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ahrens, J.P.; Shapiro, L.G.; Tanimoto, S.L.

    1997-04-01

    This paper describes a computing environment which supports computer-based scientific research work. Key features include support for automatic distributed scheduling and execution and computer-based scientific experimentation. A new flexible and extensible scheduling technique that is responsive to a user`s scheduling constraints, such as the ordering of program results and the specification of task assignments and processor utilization levels, is presented. An easy-to-use constraint language for specifying scheduling constraints, based on the relational database query language SQL, is described along with a search-based algorithm for fulfilling these constraints. A set of performance studies show that the environment can schedule and executemore » program graphs on a network of workstations as the user requests. A method for automatically generating computer-based scientific experiments is described. Experiments provide a concise method of specifying a large collection of parameterized program executions. The environment achieved significant speedups when executing experiments; for a large collection of scientific experiments an average speedup of 3.4 on an average of 5.5 scheduled processors was obtained.« less

  19. Computer-Supported Aids to Making Sense of Scientific Articles: Cognitive, Motivational, and Attitudinal Effects

    ERIC Educational Resources Information Center

    Gegner, Julie A.; Mackay, Donald H. J.; Mayer, Richard E.

    2009-01-01

    High school students can access original scientific research articles on the Internet, but may have trouble understanding them. To address this problem of online literacy, the authors developed a computer-based prototype for guiding students' comprehension of scientific articles. High school students were asked to read an original scientific…

  20. Scientific Computing for Chemists: An Undergraduate Course in Simulations, Data Processing, and Visualization

    ERIC Educational Resources Information Center

    Weiss, Charles J.

    2017-01-01

    The Scientific Computing for Chemists course taught at Wabash College teaches chemistry students to use the Python programming language, Jupyter notebooks, and a number of common Python scientific libraries to process, analyze, and visualize data. Assuming no prior programming experience, the course introduces students to basic programming and…

  1. Computational chemistry in pharmaceutical research: at the crossroads.

    PubMed

    Bajorath, Jürgen

    2012-01-01

    Computational approaches are an integral part of pharmaceutical research. However, there are many of unsolved key questions that limit the scientific progress in the still evolving computational field and its impact on drug discovery. Importantly, a number of these questions are not new but date back many years. Hence, it might be difficult to conclusively answer them in the foreseeable future. Moreover, the computational field as a whole is characterized by a high degree of heterogeneity and so is, unfortunately, the quality of its scientific output. In light of this situation, it is proposed that changes in scientific standards and culture should be seriously considered now in order to lay a foundation for future progress in computational research.

  2. [Earth and Space Sciences Project Services for NASA HPCC

    NASA Technical Reports Server (NTRS)

    Merkey, Phillip

    2002-01-01

    This grant supported the effort to characterize the problem domain of the Earth Science Technology Office's Computational Technologies Project, to engage the Beowulf Cluster Computing Community as well as the High Performance Computing Research Community so that we can predict the applicability of said technologies to the scientific community represented by the CT project and formulate long term strategies to provide the computational resources necessary to attain the anticipated scientific objectives of the CT project. Specifically, the goal of the evaluation effort is to use the information gathered over the course of the Round-3 investigations to quantify the trends in scientific expectations, the algorithmic requirements and capabilities of high-performance computers to satisfy this anticipated need.

  3. Scholarly literature and the press: scientific impact and social perception of physics computing

    NASA Astrophysics Data System (ADS)

    Pia, M. G.; Basaglia, T.; Bell, Z. W.; Dressendorfer, P. V.

    2014-06-01

    The broad coverage of the search for the Higgs boson in the mainstream media is a relative novelty for high energy physics (HEP) research, whose achievements have traditionally been limited to scholarly literature. This paper illustrates the results of a scientometric analysis of HEP computing in scientific literature, institutional media and the press, and a comparative overview of similar metrics concerning representative particle physics measurements. The picture emerging from these scientometric data documents the relationship between the scientific impact and the social perception of HEP physics research versus that of HEP computing. The results of this analysis suggest that improved communication of the scientific and social role of HEP computing via press releases from the major HEP laboratories would be beneficial to the high energy physics community.

  4. Software Reuse Methods to Improve Technological Infrastructure for e-Science

    NASA Technical Reports Server (NTRS)

    Marshall, James J.; Downs, Robert R.; Mattmann, Chris A.

    2011-01-01

    Social computing has the potential to contribute to scientific research. Ongoing developments in information and communications technology improve capabilities for enabling scientific research, including research fostered by social computing capabilities. The recent emergence of e-Science practices has demonstrated the benefits from improvements in the technological infrastructure, or cyber-infrastructure, that has been developed to support science. Cloud computing is one example of this e-Science trend. Our own work in the area of software reuse offers methods that can be used to improve new technological development, including cloud computing capabilities, to support scientific research practices. In this paper, we focus on software reuse and its potential to contribute to the development and evaluation of information systems and related services designed to support new capabilities for conducting scientific research.

  5. [History of scientific and popular educational films in neurology and psychiatry in Germany 1985-1929].

    PubMed

    Podoll, K

    2000-11-01

    Based on a survey of a variety of sources from medical and film history, an account is given of the history of scientific and popular educational films in neurology and psychiatry in Germany in the era of the silent film 1895-1929. A central event for the centralization of the production and distribution of medical scientific educational films was the foundation, in 1918, of the 'cultural department' of the Ufa film company which established, under the direction of the neurologist Curt Thomalla, a large medical film archive. Curt Thomalla was also the first who developed a dramatic type of popular educational film amalgamating medical and melodramatical features, thereby greatly increasing its mass impact, but also anticipating central elements of its later misuse by the Nazi film propaganda.

  6. Multidimensional Environmental Data Resource Brokering on Computational Grids and Scientific Clouds

    NASA Astrophysics Data System (ADS)

    Montella, Raffaele; Giunta, Giulio; Laccetti, Giuliano

    Grid computing has widely evolved over the past years, and its capabilities have found their way even into business products and are no longer relegated to scientific applications. Today, grid computing technology is not restricted to a set of specific grid open source or industrial products, but rather it is comprised of a set of capabilities virtually within any kind of software to create shared and highly collaborative production environments. These environments are focused on computational (workload) capabilities and the integration of information (data) into those computational capabilities. An active grid computing application field is the fully virtualization of scientific instruments in order to increase their availability and decrease operational and maintaining costs. Computational and information grids allow to manage real-world objects in a service-oriented way using industrial world-spread standards.

  7. 78 FR 6087 - Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-29

    ... INFORMATION CONTACT: Melea Baker, Office of Advanced Scientific Computing Research; SC-21/Germantown Building... Theory and Experiment (INCITE) Public Comment (10-minute rule) Public Participation: The meeting is open...

  8. Computational Science in Armenia (Invited Talk)

    NASA Astrophysics Data System (ADS)

    Marandjian, H.; Shoukourian, Yu.

    This survey is devoted to the development of informatics and computer science in Armenia. The results in theoretical computer science (algebraic models, solutions to systems of general form recursive equations, the methods of coding theory, pattern recognition and image processing), constitute the theoretical basis for developing problem-solving-oriented environments. As examples can be mentioned: a synthesizer of optimized distributed recursive programs, software tools for cluster-oriented implementations of two-dimensional cellular automata, a grid-aware web interface with advanced service trading for linear algebra calculations. In the direction of solving scientific problems that require high-performance computing resources, examples of completed projects include the field of physics (parallel computing of complex quantum systems), astrophysics (Armenian virtual laboratory), biology (molecular dynamics study of human red blood cell membrane), meteorology (implementing and evaluating the Weather Research and Forecast Model for the territory of Armenia). The overview also notes that the Institute for Informatics and Automation Problems of the National Academy of Sciences of Armenia has established a scientific and educational infrastructure, uniting computing clusters of scientific and educational institutions of the country and provides the scientific community with access to local and international computational resources, that is a strong support for computational science in Armenia.

  9. Highly parallel computation

    NASA Technical Reports Server (NTRS)

    Denning, Peter J.; Tichy, Walter F.

    1990-01-01

    Highly parallel computing architectures are the only means to achieve the computation rates demanded by advanced scientific problems. A decade of research has demonstrated the feasibility of such machines and current research focuses on which architectures designated as multiple instruction multiple datastream (MIMD) and single instruction multiple datastream (SIMD) have produced the best results to date; neither shows a decisive advantage for most near-homogeneous scientific problems. For scientific problems with many dissimilar parts, more speculative architectures such as neural networks or data flow may be needed.

  10. ASCR Cybersecurity for Scientific Computing Integrity - Research Pathways and Ideas Workshop

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peisert, Sean; Potok, Thomas E.; Jones, Todd

    At the request of the U.S. Department of Energy's (DOE) Office of Science (SC) Advanced Scientific Computing Research (ASCR) program office, a workshop was held June 2-3, 2015, in Gaithersburg, MD, to identify potential long term (10 to +20 year) cybersecurity fundamental basic research and development challenges, strategies and roadmap facing future high performance computing (HPC), networks, data centers, and extreme-scale scientific user facilities. This workshop was a follow-on to the workshop held January 7-9, 2015, in Rockville, MD, that examined higher level ideas about scientific computing integrity specific to the mission of the DOE Office of Science. Issues includedmore » research computation and simulation that takes place on ASCR computing facilities and networks, as well as network-connected scientific instruments, such as those run by various DOE Office of Science programs. Workshop participants included researchers and operational staff from DOE national laboratories, as well as academic researchers and industry experts. Participants were selected based on the submission of abstracts relating to the topics discussed in the previous workshop report [1] and also from other ASCR reports, including "Abstract Machine Models and Proxy Architectures for Exascale Computing" [27], the DOE "Preliminary Conceptual Design for an Exascale Computing Initiative" [28], and the January 2015 machine learning workshop [29]. The workshop was also attended by several observers from DOE and other government agencies. The workshop was divided into three topic areas: (1) Trustworthy Supercomputing, (2) Extreme-Scale Data, Knowledge, and Analytics for Understanding and Improving Cybersecurity, and (3) Trust within High-end Networking and Data Centers. Participants were divided into three corresponding teams based on the category of their abstracts. The workshop began with a series of talks from the program manager and workshop chair, followed by the leaders for each of the three topics and a representative of each of the four major DOE Office of Science Advanced Scientific Computing Research Facilities: the Argonne Leadership Computing Facility (ALCF), the Energy Sciences Network (ESnet), the National Energy Research Scientific Computing Center (NERSC), and the Oak Ridge Leadership Computing Facility (OLCF). The rest of the workshop consisted of topical breakout discussions and focused writing periods that produced much of this report.« less

  11. Building Cognition: The Construction of Computational Representations for Scientific Discovery

    ERIC Educational Resources Information Center

    Chandrasekharan, Sanjay; Nersessian, Nancy J.

    2015-01-01

    Novel computational representations, such as simulation models of complex systems and video games for scientific discovery (Foldit, EteRNA etc.), are dramatically changing the way discoveries emerge in science and engineering. The cognitive roles played by such computational representations in discovery are not well understood. We present a…

  12. Big Data Processing for a Central Texas Groundwater Case Study

    NASA Astrophysics Data System (ADS)

    Cantu, A.; Rivera, O.; Martínez, A.; Lewis, D. H.; Gentle, J. N., Jr.; Fuentes, G.; Pierce, S. A.

    2016-12-01

    As computational methods improve, scientists are able to expand the level and scale of experimental simulation and testing that is completed for case studies. This study presents a comparative analysis of multiple models for the Barton Springs segment of the Edwards aquifer. Several numerical simulations using state-mandated MODFLOW models ran on Stampede, a High Performance Computing system housed at the Texas Advanced Computing Center, were performed for multiple scenario testing. One goal of this multidisciplinary project aims to visualize and compare the output data of the groundwater model using the statistical programming language R to find revealing data patterns produced by different pumping scenarios. Presenting data in a friendly post-processing format is covered in this paper. Visualization of the data and creating workflows applicable to the management of the data are tasks performed after data extraction. Resulting analyses provide an example of how supercomputing can be used to accelerate evaluation of scientific uncertainty and geological knowledge in relation to policy and management decisions. Understanding the aquifer behavior helps policy makers avoid negative impact on the endangered species, environmental services and aids in maximizing the aquifer yield.

  13. DOE Advanced Scientific Computing Advisory Committee (ASCAC) Subcommittee Report on Scientific and Technical Information

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hey, Tony; Agarwal, Deborah; Borgman, Christine

    The Advanced Scientific Computing Advisory Committee (ASCAC) was charged to form a standing subcommittee to review the Department of Energy’s Office of Scientific and Technical Information (OSTI) and to begin by assessing the quality and effectiveness of OSTI’s recent and current products and services and to comment on its mission and future directions in the rapidly changing environment for scientific publication and data. The Committee met with OSTI staff and reviewed available products, services and other materials. This report summaries their initial findings and recommendations.

  14. Common Graphics Library (CGL). Volume 1: LEZ user's guide

    NASA Technical Reports Server (NTRS)

    Taylor, Nancy L.; Hammond, Dana P.; Hofler, Alicia S.; Miner, David L.

    1988-01-01

    Users are introduced to and instructed in the use of the Langley Easy (LEZ) routines of the Common Graphics Library (CGL). The LEZ routines form an application independent graphics package which enables the user community to view data quickly and easily, while providing a means of generating scientific charts conforming to the publication and/or viewgraph process. A distinct advantage for using the LEZ routines is that the underlying graphics package may be replaced or modified without requiring the users to change their application programs. The library is written in ANSI FORTRAN 77, and currently uses a CORE-based underlying graphics package, and is therefore machine independent, providing support for centralized and/or distributed computer systems.

  15. Computational Scientific Inquiry with Virtual Worlds and Agent-Based Models: New Ways of Doing Science to Learn Science

    ERIC Educational Resources Information Center

    Jacobson, Michael J.; Taylor, Charlotte E.; Richards, Deborah

    2016-01-01

    In this paper, we propose computational scientific inquiry (CSI) as an innovative model for learning important scientific knowledge and new practices for "doing" science. This approach involves the use of a "game-like" virtual world for students to experience virtual biological fieldwork in conjunction with using an agent-based…

  16. Using Just-in-Time Information to Support Scientific Discovery Learning in a Computer-Based Simulation

    ERIC Educational Resources Information Center

    Hulshof, Casper D.; de Jong, Ton

    2006-01-01

    Students encounter many obstacles during scientific discovery learning with computer-based simulations. It is hypothesized that an effective type of support, that does not interfere with the scientific discovery learning process, should be delivered on a "just-in-time" base. This study explores the effect of facilitating access to…

  17. Pervasive healthcare as a scientific discipline.

    PubMed

    Bardram, J E

    2008-01-01

    The OECD countries are facing a set of core challenges; an increasing elderly population; increasing number of chronic and lifestyle-related diseases; expanding scope of what medicine can do; and increasing lack of medical professionals. Pervasive healthcare asks how pervasive computing technology can be designed to meet these challenges. The objective of this paper is to discuss 'pervasive healthcare' as a research field and tries to establish how novel and distinct it is, compared to related work within biomedical engineering, medical informatics, and ubiquitous computing. The paper presents the research questions, approach, technologies, and methods of pervasive healthcare and discusses these in comparison to those of other related scientific disciplines. A set of central research themes are presented; monitoring and body sensor networks; pervasive assistive technologies; pervasive computing for hospitals; and preventive and persuasive technologies. Two projects illustrate the kind of research being done in pervasive healthcare. The first project is targeted at home-based monitoring of hypertension; the second project is designing context-aware technologies for hospitals. Both projects approach the healthcare challenges in a new way, apply a new type of research method, and come up with new kinds of technological solutions. 'Clinical proof-of-concept' is recommended as a new method for pervasive healthcare research; the method helps design and test pervasive healthcare technologies, and in ascertaining their clinical potential before large-scale clinical tests are needed. The paper concludes that pervasive healthcare as a research field and agenda is novel; it is addressing new emerging research questions, represents a novel approach, designs new types of technologies, and applies a new kind of research method.

  18. Science Gateways, Scientific Workflows and Open Community Software

    NASA Astrophysics Data System (ADS)

    Pierce, M. E.; Marru, S.

    2014-12-01

    Science gateways and scientific workflows occupy different ends of the spectrum of user-focused cyberinfrastructure. Gateways, sometimes called science portals, provide a way for enabling large numbers of users to take advantage of advanced computing resources (supercomputers, advanced storage systems, science clouds) by providing Web and desktop interfaces and supporting services. Scientific workflows, at the other end of the spectrum, support advanced usage of cyberinfrastructure that enable "power users" to undertake computational experiments that are not easily done through the usual mechanisms (managing simulations across multiple sites, for example). Despite these different target communities, gateways and workflows share many similarities and can potentially be accommodated by the same software system. For example, pipelines to process InSAR imagery sets or to datamine GPS time series data are workflows. The results and the ability to make downstream products may be made available through a gateway, and power users may want to provide their own custom pipelines. In this abstract, we discuss our efforts to build an open source software system, Apache Airavata, that can accommodate both gateway and workflow use cases. Our approach is general, and we have applied the software to problems in a number of scientific domains. In this talk, we discuss our applications to usage scenarios specific to earth science, focusing on earthquake physics examples drawn from the QuakSim.org and GeoGateway.org efforts. We also examine the role of the Apache Software Foundation's open community model as a way to build up common commmunity codes that do not depend upon a single "owner" to sustain. Pushing beyond open source software, we also see the need to provide gateways and workflow systems as cloud services. These services centralize operations, provide well-defined programming interfaces, scale elastically, and have global-scale fault tolerance. We discuss our work providing Apache Airavata as a hosted service to provide these features.

  19. Building Capacity Through Hands-on Computational Internships to Assure Reproducible Results and Implementation of Digital Documentation in the ICERT REU Program

    NASA Astrophysics Data System (ADS)

    Gomez, R.; Gentle, J.

    2015-12-01

    Modern data pipelines and computational processes require that meticulous methodologies be applied in order to insure that the source data, algorithms, and results are properly curated, managed and retained while remaining discoverable, accessible, and reproducible. Given the complexity of understanding the scientific problem domain being researched, combined with the overhead of learning to use advanced computing technologies, it becomes paramount that the next generation of scientists and researchers learn to embrace best-practices. The Integrative Computational Education and Research Traineeship (ICERT) is a National Science Foundation (NSF) Research Experience for Undergraduates (REU) Site at the Texas Advanced Computing Center (TACC). During Summer 2015, two ICERT interns joined the 3DDY project. 3DDY converts geospatial datasets into file types that can take advantage of new formats, such as natural user interfaces, interactive visualization, and 3D printing. Mentored by TACC researchers for ten weeks, students with no previous background in computational science learned to use scripts to build the first prototype of the 3DDY application, and leveraged Wrangler, the newest high performance computing (HPC) resource at TACC. Test datasets for quadrangles in central Texas were used to assemble the 3DDY workflow and code. Test files were successfully converted into a stereo lithographic (STL) format, which is amenable for use with a 3D printers. Test files and the scripts were documented and shared using the Figshare site while metadata was documented for the 3DDY application using OntoSoft. These efforts validated a straightforward set of workflows to transform geospatial data and established the first prototype version of 3DDY. Adding the data and software management procedures helped students realize a broader set of tangible results (e.g. Figshare entries), better document their progress and the final state of their work for the research group and community, helped students and researchers follow a clear set of formats and fill in the necessary details that may be lost otherwise, and exposed the students to the next generation workflows and practices for digital scholarship and scientific inquiry for converting geospatial data into formats that are easy to reuse.

  20. An Interdisciplinary Guided Inquiry on Estuarine Transport Using a Computer Model in High School Classrooms

    ERIC Educational Resources Information Center

    Chan, Kit Yu Karen; Yang, Sylvia; Maliska, Max E.; Grunbaum, Daniel

    2012-01-01

    The National Science Education Standards have highlighted the importance of active learning and reflection for contemporary scientific methods in K-12 classrooms, including the use of models. Computer modeling and visualization are tools that researchers employ in their scientific inquiry process, and often computer models are used in…

  1. Architectural Principles and Experimentation of Distributed High Performance Virtual Clusters

    ERIC Educational Resources Information Center

    Younge, Andrew J.

    2016-01-01

    With the advent of virtualization and Infrastructure-as-a-Service (IaaS), the broader scientific computing community is considering the use of clouds for their scientific computing needs. This is due to the relative scalability, ease of use, advanced user environment customization abilities, and the many novel computing paradigms available for…

  2. An Analysis on the Effect of Computer Self-Efficacy over Scientific Research Self-Efficacy and Information Literacy Self-Efficacy

    ERIC Educational Resources Information Center

    Tuncer, Murat

    2013-01-01

    Present research investigates reciprocal relations amidst computer self-efficacy, scientific research and information literacy self-efficacy. Research findings have demonstrated that according to standardized regression coefficients, computer self-efficacy has a positive effect on information literacy self-efficacy. Likewise it has been detected…

  3. The Impact of Three-Dimensional Computational Modeling on Student Understanding of Astronomical Concepts: A Quantitative Analysis

    ERIC Educational Resources Information Center

    Hansen, John; Barnett, Michael; MaKinster, James; Keating, Thomas

    2004-01-01

    The increased availability of computational modeling software has created opportunities for students to engage in scientific inquiry through constructing computer-based models of scientific phenomena. However, despite the growing trend of integrating technology into science curricula, educators need to understand what aspects of these technologies…

  4. Evaluation of Cache-based Superscalar and Cacheless Vector Architectures for Scientific Computations

    NASA Technical Reports Server (NTRS)

    Oliker, Leonid; Carter, Jonathan; Shalf, John; Skinner, David; Ethier, Stephane; Biswas, Rupak; Djomehri, Jahed; VanderWijngaart, Rob

    2003-01-01

    The growing gap between sustained and peak performance for scientific applications has become a well-known problem in high performance computing. The recent development of parallel vector systems offers the potential to bridge this gap for a significant number of computational science codes and deliver a substantial increase in computing capabilities. This paper examines the intranode performance of the NEC SX6 vector processor and the cache-based IBM Power3/4 superscalar architectures across a number of key scientific computing areas. First, we present the performance of a microbenchmark suite that examines a full spectrum of low-level machine characteristics. Next, we study the behavior of the NAS Parallel Benchmarks using some simple optimizations. Finally, we evaluate the perfor- mance of several numerical codes from key scientific computing domains. Overall results demonstrate that the SX6 achieves high performance on a large fraction of our application suite and in many cases significantly outperforms the RISC-based architectures. However, certain classes of applications are not easily amenable to vectorization and would likely require extensive reengineering of both algorithm and implementation to utilize the SX6 effectively.

  5. Identifying Strategic Scientific Opportunities

    Cancer.gov

    As NCI's central scientific strategy office, CRS collaborates with the institute's divisions, offices, and centers to identify research opportunities to advance NCI's vision for the future of cancer research.

  6. ITK: enabling reproducible research and open science

    PubMed Central

    McCormick, Matthew; Liu, Xiaoxiao; Jomier, Julien; Marion, Charles; Ibanez, Luis

    2014-01-01

    Reproducibility verification is essential to the practice of the scientific method. Researchers report their findings, which are strengthened as other independent groups in the scientific community share similar outcomes. In the many scientific fields where software has become a fundamental tool for capturing and analyzing data, this requirement of reproducibility implies that reliable and comprehensive software platforms and tools should be made available to the scientific community. The tools will empower them and the public to verify, through practice, the reproducibility of observations that are reported in the scientific literature. Medical image analysis is one of the fields in which the use of computational resources, both software and hardware, are an essential platform for performing experimental work. In this arena, the introduction of the Insight Toolkit (ITK) in 1999 has transformed the field and facilitates its progress by accelerating the rate at which algorithmic implementations are developed, tested, disseminated and improved. By building on the efficiency and quality of open source methodologies, ITK has provided the medical image community with an effective platform on which to build a daily workflow that incorporates the true scientific practices of reproducibility verification. This article describes the multiple tools, methodologies, and practices that the ITK community has adopted, refined, and followed during the past decade, in order to become one of the research communities with the most modern reproducibility verification infrastructure. For example, 207 contributors have created over 2400 unit tests that provide over 84% code line test coverage. The Insight Journal, an open publication journal associated with the toolkit, has seen over 360,000 publication downloads. The median normalized closeness centrality, a measure of knowledge flow, resulting from the distributed peer code review system was high, 0.46. PMID:24600387

  7. ITK: enabling reproducible research and open science.

    PubMed

    McCormick, Matthew; Liu, Xiaoxiao; Jomier, Julien; Marion, Charles; Ibanez, Luis

    2014-01-01

    Reproducibility verification is essential to the practice of the scientific method. Researchers report their findings, which are strengthened as other independent groups in the scientific community share similar outcomes. In the many scientific fields where software has become a fundamental tool for capturing and analyzing data, this requirement of reproducibility implies that reliable and comprehensive software platforms and tools should be made available to the scientific community. The tools will empower them and the public to verify, through practice, the reproducibility of observations that are reported in the scientific literature. Medical image analysis is one of the fields in which the use of computational resources, both software and hardware, are an essential platform for performing experimental work. In this arena, the introduction of the Insight Toolkit (ITK) in 1999 has transformed the field and facilitates its progress by accelerating the rate at which algorithmic implementations are developed, tested, disseminated and improved. By building on the efficiency and quality of open source methodologies, ITK has provided the medical image community with an effective platform on which to build a daily workflow that incorporates the true scientific practices of reproducibility verification. This article describes the multiple tools, methodologies, and practices that the ITK community has adopted, refined, and followed during the past decade, in order to become one of the research communities with the most modern reproducibility verification infrastructure. For example, 207 contributors have created over 2400 unit tests that provide over 84% code line test coverage. The Insight Journal, an open publication journal associated with the toolkit, has seen over 360,000 publication downloads. The median normalized closeness centrality, a measure of knowledge flow, resulting from the distributed peer code review system was high, 0.46.

  8. USRA/RIACS

    NASA Technical Reports Server (NTRS)

    Oliger, Joseph

    1992-01-01

    The Research Institute for Advanced Computer Science (RIACS) was established by the Universities Space Research Association (USRA) at the NASA Ames Research Center (ARC) on 6 June 1983. RIACS is privately operated by USRA, a consortium of universities with research programs in the aerospace sciences, under a cooperative agreement with NASA. The primary mission of RIACS is to provide research and expertise in computer science and scientific computing to support the scientific missions of NASA ARC. The research carried out at RIACS must change its emphasis from year to year in response to NASA ARC's changing needs and technological opportunities. A flexible scientific staff is provided through a university faculty visitor program, a post doctoral program, and a student visitor program. Not only does this provide appropriate expertise but it also introduces scientists outside of NASA to NASA problems. A small group of core RIACS staff provides continuity and interacts with an ARC technical monitor and scientific advisory group to determine the RIACS mission. RIACS activities are reviewed and monitored by a USRA advisory council and ARC technical monitor. Research at RIACS is currently being done in the following areas: Parallel Computing; Advanced Methods for Scientific Computing; Learning Systems; High Performance Networks and Technology; Graphics, Visualization, and Virtual Environments.

  9. Using Computer Simulations for Promoting Model-based Reasoning. Epistemological and Educational Dimensions

    NASA Astrophysics Data System (ADS)

    Develaki, Maria

    2017-11-01

    Scientific reasoning is particularly pertinent to science education since it is closely related to the content and methodologies of science and contributes to scientific literacy. Much of the research in science education investigates the appropriate framework and teaching methods and tools needed to promote students' ability to reason and evaluate in a scientific way. This paper aims (a) to contribute to an extended understanding of the nature and pedagogical importance of model-based reasoning and (b) to exemplify how using computer simulations can support students' model-based reasoning. We provide first a background for both scientific reasoning and computer simulations, based on the relevant philosophical views and the related educational discussion. This background suggests that the model-based framework provides an epistemologically valid and pedagogically appropriate basis for teaching scientific reasoning and for helping students develop sounder reasoning and decision-taking abilities and explains how using computer simulations can foster these abilities. We then provide some examples illustrating the use of computer simulations to support model-based reasoning and evaluation activities in the classroom. The examples reflect the procedure and criteria for evaluating models in science and demonstrate the educational advantages of their application in classroom reasoning activities.

  10. Scientific Research: Commodities or Commons?

    NASA Astrophysics Data System (ADS)

    Vermeir, Koen

    2013-10-01

    Truth is for sale today, some critics claim. The increased commodification of science corrupts it, scientific fraud is rampant and the age-old trust in science is shattered. This cynical view, although gaining in prominence, does not explain very well the surprising motivation and integrity that is still central to the scientific life. Although scientific knowledge becomes more and more treated as a commodity or as a product that is for sale, a central part of academic scientific practice is still organized according to different principles. In this paper, I critically analyze alternative models for understanding the organization of knowledge, such as the idea of the scientific commons and the gift economy of science. After weighing the diverse positive and negative aspects of free market economies of science and gift economies of science, a commons structured as a gift economy seems best suited to preserve and take advantage of the specific character of scientific knowledge. Furthermore, commons and gift economies promote the rich social texture that is important for supporting central norms of science. Some of these basic norms might break down if the gift character of science is lost. To conclude, I consider the possibility and desirability of hybrid economies of academic science, which combine aspects of gift economies and free market economies. The aim of this paper is to gain a better understanding of these deeper structural challenges faced by science policy. Such theoretical reflections should eventually assist us in formulating new policy guidelines.

  11. A communication efficient and scalable distributed data mining for the astronomical data

    NASA Astrophysics Data System (ADS)

    Govada, A.; Sahay, S. K.

    2016-07-01

    In 2020, ∼60PB of archived data will be accessible to the astronomers. But to analyze such a paramount data will be a challenging task. This is basically due to the computational model used to download the data from complex geographically distributed archives to a central site and then analyzing it in the local systems. Because the data has to be downloaded to the central site, the network BW limitation will be a hindrance for the scientific discoveries. Also analyzing this PB-scale on local machines in a centralized manner is challenging. In this, virtual observatory is a step towards this problem, however, it does not provide the data mining model (Zhang et al., 2004). Adding the distributed data mining layer to the VO can be the solution in which the knowledge can be downloaded by the astronomers instead the raw data and thereafter astronomers can either reconstruct the data back from the downloaded knowledge or use the knowledge directly for further analysis. Therefore, in this paper, we present Distributed Load Balancing Principal Component Analysis for optimally distributing the computation among the available nodes to minimize the transmission cost and downloading cost for the end user. The experimental analysis is done with Fundamental Plane (FP) data, Gadotti data and complex Mfeat data. In terms of transmission cost, our approach performs better than Qi et al. and Yue et al. The analysis shows that with the complex Mfeat data ∼90% downloading cost can be reduced for the end user with the negligible loss in accuracy.

  12. A toolbox and a record for scientific model development

    NASA Technical Reports Server (NTRS)

    Ellman, Thomas

    1994-01-01

    Scientific computation can benefit from software tools that facilitate construction of computational models, control the application of models, and aid in revising models to handle new situations. Existing environments for scientific programming provide only limited means of handling these tasks. This paper describes a two pronged approach for handling these tasks: (1) designing a 'Model Development Toolbox' that includes a basic set of model constructing operations; and (2) designing a 'Model Development Record' that is automatically generated during model construction. The record is subsequently exploited by tools that control the application of scientific models and revise models to handle new situations. Our two pronged approach is motivated by our belief that the model development toolbox and record should be highly interdependent. In particular, a suitable model development record can be constructed only when models are developed using a well defined set of operations. We expect this research to facilitate rapid development of new scientific computational models, to help ensure appropriate use of such models and to facilitate sharing of such models among working computational scientists. We are testing this approach by extending SIGMA, and existing knowledge-based scientific software design tool.

  13. Big Data Smart Socket (BDSS): a system that abstracts data transfer habits from end users.

    PubMed

    Watts, Nicholas A; Feltus, Frank A

    2017-02-15

    The ability to centralize and store data for long periods on an end user's computational resources is increasingly difficult for many scientific disciplines. For example, genomics data is increasingly large and distributed, and the data needs to be moved into workflow execution sites ranging from lab workstations to the cloud. However, the typical user is not always informed on emerging network technology or the most efficient methods to move and share data. Thus, the user defaults to using inefficient methods for transfer across the commercial internet. To accelerate large data transfer, we created a tool called the Big Data Smart Socket (BDSS) that abstracts data transfer methodology from the user. The user provides BDSS with a manifest of datasets stored in a remote storage repository. BDSS then queries a metadata repository for curated data transfer mechanisms and optimal path to move each of the files in the manifest to the site of workflow execution. BDSS functions as a standalone tool or can be directly integrated into a computational workflow such as provided by the Galaxy Project. To demonstrate applicability, we use BDSS within a biological context, although it is applicable to any scientific domain. BDSS is available under version 2 of the GNU General Public License at https://github.com/feltus/BDSS . ffeltus@clemson.edu. © The Author 2016. Published by Oxford University Press.

  14. Big Data Smart Socket (BDSS): a system that abstracts data transfer habits from end users

    PubMed Central

    Watts, Nicholas A.

    2017-01-01

    Motivation: The ability to centralize and store data for long periods on an end user’s computational resources is increasingly difficult for many scientific disciplines. For example, genomics data is increasingly large and distributed, and the data needs to be moved into workflow execution sites ranging from lab workstations to the cloud. However, the typical user is not always informed on emerging network technology or the most efficient methods to move and share data. Thus, the user defaults to using inefficient methods for transfer across the commercial internet. Results: To accelerate large data transfer, we created a tool called the Big Data Smart Socket (BDSS) that abstracts data transfer methodology from the user. The user provides BDSS with a manifest of datasets stored in a remote storage repository. BDSS then queries a metadata repository for curated data transfer mechanisms and optimal path to move each of the files in the manifest to the site of workflow execution. BDSS functions as a standalone tool or can be directly integrated into a computational workflow such as provided by the Galaxy Project. To demonstrate applicability, we use BDSS within a biological context, although it is applicable to any scientific domain. Availability and Implementation: BDSS is available under version 2 of the GNU General Public License at https://github.com/feltus/BDSS. Contact: ffeltus@clemson.edu PMID:27797780

  15. Causal biological network database: a comprehensive platform of causal biological network models focused on the pulmonary and vascular systems.

    PubMed

    Boué, Stéphanie; Talikka, Marja; Westra, Jurjen Willem; Hayes, William; Di Fabio, Anselmo; Park, Jennifer; Schlage, Walter K; Sewer, Alain; Fields, Brett; Ansari, Sam; Martin, Florian; Veljkovic, Emilija; Kenney, Renee; Peitsch, Manuel C; Hoeng, Julia

    2015-01-01

    With the wealth of publications and data available, powerful and transparent computational approaches are required to represent measured data and scientific knowledge in a computable and searchable format. We developed a set of biological network models, scripted in the Biological Expression Language, that reflect causal signaling pathways across a wide range of biological processes, including cell fate, cell stress, cell proliferation, inflammation, tissue repair and angiogenesis in the pulmonary and cardiovascular context. This comprehensive collection of networks is now freely available to the scientific community in a centralized web-based repository, the Causal Biological Network database, which is composed of over 120 manually curated and well annotated biological network models and can be accessed at http://causalbionet.com. The website accesses a MongoDB, which stores all versions of the networks as JSON objects and allows users to search for genes, proteins, biological processes, small molecules and keywords in the network descriptions to retrieve biological networks of interest. The content of the networks can be visualized and browsed. Nodes and edges can be filtered and all supporting evidence for the edges can be browsed and is linked to the original articles in PubMed. Moreover, networks may be downloaded for further visualization and evaluation. Database URL: http://causalbionet.com © The Author(s) 2015. Published by Oxford University Press.

  16. The Petascale Data Storage Institute

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gibson, Garth; Long, Darrell; Honeyman, Peter

    2013-07-01

    Petascale computing infrastructures for scientific discovery make petascale demands on information storage capacity, performance, concurrency, reliability, availability, and manageability.The Petascale Data Storage Institute focuses on the data storage problems found in petascale scientific computing environments, with special attention to community issues such as interoperability, community buy-in, and shared tools.The Petascale Data Storage Institute is a collaboration between researchers at Carnegie Mellon University, National Energy Research Scientific Computing Center, Pacific Northwest National Laboratory, Oak Ridge National Laboratory, Sandia National Laboratory, Los Alamos National Laboratory, University of Michigan, and the University of California at Santa Cruz.

  17. Proceedings: Fourth Workshop on Mining Scientific Datasets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kamath, C

    Commercial applications of data mining in areas such as e-commerce, market-basket analysis, text-mining, and web-mining have taken on a central focus in the JCDD community. However, there is a significant amount of innovative data mining work taking place in the context of scientific and engineering applications that is not well represented in the mainstream KDD conferences. For example, scientific data mining techniques are being developed and applied to diverse fields such as remote sensing, physics, chemistry, biology, astronomy, structural mechanics, computational fluid dynamics etc. In these areas, data mining frequently complements and enhances existing analysis methods based on statistics, exploratorymore » data analysis, and domain-specific approaches. On the surface, it may appear that data from one scientific field, say genomics, is very different from another field, such as physics. However, despite their diversity, there is much that is common across the mining of scientific and engineering data. For example, techniques used to identify objects in images are very similar, regardless of whether the images came from a remote sensing application, a physics experiment, an astronomy observation, or a medical study. Further, with data mining being applied to new types of data, such as mesh data from scientific simulations, there is the opportunity to apply and extend data mining to new scientific domains. This one-day workshop brings together data miners analyzing science data and scientists from diverse fields to share their experiences, learn how techniques developed in one field can be applied in another, and better understand some of the newer techniques being developed in the KDD community. This is the fourth workshop on the topic of Mining Scientific Data sets; for information on earlier workshops, see http://www.ahpcrc.org/conferences/. This workshop continues the tradition of addressing challenging problems in a field where the diversity of applications is matched only by the opportunities that await a practitioner.« less

  18. The need for scientific software engineering in the pharmaceutical industry

    NASA Astrophysics Data System (ADS)

    Luty, Brock; Rose, Peter W.

    2017-03-01

    Scientific software engineering is a distinct discipline from both computational chemistry project support and research informatics. A scientific software engineer not only has a deep understanding of the science of drug discovery but also the desire, skills and time to apply good software engineering practices. A good team of scientific software engineers can create a software foundation that is maintainable, validated and robust. If done correctly, this foundation enable the organization to investigate new and novel computational ideas with a very high level of efficiency.

  19. The need for scientific software engineering in the pharmaceutical industry.

    PubMed

    Luty, Brock; Rose, Peter W

    2017-03-01

    Scientific software engineering is a distinct discipline from both computational chemistry project support and research informatics. A scientific software engineer not only has a deep understanding of the science of drug discovery but also the desire, skills and time to apply good software engineering practices. A good team of scientific software engineers can create a software foundation that is maintainable, validated and robust. If done correctly, this foundation enable the organization to investigate new and novel computational ideas with a very high level of efficiency.

  20. Understanding the Performance and Potential of Cloud Computing for Scientific Applications

    DOE PAGES

    Sadooghi, Iman; Martin, Jesus Hernandez; Li, Tonglin; ...

    2015-02-19

    In this paper, commercial clouds bring a great opportunity to the scientific computing area. Scientific applications usually require significant resources, however not all scientists have access to sufficient high-end computing systems, may of which can be found in the Top500 list. Cloud Computing has gained the attention of scientists as a competitive resource to run HPC applications at a potentially lower cost. But as a different infrastructure, it is unclear whether clouds are capable of running scientific applications with a reasonable performance per money spent. This work studies the performance of public clouds and places this performance in context tomore » price. We evaluate the raw performance of different services of AWS cloud in terms of the basic resources, such as compute, memory, network and I/O. We also evaluate the performance of the scientific applications running in the cloud. This paper aims to assess the ability of the cloud to perform well, as well as to evaluate the cost of the cloud running scientific applications. We developed a full set of metrics and conducted a comprehensive performance evlauation over the Amazon cloud. We evaluated EC2, S3, EBS and DynamoDB among the many Amazon AWS services. We evaluated the memory sub-system performance with CacheBench, the network performance with iperf, processor and network performance with the HPL benchmark application, and shared storage with NFS and PVFS in addition to S3. We also evaluated a real scientific computing application through the Swift parallel scripting system at scale. Armed with both detailed benchmarks to gauge expected performance and a detailed monetary cost analysis, we expect this paper will be a recipe cookbook for scientists to help them decide where to deploy and run their scientific applications between public clouds, private clouds, or hybrid clouds.« less

  1. Understanding the Performance and Potential of Cloud Computing for Scientific Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sadooghi, Iman; Martin, Jesus Hernandez; Li, Tonglin

    In this paper, commercial clouds bring a great opportunity to the scientific computing area. Scientific applications usually require significant resources, however not all scientists have access to sufficient high-end computing systems, may of which can be found in the Top500 list. Cloud Computing has gained the attention of scientists as a competitive resource to run HPC applications at a potentially lower cost. But as a different infrastructure, it is unclear whether clouds are capable of running scientific applications with a reasonable performance per money spent. This work studies the performance of public clouds and places this performance in context tomore » price. We evaluate the raw performance of different services of AWS cloud in terms of the basic resources, such as compute, memory, network and I/O. We also evaluate the performance of the scientific applications running in the cloud. This paper aims to assess the ability of the cloud to perform well, as well as to evaluate the cost of the cloud running scientific applications. We developed a full set of metrics and conducted a comprehensive performance evlauation over the Amazon cloud. We evaluated EC2, S3, EBS and DynamoDB among the many Amazon AWS services. We evaluated the memory sub-system performance with CacheBench, the network performance with iperf, processor and network performance with the HPL benchmark application, and shared storage with NFS and PVFS in addition to S3. We also evaluated a real scientific computing application through the Swift parallel scripting system at scale. Armed with both detailed benchmarks to gauge expected performance and a detailed monetary cost analysis, we expect this paper will be a recipe cookbook for scientists to help them decide where to deploy and run their scientific applications between public clouds, private clouds, or hybrid clouds.« less

  2. Social network analysis of international scientific collaboration on psychiatry research.

    PubMed

    Wu, Ying; Duan, Zhiguang

    2015-01-01

    Mental disorder is harmful to human health, effects social life seriously and still brings a heavy burden for countries all over the world. Scientific collaboration has become the indispensable choice for progress in the field of biomedicine. However, there have been few scientific publications on scientific collaboration in psychiatry research so far. The aim of this study was to measure the activities of scientific collaboration in psychiatry research at the level of authors, institutions and countries. We retrieved 36557 papers about psychiatry from Science Ciation Index Expanded (SCI-Expanded) in web of science. Additionally, some methods such as social network analysis (SNA), K-plex analysis and Core-Periphery were used in this study. Collaboration has been increasing at the level of authors, institutions and countries in psychiatry in the last ten years. We selected the top 100 prolific authors, institutions and 30 countries to construct collaborative map respectively. Freedman, R and Seidman, LJ were the central authors, Harvard university was the central institution and the USA was the central country of the whole network. Notably, the rate of economic development of countries affected collaborative behavior. The results show that we should encourage multiple collaboration types in psychiatry research as they not only help researchers to master the current research hotspots but also provide scientific basis for clinical research on psychiatry and suggest policies to promote the development of this area.

  3. 75 FR 65639 - Center for Scientific Review; Notice of Closed Meetings

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-26

    ...: Computational Biology Special Emphasis Panel A. Date: October 29, 2010. Time: 2 p.m. to 3:30 p.m. Agenda: To.... Name of Committee: Center for Scientific Review Special Emphasis Panel; Member Conflict: Computational...

  4. The Effects of Inquiry-Based Computer Simulation with Cooperative Learning on Scientific Thinking and Conceptual Understanding of Gas Laws

    ERIC Educational Resources Information Center

    Abdullah, Sopiah; Shariff, Adilah

    2008-01-01

    The purpose of the study was to investigate the effects of inquiry-based computer simulation with heterogeneous-ability cooperative learning (HACL) and inquiry-based computer simulation with friendship cooperative learning (FCL) on (a) scientific reasoning (SR) and (b) conceptual understanding (CU) among Form Four students in Malaysian Smart…

  5. Customizable scientific web-portal for DIII-D nuclear fusion experiment

    NASA Astrophysics Data System (ADS)

    Abla, G.; Kim, E. N.; Schissel, D. P.

    2010-04-01

    Increasing utilization of the Internet and convenient web technologies has made the web-portal a major application interface for remote participation and control of scientific instruments. While web-portals have provided a centralized gateway for multiple computational services, the amount of visual output often is overwhelming due to the high volume of data generated by complex scientific instruments and experiments. Since each scientist may have different priorities and areas of interest in the experiment, filtering and organizing information based on the individual user's need can increase the usability and efficiency of a web-portal. DIII-D is the largest magnetic nuclear fusion device in the US. A web-portal has been designed to support the experimental activities of DIII-D researchers worldwide. It offers a customizable interface with personalized page layouts and list of services for users to select. Each individual user can create a unique working environment to fit his own needs and interests. Customizable services are: real-time experiment status monitoring, diagnostic data access, interactive data analysis and visualization. The web-portal also supports interactive collaborations by providing collaborative logbook, and online instant announcement services. The DIII-D web-portal development utilizes multi-tier software architecture, and Web 2.0 technologies and tools, such as AJAX and Django, to develop a highly-interactive and customizable user interface.

  6. 21 CFR 1305.24 - Central processing of orders.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... or more registered locations and maintains a central processing computer system in which orders are... order with all linked records on the central computer system. (b) A company that has central processing... the company owns and operates. ...

  7. 21 CFR 1305.24 - Central processing of orders.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... or more registered locations and maintains a central processing computer system in which orders are... order with all linked records on the central computer system. (b) A company that has central processing... the company owns and operates. ...

  8. 21 CFR 1305.24 - Central processing of orders.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... or more registered locations and maintains a central processing computer system in which orders are... order with all linked records on the central computer system. (b) A company that has central processing... the company owns and operates. ...

  9. 21 CFR 1305.24 - Central processing of orders.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... or more registered locations and maintains a central processing computer system in which orders are... order with all linked records on the central computer system. (b) A company that has central processing... the company owns and operates. ...

  10. 21 CFR 1305.24 - Central processing of orders.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... or more registered locations and maintains a central processing computer system in which orders are... order with all linked records on the central computer system. (b) A company that has central processing... the company owns and operates. ...

  11. RIACS/USRA

    NASA Technical Reports Server (NTRS)

    Oliger, Joseph

    1993-01-01

    The Research Institute for Advanced Computer Science (RIACS) was established by the Universities Space Research Association (USRA) at the NASA Ames Research Center (ARC) on 6 June 1983. RIACS is privately operated by USRA, a consortium of universities with research programs in the aerospace sciences, under contract with NASA. The primary mission of RIACS is to provide research and expertise in computer science and scientific computing to support the scientific missions of NASA ARC. The research carried out at RIACS must change its emphasis from year to year in response to NASA ARC's changing needs and technological opportunities. A flexible scientific staff is provided through a university faculty visitor program, a post doctoral program, and a student visitor program. Not only does this provide appropriate expertise but it also introduces scientists outside of NASA to NASA problems. A small group of core RIACS staff provides continuity and interacts with an ARC technical monitor and scientific advisory group to determine the RIACS mission. RIACS activities are reviewed and monitored by a USRA advisory council and ARC technical monitor. Research at RIACS is currently being done in the following areas: Parallel Computing, Advanced Methods for Scientific Computing, High Performance Networks and Technology, and Learning Systems. Parallel compiler techniques, adaptive numerical methods for flows in complicated geometries, and optimization were identified as important problems to investigate for ARC's involvement in the Computational Grand Challenges of the next decade.

  12. The emergence of spatial cyberinfrastructure.

    PubMed

    Wright, Dawn J; Wang, Shaowen

    2011-04-05

    Cyberinfrastructure integrates advanced computer, information, and communication technologies to empower computation-based and data-driven scientific practice and improve the synthesis and analysis of scientific data in a collaborative and shared fashion. As such, it now represents a paradigm shift in scientific research that has facilitated easy access to computational utilities and streamlined collaboration across distance and disciplines, thereby enabling scientific breakthroughs to be reached more quickly and efficiently. Spatial cyberinfrastructure seeks to resolve longstanding complex problems of handling and analyzing massive and heterogeneous spatial datasets as well as the necessity and benefits of sharing spatial data flexibly and securely. This article provides an overview and potential future directions of spatial cyberinfrastructure. The remaining four articles of the special feature are introduced and situated in the context of providing empirical examples of how spatial cyberinfrastructure is extending and enhancing scientific practice for improved synthesis and analysis of both physical and social science data. The primary focus of the articles is spatial analyses using distributed and high-performance computing, sensor networks, and other advanced information technology capabilities to transform massive spatial datasets into insights and knowledge.

  13. The emergence of spatial cyberinfrastructure

    PubMed Central

    Wright, Dawn J.; Wang, Shaowen

    2011-01-01

    Cyberinfrastructure integrates advanced computer, information, and communication technologies to empower computation-based and data-driven scientific practice and improve the synthesis and analysis of scientific data in a collaborative and shared fashion. As such, it now represents a paradigm shift in scientific research that has facilitated easy access to computational utilities and streamlined collaboration across distance and disciplines, thereby enabling scientific breakthroughs to be reached more quickly and efficiently. Spatial cyberinfrastructure seeks to resolve longstanding complex problems of handling and analyzing massive and heterogeneous spatial datasets as well as the necessity and benefits of sharing spatial data flexibly and securely. This article provides an overview and potential future directions of spatial cyberinfrastructure. The remaining four articles of the special feature are introduced and situated in the context of providing empirical examples of how spatial cyberinfrastructure is extending and enhancing scientific practice for improved synthesis and analysis of both physical and social science data. The primary focus of the articles is spatial analyses using distributed and high-performance computing, sensor networks, and other advanced information technology capabilities to transform massive spatial datasets into insights and knowledge. PMID:21467227

  14. Developing an EarthCube Governance Structure for Big Data Preservation and Access

    NASA Astrophysics Data System (ADS)

    Leetaru, H. E.; Leetaru, K. H.

    2012-12-01

    The underlying vision of the NSF EarthCube initiative is of an enduring resource serving the needs of the earth sciences for today and the future. We must therefore view this effort through the lens of what the earth sciences will need tomorrow and on how the underlying processes of data compilation, preservation, and access interplay with the scientific processes within the communities EarthCube will serve. Key issues that must be incorporated into the EarthCube governance structure include authentication, retrieval, and unintended use cases, the emerging role of whole-corpus data mining, and how inventory, citation, and archive practices will impact the ability of scientists to use EarthCube's collections into the future. According to the National Academies, the US federal government spends over $140 billion dollars a year in support of the nation's research base. Yet, a critical issue confronting all of the major scientific disciplines in building upon this investment is the lack of processes that guide how data are preserved for the long-term, ensuring that studies can be replicated and that experimental data remains accessible as new analytic methods become available or theories evolve. As datasets are used years or even decades after their creation, far richer metadata is needed to describe the underlying simulation, smoothing algorithms or bounding parameters of the data collection process. This is even truer as data are increasingly used outside their intended disciplines, as geoscience researchers apply algorithms from one discipline to datasets from another, where their analytical techniques may make extensive assumptions about the data. As science becomes increasingly interdisciplinary and emerging computational approaches begin applying whole-corpus methodologies and blending multiple archives together, we are facing new data access modalities distinct from the needs of the past, drawing into focus the question of centralized versus distributed architectures. In the past geoscience data have been distributed, with each site maintaining its own collections and centralized inventory metadata supporting discovery. This was based on the historical search-browse-download modality where access was primarily to download a copy to a researcher's own machine and datasets were measured in gigabytes. New "big data" approaches to the geosciences are already demonstrating the need to analyze the entirety of multiple collections from multiple sites totaling hundreds of terabytes in size. Yet, datasets are outpacing the ability of networks to share them, forcing a new paradigm in high-performance computing where computation must migrate to centralized data stores. The next generation of geoscientists are going to need a system designed for exploring and understanding data from multiple scientific domains and vantages where data are preserved for decades. We are not alone in this endeavor and there are many lessons we can learn from similar initiatives such as more than 40 years of governance policies for data warehouses and 15 years of open web archives, all of which face the same challenges. The entire EarthCube project will fail if the new governance structure does not account for the needs of integrated cyberinfrastructure that allows big data to stored, archived, analyzed, and made accessible to large numbers of scientists.

  15. Evolution of the Virtualized HPC Infrastructure of Novosibirsk Scientific Center

    NASA Astrophysics Data System (ADS)

    Adakin, A.; Anisenkov, A.; Belov, S.; Chubarov, D.; Kalyuzhny, V.; Kaplin, V.; Korol, A.; Kuchin, N.; Lomakin, S.; Nikultsev, V.; Skovpen, K.; Sukharev, A.; Zaytsev, A.

    2012-12-01

    Novosibirsk Scientific Center (NSC), also known worldwide as Akademgorodok, is one of the largest Russian scientific centers hosting Novosibirsk State University (NSU) and more than 35 research organizations of the Siberian Branch of Russian Academy of Sciences including Budker Institute of Nuclear Physics (BINP), Institute of Computational Technologies, and Institute of Computational Mathematics and Mathematical Geophysics (ICM&MG). Since each institute has specific requirements on the architecture of computing farms involved in its research field, currently we've got several computing facilities hosted by NSC institutes, each optimized for a particular set of tasks, of which the largest are the NSU Supercomputer Center, Siberian Supercomputer Center (ICM&MG), and a Grid Computing Facility of BINP. A dedicated optical network with the initial bandwidth of 10 Gb/s connecting these three facilities was built in order to make it possible to share the computing resources among the research communities, thus increasing the efficiency of operating the existing computing facilities and offering a common platform for building the computing infrastructure for future scientific projects. Unification of the computing infrastructure is achieved by extensive use of virtualization technology based on XEN and KVM platforms. This contribution gives a thorough review of the present status and future development prospects for the NSC virtualized computing infrastructure and the experience gained while using it for running production data analysis jobs related to HEP experiments being carried out at BINP, especially the KEDR detector experiment at the VEPP-4M electron-positron collider.

  16. Scientific Research: Commodities or Commons?

    ERIC Educational Resources Information Center

    Vermeir, Koen

    2013-01-01

    Truth is for sale today, some critics claim. The increased commodification of science corrupts it, scientific fraud is rampant and the age-old trust in science is shattered. This cynical view, although gaining in prominence, does not explain very well the surprising motivation and integrity that is still central to the scientific life. Although…

  17. Working toward a Stronger Conceptualization of Scientific Explanation for Science Education

    ERIC Educational Resources Information Center

    Braaten, Melissa; Windschitl, Mark

    2011-01-01

    Scientific explanation plays a central role in science education reform documents, including the "Benchmarks for Science Literacy," the "National Science Education Standards", and the recent research report, "Taking Science to School." While scientific explanation receives significant emphases in these documents, there is little discussion or…

  18. Sexual Consent as a Scientific Subject: A Literature Review

    ERIC Educational Resources Information Center

    Fenner, Lydia

    2017-01-01

    Despite the presumed centrality of sexual consent to definitions of sexual violence, it remains an ambiguous and often unexamined concept both in lay and professional/scientific discourses. The following literature review of peer-reviewed research studying sexual consent as a scientific object will thematically present major findings from said…

  19. When ethics constrains clinical research: trial design of control arms in "greater than minimal risk" pediatric trials.

    PubMed

    de Melo-Martín, Inmaculada; Sondhi, Dolan; Crystal, Ronald G

    2011-09-01

    For more than three decades clinical research in the United States has been explicitly guided by the idea that ethical considerations must be central to research design and practice. In spite of the centrality of this idea, attempting to balance the sometimes conflicting values of advancing scientific knowledge and protecting human subjects continues to pose challenges. Possible conflicts between the standards of scientific research and those of ethics are particularly salient in relation to trial design. Specifically, the choice of a control arm is an aspect of trial design in which ethical and scientific issues are deeply entwined. Although ethical quandaries related to the choice of control arms may arise when conducting any type of clinical trials, they are conspicuous in early phase gene transfer trials that involve highly novel approaches and surgical procedures and have children as the research subjects. Because of children's and their parents' vulnerabilities, in trials that investigate therapies for fatal, rare diseases affecting minors, the scientific and ethical concerns related to choosing appropriate controls are particularly significant. In this paper we use direct gene transfer to the central nervous system to treat late infantile neuronal ceroid lipofuscinosis to illustrate some of these ethical issues and explore possible solutions to real and apparent conflicts between scientific and ethical considerations.

  20. Opus: A Coordination Language for Multidisciplinary Applications

    NASA Technical Reports Server (NTRS)

    Chapman, Barbara; Haines, Matthew; Mehrotra, Piyush; Zima, Hans; vanRosendale, John

    1997-01-01

    Data parallel languages, such as High Performance fortran, can be successfully applied to a wide range of numerical applications. However, many advanced scientific and engineering applications are multidisciplinary and heterogeneous in nature, and thus do not fit well into the data parallel paradigm. In this paper we present Opus, a language designed to fill this gap. The central concept of Opus is a mechanism called ShareD Abstractions (SDA). An SDA can be used as a computation server, i.e., a locus of computational activity, or as a data repository for sharing data between asynchronous tasks. SDAs can be internally data parallel, providing support for the integration of data and task parallelism as well as nested task parallelism. They can thus be used to express multidisciplinary applications in a natural and efficient way. In this paper we describe the features of the language through a series of examples and give an overview of the runtime support required to implement these concepts in parallel and distributed environments.

  1. Exascale Storage Systems the SIRIUS Way

    NASA Astrophysics Data System (ADS)

    Klasky, S. A.; Abbasi, H.; Ainsworth, M.; Choi, J.; Curry, M.; Kurc, T.; Liu, Q.; Lofstead, J.; Maltzahn, C.; Parashar, M.; Podhorszki, N.; Suchyta, E.; Wang, F.; Wolf, M.; Chang, C. S.; Churchill, M.; Ethier, S.

    2016-10-01

    As the exascale computing age emerges, data related issues are becoming critical factors that determine how and where we do computing. Popular approaches used by traditional I/O solution and storage libraries become increasingly bottlenecked due to their assumptions about data movement, re-organization, and storage. While, new technologies, such as “burst buffers”, can help address some of the short-term performance issues, it is essential that we reexamine the underlying storage and I/O infrastructure to effectively support requirements and challenges at exascale and beyond. In this paper we present a new approach to the exascale Storage System and I/O (SSIO), which is based on allowing users to inject application knowledge into the system and leverage this knowledge to better manage, store, and access large data volumes so as to minimize the time to scientific insights. Central to our approach is the distinction between the data, metadata, and the knowledge contained therein, transferred from the user to the system by describing “utility” of data as it ages.

  2. Web-based analysis and publication of flow cytometry experiments.

    PubMed

    Kotecha, Nikesh; Krutzik, Peter O; Irish, Jonathan M

    2010-07-01

    Cytobank is a Web-based application for storage, analysis, and sharing of flow cytometry experiments. Researchers use a Web browser to log in and use a wide range of tools developed for basic and advanced flow cytometry. In addition to providing access to standard cytometry tools from any computer, Cytobank creates a platform and community for developing new analysis and publication tools. Figure layouts created on Cytobank are designed to allow transparent access to the underlying experiment annotation and data processing steps. Since all flow cytometry files and analysis data are stored on a central server, experiments and figures can be viewed or edited by anyone with the proper permission, from any computer with Internet access. Once a primary researcher has performed the initial analysis of the data, collaborators can engage in experiment analysis and make their own figure layouts using the gated, compensated experiment files. Cytobank is available to the scientific community at http://www.cytobank.org. (c) 2010 by John Wiley & Sons, Inc.

  3. Web-Based Analysis and Publication of Flow Cytometry Experiments

    PubMed Central

    Kotecha, Nikesh; Krutzik, Peter O.; Irish, Jonathan M.

    2014-01-01

    Cytobank is a web-based application for storage, analysis, and sharing of flow cytometry experiments. Researchers use a web browser to log in and use a wide range of tools developed for basic and advanced flow cytometry. In addition to providing access to standard cytometry tools from any computer, Cytobank creates a platform and community for developing new analysis and publication tools. Figure layouts created on Cytobank are designed to allow transparent access to the underlying experiment annotation and data processing steps. Since all flow cytometry files and analysis data are stored on a central server, experiments and figures can be viewed or edited by anyone with the proper permissions from any computer with Internet access. Once a primary researcher has performed the initial analysis of the data, collaborators can engage in experiment analysis and make their own figure layouts using the gated, compensated experiment files. Cytobank is available to the scientific community at www.cytobank.org PMID:20578106

  4. Position Paper: Applying Machine Learning to Software Analysis to Achieve Trusted, Repeatable Scientific Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Prowell, Stacy J; Symons, Christopher T

    2015-01-01

    Producing trusted results from high-performance codes is essential for policy and has significant economic impact. We propose combining rigorous analytical methods with machine learning techniques to achieve the goal of repeatable, trustworthy scientific computing.

  5. Identifying the Factors Leading to Success: How an Innovative Science Curriculum Cultivates Student Motivation

    NASA Astrophysics Data System (ADS)

    Scogin, Stephen C.

    2016-06-01

    PlantingScience is an award-winning program recognized for its innovation and use of computer-supported scientist mentoring. Science learners work on inquiry-based experiments in their classrooms and communicate asynchronously with practicing plant scientist-mentors about the projects. The purpose of this study was to identify specific factors contributing to the program's effectiveness in engaging students. Using multiple data sources, grounded theory (Strauss and Corbin in Basics of qualitative research. Sage, Newbury Park, 1990) was used to develop a conceptual model identifying the central phenomenon, causal conditions, intervening conditions, strategies, contexts, and student outcomes of the project. Student motivation was determined to be the central phenomenon explaining the success of the program, with student empowerment, online mentor interaction, and authenticity of the scientific experiences serving as causal conditions. Teachers contributed to student motivation by giving students more freedom, challenging students to take projects deeper, encouraging, and scaffolding. Scientists contributed to student motivation by providing explanations, asking questions, encouraging, and offering themselves as partners in the inquiry process. Several positive student outcomes of the program were uncovered and included increased positivity, greater willingness to take projects deeper, better understanding of scientific concepts, and greater commitments to collaboration. The findings of this study provide relevant information on how to develop curriculum, use technology, and train practitioners and mentors to utilize strategies and actions that improve learners' motivation to engage in authentic science in the classroom.

  6. Advanced Scientific Computing Research Exascale Requirements Review. An Office of Science review sponsored by Advanced Scientific Computing Research, September 27-29, 2016, Rockville, Maryland

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Almgren, Ann; DeMar, Phil; Vetter, Jeffrey

    The widespread use of computing in the American economy would not be possible without a thoughtful, exploratory research and development (R&D) community pushing the performance edge of operating systems, computer languages, and software libraries. These are the tools and building blocks — the hammers, chisels, bricks, and mortar — of the smartphone, the cloud, and the computing services on which we rely. Engineers and scientists need ever-more specialized computing tools to discover new material properties for manufacturing, make energy generation safer and more efficient, and provide insight into the fundamentals of the universe, for example. The research division of themore » U.S. Department of Energy’s (DOE’s) Office of Advanced Scientific Computing and Research (ASCR Research) ensures that these tools and building blocks are being developed and honed to meet the extreme needs of modern science. See also http://exascaleage.org/ascr/ for additional information.« less

  7. Quantum Testbeds Stakeholder Workshop (QTSW) Report meeting purpose and agenda.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hebner, Gregory A.

    Quantum computing (QC) is a promising early-stage technology with the potential to provide scientific computing capabilities far beyond what is possible with even an Exascale computer in specific problems of relevance to the Office of Science. These include (but are not limited to) materials modeling, molecular dynamics, and quantum chromodynamics. However, commercial QC systems are not yet available and the technical maturity of current QC hardware, software, algorithms, and systems integration is woefully incomplete. Thus, there is a significant opportunity for DOE to define the technology building blocks, and solve the system integration issues to enable a revolutionary tool. Oncemore » realized, QC will have world changing impact on economic competitiveness, the scientific enterprise, and citizen well-being. Prior to this workshop, DOE / Office of Advanced Scientific Computing Research (ASCR) hosted a workshop in 2015 to explore QC scientific applications. The goal of that workshop was to assess the viability of QC technologies to meet the computational requirements in support of DOE’s science and energy mission and to identify the potential impact of these technologies.« less

  8. Investigating power capping toward energy-efficient scientific applications: Investigating Power Capping toward Energy-Efficient Scientific Applications

    DOE PAGES

    Haidar, Azzam; Jagode, Heike; Vaccaro, Phil; ...

    2018-03-22

    The emergence of power efficiency as a primary constraint in processor and system design poses new challenges concerning power and energy awareness for numerical libraries and scientific applications. Power consumption also plays a major role in the design of data centers, which may house petascale or exascale-level computing systems. At these extreme scales, understanding and improving the energy efficiency of numerical libraries and their related applications becomes a crucial part of the successful implementation and operation of the computing system. In this paper, we study and investigate the practice of controlling a compute system's power usage, and we explore howmore » different power caps affect the performance of numerical algorithms with different computational intensities. Further, we determine the impact, in terms of performance and energy usage, that these caps have on a system running scientific applications. This analysis will enable us to characterize the types of algorithms that benefit most from these power management schemes. Our experiments are performed using a set of representative kernels and several popular scientific benchmarks. Lastly, we quantify a number of power and performance measurements and draw observations and conclusions that can be viewed as a roadmap to achieving energy efficiency in the design and execution of scientific algorithms.« less

  9. Investigating power capping toward energy-efficient scientific applications: Investigating Power Capping toward Energy-Efficient Scientific Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Haidar, Azzam; Jagode, Heike; Vaccaro, Phil

    The emergence of power efficiency as a primary constraint in processor and system design poses new challenges concerning power and energy awareness for numerical libraries and scientific applications. Power consumption also plays a major role in the design of data centers, which may house petascale or exascale-level computing systems. At these extreme scales, understanding and improving the energy efficiency of numerical libraries and their related applications becomes a crucial part of the successful implementation and operation of the computing system. In this paper, we study and investigate the practice of controlling a compute system's power usage, and we explore howmore » different power caps affect the performance of numerical algorithms with different computational intensities. Further, we determine the impact, in terms of performance and energy usage, that these caps have on a system running scientific applications. This analysis will enable us to characterize the types of algorithms that benefit most from these power management schemes. Our experiments are performed using a set of representative kernels and several popular scientific benchmarks. Lastly, we quantify a number of power and performance measurements and draw observations and conclusions that can be viewed as a roadmap to achieving energy efficiency in the design and execution of scientific algorithms.« less

  10. A Computing Environment to Support Repeatable Scientific Big Data Experimentation of World-Wide Scientific Literature

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schlicher, Bob G; Kulesz, James J; Abercrombie, Robert K

    A principal tenant of the scientific method is that experiments must be repeatable and relies on ceteris paribus (i.e., all other things being equal). As a scientific community, involved in data sciences, we must investigate ways to establish an environment where experiments can be repeated. We can no longer allude to where the data comes from, we must add rigor to the data collection and management process from which our analysis is conducted. This paper describes a computing environment to support repeatable scientific big data experimentation of world-wide scientific literature, and recommends a system that is housed at the Oakmore » Ridge National Laboratory in order to provide value to investigators from government agencies, academic institutions, and industry entities. The described computing environment also adheres to the recently instituted digital data management plan mandated by multiple US government agencies, which involves all stages of the digital data life cycle including capture, analysis, sharing, and preservation. It particularly focuses on the sharing and preservation of digital research data. The details of this computing environment are explained within the context of cloud services by the three layer classification of Software as a Service , Platform as a Service , and Infrastructure as a Service .« less

  11. The Central Asian Journal of Global Health to Increase Scientific Productivity.

    PubMed

    Freese, Kyle; Shubnikov, Eugene; LaPorte, Ron; Adambekov, Shalkar; Askarova, Sholpan; Zhumadilov, Zhaxybay; Linkov, Faina

    2013-01-01

    The WHO Collaborating Center at the University of Pittsburgh, USA partnering with Nazarbayev University, developed the Central Asian Journal of Global Health (CAJGH, cajgh.pitt.edu) in order to increase scientific productivity in Kazakhstan and Central Asia. Scientists in this region often have difficulty publishing in upper tier English language scientific journals due to language barriers, high publication fees, and a lack of access to mentoring services. CAJGH seeks to help scientists overcome these challenges by providing peer-reviewed publication free of change with English and research mentoring services available to selected authors. CAJGH began as a way to expand the Supercourse scientific network (www.pitt.edu/~super1) in the Central Asian region in order to rapidly disseminate educational materials. The network began with approximately 60 individuals in five Central Asian countries and has grown to over 1,300 in a few short years. The CAJGH website receives nearly 900 visits per month. The University of Pittsburgh's "open access publishing system" was utilized to create CAJGH in 2012. There are two branches of the CAJGH editorial board: Astana (at the Center for Life Sciences, Nazarbayev University) and Pittsburgh (WHO Collaborating Center). Both are comprised of leading scientists and expert staff who work together throughout the review and publication process. Two complete issues have been published since 2012 and a third is now underway. Even though CAJGH is a new journal, the editorial board uses a rigorous review process; fewer than 50% of all submitted articles are forwarded to peer review or accepted for publication. Furthermore, in 2014, CAJGH will apply to be cross referenced in PubMed and Scopes. CAJGH is one of the first English language journals in the Central Asian region that reaches a large number of scientists. This journal fills a unique niche that will assist scientists in Kazakhstan and Central Asia publish their research findings and share their knowledge with others around the region and the world.

  12. A general purpose subroutine for fast fourier transform on a distributed memory parallel machine

    NASA Technical Reports Server (NTRS)

    Dubey, A.; Zubair, M.; Grosch, C. E.

    1992-01-01

    One issue which is central in developing a general purpose Fast Fourier Transform (FFT) subroutine on a distributed memory parallel machine is the data distribution. It is possible that different users would like to use the FFT routine with different data distributions. Thus, there is a need to design FFT schemes on distributed memory parallel machines which can support a variety of data distributions. An FFT implementation on a distributed memory parallel machine which works for a number of data distributions commonly encountered in scientific applications is presented. The problem of rearranging the data after computing the FFT is also addressed. The performance of the implementation on a distributed memory parallel machine Intel iPSC/860 is evaluated.

  13. Common Graphics Library (CGL). Volume 2: Low-level user's guide

    NASA Technical Reports Server (NTRS)

    Taylor, Nancy L.; Hammond, Dana P.; Theophilos, Pauline M.

    1989-01-01

    The intent is to instruct the users of the Low-Level routines of the Common Graphics Library (CGL). The Low-Level routines form an application-independent graphics package enabling the user community to construct and design scientific charts conforming to the publication and/or viewgraph process. The Low-Level routines allow the user to design unique or unusual report-quality charts from a set of graphics utilities. The features of these routines can be used stand-alone or in conjunction with other packages to enhance or augment their capabilities. This library is written in ANSI FORTRAN 77, and currently uses a CORE-based underlying graphics package, and is therefore machine-independent, providing support for centralized and/or distributed computer systems.

  14. Genome annotation in a community college cell biology lab.

    PubMed

    Beagley, C Timothy

    2013-01-01

    The Biology Department at Salt Lake Community College has used the IMG-ACT toolbox to introduce a genome mapping and annotation exercise into the laboratory portion of its Cell Biology course. This project provides students with an authentic inquiry-based learning experience while introducing them to computational biology and contemporary learning skills. Additionally, the project strengthens student understanding of the scientific method and contributes to student learning gains in curricular objectives centered around basic molecular biology, specifically, the Central Dogma. Importantly, inclusion of this project in the laboratory course provides students with a positive learning environment and allows for the use of cooperative learning strategies to increase overall student success. Copyright © 2012 International Union of Biochemistry and Molecular Biology, Inc.

  15. An evaluation of the impact of biomass burning smoke aerosol particles on near surface temperature forecasts

    NASA Astrophysics Data System (ADS)

    Zhang, J.; Reid, J. S.; Benedetti, A.; Christensen, M.; Marquis, J. W.

    2016-12-01

    Currently, with the improvements in aerosol forecast accuracies through aerosol data assimilation, the community is unavoidably facing a scientific question: is it worth the computational time to insert real-time aerosol analyses into numerical models for weather forecasts? In this study, by analyzing a significant biomass burning aerosol event that occurred in 2015 over the Northern part of the Central US, the impact of aerosol particles on near-surface temperature forecasts is evaluated. The aerosol direct surface cooling efficiency, which links surface temperature changes to aerosol loading, is derived from observational-based data for the first time. The potential of including real-time aerosol analyses into weather forecasting models for near surface temperature forecasts is also investigated.

  16. A Blended Professional Development Program to Help a Teacher Learn to Provide One-to-One Scaffolding

    NASA Astrophysics Data System (ADS)

    Belland, Brian R.; Burdo, Ryan; Gu, Jiangyue

    2015-04-01

    Argumentation is central to instruction centered on socio-scientific issues (Sadler & Donnelly in International Journal of Science Education, 28(12), 1463-1488, 2006. doi: 10.1080/09500690600708717). Teachers can play a big role in helping students engage in argumentation and solve authentic scientific problems. To do so, they need to learn one-to-one scaffolding—dynamic support to help students accomplish tasks that they could not complete unaided. This study explores a middle school science teacher's provision of one-to-one scaffolding during a problem-based learning unit, in which students argued about how to optimize the water quality of their local river. The blended professional development program incorporated three 1.5-h seminars, one 8-h workshop, and 4 weeks of online education activities. Data sources were video of three small groups per period, and what students typed in response to prompts from computer-based argumentation scaffolds. Results indicated that the teacher provided one-to-one scaffolding on a par with inquiry-oriented teachers described in the literature.

  17. Comparing the Consumption of CPU Hours with Scientific Output for the Extreme Science and Engineering Discovery Environment (XSEDE)

    PubMed Central

    Börner, Katy

    2016-01-01

    This paper presents the results of a study that compares resource usage with publication output using data about the consumption of CPU cycles from the Extreme Science and Engineering Discovery Environment (XSEDE) and resulting scientific publications for 2,691 institutions/teams. Specifically, the datasets comprise a total of 5,374,032,696 central processing unit (CPU) hours run in XSEDE during July 1, 2011 to August 18, 2015 and 2,882 publications that cite the XSEDE resource. Three types of studies were conducted: a geospatial analysis of XSEDE providers and consumers, co-authorship network analysis of XSEDE publications, and bi-modal network analysis of how XSEDE resources are used by different research fields. Resulting visualizations show that a diverse set of consumers make use of XSEDE resources, that users of XSEDE publish together frequently, and that the users of XSEDE with the highest resource usage tend to be “traditional” high-performance computing (HPC) community members from astronomy, atmospheric science, physics, chemistry, and biology. PMID:27310174

  18. Idle waves in high-performance computing

    NASA Astrophysics Data System (ADS)

    Markidis, Stefano; Vencels, Juris; Peng, Ivy Bo; Akhmetova, Dana; Laure, Erwin; Henri, Pierre

    2015-01-01

    The vast majority of parallel scientific applications distributes computation among processes that are in a busy state when computing and in an idle state when waiting for information from other processes. We identify the propagation of idle waves through processes in scientific applications with a local information exchange between the two processes. Idle waves are nondispersive and have a phase velocity inversely proportional to the average busy time. The physical mechanism enabling the propagation of idle waves is the local synchronization between two processes due to remote data dependency. This study provides a description of the large number of processes in parallel scientific applications as a continuous medium. This work also is a step towards an understanding of how localized idle periods can affect remote processes, leading to the degradation of global performance in parallel scientific applications.

  19. Scientific progress without increasing verisimilitude: In response to Niiniluoto.

    PubMed

    Rowbottom, Darrell P

    2015-06-01

    First, I argue that scientific progress is possible in the absence of increasing verisimilitude in science's theories. Second, I argue that increasing theoretical verisimilitude is not the central, or primary, dimension of scientific progress. Third, I defend my previous argument that unjustified changes in scientific belief may be progressive. Fourth, I illustrate how false beliefs can promote scientific progress in ways that cannot be explicated by appeal to verisimilitude. Copyright © 2015 Elsevier Ltd. All rights reserved.

  20. A 3-D Approach for Teaching and Learning about Surface Water Systems through Computational Thinking, Data Visualization and Physical Models

    NASA Astrophysics Data System (ADS)

    Caplan, B.; Morrison, A.; Moore, J. C.; Berkowitz, A. R.

    2017-12-01

    Understanding water is central to understanding environmental challenges. Scientists use `big data' and computational models to develop knowledge about the structure and function of complex systems, and to make predictions about changes in climate, weather, hydrology, and ecology. Large environmental systems-related data sets and simulation models are difficult for high school teachers and students to access and make sense of. Comp Hydro, a collaboration across four states and multiple school districts, integrates computational thinking and data-related science practices into water systems instruction to enhance development of scientific model-based reasoning, through curriculum, assessment and teacher professional development. Comp Hydro addresses the need for 1) teaching materials for using data and physical models of hydrological phenomena, 2) building teachers' and students' comfort or familiarity with data analysis and modeling, and 3) infusing the computational knowledge and practices necessary to model and visualize hydrologic processes into instruction. Comp Hydro teams in Baltimore, MD and Fort Collins, CO are integrating teaching about surface water systems into high school courses focusing on flooding (MD) and surface water reservoirs (CO). This interactive session will highlight the successes and challenges of our physical and simulation models in helping teachers and students develop proficiency with computational thinking about surface water. We also will share insights from comparing teacher-led vs. project-led development of curriculum and our simulations.

  1. Comparisons of some large scientific computers

    NASA Technical Reports Server (NTRS)

    Credeur, K. R.

    1981-01-01

    In 1975, the National Aeronautics and Space Administration (NASA) began studies to assess the technical and economic feasibility of developing a computer having sustained computational speed of one billion floating point operations per second and a working memory of at least 240 million words. Such a powerful computer would allow computational aerodynamics to play a major role in aeronautical design and advanced fluid dynamics research. Based on favorable results from these studies, NASA proceeded with developmental plans. The computer was named the Numerical Aerodynamic Simulator (NAS). To help insure that the estimated cost, schedule, and technical scope were realistic, a brief study was made of past large scientific computers. Large discrepancies between inception and operation in scope, cost, or schedule were studied so that they could be minimized with NASA's proposed new compter. The main computers studied were the ILLIAC IV, STAR 100, Parallel Element Processor Ensemble (PEPE), and Shuttle Mission Simulator (SMS) computer. Comparison data on memory and speed were also obtained on the IBM 650, 704, 7090, 360-50, 360-67, 360-91, and 370-195; the CDC 6400, 6600, 7600, CYBER 203, and CYBER 205; CRAY 1; and the Advanced Scientific Computer (ASC). A few lessons learned conclude the report.

  2. Public accessibility of biomedical articles from PubMed Central reduces journal readership--retrospective cohort analysis.

    PubMed

    Davis, Philip M

    2013-07-01

    Does PubMed Central--a government-run digital archive of biomedical articles--compete with scientific society journals? A longitudinal, retrospective cohort analysis of 13,223 articles (5999 treatment, 7224 control) published in 14 society-run biomedical research journals in nutrition, experimental biology, physiology, and radiology between February 2008 and January 2011 reveals a 21.4% reduction in full-text hypertext markup language (HTML) article downloads and a 13.8% reduction in portable document format (PDF) article downloads from the journals' websites when U.S. National Institutes of Health-sponsored articles (treatment) become freely available from the PubMed Central repository. In addition, the effect of PubMed Central on reducing PDF article downloads is increasing over time, growing at a rate of 1.6% per year. There was no longitudinal effect for full-text HTML downloads. While PubMed Central may be providing complementary access to readers traditionally underserved by scientific journals, the loss of article readership from the journal website may weaken the ability of the journal to build communities of interest around research papers, impede the communication of news and events to scientific society members and journal readers, and reduce the perceived value of the journal to institutional subscribers.

  3. USSR Report: Cybernetics, Computers and Automation Technology. No. 69.

    DTIC Science & Technology

    1983-05-06

    computers in multiprocessor and multistation design , control and scientific research automation systems. The results of comparing the efficiency of...Podvizhnaya, Scientific Research Institute of Control Computers, Severodonetsk] [Text] The most significant change in the design of the SM-2M compared to...UPRAVLYAYUSHCHIYE SISTEMY I MASHINY, Nov-Dec 82) 95 APPLICATIONS Kiev Automated Control System, Design Features and Prospects for Development (V. A

  4. The Nature and Development of Scientific Reasoning: A Synthetic View

    ERIC Educational Resources Information Center

    Lawson, Antone E.

    2004-01-01

    This paper presents a synthesis of what is currently known about the nature and development of scientific reasoning and why it plays a central role in acquiring scientific literacy. Science is viewed as a hypothetico-deductive (HD) enterprise engaging in the generation and test of alternative explanations. Explanation generation and test requires…

  5. Scientific Knowledge, Popularisation, and the Use of Metaphors: Modern Genetics in Popular Science Magazines

    ERIC Educational Resources Information Center

    Pramling, Niklas; Saljo, Roger

    2007-01-01

    The article reports an empirical study of how authors in popular science magazines attempt to render scientific knowledge intelligible to wide audiences. In bridging the two domains of "popular" and "scientific" knowledge, respectively, metaphor becomes central. We ask the empirical question of what metaphors are used when communicating about…

  6. Mastery of Scientific Argumentation on the Concept of Neutralization in Chemistry: A Malaysian Perspective

    ERIC Educational Resources Information Center

    Heng, Lee Ling; Surif, Johari; Seng, Cher Hau; Ibrahim, Nor Hasniza

    2015-01-01

    Purpose: Argumentative practices are central to science education, and have recently been emphasised to promote students' reasoning skills and to develop student's understanding of scientific concepts. This study examines the mastery of scientific argumentation, based on the concept of neutralisation, among secondary level science students, when…

  7. SCIENTIFIC AND TECHNICAL MANPOWER RESOURCES, SUMMARY INFORMATION ON EMPLOYMENT, CHARACTERISTICS, SUPPLY, AND TRAINING.

    ERIC Educational Resources Information Center

    National Science Foundation, Washington, DC.

    THE LEGISLATION ESTABLISHING THE NATIONAL SCIENCE FOUNDATION STIPULATED THAT IT MAINTAIN A REGISTER OF SCIENTIFIC AND TECHNICAL PERSONNEL AND IN OTHER WAYS PROVIDE A CENTRAL CLEARINGHOUSE FOR INFORMATION COVERING ALL SCIENTIFIC AND TECHNICAL PERSONNEL IN THE UNITED STATES. THIS PUBLICATION BRINGS TOGETHER INFORMATION FROM MANY SOURCES ON THE…

  8. The TeraShake Computational Platform for Large-Scale Earthquake Simulations

    NASA Astrophysics Data System (ADS)

    Cui, Yifeng; Olsen, Kim; Chourasia, Amit; Moore, Reagan; Maechling, Philip; Jordan, Thomas

    Geoscientific and computer science researchers with the Southern California Earthquake Center (SCEC) are conducting a large-scale, physics-based, computationally demanding earthquake system science research program with the goal of developing predictive models of earthquake processes. The computational demands of this program continue to increase rapidly as these researchers seek to perform physics-based numerical simulations of earthquake processes for larger meet the needs of this research program, a multiple-institution team coordinated by SCEC has integrated several scientific codes into a numerical modeling-based research tool we call the TeraShake computational platform (TSCP). A central component in the TSCP is a highly scalable earthquake wave propagation simulation program called the TeraShake anelastic wave propagation (TS-AWP) code. In this chapter, we describe how we extended an existing, stand-alone, wellvalidated, finite-difference, anelastic wave propagation modeling code into the highly scalable and widely used TS-AWP and then integrated this code into the TeraShake computational platform that provides end-to-end (initialization to analysis) research capabilities. We also describe the techniques used to enhance the TS-AWP parallel performance on TeraGrid supercomputers, as well as the TeraShake simulations phases including input preparation, run time, data archive management, and visualization. As a result of our efforts to improve its parallel efficiency, the TS-AWP has now shown highly efficient strong scaling on over 40K processors on IBM’s BlueGene/L Watson computer. In addition, the TSCP has developed into a computational system that is useful to many members of the SCEC community for performing large-scale earthquake simulations.

  9. How music training enhances working memory: a cerebrocerebellar blending mechanism that can lead equally to scientific discovery and therapeutic efficacy in neurological disorders.

    PubMed

    Vandervert, Larry

    2015-01-01

    Following in the vein of studies that concluded that music training resulted in plastic changes in Einstein's cerebral cortex, controlled research has shown that music training (1) enhances central executive attentional processes in working memory, and (2) has also been shown to be of significant therapeutic value in neurological disorders. Within this framework of music training-induced enhancement of central executive attentional processes, the purpose of this article is to argue that: (1) The foundational basis of the central executive begins in infancy as attentional control during the establishment of working memory, (2) In accordance with Akshoomoff, Courchesne and Townsend's and Leggio and Molinari's cerebellar sequence detection and prediction models, the rigors of volitional control demands of music training can enhance voluntary manipulation of information in thought and movement, (3) The music training-enhanced blending of cerebellar internal models in working memory as can be experienced as intuition in scientific discovery (as Einstein often indicated) or, equally, as moments of therapeutic advancement toward goals in the development of voluntary control in neurological disorders, and (4) The blending of internal models as in (3) thus provides a mechanism by which music training enhances central executive processes in working memory that can lead to scientific discovery and improved therapeutic outcomes in neurological disorders. Within the framework of Leggio and Molinari's cerebellar sequence detection model, it is determined that intuitive steps forward that occur in both scientific discovery and during therapy in those with neurological disorders operate according to the same mechanism of adaptive error-driven blending of cerebellar internal models. It is concluded that the entire framework of the central executive structure of working memory is a product of the cerebrocerebellar system which can, through the learning of internal models, incorporate the multi-dimensional rigor and volitional-control demands of music training and, thereby, enhance voluntary control. It is further concluded that this cerebrocerebellar view of the music training-induced enhancement of central executive control in working memory provides a needed mechanism to explain both the highest level of scientific discovery and the efficacy of music training in the remediation of neurological impairments.

  10. Accelerating scientific discovery : 2007 annual report.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beckman, P.; Dave, P.; Drugan, C.

    2008-11-14

    As a gateway for scientific discovery, the Argonne Leadership Computing Facility (ALCF) works hand in hand with the world's best computational scientists to advance research in a diverse span of scientific domains, ranging from chemistry, applied mathematics, and materials science to engineering physics and life sciences. Sponsored by the U.S. Department of Energy's (DOE) Office of Science, researchers are using the IBM Blue Gene/L supercomputer at the ALCF to study and explore key scientific problems that underlie important challenges facing our society. For instance, a research team at the University of California-San Diego/ SDSC is studying the molecular basis ofmore » Parkinson's disease. The researchers plan to use the knowledge they gain to discover new drugs to treat the disease and to identify risk factors for other diseases that are equally prevalent. Likewise, scientists from Pratt & Whitney are using the Blue Gene to understand the complex processes within aircraft engines. Expanding our understanding of jet engine combustors is the secret to improved fuel efficiency and reduced emissions. Lessons learned from the scientific simulations of jet engine combustors have already led Pratt & Whitney to newer designs with unprecedented reductions in emissions, noise, and cost of ownership. ALCF staff members provide in-depth expertise and assistance to those using the Blue Gene/L and optimizing user applications. Both the Catalyst and Applications Performance Engineering and Data Analytics (APEDA) teams support the users projects. In addition to working with scientists running experiments on the Blue Gene/L, we have become a nexus for the broader global community. In partnership with the Mathematics and Computer Science Division at Argonne National Laboratory, we have created an environment where the world's most challenging computational science problems can be addressed. Our expertise in high-end scientific computing enables us to provide guidance for applications that are transitioning to petascale as well as to produce software that facilitates their development, such as the MPICH library, which provides a portable and efficient implementation of the MPI standard--the prevalent programming model for large-scale scientific applications--and the PETSc toolkit that provides a programming paradigm that eases the development of many scientific applications on high-end computers.« less

  11. Brain-computer interface devices for patients with paralysis and amputation: a meeting report

    NASA Astrophysics Data System (ADS)

    Bowsher, K.; Civillico, E. F.; Coburn, J.; Collinger, J.; Contreras-Vidal, J. L.; Denison, T.; Donoghue, J.; French, J.; Getzoff, N.; Hochberg, L. R.; Hoffmann, M.; Judy, J.; Kleitman, N.; Knaack, G.; Krauthamer, V.; Ludwig, K.; Moynahan, M.; Pancrazio, J. J.; Peckham, P. H.; Pena, C.; Pinto, V.; Ryan, T.; Saha, D.; Scharen, H.; Shermer, S.; Skodacek, K.; Takmakov, P.; Tyler, D.; Vasudevan, S.; Wachrathit, K.; Weber, D.; Welle, C. G.; Ye, M.

    2016-04-01

    Objective. The Food and Drug Administration’s (FDA) Center for Devices and Radiological Health (CDRH) believes it is important to help stakeholders (e.g., manufacturers, health-care professionals, patients, patient advocates, academia, and other government agencies) navigate the regulatory landscape for medical devices. For innovative devices involving brain-computer interfaces, this is particularly important. Approach. Towards this goal, on 21 November, 2014, CDRH held an open public workshop on its White Oak, MD campus with the aim of fostering an open discussion on the scientific and clinical considerations associated with the development of brain-computer interface (BCI) devices, defined for the purposes of this workshop as neuroprostheses that interface with the central or peripheral nervous system to restore lost motor or sensory capabilities. Main results. This paper summarizes the presentations and discussions from that workshop. Significance. CDRH plans to use this information to develop regulatory considerations that will promote innovation while maintaining appropriate patient protections. FDA plans to build on advances in regulatory science and input provided in this workshop to develop guidance that provides recommendations for premarket submissions for BCI devices. These proceedings will be a resource for the BCI community during the development of medical devices for consumers.

  12. Brain-computer interface devices for patients with paralysis and amputation: a meeting report.

    PubMed

    Bowsher, K; Civillico, E F; Coburn, J; Collinger, J; Contreras-Vidal, J L; Denison, T; Donoghue, J; French, J; Getzoff, N; Hochberg, L R; Hoffmann, M; Judy, J; Kleitman, N; Knaack, G; Krauthamer, V; Ludwig, K; Moynahan, M; Pancrazio, J J; Peckham, P H; Pena, C; Pinto, V; Ryan, T; Saha, D; Scharen, H; Shermer, S; Skodacek, K; Takmakov, P; Tyler, D; Vasudevan, S; Wachrathit, K; Weber, D; Welle, C G; Ye, M

    2016-04-01

    The Food and Drug Administration's (FDA) Center for Devices and Radiological Health (CDRH) believes it is important to help stakeholders (e.g., manufacturers, health-care professionals, patients, patient advocates, academia, and other government agencies) navigate the regulatory landscape for medical devices. For innovative devices involving brain-computer interfaces, this is particularly important. Towards this goal, on 21 November, 2014, CDRH held an open public workshop on its White Oak, MD campus with the aim of fostering an open discussion on the scientific and clinical considerations associated with the development of brain-computer interface (BCI) devices, defined for the purposes of this workshop as neuroprostheses that interface with the central or peripheral nervous system to restore lost motor or sensory capabilities. This paper summarizes the presentations and discussions from that workshop. CDRH plans to use this information to develop regulatory considerations that will promote innovation while maintaining appropriate patient protections. FDA plans to build on advances in regulatory science and input provided in this workshop to develop guidance that provides recommendations for premarket submissions for BCI devices. These proceedings will be a resource for the BCI community during the development of medical devices for consumers.

  13. RAPPORT: running scientific high-performance computing applications on the cloud.

    PubMed

    Cohen, Jeremy; Filippis, Ioannis; Woodbridge, Mark; Bauer, Daniela; Hong, Neil Chue; Jackson, Mike; Butcher, Sarah; Colling, David; Darlington, John; Fuchs, Brian; Harvey, Matt

    2013-01-28

    Cloud computing infrastructure is now widely used in many domains, but one area where there has been more limited adoption is research computing, in particular for running scientific high-performance computing (HPC) software. The Robust Application Porting for HPC in the Cloud (RAPPORT) project took advantage of existing links between computing researchers and application scientists in the fields of bioinformatics, high-energy physics (HEP) and digital humanities, to investigate running a set of scientific HPC applications from these domains on cloud infrastructure. In this paper, we focus on the bioinformatics and HEP domains, describing the applications and target cloud platforms. We conclude that, while there are many factors that need consideration, there is no fundamental impediment to the use of cloud infrastructure for running many types of HPC applications and, in some cases, there is potential for researchers to benefit significantly from the flexibility offered by cloud platforms.

  14. High-performance scientific computing in the cloud

    NASA Astrophysics Data System (ADS)

    Jorissen, Kevin; Vila, Fernando; Rehr, John

    2011-03-01

    Cloud computing has the potential to open up high-performance computational science to a much broader class of researchers, owing to its ability to provide on-demand, virtualized computational resources. However, before such approaches can become commonplace, user-friendly tools must be developed that hide the unfamiliar cloud environment and streamline the management of cloud resources for many scientific applications. We have recently shown that high-performance cloud computing is feasible for parallelized x-ray spectroscopy calculations. We now present benchmark results for a wider selection of scientific applications focusing on electronic structure and spectroscopic simulation software in condensed matter physics. These applications are driven by an improved portable interface that can manage virtual clusters and run various applications in the cloud. We also describe a next generation of cluster tools, aimed at improved performance and a more robust cluster deployment. Supported by NSF grant OCI-1048052.

  15. Management of scientific information with Google Drive.

    PubMed

    Kubaszewski, Łukasz; Kaczmarczyk, Jacek; Nowakowski, Andrzej

    2013-09-20

    The amount and diversity of scientific publications requires a modern management system. By "management" we mean the process of gathering interesting information for the purpose of reading and archiving for quick access in future clinical practice and research activity. In the past, such system required physical existence of a library, either institutional or private. Nowadays in an era dominated by electronic information, it is natural to migrate entire systems to a digital form. In the following paper we describe the structure and functions of an individual electronic library system (IELiS) for the management of scientific publications based on the Google Drive service. Architecture of the system. Architecture system consists of a central element and peripheral devices. Central element of the system is virtual Google Drive provided by Google Inc. Physical elements of the system include: tablet with Android operating system and a personal computer, both with internet access. Required software includes a program to view and edit files in PDF format for mobile devices and another to synchronize the files. Functioning of the system. The first step in creating a system is collection of scientific papers in PDF format and their analysis. This step is performed most frequently on a tablet. At this stage, after being read, the papers are cataloged in a system of folders and subfolders, according to individual demands. During this stage, but not exclusively, the PDF files are annotated by the reader. This allows the user to quickly track down interesting information in review or research process. Modification of the document title is performed at this stage, as well. Second element of the system is creation of a mirror database in the Google Drive virtual memory. Modified and cataloged papers are synchronized with Google Drive. At this stage, a fully functional scientific information electronic library becomes available online. The third element of the system is a periodic two-way synchronization of data between Google Drive and tablet, as occasional modification of the files with annotation or recataloging may be performed at both locations. The system architecture is designed to gather, catalog and analyze scientific publications. All steps are electronic, eliminating paper forms. Indexed files are available for re-reading and modification. The system allows for fast access to full-text search with additional features making research easier. Team collaboration is also possible with full control of user privileges. Particularly important is the safety of collected data. In our opinion, the system exceeds many commercially available applications in terms of functionality and versatility.

  16. A computational framework for the study of confidence in humans and animals

    PubMed Central

    Kepecs, Adam; Mainen, Zachary F.

    2012-01-01

    Confidence judgements, self-assessments about the quality of a subject's knowledge, are considered a central example of metacognition. Prima facie, introspection and self-report appear the only way to access the subjective sense of confidence or uncertainty. Contrary to this notion, overt behavioural measures can be used to study confidence judgements by animals trained in decision-making tasks with perceptual or mnemonic uncertainty. Here, we suggest that a computational approach can clarify the issues involved in interpreting these tasks and provide a much needed springboard for advancing the scientific understanding of confidence. We first review relevant theories of probabilistic inference and decision-making. We then critically discuss behavioural tasks employed to measure confidence in animals and show how quantitative models can help to constrain the computational strategies underlying confidence-reporting behaviours. In our view, post-decision wagering tasks with continuous measures of confidence appear to offer the best available metrics of confidence. Since behavioural reports alone provide a limited window into mechanism, we argue that progress calls for measuring the neural representations and identifying the computations underlying confidence reports. We present a case study using such a computational approach to study the neural correlates of decision confidence in rats. This work shows that confidence assessments may be considered higher order, but can be generated using elementary neural computations that are available to a wide range of species. Finally, we discuss the relationship of confidence judgements to the wider behavioural uses of confidence and uncertainty. PMID:22492750

  17. Fluid Centrality: A Social Network Analysis of Social-Technical Relations in Computer-Mediated Communication

    ERIC Educational Resources Information Center

    Enriquez, Judith Guevarra

    2010-01-01

    In this article, centrality is explored as a measure of computer-mediated communication (CMC) in networked learning. Centrality measure is quite common in performing social network analysis (SNA) and in analysing social cohesion, strength of ties and influence in CMC, and computer-supported collaborative learning research. It argues that measuring…

  18. Activities of the Research Institute for Advanced Computer Science

    NASA Technical Reports Server (NTRS)

    Oliger, Joseph

    1994-01-01

    The Research Institute for Advanced Computer Science (RIACS) was established by the Universities Space Research Association (USRA) at the NASA Ames Research Center (ARC) on June 6, 1983. RIACS is privately operated by USRA, a consortium of universities with research programs in the aerospace sciences, under contract with NASA. The primary mission of RIACS is to provide research and expertise in computer science and scientific computing to support the scientific missions of NASA ARC. The research carried out at RIACS must change its emphasis from year to year in response to NASA ARC's changing needs and technological opportunities. Research at RIACS is currently being done in the following areas: (1) parallel computing; (2) advanced methods for scientific computing; (3) high performance networks; and (4) learning systems. RIACS technical reports are usually preprints of manuscripts that have been submitted to research journals or conference proceedings. A list of these reports for the period January 1, 1994 through December 31, 1994 is in the Reports and Abstracts section of this report.

  19. The future of scientific workflows

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Deelman, Ewa; Peterka, Tom; Altintas, Ilkay

    Today’s computational, experimental, and observational sciences rely on computations that involve many related tasks. The success of a scientific mission often hinges on the computer automation of these workflows. In April 2015, the US Department of Energy (DOE) invited a diverse group of domain and computer scientists from national laboratories supported by the Office of Science, the National Nuclear Security Administration, from industry, and from academia to review the workflow requirements of DOE’s science and national security missions, to assess the current state of the art in science workflows, to understand the impact of emerging extreme-scale computing systems on thosemore » workflows, and to develop requirements for automated workflow management in future and existing environments. This article is a summary of the opinions of over 50 leading researchers attending this workshop. We highlight use cases, computing systems, workflow needs and conclude by summarizing the remaining challenges this community sees that inhibit large-scale scientific workflows from becoming a mainstream tool for extreme-scale science.« less

  20. Neuromorphic Computing, Architectures, Models, and Applications. A Beyond-CMOS Approach to Future Computing, June 29-July 1, 2016, Oak Ridge, TN

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Potok, Thomas; Schuman, Catherine; Patton, Robert

    The White House and Department of Energy have been instrumental in driving the development of a neuromorphic computing program to help the United States continue its lead in basic research into (1) Beyond Exascale—high performance computing beyond Moore’s Law and von Neumann architectures, (2) Scientific Discovery—new paradigms for understanding increasingly large and complex scientific data, and (3) Emerging Architectures—assessing the potential of neuromorphic and quantum architectures. Neuromorphic computing spans a broad range of scientific disciplines from materials science to devices, to computer science, to neuroscience, all of which are required to solve the neuromorphic computing grand challenge. In our workshopmore » we focus on the computer science aspects, specifically from a neuromorphic device through an application. Neuromorphic devices present a very different paradigm to the computer science community from traditional von Neumann architectures, which raises six major questions about building a neuromorphic application from the device level. We used these fundamental questions to organize the workshop program and to direct the workshop panels and discussions. From the white papers, presentations, panels, and discussions, there emerged several recommendations on how to proceed.« less

  1. Computational science: shifting the focus from tools to models

    PubMed Central

    Hinsen, Konrad

    2014-01-01

    Computational techniques have revolutionized many aspects of scientific research over the last few decades. Experimentalists use computation for data analysis, processing ever bigger data sets. Theoreticians compute predictions from ever more complex models. However, traditional articles do not permit the publication of big data sets or complex models. As a consequence, these crucial pieces of information no longer enter the scientific record. Moreover, they have become prisoners of scientific software: many models exist only as software implementations, and the data are often stored in proprietary formats defined by the software. In this article, I argue that this emphasis on software tools over models and data is detrimental to science in the long term, and I propose a means by which this can be reversed. PMID:25309728

  2. InSAR Scientific Computing Environment

    NASA Astrophysics Data System (ADS)

    Gurrola, E. M.; Rosen, P. A.; Sacco, G.; Zebker, H. A.; Simons, M.; Sandwell, D. T.

    2010-12-01

    The InSAR Scientific Computing Environment (ISCE) is a software development effort in its second year within the NASA Advanced Information Systems and Technology program. The ISCE will provide a new computing environment for geodetic image processing for InSAR sensors that will enable scientists to reduce measurements directly from radar satellites and aircraft to new geophysical products without first requiring them to develop detailed expertise in radar processing methods. The environment can serve as the core of a centralized processing center to bring Level-0 raw radar data up to Level-3 data products, but is adaptable to alternative processing approaches for science users interested in new and different ways to exploit mission data. The NRC Decadal Survey-recommended DESDynI mission will deliver data of unprecedented quantity and quality, making possible global-scale studies in climate research, natural hazards, and Earth's ecosystem. The InSAR Scientific Computing Environment is planned to become a key element in processing DESDynI data into higher level data products and it is expected to enable a new class of analyses that take greater advantage of the long time and large spatial scales of these new data, than current approaches. At the core of ISCE is both legacy processing software from the JPL/Caltech ROI_PAC repeat-pass interferometry package as well as a new InSAR processing package containing more efficient and more accurate processing algorithms being developed at Stanford for this project that is based on experience gained in developing processors for missions such as SRTM and UAVSAR. Around the core InSAR processing programs we are building object-oriented wrappers to enable their incorporation into a more modern, flexible, extensible software package that is informed by modern programming methods, including rigorous componentization of processing codes, abstraction and generalization of data models, and a robust, intuitive user interface with graduated exposure to the levels of sophistication, allowing novices to apply it readily for common tasks and experienced users to mine data with great facility and flexibility. The environment is designed to easily allow user contributions, enabling an open source community to extend the framework into the indefinite future. In this paper we briefly describe both the legacy and the new core processing algorithms and their integration into the new computing environment. We describe the ISCE component and application architecture and the features that permit the desired flexibility, extensibility and ease-of-use. We summarize the state of progress of the environment and the plans for completion of the environment and for its future introduction into the radar processing community.

  3. InSAR Scientific Computing Environment - The Home Stretch

    NASA Astrophysics Data System (ADS)

    Rosen, P. A.; Gurrola, E. M.; Sacco, G.; Zebker, H. A.

    2011-12-01

    The Interferometric Synthetic Aperture Radar (InSAR) Scientific Computing Environment (ISCE) is a software development effort in its third and final year within the NASA Advanced Information Systems and Technology program. The ISCE is a new computing environment for geodetic image processing for InSAR sensors enabling scientists to reduce measurements directly from radar satellites to new geophysical products with relative ease. The environment can serve as the core of a centralized processing center to bring Level-0 raw radar data up to Level-3 data products, but is adaptable to alternative processing approaches for science users interested in new and different ways to exploit mission data. Upcoming international SAR missions will deliver data of unprecedented quantity and quality, making possible global-scale studies in climate research, natural hazards, and Earth's ecosystem. The InSAR Scientific Computing Environment has the functionality to become a key element in processing data from NASA's proposed DESDynI mission into higher level data products, supporting a new class of analyses that take advantage of the long time and large spatial scales of these new data. At the core of ISCE is a new set of efficient and accurate InSAR algorithms. These algorithms are placed into an object-oriented, flexible, extensible software package that is informed by modern programming methods, including rigorous componentization of processing codes, abstraction and generalization of data models. The environment is designed to easily allow user contributions, enabling an open source community to extend the framework into the indefinite future. ISCE supports data from nearly all of the available satellite platforms, including ERS, EnviSAT, Radarsat-1, Radarsat-2, ALOS, TerraSAR-X, and Cosmo-SkyMed. The code applies a number of parallelization techniques and sensible approximations for speed. It is configured to work on modern linux-based computers with gcc compilers and python. ISCE is now a complete, functional package, under configuration management, and with extensive documentation and tested use cases appropriate to geodetic imaging applications. The software has been tested with canonical simulated radar data ("point targets") as well as with a variety of existing satellite data, cross-compared with other software packages. Its extensibility has already been proven by the straightforward addition of polarimetric processing and calibration, and derived filtering and estimation routines associated with polarimetry that supplement the original InSAR geodetic functionality. As of October 2011, the software is available for non-commercial use through UNAVCO's WinSAR consortium.

  4. Crosscut report: Exascale Requirements Reviews, March 9–10, 2017 – Tysons Corner, Virginia. An Office of Science review sponsored by: Advanced Scientific Computing Research, Basic Energy Sciences, Biological and Environmental Research, Fusion Energy Sciences, High Energy Physics, Nuclear Physics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gerber, Richard; Hack, James; Riley, Katherine

    The mission of the U.S. Department of Energy Office of Science (DOE SC) is the delivery of scientific discoveries and major scientific tools to transform our understanding of nature and to advance the energy, economic, and national security missions of the United States. To achieve these goals in today’s world requires investments in not only the traditional scientific endeavors of theory and experiment, but also in computational science and the facilities that support large-scale simulation and data analysis. The Advanced Scientific Computing Research (ASCR) program addresses these challenges in the Office of Science. ASCR’s mission is to discover, develop, andmore » deploy computational and networking capabilities to analyze, model, simulate, and predict complex phenomena important to DOE. ASCR supports research in computational science, three high-performance computing (HPC) facilities — the National Energy Research Scientific Computing Center (NERSC) at Lawrence Berkeley National Laboratory and Leadership Computing Facilities at Argonne (ALCF) and Oak Ridge (OLCF) National Laboratories — and the Energy Sciences Network (ESnet) at Berkeley Lab. ASCR is guided by science needs as it develops research programs, computers, and networks at the leading edge of technologies. As we approach the era of exascale computing, technology changes are creating challenges for science programs in SC for those who need to use high performance computing and data systems effectively. Numerous significant modifications to today’s tools and techniques will be needed to realize the full potential of emerging computing systems and other novel computing architectures. To assess these needs and challenges, ASCR held a series of Exascale Requirements Reviews in 2015–2017, one with each of the six SC program offices,1 and a subsequent Crosscut Review that sought to integrate the findings from each. Participants at the reviews were drawn from the communities of leading domain scientists, experts in computer science and applied mathematics, ASCR facility staff, and DOE program managers in ASCR and the respective program offices. The purpose of these reviews was to identify mission-critical scientific problems within the DOE Office of Science (including experimental facilities) and determine the requirements for the exascale ecosystem that would be needed to address those challenges. The exascale ecosystem includes exascale computing systems, high-end data capabilities, efficient software at scale, libraries, tools, and other capabilities. This effort will contribute to the development of a strategic roadmap for ASCR compute and data facility investments and will help the ASCR Facility Division establish partnerships with Office of Science stakeholders. It will also inform the Office of Science research needs and agenda. The results of the six reviews have been published in reports available on the web at http://exascaleage.org/. This report presents a summary of the individual reports and of common and crosscutting findings, and it identifies opportunities for productive collaborations among the DOE SC program offices.« less

  5. The International Conference on Vector and Parallel Computing (2nd)

    DTIC Science & Technology

    1989-01-17

    Computation of the SVD of Bidiagonal Matrices" ...................................... 11 " Lattice QCD -As a Large Scale Scientific Computation...vectorizcd for the IBM 3090 Vector Facility. In addition, elapsed times " Lattice QCD -As a Large Scale Scientific have been reduced by using 3090...benchmarked Lattice QCD on a large number ofcompu- come from the wavefront solver routine. This was exten- ters: CrayX-MP and Cray 2 (vector

  6. Multi-threading: A new dimension to massively parallel scientific computation

    NASA Astrophysics Data System (ADS)

    Nielsen, Ida M. B.; Janssen, Curtis L.

    2000-06-01

    Multi-threading is becoming widely available for Unix-like operating systems, and the application of multi-threading opens new ways for performing parallel computations with greater efficiency. We here briefly discuss the principles of multi-threading and illustrate the application of multi-threading for a massively parallel direct four-index transformation of electron repulsion integrals. Finally, other potential applications of multi-threading in scientific computing are outlined.

  7. Comment on "Most computational hydrology is not reproducible, so is it really science?" by Christopher Hutton et al.

    NASA Astrophysics Data System (ADS)

    Añel, Juan A.

    2017-03-01

    Nowadays, the majority of the scientific community is not aware of the risks and problems associated with an inadequate use of computer systems for research, mostly for reproducibility of scientific results. Such reproducibility can be compromised by the lack of clear standards and insufficient methodological description of the computational details involved in an experiment. In addition, the inappropriate application or ignorance of copyright laws can have undesirable effects on access to aspects of great importance of the design of experiments and therefore to the interpretation of results.Plain Language SummaryThis article highlights several important issues to ensure the scientific reproducibility of results within the current scientific framework, going beyond simple documentation. Several specific examples are discussed in the field of hydrological modeling.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3177948','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3177948"><span>When Ethics Constrains Clinical Research: Trial Design of Control Arms in “Greater Than Minimal Risk” Pediatric Trials</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>de Melo-Martín, Inmaculada; Sondhi, Dolan</p> <p>2011-01-01</p> <p>Abstract For more than three decades clinical research in the United States has been explicitly guided by the idea that ethical considerations must be central to research design and practice. In spite of the centrality of this idea, attempting to balance the sometimes conflicting values of advancing scientific knowledge and protecting human subjects continues to pose challenges. Possible conflicts between the standards of scientific research and those of ethics are particularly salient in relation to trial design. Specifically, the choice of a control arm is an aspect of trial design in which ethical and scientific issues are deeply entwined. Although ethical quandaries related to the choice of control arms may arise when conducting any type of clinical trials, they are conspicuous in early phase gene transfer trials that involve highly novel approaches and surgical procedures and have children as the research subjects. Because of children's and their parents' vulnerabilities, in trials that investigate therapies for fatal, rare diseases affecting minors, the scientific and ethical concerns related to choosing appropriate controls are particularly significant. In this paper we use direct gene transfer to the central nervous system to treat late infantile neuronal ceroid lipofuscinosis to illustrate some of these ethical issues and explore possible solutions to real and apparent conflicts between scientific and ethical considerations. PMID:21446781</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/28586280','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/28586280"><span>A decade of proteomics accomplished! Central and Eastern European Proteomic Conference (CEEPC) celebrates its 10th Anniversary in Budapest, Hungary.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Gadher, Suresh Jivan; Drahos, László; Vékey, Károly; Kovarova, Hana</p> <p>2017-07-01</p> <p>The Central and Eastern European Proteomic Conference (CEEPC) proudly celebrated its 10th Anniversary with an exciting scientific program inclusive of proteome, proteomics and systems biology in Budapest, Hungary. Since 2007, CEEPC has represented 'state-of the-art' proteomics in and around Central and Eastern Europe and these series of conferences have become a well-recognized event in the proteomic calendar. Fresher challenges and global healthcare issues such as ageing and chronic diseases are driving clinical and scientific research towards regenerative, reparative and personalized medicine. To this end, proteomics may enable diverse intertwining research fields to reach their end goals. CEEPC will endeavor to facilitate these goals.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://eric.ed.gov/?q=Biology%3a+AND+life+AND+earth&pg=7&id=ED242514','ERIC'); return false;" href="https://eric.ed.gov/?q=Biology%3a+AND+life+AND+earth&pg=7&id=ED242514"><span>Science and Creationism: A View from the National Academy of Sciences.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>National Academy of Sciences, Washington, DC.</p> <p></p> <p>Five central scientific issues are critical to consideration of the treatment in school curricula of the origin and evolution of the universe and of life on earth. These issues are: (1) the nature of science; (2) scientific evidence on the origin of the universe and the earth; (3) the consistent and validated scientific evidence for biological…</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://eric.ed.gov/?q=Specht&pg=2&id=EJ1027395','ERIC'); return false;" href="https://eric.ed.gov/?q=Specht&pg=2&id=EJ1027395"><span>The Relationship in Biology between the Nature of Science and Scientific Inquiry</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Kremer, Kerstin; Specht, Christiane; Urhahne, Detlef; Mayer, Jürgen</p> <p>2014-01-01</p> <p>Informed understandings of nature of science and scientific inquiry are generally accepted goals of biology education. This article points out central features of scientific inquiry with relation to biology and the nature of science in general terms and focuses on the relationship of students' inquiry skills in biology and their beliefs on the…</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015EGUGA..1715419M','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015EGUGA..1715419M"><span>Collaborative WorkBench (cwb): Enabling Experiment Execution, Analysis and Visualization with Increased Scientific Productivity</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Maskey, Manil; Ramachandran, Rahul; Kuo, Kwo-Sen</p> <p>2015-04-01</p> <p>The Collaborative WorkBench (CWB) has been successfully developed to support collaborative science algorithm development. It incorporates many features that enable and enhance science collaboration, including the support for both asynchronous and synchronous modes of interactions in collaborations. With the former, members in a team can share a full range of research artifacts, e.g. data, code, visualizations, and even virtual machine images. With the latter, they can engage in dynamic interactions such as notification, instant messaging, file exchange, and, most notably, collaborative programming. CWB also implements behind-the-scene provenance capture as well as version control to relieve scientists of these chores. Furthermore, it has achieved a seamless integration between researchers' local compute environments and those of the Cloud. CWB has also been successfully extended to support instrument verification and validation. Adopted by almost every researcher, the current practice of downloading data to local compute resources for analysis results in much duplication and inefficiency. CWB leverages Cloud infrastructure to provide a central location for data used by an entire science team, thereby eliminating much of this duplication and waste. Furthermore, use of CWB in concert with this same Cloud infrastructure enables co-located analysis with data where opportunities of data-parallelism can be better exploited, thereby further improving efficiency. With its collaboration-enabling features apposite to steps throughout the scientific process, we expect CWB to fundamentally transform research collaboration and realize maximum science productivity.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=2958428','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=2958428"><span>Neurological Effects of Blast Injury</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Hicks, Ramona R.; Fertig, Stephanie J.; Desrocher, Rebecca E.; Koroshetz, Walter J.; Pancrazio, Joseph J.</p> <p>2010-01-01</p> <p>Over the last few years, thousands of soldiers and an even greater number of civilians have suffered traumatic injuries due to blast exposure, largely attributed to improvised explosive devices in terrorist and insurgent activities. The use of body armor is allowing soldiers to survive blasts that would otherwise be fatal due to systemic damage. Emerging evidence suggests that exposure to a blast can produce neurological consequences in the brain, but much remains unknown. To elucidate the current scientific basis for understanding blast-induced traumatic brain injury (bTBI), the NIH convened a workshop in April, 2008. A multidisciplinary group of neuroscientists, engineers, and clinicians were invited to share insights on bTBI, specifically pertaining to: physics of blast explosions, acute clinical observations and treatments, preclinical and computational models, and lessons from the international community on civilian exposures. This report provides an overview of the state of scientific knowledge of bTBI, drawing from the published literature, as well as presentations, discussions, and recommendations from the workshop. One of the major recommendations from the workshop was the need to characterize the effects of blast exposure on clinical neuropathology. Clearer understanding of the human neuropathology would enable validation of preclinical and computational models, which are attempting to simulate blast wave interactions with the central nervous system. Furthermore, the civilian experience with bTBI suggests that polytrauma models incorporating both brain and lung injuries may be more relevant to the study of civilian countermeasures than considering models with a neurological focus alone. PMID:20453776</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014AGUFMIN11A3598R','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014AGUFMIN11A3598R"><span>Handling the Diversity in the Coming Flood of InSAR Data with the InSAR Scientific Computing Environment</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Rosen, P. A.; Gurrola, E. M.; Sacco, G. F.; Agram, P. S.; Lavalle, M.; Zebker, H. A.</p> <p>2014-12-01</p> <p>The NASA ESTO-developed InSAR Scientific Computing Environment (ISCE) provides acomputing framework for geodetic image processing for InSAR sensors that ismodular, flexible, and extensible, enabling scientists to reduce measurementsdirectly from a diverse array of radar satellites and aircraft to newgeophysical products. ISCE can serve as the core of a centralized processingcenter to bring Level-0 raw radar data up to Level-3 data products, but isadaptable to alternative processing approaches for science users interested innew and different ways to exploit mission data. This is accomplished throughrigorous componentization of processing codes, abstraction and generalization ofdata models, and a xml-based input interface with multi-level prioritizedcontrol of the component configurations depending on the science processingcontext. The proposed NASA-ISRO SAR (NISAR) Mission would deliver data ofunprecedented quantity and quality, making possible global-scale studies inclimate research, natural hazards, and Earth's ecosystems. ISCE is planned tobecome a key element in processing projected NISAR data into higher level dataproducts, enabling a new class of analyses that take greater advantage of thelong time and large spatial scales of these new data than current approaches.NISAR would be but one mission in a constellation of radar satellites in thefuture delivering such data. ISCE has been incorporated into two prototypecloud-based systems that have demonstrated its elasticity to addressing largerdata processing problems in a "production" context and its ability to becontrolled by individual science users on the cloud for large data problems.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/FR-2012-02-24/pdf/2012-4346.pdf','FEDREG'); return false;" href="https://www.gpo.gov/fdsys/pkg/FR-2012-02-24/pdf/2012-4346.pdf"><span>77 FR 11139 - Center for Scientific Review; Notice of Closed Meetings</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.gpo.gov/fdsys/browse/collection.action?collectionCode=FR">Federal Register 2010, 2011, 2012, 2013, 2014</a></p> <p></p> <p>2012-02-24</p> <p>...: Center for Scientific Review Special Emphasis Panel; ``Genetics and Epigenetics of Disease.'' Date: March... Scientific Review Special Emphasis Panel; Small Business: Cell, Computational, and Molecular Biology. Date...</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://eric.ed.gov/?q=Jonathan+AND+Newton&pg=2&id=EJ605606','ERIC'); return false;" href="https://eric.ed.gov/?q=Jonathan+AND+Newton&pg=2&id=EJ605606"><span>Establishing the Norms of Scientific Argumentation in Classrooms.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Driver, Rosalind; Newton, Paul; Osborne, Jonathan</p> <p>2000-01-01</p> <p>Develops the case for the inclusion and central role of argument in science education. Discusses the function and purpose of dialogical argument in the social construction of scientific knowledge and interpretation of empirical data. (Author/CCM)</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://ntrs.nasa.gov/search.jsp?R=19940000357&hterms=scientific+method&qs=Ntx%3Dmode%2Bmatchall%26Ntk%3DAll%26N%3D0%26No%3D60%26Ntt%3Dscientific%2Bmethod','NASA-TRS'); return false;" href="https://ntrs.nasa.gov/search.jsp?R=19940000357&hterms=scientific+method&qs=Ntx%3Dmode%2Bmatchall%26Ntk%3DAll%26N%3D0%26No%3D60%26Ntt%3Dscientific%2Bmethod"><span>Program Supports Scientific Visualization</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Keith, Stephan</p> <p>1994-01-01</p> <p>Primary purpose of General Visualization System (GVS) computer program is to support scientific visualization of data generated by panel-method computer program PMARC_12 (inventory number ARC-13362) on Silicon Graphics Iris workstation. Enables user to view PMARC geometries and wakes as wire frames or as light shaded objects. GVS is written in C language.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://eric.ed.gov/?q=Python&pg=2&id=EJ1042419','ERIC'); return false;" href="https://eric.ed.gov/?q=Python&pg=2&id=EJ1042419"><span>Using POGIL to Help Students Learn to Program</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Hu, Helen H.; Shepherd, Tricia D.</p> <p>2013-01-01</p> <p>POGIL has been successfully implemented in a scientific computing course to teach science students how to program in Python. Following POGIL guidelines, the authors have developed guided inquiry activities that lead student teams to discover and understand programming concepts. With each iteration of the scientific computing course, the authors…</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://eric.ed.gov/?q=semantic+AND+web&pg=6&id=ED520304','ERIC'); return false;" href="https://eric.ed.gov/?q=semantic+AND+web&pg=6&id=ED520304"><span>Ontology-Driven Discovery of Scientific Computational Entities</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Brazier, Pearl W.</p> <p>2010-01-01</p> <p>Many geoscientists use modern computational resources, such as software applications, Web services, scientific workflows and datasets that are readily available on the Internet, to support their research and many common tasks. These resources are often shared via human contact and sometimes stored in data portals; however, they are not necessarily…</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2005APS..MAR.J6001T','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2005APS..MAR.J6001T"><span>Scientific Discovery through Advanced Computing in Plasma Science</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Tang, William</p> <p>2005-03-01</p> <p>Advanced computing is generally recognized to be an increasingly vital tool for accelerating progress in scientific research during the 21st Century. For example, the Department of Energy's ``Scientific Discovery through Advanced Computing'' (SciDAC) Program was motivated in large measure by the fact that formidable scientific challenges in its research portfolio could best be addressed by utilizing the combination of the rapid advances in super-computing technology together with the emergence of effective new algorithms and computational methodologies. The imperative is to translate such progress into corresponding increases in the performance of the scientific codes used to model complex physical systems such as those encountered in high temperature plasma research. If properly validated against experimental measurements and analytic benchmarks, these codes can provide reliable predictive capability for the behavior of a broad range of complex natural and engineered systems. This talk reviews recent progress and future directions for advanced simulations with some illustrative examples taken from the plasma science applications area. Significant recent progress has been made in both particle and fluid simulations of fine-scale turbulence and large-scale dynamics, giving increasingly good agreement between experimental observations and computational modeling. This was made possible by the combination of access to powerful new computational resources together with innovative advances in analytic and computational methods for developing reduced descriptions of physics phenomena spanning a huge range in time and space scales. In particular, the plasma science community has made excellent progress in developing advanced codes for which computer run-time and problem size scale well with the number of processors on massively parallel machines (MPP's). A good example is the effective usage of the full power of multi-teraflop (multi-trillion floating point computations per second) MPP's to produce three-dimensional, general geometry, nonlinear particle simulations which have accelerated progress in understanding the nature of plasma turbulence in magnetically-confined high temperature plasmas. These calculations, which typically utilized billions of particles for thousands of time-steps, would not have been possible without access to powerful present generation MPP computers and the associated diagnostic and visualization capabilities. In general, results from advanced simulations provide great encouragement for being able to include increasingly realistic dynamics to enable deeper physics insights into plasmas in both natural and laboratory environments. The associated scientific excitement should serve to stimulate improved cross-cutting collaborations with other fields and also to help attract bright young talent to the computational science area.</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_12");'>12</a></li> <li><a href="#" onclick='return showDiv("page_13");'>13</a></li> <li class="active"><span>14</span></li> <li><a href="#" onclick='return showDiv("page_15");'>15</a></li> <li><a href="#" onclick='return showDiv("page_16");'>16</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_14 --> <div id="page_15" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_13");'>13</a></li> <li><a href="#" onclick='return showDiv("page_14");'>14</a></li> <li class="active"><span>15</span></li> <li><a href="#" onclick='return showDiv("page_16");'>16</a></li> <li><a href="#" onclick='return showDiv("page_17");'>17</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="281"> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/1993AIPC..283..375T','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/1993AIPC..283..375T"><span>Computer network access to scientific information systems for minority universities</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Thomas, Valerie L.; Wakim, Nagi T.</p> <p>1993-08-01</p> <p>The evolution of computer networking technology has lead to the establishment of a massive networking infrastructure which interconnects various types of computing resources at many government, academic, and corporate institutions. A large segment of this infrastructure has been developed to facilitate information exchange and resource sharing within the scientific community. The National Aeronautics and Space Administration (NASA) supports both the development and the application of computer networks which provide its community with access to many valuable multi-disciplinary scientific information systems and on-line databases. Recognizing the need to extend the benefits of this advanced networking technology to the under-represented community, the National Space Science Data Center (NSSDC) in the Space Data and Computing Division at the Goddard Space Flight Center has developed the Minority University-Space Interdisciplinary Network (MU-SPIN) Program: a major networking and education initiative for Historically Black Colleges and Universities (HBCUs) and Minority Universities (MUs). In this paper, we will briefly explain the various components of the MU-SPIN Program while highlighting how, by providing access to scientific information systems and on-line data, it promotes a higher level of collaboration among faculty and students and NASA scientists.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/servlets/purl/1426900','SCIGOV-STC'); return false;" href="https://www.osti.gov/servlets/purl/1426900"><span></span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Pebay, Philippe; Terriberry, Timothy B.; Kolla, Hemanth</p> <p></p> <p>Formulas for incremental or parallel computation of second order central moments have long been known, and recent extensions of these formulas to univariate and multivariate moments of arbitrary order have been developed. Formulas such as these, are of key importance in scenarios where incremental results are required and in parallel and distributed systems where communication costs are high. We survey these recent results, and improve them with arbitrary-order, numerically stable one-pass formulas which we further extend with weighted and compound variants. We also develop a generalized correction factor for standard two-pass algorithms that enables the maintenance of accuracy over nearlymore » the full representable range of the input, avoiding the need for extended-precision arithmetic. We then empirically examine algorithm correctness for pairwise update formulas up to order four as well as condition number and relative error bounds for eight different central moment formulas, each up to degree six, to address the trade-offs between numerical accuracy and speed of the various algorithms. Finally, we demonstrate the use of the most elaborate among the above mentioned formulas, with the utilization of the compound moments for a practical large-scale scientific application.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/servlets/purl/1427275','SCIGOV-STC'); return false;" href="https://www.osti.gov/servlets/purl/1427275"><span></span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Pebay, Philippe; Terriberry, Timothy B.; Kolla, Hemanth</p> <p></p> <p>Formulas for incremental or parallel computation of second order central moments have long been known, and recent extensions of these formulas to univariate and multivariate moments of arbitrary order have been developed. Such formulas are of key importance in scenarios where incremental results are required and in parallel and distributed systems where communication costs are high. We survey these recent results, and improve them with arbitrary-order, numerically stable one-pass formulas which we further extend with weighted and compound variants. We also develop a generalized correction factor for standard two-pass algorithms that enables the maintenance of accuracy over nearly the fullmore » representable range of the input, avoiding the need for extended-precision arithmetic. We then empirically examine algorithm correctness for pairwise update formulas up to order four as well as condition number and relative error bounds for eight different central moment formulas, each up to degree six, to address the trade-offs between numerical accuracy and speed of the various algorithms. Finally, we demonstrate the use of the most elaborate among the above mentioned formulas, with the utilization of the compound moments for a practical large-scale scientific application.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/servlets/purl/881640','SCIGOV-STC'); return false;" href="https://www.osti.gov/servlets/purl/881640"><span>ISCR Annual Report: Fical Year 2004</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>McGraw, J R</p> <p>2005-03-03</p> <p>Large-scale scientific computation and all of the disciplines that support and help to validate it have been placed at the focus of Lawrence Livermore National Laboratory (LLNL) by the Advanced Simulation and Computing (ASC) program of the National Nuclear Security Administration (NNSA) and the Scientific Discovery through Advanced Computing (SciDAC) initiative of the Office of Science of the Department of Energy (DOE). The maturation of computational simulation as a tool of scientific and engineering research is underscored in the November 2004 statement of the Secretary of Energy that, ''high performance computing is the backbone of the nation's science and technologymore » enterprise''. LLNL operates several of the world's most powerful computers--including today's single most powerful--and has undertaken some of the largest and most compute-intensive simulations ever performed. Ultrascale simulation has been identified as one of the highest priorities in DOE's facilities planning for the next two decades. However, computers at architectural extremes are notoriously difficult to use efficiently. Furthermore, each successful terascale simulation only points out the need for much better ways of interacting with the resulting avalanche of data. Advances in scientific computing research have, therefore, never been more vital to LLNL's core missions than at present. Computational science is evolving so rapidly along every one of its research fronts that to remain on the leading edge, LLNL must engage researchers at many academic centers of excellence. In Fiscal Year 2004, the Institute for Scientific Computing Research (ISCR) served as one of LLNL's main bridges to the academic community with a program of collaborative subcontracts, visiting faculty, student internships, workshops, and an active seminar series. The ISCR identifies researchers from the academic community for computer science and computational science collaborations with LLNL and hosts them for short- and long-term visits with the aim of encouraging long-term academic research agendas that address LLNL's research priorities. Through such collaborations, ideas and software flow in both directions, and LLNL cultivates its future workforce. The Institute strives to be LLNL's ''eyes and ears'' in the computer and information sciences, keeping the Laboratory aware of and connected to important external advances. It also attempts to be the ''feet and hands'' that carry those advances into the Laboratory and incorporates them into practice. ISCR research participants are integrated into LLNL's Computing and Applied Research (CAR) Department, especially into its Center for Applied Scientific Computing (CASC). In turn, these organizations address computational challenges arising throughout the rest of the Laboratory. Administratively, the ISCR flourishes under LLNL's University Relations Program (URP). Together with the other five institutes of the URP, it navigates a course that allows LLNL to benefit from academic exchanges while preserving national security. While it is difficult to operate an academic-like research enterprise within the context of a national security laboratory, the results declare the challenges well met and worth the continued effort.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/servlets/purl/1408072','SCIGOV-STC'); return false;" href="https://www.osti.gov/servlets/purl/1408072"><span>A characterization of workflow management systems for extreme-scale applications</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Ferreira da Silva, Rafael; Filgueira, Rosa; Pietri, Ilia</p> <p></p> <p>We present that the automation of the execution of computational tasks is at the heart of improving scientific productivity. Over the last years, scientific workflows have been established as an important abstraction that captures data processing and computation of large and complex scientific applications. By allowing scientists to model and express entire data processing steps and their dependencies, workflow management systems relieve scientists from the details of an application and manage its execution on a computational infrastructure. As the resource requirements of today’s computational and data science applications that process vast amounts of data keep increasing, there is a compellingmore » case for a new generation of advances in high-performance computing, commonly termed as extreme-scale computing, which will bring forth multiple challenges for the design of workflow applications and management systems. This paper presents a novel characterization of workflow management systems using features commonly associated with extreme-scale computing applications. We classify 15 popular workflow management systems in terms of workflow execution models, heterogeneous computing environments, and data access methods. Finally, the paper also surveys workflow applications and identifies gaps for future research on the road to extreme-scale workflows and management systems.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/pages/biblio/1408072-characterization-workflow-management-systems-extreme-scale-applications','SCIGOV-DOEP'); return false;" href="https://www.osti.gov/pages/biblio/1408072-characterization-workflow-management-systems-extreme-scale-applications"><span>A characterization of workflow management systems for extreme-scale applications</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/pages">DOE PAGES</a></p> <p>Ferreira da Silva, Rafael; Filgueira, Rosa; Pietri, Ilia; ...</p> <p>2017-02-16</p> <p>We present that the automation of the execution of computational tasks is at the heart of improving scientific productivity. Over the last years, scientific workflows have been established as an important abstraction that captures data processing and computation of large and complex scientific applications. By allowing scientists to model and express entire data processing steps and their dependencies, workflow management systems relieve scientists from the details of an application and manage its execution on a computational infrastructure. As the resource requirements of today’s computational and data science applications that process vast amounts of data keep increasing, there is a compellingmore » case for a new generation of advances in high-performance computing, commonly termed as extreme-scale computing, which will bring forth multiple challenges for the design of workflow applications and management systems. This paper presents a novel characterization of workflow management systems using features commonly associated with extreme-scale computing applications. We classify 15 popular workflow management systems in terms of workflow execution models, heterogeneous computing environments, and data access methods. Finally, the paper also surveys workflow applications and identifies gaps for future research on the road to extreme-scale workflows and management systems.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015APS..APR.D1056T','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015APS..APR.D1056T"><span>A Scientific Analysis of Galaxy Tangential Speed of Revolution Curves III</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Taff, Laurence</p> <p>2015-04-01</p> <p>I last reported on my preliminary analysis of 350 + spiral, lenticular, irregular, polar ring, ring, and dwarf elliptical galaxies' tangential speed of revolution curves [TSRCs; and not rotation (sic) curves]. I now know that the consensus opinion in the literature--for which I can find no geometrical, numerical, statistical, nor scientific testing in 2,500 + publications--that the TSRC, vB(r), in the central bulges of these galaxies, is a linear function of the radial distance from the minor axis of symmetry r--is false. For the majority (>98%) vB(r) is rarely well represented by vB(r) = ωB r (for which the unique material model is an homogeneous, oblate, spheroid). Discovered via a scientific analysis of the gravitational potential energy computed directly from the observational data, vB(r) is almost exactly given by vB2(r) = (ωB r)2(1 + η r2) with | η | < 10-2 and frequently orders of magnitude less. The corresponding mass model is the simplest generalization: a two component homoeoid. The set of possible periodic orbits, based on circular trigonometric functions, becomes a set of periodic orbits based on the Jacobian elliptic functions. Once again it is possible to prove that the mass-to-light ratio can neither be a constant nor follow the de Vaucouleurs R1/4 rule.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/FR-2011-05-17/pdf/2011-12049.pdf','FEDREG'); return false;" href="https://www.gpo.gov/fdsys/pkg/FR-2011-05-17/pdf/2011-12049.pdf"><span>76 FR 28441 - Center for Scientific Review; Notice of Closed Meetings</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.gpo.gov/fdsys/browse/collection.action?collectionCode=FR">Federal Register 2010, 2011, 2012, 2013, 2014</a></p> <p></p> <p>2011-05-17</p> <p>...: Vision, Cognition and Pain. Date: June 29-30, 2011. Time: 8 a.m. to 6 p.m. Agenda: To review and evaluate...: Center for Scientific Review Special Emphasis Panel; Member Conflict: Cognition and Central Visual...</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://www.dtic.mil/docs/citations/ADA130978','DTIC-ST'); return false;" href="http://www.dtic.mil/docs/citations/ADA130978"><span>A Numerical Method for Computing the Transonic Fan Duct Flow over a Centerbody into an Exterior Free Stream - Program Tea-343,</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.dtic.mil/">DTIC Science & Technology</a></p> <p></p> <p>1974-09-24</p> <p>Transonic Flows with Imbedded Shock Waves", Boeing Scientific Research Laboratories Document D1-82-1053 (1971); also as invited lecture series for AGARD...Past Thin Lifting Airfoils", Boeing Scientific Research Laboratories Document D180-2298-1, June 1971. 5. Krupp, J. A. and Ia-man, 9. M., "Computation...Aerodynamics and Marine Sciences Laboratory, Boeing Scientific Research Laboratories, June 1971. 7. Krupp, J. A., "Documentation for Program TSONIC", Technical</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/servlets/purl/883740','SCIGOV-STC'); return false;" href="https://www.osti.gov/servlets/purl/883740"><span>ISCR FY2005 Annual Report</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Keyes, D E; McGraw, J R</p> <p>2006-02-02</p> <p>Large-scale scientific computation and all of the disciplines that support and help validate it have been placed at the focus of Lawrence Livermore National Laboratory (LLNL) by the Advanced Simulation and Computing (ASC) program of the National Nuclear Security Administration (NNSA) and the Scientific Discovery through Advanced Computing (SciDAC) initiative of the Office of Science of the Department of Energy (DOE). The maturation of simulation as a fundamental tool of scientific and engineering research is underscored in the President's Information Technology Advisory Committee (PITAC) June 2005 finding that ''computational science has become critical to scientific leadership, economic competitiveness, and nationalmore » security''. LLNL operates several of the world's most powerful computers--including today's single most powerful--and has undertaken some of the largest and most compute-intensive simulations ever performed, most notably the molecular dynamics simulation that sustained more than 100 Teraflop/s and won the 2005 Gordon Bell Prize. Ultrascale simulation has been identified as one of the highest priorities in DOE's facilities planning for the next two decades. However, computers at architectural extremes are notoriously difficult to use in an efficient manner. Furthermore, each successful terascale simulation only points out the need for much better ways of interacting with the resulting avalanche of data. Advances in scientific computing research have, therefore, never been more vital to the core missions of LLNL than at present. Computational science is evolving so rapidly along every one of its research fronts that to remain on the leading edge, LLNL must engage researchers at many academic centers of excellence. In FY 2005, the Institute for Scientific Computing Research (ISCR) served as one of LLNL's main bridges to the academic community with a program of collaborative subcontracts, visiting faculty, student internships, workshops, and an active seminar series. The ISCR identifies researchers from the academic community for computer science and computational science collaborations with LLNL and hosts them for both brief and extended visits with the aim of encouraging long-term academic research agendas that address LLNL research priorities. Through these collaborations, ideas and software flow in both directions, and LLNL cultivates its future workforce. The Institute strives to be LLNL's ''eyes and ears'' in the computer and information sciences, keeping the Laboratory aware of and connected to important external advances. It also attempts to be the ''hands and feet'' that carry those advances into the Laboratory and incorporate them into practice. ISCR research participants are integrated into LLNL's Computing Applications and Research (CAR) Department, especially into its Center for Applied Scientific Computing (CASC). In turn, these organizations address computational challenges arising throughout the rest of the Laboratory. Administratively, the ISCR flourishes under LLNL's University Relations Program (URP). Together with the other four institutes of the URP, the ISCR navigates a course that allows LLNL to benefit from academic exchanges while preserving national security. While it is difficult to operate an academic-like research enterprise within the context of a national security laboratory, the results declare the challenges well met and worth the continued effort. The pages of this annual report summarize the activities of the faculty members, postdoctoral researchers, students, and guests from industry and other laboratories who participated in LLNL's computational mission under the auspices of the ISCR during FY 2005.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://eric.ed.gov/?q=record+AND+scientific&pg=6&id=EJ266263','ERIC'); return false;" href="https://eric.ed.gov/?q=record+AND+scientific&pg=6&id=EJ266263"><span>Environmentalists and the Computer.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Baron, Robert C.</p> <p>1982-01-01</p> <p>Review characteristics, applications, and limitations of computers, including word processing, data/record keeping, scientific and industrial, and educational applications. Discusses misuse of computers and role of computers in environmental management. (JN)</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/19247811','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/19247811"><span>Programmers, professors, and parasites: credit and co-authorship in computer science.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Solomon, Justin</p> <p>2009-12-01</p> <p>This article presents an in-depth analysis of past and present publishing practices in academic computer science to suggest the establishment of a more consistent publishing standard. Historical precedent for academic publishing in computer science is established through the study of anecdotes as well as statistics collected from databases of published computer science papers. After examining these facts alongside information about analogous publishing situations and standards in other scientific fields, the article concludes with a list of basic principles that should be adopted in any computer science publishing standard. These principles would contribute to the reliability and scientific nature of academic publications in computer science and would allow for more straightforward discourse in future publications.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://eric.ed.gov/?q=water+AND+quality&pg=2&id=EJ1060250','ERIC'); return false;" href="https://eric.ed.gov/?q=water+AND+quality&pg=2&id=EJ1060250"><span>Scaffolding Argumentation about Water Quality: A Mixed-Method Study in a Rural Middle School</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Belland, Brian R.; Gu, Jiangyue; Armbrust, Sara; Cook, Brant</p> <p>2015-01-01</p> <p>A common way for students to develop scientific argumentation abilities is through argumentation about socioscientific issues, defined as scientific problems with social, ethical, and moral aspects. Computer-based scaffolding can support students in this process. In this mixed method study, we examined the use and impact of computer based…</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/CFR-2012-title48-vol7/pdf/CFR-2012-title48-vol7-sec9904-410-60.pdf','CFR2012'); return false;" href="https://www.gpo.gov/fdsys/pkg/CFR-2012-title48-vol7/pdf/CFR-2012-title48-vol7-sec9904-410-60.pdf"><span>48 CFR 9904.410-60 - Illustrations.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.gpo.gov/fdsys/browse/collectionCfr.action?selectedYearFrom=2012&page.go=Go">Code of Federal Regulations, 2012 CFR</a></p> <p></p> <p>2012-10-01</p> <p>... budgets for the other segment should be removed from B's G&A expense pool and transferred to the other...; all home office expenses allocated to Segment H are included in Segment H's G&A expense pool. (2) This... cost of scientific computer operations in its G&A expense pool. The scientific computer is used...</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/CFR-2014-title48-vol7/pdf/CFR-2014-title48-vol7-sec9904-410-60.pdf','CFR2014'); return false;" href="https://www.gpo.gov/fdsys/pkg/CFR-2014-title48-vol7/pdf/CFR-2014-title48-vol7-sec9904-410-60.pdf"><span>48 CFR 9904.410-60 - Illustrations.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.gpo.gov/fdsys/browse/collectionCfr.action?selectedYearFrom=2014&page.go=Go">Code of Federal Regulations, 2014 CFR</a></p> <p></p> <p>2014-10-01</p> <p>... budgets for the other segment should be removed from B's G&A expense pool and transferred to the other...; all home office expenses allocated to Segment H are included in Segment H's G&A expense pool. (2) This... cost of scientific computer operations in its G&A expense pool. The scientific computer is used...</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://www.dtic.mil/docs/citations/ADA502694','DTIC-ST'); return false;" href="http://www.dtic.mil/docs/citations/ADA502694"><span>America COMPETES Act and the FY2010 Budget</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.dtic.mil/">DTIC Science & Technology</a></p> <p></p> <p>2009-06-29</p> <p>Outstanding Junior Investigator, Fusion Energy Sciences Plasma Physics Junior Faculty Development; Advanced Scientific Computing Research Early Career...the Fusion Energy Sciences Graduate Fellowships.2 If members of Congress agree with this contention, these America COMPETES Act programs were...Physics Outstanding Junior Investigator, Fusion Energy Sciences Plasma Physics Junior Faculty Development; Advanced Scientific Computing Research Early</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/servlets/purl/1351827','SCIGOV-STC'); return false;" href="https://www.osti.gov/servlets/purl/1351827"><span>Analytical Performance Modeling and Validation of Intel’s Xeon Phi Architecture</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Chunduri, Sudheer; Balaprakash, Prasanna; Morozov, Vitali</p> <p></p> <p>Modeling the performance of scientific applications on emerging hardware plays a central role in achieving extreme-scale computing goals. Analytical models that capture the interaction between applications and hardware characteristics are attractive because even a reasonably accurate model can be useful for performance tuning before the hardware is made available. In this paper, we develop a hardware model for Intel’s second-generation Xeon Phi architecture code-named Knights Landing (KNL) for the SKOPE framework. We validate the KNL hardware model by projecting the performance of mini-benchmarks and application kernels. The results show that our KNL model can project the performance with prediction errorsmore » of 10% to 20%. The hardware model also provides informative recommendations for code transformations and tuning.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://ntrs.nasa.gov/search.jsp?R=19730030621&hterms=mvm&qs=Ntx%3Dmode%2Bmatchall%26Ntk%3DAll%26N%3D0%26No%3D10%26Ntt%3Dmvm','NASA-TRS'); return false;" href="https://ntrs.nasa.gov/search.jsp?R=19730030621&hterms=mvm&qs=Ntx%3Dmode%2Bmatchall%26Ntk%3DAll%26N%3D0%26No%3D10%26Ntt%3Dmvm"><span>The Mariner Venus Mercury flight data subsystem.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Whitehead, P. B.</p> <p>1972-01-01</p> <p>The flight data subsystem (FDS) discussed handles both the engineering and scientific measurements performed on the MVM'73. It formats the data into serial data streams, and sends it to the modulation/demodulation subsystem for transmission to earth or to the data storage subsystem for storage on a digital tape recorder. The FDS is controlled by serial digital words, called coded commands, received from the central computer sequencer of from the ground via the modulation/demodulation subsystem. The eight major blocks of the FDS are: power converter, timing and control, engineering data, memory, memory input/output and control, nonimaging data, imaging data, and data output. The FDS incorporates some 4000 components, weighs 17 kg, and uses 35 W of power. General data on the mission and spacecraft are given.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/pages/biblio/1361292-exascale-computing-big-data','SCIGOV-DOEP'); return false;" href="https://www.osti.gov/pages/biblio/1361292-exascale-computing-big-data"><span>Exascale computing and big data</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/pages">DOE PAGES</a></p> <p>Reed, Daniel A.; Dongarra, Jack</p> <p>2015-06-25</p> <p>Scientific discovery and engineering innovation requires unifying traditionally separated high-performance computing and big data analytics. The tools and cultures of high-performance computing and big data analytics have diverged, to the detriment of both; unification is essential to address a spectrum of major research domains. The challenges of scale tax our ability to transmit data, compute complicated functions on that data, or store a substantial part of it; new approaches are required to meet these challenges. Finally, the international nature of science demands further development of advanced computer architectures and global standards for processing data, even as international competition complicates themore » openness of the scientific process.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/servlets/purl/1361292','SCIGOV-STC'); return false;" href="https://www.osti.gov/servlets/purl/1361292"><span>Exascale computing and big data</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Reed, Daniel A.; Dongarra, Jack</p> <p></p> <p>Scientific discovery and engineering innovation requires unifying traditionally separated high-performance computing and big data analytics. The tools and cultures of high-performance computing and big data analytics have diverged, to the detriment of both; unification is essential to address a spectrum of major research domains. The challenges of scale tax our ability to transmit data, compute complicated functions on that data, or store a substantial part of it; new approaches are required to meet these challenges. Finally, the international nature of science demands further development of advanced computer architectures and global standards for processing data, even as international competition complicates themore » openness of the scientific process.« less</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_13");'>13</a></li> <li><a href="#" onclick='return showDiv("page_14");'>14</a></li> <li class="active"><span>15</span></li> <li><a href="#" onclick='return showDiv("page_16");'>16</a></li> <li><a href="#" onclick='return showDiv("page_17");'>17</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_15 --> <div id="page_16" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_14");'>14</a></li> <li><a href="#" onclick='return showDiv("page_15");'>15</a></li> <li class="active"><span>16</span></li> <li><a href="#" onclick='return showDiv("page_17");'>17</a></li> <li><a href="#" onclick='return showDiv("page_18");'>18</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="301"> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.fs.usda.gov/treesearch/pubs/1823','TREESEARCH'); return false;" href="https://www.fs.usda.gov/treesearch/pubs/1823"><span>Foreword</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.fs.usda.gov/treesearch/">Treesearch</a></p> <p>H. Michael Rauscher; Richard E. Plant; Alan J. Thomson; Mark J. Twery</p> <p>2000-01-01</p> <p>This article includes the central themes of the keynote speakers for the scientific conference "The Application of Scientific Knowledge to Decisionmaking in Managing Forest Ecosystems." This International Union of Forestry Research Organizations (IUFRO) conference presented the latest developments concerning the entire range of topics dealing with ecosystem...</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/servlets/purl/1076794','SCIGOV-STC'); return false;" href="https://www.osti.gov/servlets/purl/1076794"><span>The Magellan Final Report on Cloud Computing</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>,; Coghlan, Susan; Yelick, Katherine</p> <p></p> <p>The goal of Magellan, a project funded through the U.S. Department of Energy (DOE) Office of Advanced Scientific Computing Research (ASCR), was to investigate the potential role of cloud computing in addressing the computing needs for the DOE Office of Science (SC), particularly related to serving the needs of mid- range computing and future data-intensive computing workloads. A set of research questions was formed to probe various aspects of cloud computing from performance, usability, and cost. To address these questions, a distributed testbed infrastructure was deployed at the Argonne Leadership Computing Facility (ALCF) and the National Energy Research Scientific Computingmore » Center (NERSC). The testbed was designed to be flexible and capable enough to explore a variety of computing models and hardware design points in order to understand the impact for various scientific applications. During the project, the testbed also served as a valuable resource to application scientists. Applications from a diverse set of projects such as MG-RAST (a metagenomics analysis server), the Joint Genome Institute, the STAR experiment at the Relativistic Heavy Ion Collider, and the Laser Interferometer Gravitational Wave Observatory (LIGO), were used by the Magellan project for benchmarking within the cloud, but the project teams were also able to accomplish important production science utilizing the Magellan cloud resources.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://files.eric.ed.gov/fulltext/EJ1128266.pdf','ERIC'); return false;" href="http://files.eric.ed.gov/fulltext/EJ1128266.pdf"><span>Evolution and Natural Selection: Learning by Playing and Reflecting</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Herrero, David; del Castillo, Héctor; Monjelat, Natalia; García-Varela, Ana Belén; Checa, Mirian; Gómez, Patricia</p> <p>2014-01-01</p> <p>Scientific literacy is more than the simple reproduction of traditional school science knowledge and requires a set of skills, among them identifying scientific issues, explaining phenomena scientifically and using scientific evidence. Several studies have indicated that playing computer games in the classroom can support the development of…</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2008SPIE.6919E..0KW','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2008SPIE.6919E..0KW"><span>An adaptable XML based approach for scientific data management and integration</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Wang, Fusheng; Thiel, Florian; Furrer, Daniel; Vergara-Niedermayr, Cristobal; Qin, Chen; Hackenberg, Georg; Bourgue, Pierre-Emmanuel; Kaltschmidt, David; Wang, Mo</p> <p>2008-03-01</p> <p>Increased complexity of scientific research poses new challenges to scientific data management. Meanwhile, scientific collaboration is becoming increasing important, which relies on integrating and sharing data from distributed institutions. We develop SciPort, a Web-based platform on supporting scientific data management and integration based on a central server based distributed architecture, where researchers can easily collect, publish, and share their complex scientific data across multi-institutions. SciPort provides an XML based general approach to model complex scientific data by representing them as XML documents. The documents capture not only hierarchical structured data, but also images and raw data through references. In addition, SciPort provides an XML based hierarchical organization of the overall data space to make it convenient for quick browsing. To provide generalization, schemas and hierarchies are customizable with XML-based definitions, thus it is possible to quickly adapt the system to different applications. While each institution can manage documents on a Local SciPort Server independently, selected documents can be published to a Central Server to form a global view of shared data across all sites. By storing documents in a native XML database, SciPort provides high schema extensibility and supports comprehensive queries through XQuery. By providing a unified and effective means for data modeling, data access and customization with XML, SciPort provides a flexible and powerful platform for sharing scientific data for scientific research communities, and has been successfully used in both biomedical research and clinical trials.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/21603095','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/21603095"><span>An Adaptable XML Based Approach for Scientific Data Management and Integration.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Wang, Fusheng; Thiel, Florian; Furrer, Daniel; Vergara-Niedermayr, Cristobal; Qin, Chen; Hackenberg, Georg; Bourgue, Pierre-Emmanuel; Kaltschmidt, David; Wang, Mo</p> <p>2008-02-20</p> <p>Increased complexity of scientific research poses new challenges to scientific data management. Meanwhile, scientific collaboration is becoming increasing important, which relies on integrating and sharing data from distributed institutions. We develop SciPort, a Web-based platform on supporting scientific data management and integration based on a central server based distributed architecture, where researchers can easily collect, publish, and share their complex scientific data across multi-institutions. SciPort provides an XML based general approach to model complex scientific data by representing them as XML documents. The documents capture not only hierarchical structured data, but also images and raw data through references. In addition, SciPort provides an XML based hierarchical organization of the overall data space to make it convenient for quick browsing. To provide generalization, schemas and hierarchies are customizable with XML-based definitions, thus it is possible to quickly adapt the system to different applications. While each institution can manage documents on a Local SciPort Server independently, selected documents can be published to a Central Server to form a global view of shared data across all sites. By storing documents in a native XML database, SciPort provides high schema extensibility and supports comprehensive queries through XQuery. By providing a unified and effective means for data modeling, data access and customization with XML, SciPort provides a flexible and powerful platform for sharing scientific data for scientific research communities, and has been successfully used in both biomedical research and clinical trials.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014JPhCS.513c2069M','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014JPhCS.513c2069M"><span>Cloud Bursting with GlideinWMS: Means to satisfy ever increasing computing needs for Scientific Workflows</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Mhashilkar, Parag; Tiradani, Anthony; Holzman, Burt; Larson, Krista; Sfiligoi, Igor; Rynge, Mats</p> <p>2014-06-01</p> <p>Scientific communities have been in the forefront of adopting new technologies and methodologies in the computing. Scientific computing has influenced how science is done today, achieving breakthroughs that were impossible to achieve several decades ago. For the past decade several such communities in the Open Science Grid (OSG) and the European Grid Infrastructure (EGI) have been using GlideinWMS to run complex application workflows to effectively share computational resources over the grid. GlideinWMS is a pilot-based workload management system (WMS) that creates on demand, a dynamically sized overlay HTCondor batch system on grid resources. At present, the computational resources shared over the grid are just adequate to sustain the computing needs. We envision that the complexity of the science driven by "Big Data" will further push the need for computational resources. To fulfill their increasing demands and/or to run specialized workflows, some of the big communities like CMS are investigating the use of cloud computing as Infrastructure-As-A-Service (IAAS) with GlideinWMS as a potential alternative to fill the void. Similarly, communities with no previous access to computing resources can use GlideinWMS to setup up a batch system on the cloud infrastructure. To enable this, the architecture of GlideinWMS has been extended to enable support for interfacing GlideinWMS with different Scientific and commercial cloud providers like HLT, FutureGrid, FermiCloud and Amazon EC2. In this paper, we describe a solution for cloud bursting with GlideinWMS. The paper describes the approach, architectural changes and lessons learned while enabling support for cloud infrastructures in GlideinWMS.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/servlets/purl/1296583','SCIGOV-STC'); return false;" href="https://www.osti.gov/servlets/purl/1296583"><span>Cloud Bursting with GlideinWMS: Means to satisfy ever increasing computing needs for Scientific Workflows</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Mhashilkar, Parag; Tiradani, Anthony; Holzman, Burt</p> <p></p> <p>Scientific communities have been in the forefront of adopting new technologies and methodologies in the computing. Scientific computing has influenced how science is done today, achieving breakthroughs that were impossible to achieve several decades ago. For the past decade several such communities in the Open Science Grid (OSG) and the European Grid Infrastructure (EGI) have been using GlideinWMS to run complex application workflows to effectively share computational resources over the grid. GlideinWMS is a pilot-based workload management system (WMS) that creates on demand, a dynamically sized overlay HTCondor batch system on grid resources. At present, the computational resources shared overmore » the grid are just adequate to sustain the computing needs. We envision that the complexity of the science driven by 'Big Data' will further push the need for computational resources. To fulfill their increasing demands and/or to run specialized workflows, some of the big communities like CMS are investigating the use of cloud computing as Infrastructure-As-A-Service (IAAS) with GlideinWMS as a potential alternative to fill the void. Similarly, communities with no previous access to computing resources can use GlideinWMS to setup up a batch system on the cloud infrastructure. To enable this, the architecture of GlideinWMS has been extended to enable support for interfacing GlideinWMS with different Scientific and commercial cloud providers like HLT, FutureGrid, FermiCloud and Amazon EC2. In this paper, we describe a solution for cloud bursting with GlideinWMS. The paper describes the approach, architectural changes and lessons learned while enabling support for cloud infrastructures in GlideinWMS.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/servlets/purl/1222713','SCIGOV-STC'); return false;" href="https://www.osti.gov/servlets/purl/1222713"><span>DOE Advanced Scientific Computing Advisory Subcommittee (ASCAC) Report: Top Ten Exascale Research Challenges</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Lucas, Robert; Ang, James; Bergman, Keren</p> <p>2014-02-10</p> <p>Exascale computing systems are essential for the scientific fields that will transform the 21st century global economy, including energy, biotechnology, nanotechnology, and materials science. Progress in these fields is predicated on the ability to perform advanced scientific and engineering simulations, and analyze the deluge of data. On July 29, 2013, ASCAC was charged by Patricia Dehmer, the Acting Director of the Office of Science, to assemble a subcommittee to provide advice on exascale computing. This subcommittee was directed to return a list of no more than ten technical approaches (hardware and software) that will enable the development of a systemmore » that achieves the Department's goals for exascale computing. Numerous reports over the past few years have documented the technical challenges and the non¬-viability of simply scaling existing computer designs to reach exascale. The technical challenges revolve around energy consumption, memory performance, resilience, extreme concurrency, and big data. Drawing from these reports and more recent experience, this ASCAC subcommittee has identified the top ten computing technology advancements that are critical to making a capable, economically viable, exascale system.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/biblio/1335944-idea-paper-lifecycle-software-scientific-simulations','SCIGOV-STC'); return false;" href="https://www.osti.gov/biblio/1335944-idea-paper-lifecycle-software-scientific-simulations"><span>Idea Paper: The Lifecycle of Software for Scientific Simulations</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Dubey, Anshu; McInnes, Lois C.</p> <p></p> <p>The software lifecycle is a well researched topic that has produced many models to meet the needs of different types of software projects. However, one class of projects, software development for scientific computing, has received relatively little attention from lifecycle researchers. In particular, software for end-to-end computations for obtaining scientific results has received few lifecycle proposals and no formalization of a development model. An examination of development approaches employed by the teams implementing large multicomponent codes reveals a great deal of similarity in their strategies. This idea paper formalizes these related approaches into a lifecycle model for end-to-end scientific applicationmore » software, featuring loose coupling between submodels for development of infrastructure and scientific capability. We also invite input from stakeholders to converge on a model that captures the complexity of this development processes and provides needed lifecycle guidance to the scientific software community.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/biblio/6142497-skysine-ii-procedure-calculation-effects-structure-design-neutron-primary-gamma-ray-secondary-gamma-ray-dose-rates-air','SCIGOV-STC'); return false;" href="https://www.osti.gov/biblio/6142497-skysine-ii-procedure-calculation-effects-structure-design-neutron-primary-gamma-ray-secondary-gamma-ray-dose-rates-air"><span>SKYSINE-II procedure: calculation of the effects of structure design on neutron, primary gamma-ray and secondary gamma-ray dose rates in air</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Lampley, C.M.</p> <p>1979-01-01</p> <p>An updated version of the SKYSHINE Monte Carlo procedure has been developed. The new computer code, SKYSHINE-II, provides a substantial increase in versatility in that the program possesses the ability to address three types of point-isotropic radiation sources: (1) primary gamma rays, (2) neutrons, and (3) secondary gamma rays. In addition, the emitted radiation may now be characterized by an energy emission spectrum product of a new energy-dependent atmospheric transmission data base developed by Radiation Research Associates, Inc. for each of the three source types described above. Most of the computational options present in the original program have been retainedmore » in the new version. Hence, the SKYSHINE-II computer code provides a versatile and viable tool for the analysis of the radiation environment in the vicinity of a building structure containing radiation sources, situated within the confines of a nuclear power plant. This report describes many of the calculational methods employed within the SKYSHINE-II program. A brief description of the new data base is included. Utilization instructions for the program are provided for operation of the SKYSHINE-II code on the Brookhaven National Laboratory Central Scientific Computing Facility. A listing of the source decks, block data routines, and the new atmospheric transmission data base are provided in the appendices of the report.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015JPhCS.664d2057C','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015JPhCS.664d2057C"><span>Mean PB To Failure - Initial results from a long-term study of disk storage patterns at the RACF</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Caramarcu, C.; Hollowell, C.; Rao, T.; Strecker-Kellogg, W.; Wong, A.; Zaytsev, S. A.</p> <p>2015-12-01</p> <p>The RACF (RHIC-ATLAS Computing Facility) has operated a large, multi-purpose dedicated computing facility since the mid-1990’s, serving a worldwide, geographically diverse scientific community that is a major contributor to various HEPN projects. A central component of the RACF is the Linux-based worker node cluster that is used for both computing and data storage purposes. It currently has nearly 50,000 computing cores and over 23 PB of storage capacity distributed over 12,000+ (non-SSD) disk drives. The majority of the 12,000+ disk drives provide a cost-effective solution for dCache/XRootD-managed storage, and a key concern is the reliability of this solution over the lifetime of the hardware, particularly as the number of disk drives and the storage capacity of individual drives grow. We report initial results of a long-term study to measure lifetime PB read/written to disk drives in the worker node cluster. We discuss the historical disk drive mortality rate, disk drive manufacturers' published MPTF (Mean PB to Failure) data and how they are correlated to our results. The results help the RACF understand the productivity and reliability of its storage solutions and have implications for other highly-available storage systems (NFS, GPFS, CVMFS, etc) with large I/O requirements.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017AcASn..58...14F','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017AcASn..58...14F"><span>The Research and Implementation of MUSER CLEAN Algorithm Based on OpenCL</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Feng, Y.; Chen, K.; Deng, H.; Wang, F.; Mei, Y.; Wei, S. L.; Dai, W.; Yang, Q. P.; Liu, Y. B.; Wu, J. P.</p> <p>2017-03-01</p> <p>It's urgent to carry out high-performance data processing with a single machine in the development of astronomical software. However, due to the different configuration of the machine, traditional programming techniques such as multi-threading, and CUDA (Compute Unified Device Architecture)+GPU (Graphic Processing Unit) have obvious limitations in portability and seamlessness between different operation systems. The OpenCL (Open Computing Language) used in the development of MUSER (MingantU SpEctral Radioheliograph) data processing system is introduced. And the Högbom CLEAN algorithm is re-implemented into parallel CLEAN algorithm by the Python language and PyOpenCL extended package. The experimental results show that the CLEAN algorithm based on OpenCL has approximately equally operating efficiency compared with the former CLEAN algorithm based on CUDA. More important, the data processing in merely CPU (Central Processing Unit) environment of this system can also achieve high performance, which has solved the problem of environmental dependence of CUDA+GPU. Overall, the research improves the adaptability of the system with emphasis on performance of MUSER image clean computing. In the meanwhile, the realization of OpenCL in MUSER proves its availability in scientific data processing. In view of the high-performance computing features of OpenCL in heterogeneous environment, it will probably become the preferred technology in the future high-performance astronomical software development.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4785050','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4785050"><span>Neuroimaging of Central Sensitivity Syndromes: Key Insights from the Scientific Literature</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Walitt, Brian; Čeko, Marta; Gracely, John L.; Gracely, Richard H.</p> <p>2016-01-01</p> <p>Central sensitivity syndromes are characterized by distressing symptoms, such as pain and fatigue, in the absence of clinically obvious pathology. The scientific underpinnings of these disorders are not currently known. Modern neuroimaging techniques promise new insights into mechanisms mediating these postulated syndromes. We review the results of neuroimaging applied to five central sensitivity syndromes: fibromyalgia, chronic fatigue syndrome, irritable bowel syndrome, temporomandibular joint disorder, and vulvodynia syndrome. Neuroimaging studies of basal metabolism, anatomic constitution, molecular constituents, evoked neural activity, and treatment effect are compared across all of these syndromes. Evoked sensory paradigms reveal sensory augmentation to both painful and non-painful stimulation. This is a transformative observation for these syndromes, which were historically considered to be completely of hysterical or feigned in origin. However, whether sensory augmentation represents the cause of these syndromes, a predisposing factor, an endophenotype, or an epiphenomenon cannot be discerned from the current literature. Further, the result from cross-sectional neuroimaging studies of basal activity, anatomy, and molecular constituency are extremely heterogeneous within and between the syndromes. A defining neuroimaging “signature” cannot be discerned for any of the particular syndromes or for an over-arching central sensitization mechanism common to all of the syndromes. Several issues confound initial attempts to meaningfully measure treatment effects in these syndromes. At this time, the existence of “central sensitivity syndromes” is based more soundly on clinical and epidemiological evidence. A coherent picture of a “central sensitization” mechanism that bridges across all of these syndromes does not emerge from the existing scientific evidence. PMID:26717948</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/servlets/purl/43771','SCIGOV-STC'); return false;" href="https://www.osti.gov/servlets/purl/43771"><span>Introduction to computers: Reference guide</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Ligon, F.V.</p> <p>1995-04-01</p> <p>The ``Introduction to Computers`` program establishes formal partnerships with local school districts and community-based organizations, introduces computer literacy to precollege students and their parents, and encourages students to pursue Scientific, Mathematical, Engineering, and Technical careers (SET). Hands-on assignments are given in each class, reinforcing the lesson taught. In addition, the program is designed to broaden the knowledge base of teachers in scientific/technical concepts, and Brookhaven National Laboratory continues to act as a liaison, offering educational outreach to diverse community organizations and groups. This manual contains the teacher`s lesson plans and the student documentation to this introduction to computer course.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://eric.ed.gov/?q=Right+AND+method+AND+scientific&pg=5&id=ED547949','ERIC'); return false;" href="https://eric.ed.gov/?q=Right+AND+method+AND+scientific&pg=5&id=ED547949"><span>The Effects of Computer-Supported Inquiry-Based Learning Methods and Peer Interaction on Learning Stellar Parallax</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Ruzhitskaya, Lanika</p> <p>2011-01-01</p> <p>The presented research study investigated the effects of computer-supported inquiry-based learning and peer interaction methods on effectiveness of learning a scientific concept. The stellar parallax concept was selected as a basic, and yet important in astronomy, scientific construct, which is based on a straightforward relationship of several…</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://files.eric.ed.gov/fulltext/EJ1029323.pdf','ERIC'); return false;" href="http://files.eric.ed.gov/fulltext/EJ1029323.pdf"><span>Using Cloud-Computing Applications to Support Collaborative Scientific Inquiry: Examining Pre-Service Teachers' Perceived Barriers to Integration</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Donna, Joel D.; Miller, Brant G.</p> <p>2013-01-01</p> <p>Technology plays a crucial role in facilitating collaboration within the scientific community. Cloud-computing applications, such as Google Drive, can be used to model such collaboration and support inquiry within the secondary science classroom. Little is known about pre-service teachers' beliefs related to the envisioned use of collaborative,…</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://eric.ed.gov/?q=Scientific+AND+discoveries&pg=2&id=EJ908629','ERIC'); return false;" href="https://eric.ed.gov/?q=Scientific+AND+discoveries&pg=2&id=EJ908629"><span>The Goal Specificity Effect on Strategy Use and Instructional Efficiency during Computer-Based Scientific Discovery Learning</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Kunsting, Josef; Wirth, Joachim; Paas, Fred</p> <p>2011-01-01</p> <p>Using a computer-based scientific discovery learning environment on buoyancy in fluids we investigated the "effects of goal specificity" (nonspecific goals vs. specific goals) for two goal types (problem solving goals vs. learning goals) on "strategy use" and "instructional efficiency". Our empirical findings close an important research gap,…</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://www.dtic.mil/docs/citations/ADA189981','DTIC-ST'); return false;" href="http://www.dtic.mil/docs/citations/ADA189981"><span>Instrumentation for Scientific Computing in Neural Networks, Information Science, Artificial Intelligence, and Applied Mathematics.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.dtic.mil/">DTIC Science & Technology</a></p> <p></p> <p>1987-10-01</p> <p>include Security Classification) Instrumentation for scientific computing in neural networks, information science, artificial intelligence, and...instrumentation grant to purchase equipment for support of research in neural networks, information science, artificail intellignece , and applied mathematics...in Neural Networks, Information Science, Artificial Intelligence, and Applied Mathematics Contract AFOSR 86-0282 Principal Investigator: Stephen</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://www.dtic.mil/docs/citations/ADA373565','DTIC-ST'); return false;" href="http://www.dtic.mil/docs/citations/ADA373565"><span>Translations on Eastern Europe, Scientific Affairs, Number 546.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.dtic.mil/">DTIC Science & Technology</a></p> <p></p> <p>1977-05-27</p> <p>Tsurvenkov Prof. Tsvyatko Petkov Eng. Kiril Dochev Eng. Georgi Dimitrov Eng. Dimitur Petrov 26 Central Council of the Scientific and Technical...first deputy chairman of the Council of Ministers; Kiril Zarev, deputy chairman of the Council of Ministers and the chairman of the State Committee...deputy chairman: Engineer Georgi Dimitrov , deputy director for technical affairs and chairman of the scientific and technical society at the Vazovski</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2018JPhCS.979a2034G','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2018JPhCS.979a2034G"><span>The Observation of Bahasa Indonesia Official Computer Terms Implementation in Scientific Publication</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Gunawan, D.; Amalia, A.; Lydia, M. S.; Muthaqin, M. I.</p> <p>2018-03-01</p> <p>The government of the Republic of Indonesia had issued a regulation to substitute computer terms in foreign language that have been used earlier into official computer terms in Bahasa Indonesia. This regulation was stipulated in Presidential Decree No. 2 of 2001 concerning the introduction of official computer terms in Bahasa Indonesia (known as Senarai Padanan Istilah/SPI). After sixteen years, people of Indonesia, particularly for academics, should have implemented the official computer terms in their official publications. This observation is conducted to discover the implementation of official computer terms usage in scientific publications which are written in Bahasa Indonesia. The data source used in this observation are the publications by the academics, particularly in computer science field. The method used in the observation is divided into four stages. The first stage is metadata harvesting by using Open Archive Initiative - Protocol for Metadata Harvesting (OAI-PMH). Second, converting the harvested document (in pdf format) to plain text. The third stage is text-preprocessing as the preparation of string matching. Then the final stage is searching the official computer terms based on 629 SPI terms by using Boyer-Moore algorithm. We observed that there are 240,781 foreign computer terms in 1,156 scientific publications from six universities. This result shows that the foreign computer terms are still widely used by the academics.</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_14");'>14</a></li> <li><a href="#" onclick='return showDiv("page_15");'>15</a></li> <li class="active"><span>16</span></li> <li><a href="#" onclick='return showDiv("page_17");'>17</a></li> <li><a href="#" onclick='return showDiv("page_18");'>18</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_16 --> <div id="page_17" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_15");'>15</a></li> <li><a href="#" onclick='return showDiv("page_16");'>16</a></li> <li class="active"><span>17</span></li> <li><a href="#" onclick='return showDiv("page_18");'>18</a></li> <li><a href="#" onclick='return showDiv("page_19");'>19</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="321"> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://ntrs.nasa.gov/search.jsp?R=20150008748&hterms=cloud+computing&qs=Ntx%3Dmode%2Bmatchall%26Ntk%3DAll%26N%3D0%26No%3D50%26Ntt%3Dcloud%2Bcomputing','NASA-TRS'); return false;" href="https://ntrs.nasa.gov/search.jsp?R=20150008748&hterms=cloud+computing&qs=Ntx%3Dmode%2Bmatchall%26Ntk%3DAll%26N%3D0%26No%3D50%26Ntt%3Dcloud%2Bcomputing"><span>Evaluating the Efficacy of the Cloud for Cluster Computation</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Knight, David; Shams, Khawaja; Chang, George; Soderstrom, Tom</p> <p>2012-01-01</p> <p>Computing requirements vary by industry, and it follows that NASA and other research organizations have computing demands that fall outside the mainstream. While cloud computing made rapid inroads for tasks such as powering web applications, performance issues on highly distributed tasks hindered early adoption for scientific computation. One venture to address this problem is Nebula, NASA's homegrown cloud project tasked with delivering science-quality cloud computing resources. However, another industry development is Amazon's high-performance computing (HPC) instances on Elastic Cloud Compute (EC2) that promises improved performance for cluster computation. This paper presents results from a series of benchmarks run on Amazon EC2 and discusses the efficacy of current commercial cloud technology for running scientific applications across a cluster. In particular, a 240-core cluster of cloud instances achieved 2 TFLOPS on High-Performance Linpack (HPL) at 70% of theoretical computational performance. The cluster's local network also demonstrated sub-100 ?s inter-process latency with sustained inter-node throughput in excess of 8 Gbps. Beyond HPL, a real-world Hadoop image processing task from NASA's Lunar Mapping and Modeling Project (LMMP) was run on a 29 instance cluster to process lunar and Martian surface images with sizes on the order of tens of gigapixels. These results demonstrate that while not a rival of dedicated supercomputing clusters, commercial cloud technology is now a feasible option for moderately demanding scientific workloads.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4890317','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4890317"><span>Crystallographic Lattice Boltzmann Method</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Namburi, Manjusha; Krithivasan, Siddharth; Ansumali, Santosh</p> <p>2016-01-01</p> <p>Current approaches to Direct Numerical Simulation (DNS) are computationally quite expensive for most realistic scientific and engineering applications of Fluid Dynamics such as automobiles or atmospheric flows. The Lattice Boltzmann Method (LBM), with its simplified kinetic descriptions, has emerged as an important tool for simulating hydrodynamics. In a heterogeneous computing environment, it is often preferred due to its flexibility and better parallel scaling. However, direct simulation of realistic applications, without the use of turbulence models, remains a distant dream even with highly efficient methods such as LBM. In LBM, a fictitious lattice with suitable isotropy in the velocity space is considered to recover Navier-Stokes hydrodynamics in macroscopic limit. The same lattice is mapped onto a cartesian grid for spatial discretization of the kinetic equation. In this paper, we present an inverted argument of the LBM, by making spatial discretization as the central theme. We argue that the optimal spatial discretization for LBM is a Body Centered Cubic (BCC) arrangement of grid points. We illustrate an order-of-magnitude gain in efficiency for LBM and thus a significant progress towards feasibility of DNS for realistic flows. PMID:27251098</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/20110013076','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/20110013076"><span>Tool for Analysis and Reduction of Scientific Data</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>James, Mark</p> <p>2006-01-01</p> <p>The Automated Scheduling and Planning Environment (ASPEN) computer program has been updated to version 3.0. ASPEN as a whole (up to version 2.0) has been summarized, and selected aspects of ASPEN have been discussed in several previous NASA Tech Briefs articles. Restated briefly, ASPEN is a modular, reconfigurable, application software framework for solving batch problems that involve reasoning about time, activities, states, and resources. Applications of ASPEN can include planning spacecraft missions, scheduling of personnel, and managing supply chains, inventories, and production lines. ASPEN 3.0 can be customized for a wide range of applications and for a variety of computing environments that include various central processing units and randomaccess memories. Domain-specific reasoning modules (e.g., modules for determining orbits for spacecraft) can easily be plugged into ASPEN 3.0. Improvements over other, similar software that have been incorporated into ASPEN 3.0 include a provision for more expressive time-line values, new parsing capabilities afforded by an ASPEN language based on Extensible Markup Language, improved search capabilities, and improved interfaces to other, utility-type software (notably including MATLAB).</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/20110011151','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/20110011151"><span>Kepler Science Operations Center Architecture</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Middour, Christopher; Klaus, Todd; Jenkins, Jon; Pletcher, David; Cote, Miles; Chandrasekaran, Hema; Wohler, Bill; Girouard, Forrest; Gunter, Jay P.; Uddin, Kamal; <a style="text-decoration: none; " href="javascript:void(0); " onClick="displayelement('author_20110011151'); toggleEditAbsImage('author_20110011151_show'); toggleEditAbsImage('author_20110011151_hide'); "> <img style="display:inline; width:12px; height:12px; " src="images/arrow-up.gif" width="12" height="12" border="0" alt="hide" id="author_20110011151_show"> <img style="width:12px; height:12px; display:none; " src="images/arrow-down.gif" width="12" height="12" border="0" alt="hide" id="author_20110011151_hide"></p> <p>2010-01-01</p> <p>We give an overview of the operational concepts and architecture of the Kepler Science Data Pipeline. Designed, developed, operated, and maintained by the Science Operations Center (SOC) at NASA Ames Research Center, the Kepler Science Data Pipeline is central element of the Kepler Ground Data System. The SOC charter is to analyze stellar photometric data from the Kepler spacecraft and report results to the Kepler Science Office for further analysis. We describe how this is accomplished via the Kepler Science Data Pipeline, including the hardware infrastructure, scientific algorithms, and operational procedures. The SOC consists of an office at Ames Research Center, software development and operations departments, and a data center that hosts the computers required to perform data analysis. We discuss the high-performance, parallel computing software modules of the Kepler Science Data Pipeline that perform transit photometry, pixel-level calibration, systematic error-correction, attitude determination, stellar target management, and instrument characterization. We explain how data processing environments are divided to support operational processing and test needs. We explain the operational timelines for data processing and the data constructs that flow into the Kepler Science Data Pipeline.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/CFR-2014-title31-vol2/pdf/CFR-2014-title31-vol2-sec285-7.pdf','CFR2014'); return false;" href="https://www.gpo.gov/fdsys/pkg/CFR-2014-title31-vol2/pdf/CFR-2014-title31-vol2-sec285-7.pdf"><span>31 CFR 285.7 - Salary offset.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.gpo.gov/fdsys/browse/collectionCfr.action?selectedYearFrom=2014&page.go=Go">Code of Federal Regulations, 2014 CFR</a></p> <p></p> <p>2014-07-01</p> <p>... Secretary, has waived certain requirements of the Computer Matching and Privacy Protection Act of 1988, 5 U... process known as centralized salary offset computer matching, identify Federal employees who owe delinquent nontax debt to the United States. Centralized salary offset computer matching is the computerized...</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015JPhCS.608a2033A','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015JPhCS.608a2033A"><span>Heterogeneous High Throughput Scientific Computing with APM X-Gene and Intel Xeon Phi</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Abdurachmanov, David; Bockelman, Brian; Elmer, Peter; Eulisse, Giulio; Knight, Robert; Muzaffar, Shahzad</p> <p>2015-05-01</p> <p>Electrical power requirements will be a constraint on the future growth of Distributed High Throughput Computing (DHTC) as used by High Energy Physics. Performance-per-watt is a critical metric for the evaluation of computer architectures for cost- efficient computing. Additionally, future performance growth will come from heterogeneous, many-core, and high computing density platforms with specialized processors. In this paper, we examine the Intel Xeon Phi Many Integrated Cores (MIC) co-processor and Applied Micro X-Gene ARMv8 64-bit low-power server system-on-a-chip (SoC) solutions for scientific computing applications. We report our experience on software porting, performance and energy efficiency and evaluate the potential for use of such technologies in the context of distributed computing systems such as the Worldwide LHC Computing Grid (WLCG).</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/biblio/5792858-computer-applications-annual-report-october-september-lasl-data-base-management-activities-regarding-agricultural-phenomena-southwestern-us','SCIGOV-STC'); return false;" href="https://www.osti.gov/biblio/5792858-computer-applications-annual-report-october-september-lasl-data-base-management-activities-regarding-agricultural-phenomena-southwestern-us"><span>Computer applications. Annual report, October 1, 1977-September 30, 1978. [LASL data base management activities regarding agricultural phenomena in southwestern US</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Sanders, W.M.; Campbell, C.L.; Lester, J.V.</p> <p>1979-09-01</p> <p>The Los Alamos Scientific Laboratory is funded by the US Department of Agriculture to apply scientific and computer technology to solve agricultural problems. This report summarizes work during the period October 1, 1977, through September 30, 1978, on the application of computer technology to three areas: (1) surveillance of slaughterplants in Texas; (2) a pilot study of the New Mexico Brucellosis Eradication Program; and (3) the Market Cattle Identification program in Texas.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://www.dtic.mil/docs/citations/ADA046662','DTIC-ST'); return false;" href="http://www.dtic.mil/docs/citations/ADA046662"><span>Applications of Multiconductor Transmission Line Theory to the Prediction of Cable Coupling. Volume 7. Digital Computer Programs for the Analysis of Multiconductor Transmission Lines</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.dtic.mil/">DTIC Science & Technology</a></p> <p></p> <p>1977-07-01</p> <p>on an IBM 370/165 computer at The University of Kentucky using the Fortran IV, G level compiler and should be easily implemented on other computers...order as the columns of T. 3.5.3 Subroutines NROOT and EIGEN Subroutines NROOT and EIGEN are a set of subroutines from the IBM Scientific Subroutine...November 1975). [10] System/360 Scientific Subroutine Package, Version III, Fifth Edition (August 1970), IBM Corporation, Technical Publications</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://www.dtic.mil/docs/citations/ADA373989','DTIC-ST'); return false;" href="http://www.dtic.mil/docs/citations/ADA373989"><span>USSR and Eastern Europe Scientific Abstracts, Cybernetics, Computers, and Automation Technology, Number 27</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.dtic.mil/">DTIC Science & Technology</a></p> <p></p> <p>1977-05-10</p> <p>apply this method of forecast- ing in the solution of all major scientific-technical problems of the na- tional economy. Citing the slow...the future, however, computers will "mature" and learn to recognize patterns in what amounts to a much more complex language—the language of visual...images. Photoelectronic tracking devices or "eyes" will allow the computer to take in information in a much more complex form and to perform opera</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://www.dtic.mil/docs/citations/ADA183115','DTIC-ST'); return false;" href="http://www.dtic.mil/docs/citations/ADA183115"><span>Comments on the Development of Computational Mathematics in Czechoslovakia and in the USSR.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.dtic.mil/">DTIC Science & Technology</a></p> <p></p> <p>1987-03-01</p> <p>ACT (COusduMe an reverse .eld NE 4040604W SWi 1410011 6F 660" ambe The talk is an Invited lecture at Ale Conference on the History of Scientific and...Numeric Computations, May 13-15, 1987, Princeton, New Jersey. It present soon basic subjective observations about the history of numerical methods in...invited lecture at ACH Conference on the History of Scientific and Numeric Computations, May 13’-15, 1987, Princeton, New Jersey. It present some basic</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/1994EOSTr..75..289C','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/1994EOSTr..75..289C"><span>New project to support scientific collaboration electronically</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Clauer, C. R.; Rasmussen, C. E.; Niciejewski, R. J.; Killeen, T. L.; Kelly, J. D.; Zambre, Y.; Rosenberg, T. J.; Stauning, P.; Friis-Christensen, E.; Mende, S. B.; Weymouth, T. E.; Prakash, A.; McDaniel, S. E.; Olson, G. M.; Finholt, T. A.; Atkins, D. E.</p> <p></p> <p>A new multidisciplinary effort is linking research in the upper atmospheric and space, computer, and behavioral sciences to develop a prototype electronic environment for conducting team science worldwide. A real-world electronic collaboration testbed has been established to support scientific work centered around the experimental operations being conducted with instruments from the Sondrestrom Upper Atmospheric Research Facility in Kangerlussuaq, Greenland. Such group computing environments will become an important component of the National Information Infrastructure initiative, which is envisioned as the high-performance communications infrastructure to support national scientific research.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/20120013253','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/20120013253"><span>Social Networking Adapted for Distributed Scientific Collaboration</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Karimabadi, Homa</p> <p>2012-01-01</p> <p>Share is a social networking site with novel, specially designed feature sets to enable simultaneous remote collaboration and sharing of large data sets among scientists. The site will include not only the standard features found on popular consumer-oriented social networking sites such as Facebook and Myspace, but also a number of powerful tools to extend its functionality to a science collaboration site. A Virtual Observatory is a promising technology for making data accessible from various missions and instruments through a Web browser. Sci-Share augments services provided by Virtual Observatories by enabling distributed collaboration and sharing of downloaded and/or processed data among scientists. This will, in turn, increase science returns from NASA missions. Sci-Share also enables better utilization of NASA s high-performance computing resources by providing an easy and central mechanism to access and share large files on users space or those saved on mass storage. The most common means of remote scientific collaboration today remains the trio of e-mail for electronic communication, FTP for file sharing, and personalized Web sites for dissemination of papers and research results. Each of these tools has well-known limitations. Sci-Share transforms the social networking paradigm into a scientific collaboration environment by offering powerful tools for cooperative discourse and digital content sharing. Sci-Share differentiates itself by serving as an online repository for users digital content with the following unique features: a) Sharing of any file type, any size, from anywhere; b) Creation of projects and groups for controlled sharing; c) Module for sharing files on HPC (High Performance Computing) sites; d) Universal accessibility of staged files as embedded links on other sites (e.g. Facebook) and tools (e.g. e-mail); e) Drag-and-drop transfer of large files, replacing awkward e-mail attachments (and file size limitations); f) Enterprise-level data and messaging encryption; and g) Easy-to-use intuitive workflow.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2007AGUFMIN53C..06O','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2007AGUFMIN53C..06O"><span>Metadata Management on the SCEC PetaSHA Project: Helping Users Describe, Discover, Understand, and Use Simulation Data in a Large-Scale Scientific Collaboration</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Okaya, D.; Deelman, E.; Maechling, P.; Wong-Barnum, M.; Jordan, T. H.; Meyers, D.</p> <p>2007-12-01</p> <p>Large scientific collaborations, such as the SCEC Petascale Cyberfacility for Physics-based Seismic Hazard Analysis (PetaSHA) Project, involve interactions between many scientists who exchange ideas and research results. These groups must organize, manage, and make accessible their community materials of observational data, derivative (research) results, computational products, and community software. The integration of scientific workflows as a paradigm to solve complex computations provides advantages of efficiency, reliability, repeatability, choices, and ease of use. The underlying resource needed for a scientific workflow to function and create discoverable and exchangeable products is the construction, tracking, and preservation of metadata. In the scientific workflow environment there is a two-tier structure of metadata. Workflow-level metadata and provenance describe operational steps, identity of resources, execution status, and product locations and names. Domain-level metadata essentially define the scientific meaning of data, codes and products. To a large degree the metadata at these two levels are separate. However, between these two levels is a subset of metadata produced at one level but is needed by the other. This crossover metadata suggests that some commonality in metadata handling is needed. SCEC researchers are collaborating with computer scientists at SDSC, the USC Information Sciences Institute, and Carnegie Mellon Univ. in order to perform earthquake science using high-performance computational resources. A primary objective of the "PetaSHA" collaboration is to perform physics-based estimations of strong ground motion associated with real and hypothetical earthquakes located within Southern California. Construction of 3D earth models, earthquake representations, and numerical simulation of seismic waves are key components of these estimations. Scientific workflows are used to orchestrate the sequences of scientific tasks and to access distributed computational facilities such as the NSF TeraGrid. Different types of metadata are produced and captured within the scientific workflows. One workflow within PetaSHA ("Earthworks") performs a linear sequence of tasks with workflow and seismological metadata preserved. Downstream scientific codes ingest these metadata produced by upstream codes. The seismological metadata uses attribute-value pairing in plain text; an identified need is to use more advanced handling methods. Another workflow system within PetaSHA ("Cybershake") involves several complex workflows in order to perform statistical analysis of ground shaking due to thousands of hypothetical but plausible earthquakes. Metadata management has been challenging due to its construction around a number of legacy scientific codes. We describe difficulties arising in the scientific workflow due to the lack of this metadata and suggest corrective steps, which in some cases include the cultural shift of domain science programmers coding for metadata.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/servlets/purl/15013119','SCIGOV-STC'); return false;" href="https://www.osti.gov/servlets/purl/15013119"><span>Institute for scientific computing research;fiscal year 1999 annual report</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Keyes, D</p> <p>2000-03-28</p> <p>Large-scale scientific computation, and all of the disciplines that support it and help to validate it, have been placed at the focus of Lawrence Livermore National Laboratory by the Accelerated Strategic Computing Initiative (ASCI). The Laboratory operates the computer with the highest peak performance in the world and has undertaken some of the largest and most compute-intensive simulations ever performed. Computers at the architectural extremes, however, are notoriously difficult to use efficiently. Even such successes as the Laboratory's two Bell Prizes awarded in November 1999 only emphasize the need for much better ways of interacting with the results of large-scalemore » simulations. Advances in scientific computing research have, therefore, never been more vital to the core missions of the Laboratory than at present. Computational science is evolving so rapidly along every one of its research fronts that to remain on the leading edge, the Laboratory must engage researchers at many academic centers of excellence. In FY 1999, the Institute for Scientific Computing Research (ISCR) has expanded the Laboratory's bridge to the academic community in the form of collaborative subcontracts, visiting faculty, student internships, a workshop, and a very active seminar series. ISCR research participants are integrated almost seamlessly with the Laboratory's Center for Applied Scientific Computing (CASC), which, in turn, addresses computational challenges arising throughout the Laboratory. Administratively, the ISCR flourishes under the Laboratory's University Relations Program (URP). Together with the other four Institutes of the URP, it must navigate a course that allows the Laboratory to benefit from academic exchanges while preserving national security. Although FY 1999 brought more than its share of challenges to the operation of an academic-like research enterprise within the context of a national security laboratory, the results declare the challenges well met and well worth the continued effort. A change of administration for the ISCR occurred during FY 1999. Acting Director John Fitzgerald retired from LLNL in August after 35 years of service, including the last two at helm of the ISCR. David Keyes, who has been a regular visitor in conjunction with ASCI scalable algorithms research since October 1997, overlapped with John for three months and serves half-time as the new Acting Director.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2008PhRvE..78e6702H','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2008PhRvE..78e6702H"><span>Accuracy of the lattice-Boltzmann method using the Cell processor</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Harvey, M. J.; de Fabritiis, G.; Giupponi, G.</p> <p>2008-11-01</p> <p>Accelerator processors like the new Cell processor are extending the traditional platforms for scientific computation, allowing orders of magnitude more floating-point operations per second (flops) compared to standard central processing units. However, they currently lack double-precision support and support for some IEEE 754 capabilities. In this work, we develop a lattice-Boltzmann (LB) code to run on the Cell processor and test the accuracy of this lattice method on this platform. We run tests for different flow topologies, boundary conditions, and Reynolds numbers in the range Re=6 350 . In one case, simulation results show a reduced mass and momentum conservation compared to an equivalent double-precision LB implementation. All other cases demonstrate the utility of the Cell processor for fluid dynamics simulations. Benchmarks on two Cell-based platforms are performed, the Sony Playstation3 and the QS20/QS21 IBM blade, obtaining a speed-up factor of 7 and 21, respectively, compared to the original PC version of the code, and a conservative sustained performance of 28 gigaflops per single Cell processor. Our results suggest that choice of IEEE 754 rounding mode is possibly as important as double-precision support for this specific scientific application.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/CFR-2010-title31-vol2/pdf/CFR-2010-title31-vol2-sec285-7.pdf','CFR'); return false;" href="https://www.gpo.gov/fdsys/pkg/CFR-2010-title31-vol2/pdf/CFR-2010-title31-vol2-sec285-7.pdf"><span>31 CFR 285.7 - Salary offset.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.gpo.gov/fdsys/browse/collectionCfr.action?selectedYearFrom=2010&page.go=Go">Code of Federal Regulations, 2010 CFR</a></p> <p></p> <p>2010-07-01</p> <p>... requirements of the Computer Matching and Privacy Protection Act of 1988, 5 U.S.C. 552a, as amended, for... known as centralized salary offset computer matching, identify Federal employees who owe delinquent nontax debt to the United States. Centralized salary offset computer matching is the computerized...</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/CFR-2011-title31-vol2/pdf/CFR-2011-title31-vol2-sec285-7.pdf','CFR2011'); return false;" href="https://www.gpo.gov/fdsys/pkg/CFR-2011-title31-vol2/pdf/CFR-2011-title31-vol2-sec285-7.pdf"><span>31 CFR 285.7 - Salary offset.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.gpo.gov/fdsys/browse/collectionCfr.action?selectedYearFrom=2011&page.go=Go">Code of Federal Regulations, 2011 CFR</a></p> <p></p> <p>2011-07-01</p> <p>... requirements of the Computer Matching and Privacy Protection Act of 1988, 5 U.S.C. 552a, as amended, for... known as centralized salary offset computer matching, identify Federal employees who owe delinquent nontax debt to the United States. Centralized salary offset computer matching is the computerized...</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/CFR-2012-title31-vol2/pdf/CFR-2012-title31-vol2-sec285-7.pdf','CFR2012'); return false;" href="https://www.gpo.gov/fdsys/pkg/CFR-2012-title31-vol2/pdf/CFR-2012-title31-vol2-sec285-7.pdf"><span>31 CFR 285.7 - Salary offset.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.gpo.gov/fdsys/browse/collectionCfr.action?selectedYearFrom=2012&page.go=Go">Code of Federal Regulations, 2012 CFR</a></p> <p></p> <p>2012-07-01</p> <p>... requirements of the Computer Matching and Privacy Protection Act of 1988, 5 U.S.C. 552a, as amended, for... known as centralized salary offset computer matching, identify Federal employees who owe delinquent nontax debt to the United States. Centralized salary offset computer matching is the computerized...</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/CFR-2013-title31-vol2/pdf/CFR-2013-title31-vol2-sec285-7.pdf','CFR2013'); return false;" href="https://www.gpo.gov/fdsys/pkg/CFR-2013-title31-vol2/pdf/CFR-2013-title31-vol2-sec285-7.pdf"><span>31 CFR 285.7 - Salary offset.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.gpo.gov/fdsys/browse/collectionCfr.action?selectedYearFrom=2013&page.go=Go">Code of Federal Regulations, 2013 CFR</a></p> <p></p> <p>2013-07-01</p> <p>... requirements of the Computer Matching and Privacy Protection Act of 1988, 5 U.S.C. 552a, as amended, for... known as centralized salary offset computer matching, identify Federal employees who owe delinquent nontax debt to the United States. Centralized salary offset computer matching is the computerized...</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://www.dtic.mil/docs/citations/ADA076184','DTIC-ST'); return false;" href="http://www.dtic.mil/docs/citations/ADA076184"><span>Radar Detection Models in Computer Supported Naval War Games</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.dtic.mil/">DTIC Science & Technology</a></p> <p></p> <p>1979-06-08</p> <p>revealed a requirement for the effective centralized manage- ment of computer supported war game development and employment in the U.S. Navy. A...considerations and supports the requirement for centralized Io 97 management of computerized war game development . Therefore it is recommended that a central...managerial and fiscal authority be estab- lished for computerized tactical war game development . This central authority should ensure that new games</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_15");'>15</a></li> <li><a href="#" onclick='return showDiv("page_16");'>16</a></li> <li class="active"><span>17</span></li> <li><a href="#" onclick='return showDiv("page_18");'>18</a></li> <li><a href="#" onclick='return showDiv("page_19");'>19</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_17 --> <div id="page_18" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_16");'>16</a></li> <li><a href="#" onclick='return showDiv("page_17");'>17</a></li> <li class="active"><span>18</span></li> <li><a href="#" onclick='return showDiv("page_19");'>19</a></li> <li><a href="#" onclick='return showDiv("page_20");'>20</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="341"> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://eric.ed.gov/?q=Cities&pg=5&id=EJ1143561','ERIC'); return false;" href="https://eric.ed.gov/?q=Cities&pg=5&id=EJ1143561"><span>En Garde: Fencing at Kansas City's Central Computers Unlimited/Classical Greek Magnet High School, 1991-1995</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Poos, Bradley W.</p> <p>2015-01-01</p> <p>Central High School in Kansas City, Missouri is one of the oldest schools west of the Mississippi and the first public high school built in Kansas City. Kansas City's magnet plan resulted in Central High School being rebuilt as the Central Computers Unlimited/Classical Greek Magnet High School, a school that was designed to offer students an…</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/FR-2013-02-11/pdf/2013-02983.pdf','FEDREG'); return false;" href="https://www.gpo.gov/fdsys/pkg/FR-2013-02-11/pdf/2013-02983.pdf"><span>78 FR 9727 - Endangered Species Recovery Permit Applications</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.gpo.gov/fdsys/browse/collection.action?collectionCode=FR">Federal Register 2010, 2011, 2012, 2013, 2014</a></p> <p></p> <p>2013-02-11</p> <p>... California red-legged frog (Rana draytonii) (R. aurora d.) and the California tiger salamander (central DPS... salamander (central DPS) (Ambystoma californiense) in conjunction with survey and scientific research...) the California tiger salamander (Santa Barbara County DPS and Sonoma County DPS) (Ambystoma...</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015AGUFMIN13C1856P','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015AGUFMIN13C1856P"><span>Key Lessons in Building "Data Commons": The Open Science Data Cloud Ecosystem</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Patterson, M.; Grossman, R.; Heath, A.; Murphy, M.; Wells, W.</p> <p>2015-12-01</p> <p>Cloud computing technology has created a shift around data and data analysis by allowing researchers to push computation to data as opposed to having to pull data to an individual researcher's computer. Subsequently, cloud-based resources can provide unique opportunities to capture computing environments used both to access raw data in its original form and also to create analysis products which may be the source of data for tables and figures presented in research publications. Since 2008, the Open Cloud Consortium (OCC) has operated the Open Science Data Cloud (OSDC), which provides scientific researchers with computational resources for storing, sharing, and analyzing large (terabyte and petabyte-scale) scientific datasets. OSDC has provided compute and storage services to over 750 researchers in a wide variety of data intensive disciplines. Recently, internal users have logged about 2 million core hours each month. The OSDC also serves the research community by colocating these resources with access to nearly a petabyte of public scientific datasets in a variety of fields also accessible for download externally by the public. In our experience operating these resources, researchers are well served by "data commons," meaning cyberinfrastructure that colocates data archives, computing, and storage infrastructure and supports essential tools and services for working with scientific data. In addition to the OSDC public data commons, the OCC operates a data commons in collaboration with NASA and is developing a data commons for NOAA datasets. As cloud-based infrastructures for distributing and computing over data become more pervasive, we ask, "What does it mean to publish data in a data commons?" Here we present the OSDC perspective and discuss several services that are key in architecting data commons, including digital identifier services.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2013AGUFMIN53E..05F','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2013AGUFMIN53E..05F"><span>Enabling a Scientific Cloud Marketplace: VGL (Invited)</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Fraser, R.; Woodcock, R.; Wyborn, L. A.; Vote, J.; Rankine, T.; Cox, S. J.</p> <p>2013-12-01</p> <p>The Virtual Geophysics Laboratory (VGL) provides a flexible, web based environment where researchers can browse data and use a variety of scientific software packaged into tool kits that run in the Cloud. Both data and tool kits are published by multiple researchers and registered with the VGL infrastructure forming a data and application marketplace. The VGL provides the basic work flow of Discovery and Access to the disparate data sources and a Library for tool kits and scripting to drive the scientific codes. Computation is then performed on the Research or Commercial Clouds. Provenance information is collected throughout the work flow and can be published alongside the results allowing for experiment comparison and sharing with other researchers. VGL's "mix and match" approach to data, computational resources and scientific codes, enables a dynamic approach to scientific collaboration. VGL allows scientists to publish their specific contribution, be it data, code, compute or work flow, knowing the VGL framework will provide other components needed for a complete application. Other scientists can choose the pieces that suit them best to assemble an experiment. The coarse grain workflow of the VGL framework combined with the flexibility of the scripting library and computational toolkits allows for significant customisation and sharing amongst the community. The VGL utilises the cloud computational and storage resources from the Australian academic research cloud provided by the NeCTAR initiative and a large variety of data accessible from national and state agencies via the Spatial Information Services Stack (SISS - http://siss.auscope.org). VGL v1.2 screenshot - http://vgl.auscope.org</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2008SPIE.6963E..1AB','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2008SPIE.6963E..1AB"><span>U.S. Army Research Laboratory (ARL) multimodal signatures database</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Bennett, Kelly</p> <p>2008-04-01</p> <p>The U.S. Army Research Laboratory (ARL) Multimodal Signatures Database (MMSDB) is a centralized collection of sensor data of various modalities that are co-located and co-registered. The signatures include ground and air vehicles, personnel, mortar, artillery, small arms gunfire from potential sniper weapons, explosives, and many other high value targets. This data is made available to Department of Defense (DoD) and DoD contractors, Intel agencies, other government agencies (OGA), and academia for use in developing target detection, tracking, and classification algorithms and systems to protect our Soldiers. A platform independent Web interface disseminates the signatures to researchers and engineers within the scientific community. Hierarchical Data Format 5 (HDF5) signature models provide an excellent solution for the sharing of complex multimodal signature data for algorithmic development and database requirements. Many open source tools for viewing and plotting HDF5 signatures are available over the Web. Seamless integration of HDF5 signatures is possible in both proprietary computational environments, such as MATLAB, and Free and Open Source Software (FOSS) computational environments, such as Octave and Python, for performing signal processing, analysis, and algorithm development. Future developments include extending the Web interface into a portal system for accessing ARL algorithms and signatures, High Performance Computing (HPC) resources, and integrating existing database and signature architectures into sensor networking environments.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/servlets/purl/1121930','SCIGOV-STC'); return false;" href="https://www.osti.gov/servlets/purl/1121930"><span>Heterogeneous scalable framework for multiphase flows</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Morris, Karla Vanessa</p> <p>2013-09-01</p> <p>Two categories of challenges confront the developer of computational spray models: those related to the computation and those related to the physics. Regarding the computation, the trend towards heterogeneous, multi- and many-core platforms will require considerable re-engineering of codes written for the current supercomputing platforms. Regarding the physics, accurate methods for transferring mass, momentum and energy from the dispersed phase onto the carrier fluid grid have so far eluded modelers. Significant challenges also lie at the intersection between these two categories. To be competitive, any physics model must be expressible in a parallel algorithm that performs well on evolving computermore » platforms. This work created an application based on a software architecture where the physics and software concerns are separated in a way that adds flexibility to both. The develop spray-tracking package includes an application programming interface (API) that abstracts away the platform-dependent parallelization concerns, enabling the scientific programmer to write serial code that the API resolves into parallel processes and threads of execution. The project also developed the infrastructure required to provide similar APIs to other application. The API allow object-oriented Fortran applications direct interaction with Trilinos to support memory management of distributed objects in central processing units (CPU) and graphic processing units (GPU) nodes for applications using C++.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/28584870','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/28584870"><span>The inventory as a core element in the further development of the science curriculum in the Mannheim Reformed Curriculum of Medicine.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Eckel, Julia; Schüttpelz-Brauns, Katrin; Miethke, Thomas; Rolletschek, Alexandra; Fritz, Harald M</p> <p>2017-01-01</p> <p>Introduction: The German Council of Science and Humanities as well as a number of medical professional associations support the strengthening of scientific competences by developing longitudinal curricula for teaching scientific competences in the undergraduate medical education. The National Competence Based Catalogue of Learning Objectives for Undergraduate Medical Education (NKLM) has also defined medical scientific skills as learning objectives in addition to the role of the scholar. The development of the Mannheim science curriculum started with a systematic inventory of the teaching of scientific competences in the Mannheim Reformed Curriculum of Medicine (MaReCuM). Methods: The inventory is based on the analysis of module profiles, teaching materials, surveys among experts, and verbatims from memory. Furthermore, science learning objectives were defined and prioritized, thus enabling the contents of the various courses to be assigned to the top three learning objectives. Results: The learning objectives systematic collection of information regarding the current state of research, critical assessment of scientific information and data sources, as well as presentation and discussion of the results of scientific studies are facilitated by various teaching courses from the first to the fifth year of undergraduate training. The review reveals a longitudinal science curriculum that has emerged implicitly. Future efforts must aim at eliminating redundancies and closing gaps; in addition, courses must be more closely aligned with each other, regarding both their contents and their timing, by means of a central coordination unit. Conclusion: The teaching of scientific thinking and working is a central component in the MaReCuM. The inventory and prioritization of science learning objectives form the basis for a structured ongoing development of the curriculum. An essential aspect here is the establishment of a central project team responsible for the planning, coordination, and review of these measures.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://www.dtic.mil/docs/citations/AD1051297','DTIC-ST'); return false;" href="http://www.dtic.mil/docs/citations/AD1051297"><span>Long-term Stable Conservative Multiscale Methods for Vortex Flows</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.dtic.mil/">DTIC Science & Technology</a></p> <p></p> <p>2017-10-31</p> <p>Computational and Applied Mathematics and Engeneering, Eccomas 2016 (Crete, June, 2016) - M. A. Olshanskii, Scientific computing seminar of Math ...UMass Dartmouth (October 2015) - L. Rebholz, Applied Math Seminar Talk, University of Alberta (October 2015) - L. Rebholz, Colloquium Talk, Scientific...Colloquium, (November 2016) - L. Rebholz, Joint Math Meetings 2017, Special session on recent advances in numerical analysis of PDEs, Atlanta GA</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://files.eric.ed.gov/fulltext/ED046405.pdf','ERIC'); return false;" href="http://files.eric.ed.gov/fulltext/ED046405.pdf"><span>The Operation of a Specialized Scientific Information and Data Analysis Center With Computer Base and Associated Communications Network.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Cottrell, William B.; And Others</p> <p></p> <p>The Nuclear Safety Information Center (NSIC) is a highly sophisticated scientific information center operated at Oak Ridge National Laboratory (ORNL) for the U.S. Atomic Energy Commission. Its information file, which consists of both data and bibliographic information, is computer stored and numerous programs have been developed to facilitate the…</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2010EGUGA..12.1943K','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2010EGUGA..12.1943K"><span>EUROPLANET-RI modelling service for the planetary science community: European Modelling and Data Analysis Facility (EMDAF)</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Khodachenko, Maxim; Miller, Steven; Stoeckler, Robert; Topf, Florian</p> <p>2010-05-01</p> <p>Computational modeling and observational data analysis are two major aspects of the modern scientific research. Both appear nowadays under extensive development and application. Many of the scientific goals of planetary space missions require robust models of planetary objects and environments as well as efficient data analysis algorithms, to predict conditions for mission planning and to interpret the experimental data. Europe has great strength in these areas, but it is insufficiently coordinated; individual groups, models, techniques and algorithms need to be coupled and integrated. Existing level of scientific cooperation and the technical capabilities for operative communication, allow considerable progress in the development of a distributed international Research Infrastructure (RI) which is based on the existing in Europe computational modelling and data analysis centers, providing the scientific community with dedicated services in the fields of their computational and data analysis expertise. These services will appear as a product of the collaborative communication and joint research efforts of the numerical and data analysis experts together with planetary scientists. The major goal of the EUROPLANET-RI / EMDAF is to make computational models and data analysis algorithms associated with particular national RIs and teams, as well as their outputs, more readily available to their potential user community and more tailored to scientific user requirements, without compromising front-line specialized research on model and data analysis algorithms development and software implementation. This objective will be met through four keys subdivisions/tasks of EMAF: 1) an Interactive Catalogue of Planetary Models; 2) a Distributed Planetary Modelling Laboratory; 3) a Distributed Data Analysis Laboratory, and 4) enabling Models and Routines for High Performance Computing Grids. Using the advantages of the coordinated operation and efficient communication between the involved computational modelling, research and data analysis expert teams and their related research infrastructures, EMDAF will provide a 1) flexible, 2) scientific user oriented, 3) continuously developing and fast upgrading computational and data analysis service to support and intensify the European planetary scientific research. At the beginning EMDAF will create a set of demonstrators and operational tests of this service in key areas of European planetary science. This work will aim at the following objectives: (a) Development and implementation of tools for distant interactive communication between the planetary scientists and computing experts (including related RIs); (b) Development of standard routine packages, and user-friendly interfaces for operation of the existing numerical codes and data analysis algorithms by the specialized planetary scientists; (c) Development of a prototype of numerical modelling services "on demand" for space missions and planetary researchers; (d) Development of a prototype of data analysis services "on demand" for space missions and planetary researchers; (e) Development of a prototype of coordinated interconnected simulations of planetary phenomena and objects (global multi-model simulators); (f) Providing the demonstrators of a coordinated use of high performance computing facilities (super-computer networks), done in cooperation with European HPC Grid DEISA.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2013AGUFMIN53B1562J','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2013AGUFMIN53B1562J"><span>Distributed GPU Computing in GIScience</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Jiang, Y.; Yang, C.; Huang, Q.; Li, J.; Sun, M.</p> <p>2013-12-01</p> <p>Geoscientists strived to discover potential principles and patterns hidden inside ever-growing Big Data for scientific discoveries. To better achieve this objective, more capable computing resources are required to process, analyze and visualize Big Data (Ferreira et al., 2003; Li et al., 2013). Current CPU-based computing techniques cannot promptly meet the computing challenges caused by increasing amount of datasets from different domains, such as social media, earth observation, environmental sensing (Li et al., 2013). Meanwhile CPU-based computing resources structured as cluster or supercomputer is costly. In the past several years with GPU-based technology matured in both the capability and performance, GPU-based computing has emerged as a new computing paradigm. Compare to traditional computing microprocessor, the modern GPU, as a compelling alternative microprocessor, has outstanding high parallel processing capability with cost-effectiveness and efficiency(Owens et al., 2008), although it is initially designed for graphical rendering in visualization pipe. This presentation reports a distributed GPU computing framework for integrating GPU-based computing within distributed environment. Within this framework, 1) for each single computer, computing resources of both GPU-based and CPU-based can be fully utilized to improve the performance of visualizing and processing Big Data; 2) within a network environment, a variety of computers can be used to build up a virtual super computer to support CPU-based and GPU-based computing in distributed computing environment; 3) GPUs, as a specific graphic targeted device, are used to greatly improve the rendering efficiency in distributed geo-visualization, especially for 3D/4D visualization. Key words: Geovisualization, GIScience, Spatiotemporal Studies Reference : 1. Ferreira de Oliveira, M. C., & Levkowitz, H. (2003). From visual data exploration to visual data mining: A survey. Visualization and Computer Graphics, IEEE Transactions on, 9(3), 378-394. 2. Li, J., Jiang, Y., Yang, C., Huang, Q., & Rice, M. (2013). Visualizing 3D/4D Environmental Data Using Many-core Graphics Processing Units (GPUs) and Multi-core Central Processing Units (CPUs). Computers & Geosciences, 59(9), 78-89. 3. Owens, J. D., Houston, M., Luebke, D., Green, S., Stone, J. E., & Phillips, J. C. (2008). GPU computing. Proceedings of the IEEE, 96(5), 879-899.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/25845205','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/25845205"><span>[Patents and scientific research: an ethical-legal approach].</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Darío Bergel, Salvador</p> <p>2014-01-01</p> <p>This article aims to review the relationship between patents and scientific research from an ethical point of view. The recent developments in the law of industrial property led in many cases to patent discoveries, contributions of basic science, and laws of nature. This trend, which denies the central principles of the discipline, creates disturbances in scientific activity, which requires the free movement of knowledge in order to develop their potentialities.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017JEI....26a3014P','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017JEI....26a3014P"><span>Laying the foundation to use Raspberry Pi 3 V2 camera module imagery for scientific and engineering purposes</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Pagnutti, Mary; Ryan, Robert E.; Cazenavette, George; Gold, Maxwell; Harlan, Ryan; Leggett, Edward; Pagnutti, James</p> <p>2017-01-01</p> <p>A comprehensive radiometric characterization of raw-data format imagery acquired with the Raspberry Pi 3 and V2.1 camera module is presented. The Raspberry Pi is a high-performance single-board computer designed to educate and solve real-world problems. This small computer supports a camera module that uses a Sony IMX219 8 megapixel CMOS sensor. This paper shows that scientific and engineering-grade imagery can be produced with the Raspberry Pi 3 and its V2.1 camera module. Raw imagery is shown to be linear with exposure and gain (ISO), which is essential for scientific and engineering applications. Dark frame, noise, and exposure stability assessments along with flat fielding results, spectral response measurements, and absolute radiometric calibration results are described. This low-cost imaging sensor, when calibrated to produce scientific quality data, can be used in computer vision, biophotonics, remote sensing, astronomy, high dynamic range imaging, and security applications, to name a few.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/26410815','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/26410815"><span>Plants used in the traditional medicine of Mesoamerica (Mexico and Central America) and the Caribbean for the treatment of obesity.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Alonso-Castro, Angel Josabad; Domínguez, Fabiola; Zapata-Morales, Juan Ramón; Carranza-Álvarez, Candy</p> <p>2015-12-04</p> <p>Obesity is a worldwide medical concern. New ethnobotanical information regarding the antiobesity effect of medicinal plants has been obtained in the last 30 years in response to socio-demographic changes and high-fat diets became common. This review provides a summary of medicinal plants used in Mexico, Central America and the Caribbean for the empirical treatment of obesity in terms of ethnobotany, toxicity, pharmacology, conservation status, trade and chemistry. Bibliographic investigation was performed by analyzing recognized books, undergraduate and postgraduate theses and peer-reviewed scientific articles, consulting worldwide accepted scientific databases from the last four decades. Medicinal plants used for the treatment of obesity were classified in two categories: (1) plants with pharmacological evidence and (2) plants without pharmacological evidence. A total of 139 plant species, belonging to 61 families, native to Mexico, Central America and the Caribbean that are used for the empirical treatment of obesity were recorded. From these plants, 33 were investigated in scientific studies, and 106 plants lacked scientific investigation. Medicinal plants were experimentally studied in vitro (21 plants) and in vivo (16 plants). A total of 4 compounds isolated from medicinal plants used for the empirical treatment of obesity have been tested in vitro (2 compounds) and in vivo (4 compounds) studies. No clinical trials on obese subjects (BMI>30 kg/m(2)) have been performed using the medicinal plants cited in this review. There are no herbal-based products approved in Mexico for the treatment of obesity. There are a limited number of scientific studies published on medicinal plants from Mexico, Central America and the Caribbean used for the treatment of obesity. This review highlights the need to perform pharmacological, phytochemical, toxicological and ethnobotanical studies with medicinal flora to obtain new antiobesity agents. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017JPhCS.898h2009C','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017JPhCS.898h2009C"><span>The role of dedicated data computing centers in the age of cloud computing</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Caramarcu, Costin; Hollowell, Christopher; Strecker-Kellogg, William; Wong, Antonio; Zaytsev, Alexandr</p> <p>2017-10-01</p> <p>Brookhaven National Laboratory (BNL) anticipates significant growth in scientific programs with large computing and data storage needs in the near future and has recently reorganized support for scientific computing to meet these needs. A key component is the enhanced role of the RHIC-ATLAS Computing Facility (RACF) in support of high-throughput and high-performance computing (HTC and HPC) at BNL. This presentation discusses the evolving role of the RACF at BNL, in light of its growing portfolio of responsibilities and its increasing integration with cloud (academic and for-profit) computing activities. We also discuss BNL’s plan to build a new computing center to support the new responsibilities of the RACF and present a summary of the cost benefit analysis done, including the types of computing activities that benefit most from a local data center vs. cloud computing. This analysis is partly based on an updated cost comparison of Amazon EC2 computing services and the RACF, which was originally conducted in 2012.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/biblio/4229520-scientific-research-institutions-hungary-nuclear-science','SCIGOV-STC'); return false;" href="https://www.osti.gov/biblio/4229520-scientific-research-institutions-hungary-nuclear-science"><span>SCIENTIFIC AND RESEARCH INSTITUTIONS IN HUNGARY: I. NUCLEAR SCIENCE</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Bacha, E.</p> <p>1959-05-22</p> <p>Scientific and research institutions in Hungary engaged in research in the field of nuclear science are discussed. Brief descriptions are included of the Central Research Institute of Physics, the Institute of Nuclear Research the Joliot-Curie Central Research Institute of Radiobiology, and the Physics Laboratory of the Otvos Lorand Radium and X-Ray Institute. The recently completed experimental reactor at Budapest and isotope research laboratories are described. Plans for an atomic power plant are discussed. Uranium deposits in Hungary are also discussed. A list of recent publications in the field of nuclear science is included. (C.W)</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://pubs.er.usgs.gov/publication/70186928','USGSPUBS'); return false;" href="https://pubs.er.usgs.gov/publication/70186928"><span>Opinion: Why we need a centralized repository for isotopic data</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://pubs.er.usgs.gov/pubs/index.jsp?view=adv">USGS Publications Warehouse</a></p> <p>Pauli, Jonathan N.; Newsome, Seth D.; Cook, Joseph A.; Harrod, Chris; Steffan, Shawn A.; Baker, Christopher J. O.; Ben-David, Merav; Bloom, David; Bowen, Gabriel J.; Cerling, Thure E.; Cicero, Carla; Cook, Craig; Dohm, Michelle; Dharampal, Prarthana S.; Graves, Gary; Gropp, Robert; Hobson, Keith A.; Jordan, Chris; MacFadden, Bruce; Pilaar Birch, Suzanne; Poelen, Jorrit; Ratnasingham, Sujeevan; Russell, Laura; Stricker, Craig A.; Uhen, Mark D.; Yarnes, Christopher T.; Hayden, Brian</p> <p>2017-01-01</p> <p>Stable isotopes encode and integrate the origin of matter; thus, their analysis offers tremendous potential to address questions across diverse scientific disciplines (1, 2). Indeed, the broad applicability of stable isotopes, coupled with advancements in high-throughput analysis, have created a scientific field that is growing exponentially, and generating data at a rate paralleling the explosive rise of DNA sequencing and genomics (3). Centralized data repositories, such as GenBank, have become increasingly important as a means for archiving information, and “Big Data” analytics of these resources are revolutionizing science and everyday life.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/pages/biblio/1305040-heterogeneous-high-throughput-scientific-computing-apm-gene-intel-xeon-phi','SCIGOV-DOEP'); return false;" href="https://www.osti.gov/pages/biblio/1305040-heterogeneous-high-throughput-scientific-computing-apm-gene-intel-xeon-phi"><span>Heterogeneous high throughput scientific computing with APM X-Gene and Intel Xeon Phi</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/pages">DOE PAGES</a></p> <p>Abdurachmanov, David; Bockelman, Brian; Elmer, Peter; ...</p> <p>2015-05-22</p> <p>Electrical power requirements will be a constraint on the future growth of Distributed High Throughput Computing (DHTC) as used by High Energy Physics. Performance-per-watt is a critical metric for the evaluation of computer architectures for cost- efficient computing. Additionally, future performance growth will come from heterogeneous, many-core, and high computing density platforms with specialized processors. In this paper, we examine the Intel Xeon Phi Many Integrated Cores (MIC) co-processor and Applied Micro X-Gene ARMv8 64-bit low-power server system-on-a-chip (SoC) solutions for scientific computing applications. As a result, we report our experience on software porting, performance and energy efficiency and evaluatemore » the potential for use of such technologies in the context of distributed computing systems such as the Worldwide LHC Computing Grid (WLCG).« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://pubs.usgs.gov/imap/i2807/','USGSPUBS'); return false;" href="https://pubs.usgs.gov/imap/i2807/"><span>Topographic Map of the Ophir and Central Candor Chasmata Region of Mars MTM 500k -05/287E OMKT</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://pubs.er.usgs.gov/pubs/index.jsp?view=adv">USGS Publications Warehouse</a></p> <p>,</p> <p>2004-01-01</p> <p>This map, compiled photogrammetrically from Viking Orbiter stereo image pairs, is part of a series of topographic maps of areas of special scientific interest on Mars. The figure of Mars used for the computation of the map projection is an oblate spheroid (flattening of 1/176.875) with an equatorial radius of 3396.0 km and a polar radius of 3376.8 km. The datum (the 0-km contour line) for elevations is defined as the equipotential surface (gravitational plus rotational) whose average value at the equator is equal to the mean radius as determined by Mars Orbiter Laser Altimeter. The projection is part of a Mars Transverse Mercator (MTM) system with 20? wide zones. For the area covered by this map sheet the central meridian is at 290? E. (70? W.). The scale factor at the central meridian of the zone containing this quadrangle is 0.9960 relative to a nominal scale of 1:500,000. Longitude increases to the east and latitude is planetocentric as allowed by IAU/IAG standards and in accordance with current NASA and USGS standards. A secondary grid (printed in red) has been added to the map as a reference to the west longitude/planetographic latitude system that is also allowed by IAU/IAG standards and has been used for previous Mars maps.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://files.eric.ed.gov/fulltext/ED050579.pdf','ERIC'); return false;" href="http://files.eric.ed.gov/fulltext/ED050579.pdf"><span>The PLATO IV Architecture.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Stifle, Jack</p> <p></p> <p>The PLATO IV computer-based instructional system consists of a large scale centrally located CDC 6400 computer and a large number of remote student terminals. This is a brief and general description of the proposed input/output hardware necessary to interface the student terminals with the computer's central processing unit (CPU) using available…</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_16");'>16</a></li> <li><a href="#" onclick='return showDiv("page_17");'>17</a></li> <li class="active"><span>18</span></li> <li><a href="#" onclick='return showDiv("page_19");'>19</a></li> <li><a href="#" onclick='return showDiv("page_20");'>20</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_18 --> <div id="page_19" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_17");'>17</a></li> <li><a href="#" onclick='return showDiv("page_18");'>18</a></li> <li class="active"><span>19</span></li> <li><a href="#" onclick='return showDiv("page_20");'>20</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="361"> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/19800004007','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/19800004007"><span>Central Computational Facility CCF communications subsystem options</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Hennigan, K. B.</p> <p>1979-01-01</p> <p>A MITRE study which investigated the communication options available to support both the remaining Central Computational Facility (CCF) computer systems and the proposed U1108 replacements is presented. The facilities utilized to link the remote user terminals with the CCF were analyzed and guidelines to provide more efficient communications were established.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/19940009315','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/19940009315"><span>Space and Earth Sciences, Computer Systems, and Scientific Data Analysis Support, Volume 1</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Estes, Ronald H. (Editor)</p> <p>1993-01-01</p> <p>This Final Progress Report covers the specific technical activities of Hughes STX Corporation for the last contract triannual period of 1 June through 30 Sep. 1993, in support of assigned task activities at Goddard Space Flight Center (GSFC). It also provides a brief summary of work throughout the contract period of performance on each active task. Technical activity is presented in Volume 1, while financial and level-of-effort data is presented in Volume 2. Technical support was provided to all Division and Laboratories of Goddard's Space Sciences and Earth Sciences Directorates. Types of support include: scientific programming, systems programming, computer management, mission planning, scientific investigation, data analysis, data processing, data base creation and maintenance, instrumentation development, and management services. Mission and instruments supported include: ROSAT, Astro-D, BBXRT, XTE, AXAF, GRO, COBE, WIND, UIT, SMM, STIS, HEIDI, DE, URAP, CRRES, Voyagers, ISEE, San Marco, LAGEOS, TOPEX/Poseidon, Pioneer-Venus, Galileo, Cassini, Nimbus-7/TOMS, Meteor-3/TOMS, FIFE, BOREAS, TRMM, AVHRR, and Landsat. Accomplishments include: development of computing programs for mission science and data analysis, supercomputer applications support, computer network support, computational upgrades for data archival and analysis centers, end-to-end management for mission data flow, scientific modeling and results in the fields of space and Earth physics, planning and design of GSFC VO DAAC and VO IMS, fabrication, assembly, and testing of mission instrumentation, and design of mission operations center.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/28179114','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/28179114"><span>Medicinal plants used to treat snakebite in Central America: Review and assessment of scientific evidence.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Giovannini, Peter; Howes, Melanie-Jayne R</p> <p>2017-03-06</p> <p>Every year between 1.2 and 5.5 million people worldwide are victims of snakebites, with about 400,000 left permanently injured. In Central America an estimated 5500 snakebite cases are reported by health centres, but this is likely to be an underestimate due to unreported cases in rural regions. The aim of this study is to review the medicinal plants used traditionally to treat snakebites in seven Central American countries: Belize, Costa Rica, El Salvador, Guatemala, Honduras, Nicaragua and Panama. A literature search was performed on published primary data on medicinal plants of Central America and those specifically pertaining to use against snakebites. Plant use reports for traditional snakebite remedies identified in primary sources were extracted and entered in a database, with data analysed in terms of the most frequent numbers of use reports. The scientific evidence that might support the local uses of the most frequently reported species was also examined. A total of 260 independent plant use reports were recorded in the 34 sources included in this review, encompassing 208 species used to treat snakebite in Central America. Only nine species were reported in at least three studies: Cissampelos pareira L., Piper amalago L., Aristolochia trilobata L., Sansevieria hyacinthoides (L.) Druce, Strychnos panamensis Seem., Dorstenia contrajerva L., Scoparia dulcis L., Hamelia patens Jacq., and Simaba cedron Planch. Genera with the highest number of species used to treat snakebite were Piper, Aristolochia, Hamelia, Ipomoea, Passiflora and Peperomia. The extent of the scientific evidence available to understand any pharmacological basis for their use against snakebites varied between different plant species. At least 208 plant species are traditionally used to treat snakebite in Central America but there is a lack of clinical research to evaluate their efficacy and safety. Available pharmacological data suggest different plant species may target different symptoms of snakebites, such as pain or anxiety, although more studies are needed to further evaluate the scientific basis for their use. Copyright © 2017 Elsevier Ireland Ltd. All rights reserved.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017RScEd.tmp..235M','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017RScEd.tmp..235M"><span>Model-Based Knowing: How Do Students Ground Their Understanding About Climate Systems in Agent-Based Computer Models?</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Markauskaite, Lina; Kelly, Nick; Jacobson, Michael J.</p> <p>2017-12-01</p> <p>This paper gives a grounded cognition account of model-based learning of complex scientific knowledge related to socio-scientific issues, such as climate change. It draws on the results from a study of high school students learning about the carbon cycle through computational agent-based models and investigates two questions: First, how do students ground their understanding about the phenomenon when they learn and solve problems with computer models? Second, what are common sources of mistakes in students' reasoning with computer models? Results show that students ground their understanding in computer models in five ways: direct observation, straight abstraction, generalisation, conceptualisation, and extension. Students also incorporate into their reasoning their knowledge and experiences that extend beyond phenomena represented in the models, such as attitudes about unsustainable carbon emission rates, human agency, external events, and the nature of computational models. The most common difficulties of the students relate to seeing the modelled scientific phenomenon and connecting results from the observations with other experiences and understandings about the phenomenon in the outside world. An important contribution of this study is the constructed coding scheme for establishing different ways of grounding, which helps to understand some challenges that students encounter when they learn about complex phenomena with agent-based computer models.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016AGUFMPA11C1963B','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016AGUFMPA11C1963B"><span>Data Transparency in Privately Funded Scientific Research</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Brewer, P. G.</p> <p>2016-12-01</p> <p>Research investigations funded by the Gulf of Mexico Research Initiative (GoMRI) have resulted in a large pulse of scientific data produced by studies ranging across the research goals of the program. These studies have produced datasets from laboratory, field, and modeling activities describing phenomenon ranging from microscopic fluid dynamics to large-scale ocean currents, bacteria to marine mammals, and detailed field observations to synoptic mapping. One of GoMRI's central tenets is to ensure that all data are preserved and made publicly available. Thus, GoMRI formed the Gulf of Mexico Research Initiative Data and Information Cooperative (GRIIDC) with the mission to ensure a data and information legacy that promotes continual scientific discovery and public awareness of the Gulf of Mexico ecosystem. The GoMRI Research Board commitment to open data is exemplified in GoMRI's data program policies and management. The Research Board established a policy that research data must be publically available as soon as possible and no later than one year following collection or at the time of publication. GRIIDC's data specialists, and computer system experts along with a team of researchers funded by GOMRI and GoMRI Research Board members developed a data management system and process for storing and distributing all of the scientific data generated by the GoMRI researchers. Researcher compliance with the data policy is a requirement for annual funding increments, No Cost Extensions, and eligibility for future funding. Since data compliance is an important element of grant performance compliance with GOMRI data policies data are actively tracked and reported to the Board. This initiative comprises an essential component of GoMRI's research independence and legacy.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/biblio/6741384-lasl-usda-computer-applications-annual-progress-report-october-september-data-base-management-activities-regarding-agricultural-problems-southwestern-usa','SCIGOV-STC'); return false;" href="https://www.osti.gov/biblio/6741384-lasl-usda-computer-applications-annual-progress-report-october-september-data-base-management-activities-regarding-agricultural-problems-southwestern-usa"><span>LASL/USDA computer applications annual progress report, October 1, 1978-September 30, 1979. [Data Base Management activities regarding agricultural problems in southwestern USA</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Sanders, W.M.; Campbell, C.L.; Pickerill, P.A.</p> <p>1980-10-01</p> <p>The Los Alamos Scientific Laboratory is funded by the US Department of Agriculture to apply scientific and computer technology to solve agricultural problems. This report summarizes work during the period October 1, 1978 through September 30, 1979 on the application of computer technology to four areas: (1) Texas brucellosis calfhood-vaccination studies, (2) brucellosis data-entry system in New Mexico, (3) Idaho adult vaccination data base, and (4) surveillance of slaughterplants in Texas.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2018EPJWC.17305010K','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2018EPJWC.17305010K"><span>Application of SLURM, BOINC, and GlusterFS as Software System for Sustainable Modeling and Data Analytics</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Kashansky, Vladislav V.; Kaftannikov, Igor L.</p> <p>2018-02-01</p> <p>Modern numerical modeling experiments and data analytics problems in various fields of science and technology reveal a wide variety of serious requirements for distributed computing systems. Many scientific computing projects sometimes exceed the available resource pool limits, requiring extra scalability and sustainability. In this paper we share the experience and findings of our own on combining the power of SLURM, BOINC and GlusterFS as software system for scientific computing. Especially, we suggest a complete architecture and highlight important aspects of systems integration.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/28423842','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/28423842"><span>Architecture and Initial Development of a Digital Library Platform for Computable Knowledge Objects for Health.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Flynn, Allen J; Bahulekar, Namita; Boisvert, Peter; Lagoze, Carl; Meng, George; Rampton, James; Friedman, Charles P</p> <p>2017-01-01</p> <p>Throughout the world, biomedical knowledge is routinely generated and shared through primary and secondary scientific publications. However, there is too much latency between publication of knowledge and its routine use in practice. To address this latency, what is actionable in scientific publications can be encoded to make it computable. We have created a purpose-built digital library platform to hold, manage, and share actionable, computable knowledge for health called the Knowledge Grid Library. Here we present it with its system architecture.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2008EOSTr..89Q..24.','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2008EOSTr..89Q..24."><span>Biological Evolution and the History of the Earth Are Foundations of Science</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p></p> <p>2008-01-01</p> <p>AGU affirms the central importance of including scientific theories of Earth history and biological evolution in science education. Within the scientific community, the theory of biological evolution is not controversial, nor have ``alternative explanations'' been found. This is why no competing theories are required by the U.S. National Science Education Standards. Explanations of natural phenomena that appeal to the supernatural or are based on religious doctrine-and therefore cannot be tested through scientific inquiry-are not scientific, and have no place in the science classroom.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://files.eric.ed.gov/fulltext/ED566931.pdf','ERIC'); return false;" href="http://files.eric.ed.gov/fulltext/ED566931.pdf"><span>Using Relational Reasoning to Learn about Scientific Phenomena at Unfamiliar Scales</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Resnick, Ilyse; Davatzes, Alexandra; Newcombe, Nora S.; Shipley, Thomas F.</p> <p>2016-01-01</p> <p>Many scientific theories and discoveries involve reasoning about extreme scales, removed from human experience, such as time in geology, size in nanoscience. Thus, understanding scale is central to science, technology, engineering, and mathematics. Unfortunately, novices have trouble understanding and comparing sizes of unfamiliar large and small…</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://eric.ed.gov/?q=cricket&pg=3&id=EJ1052906','ERIC'); return false;" href="https://eric.ed.gov/?q=cricket&pg=3&id=EJ1052906"><span>Inquiry through Modeling: Exploring the Tensions between Natural & Sexual Selection Using Crickets</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Bouwma-Gearhart, Jana; Bouwma, Andrew</p> <p>2015-01-01</p> <p>The "Next Generation Science Standards" (NGSS Lead States, 2013) recommend that science courses engage communities of students in scientific practices that include building accurate conceptual models of phenomena central to the understanding of scientific disciplines. We offer a set of activities, implemented successfully at both the…</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://eric.ed.gov/?q=human+AND+taxonomy&pg=5&id=EJ1088416','ERIC'); return false;" href="https://eric.ed.gov/?q=human+AND+taxonomy&pg=5&id=EJ1088416"><span>The Semiotic Work of the Hands in Scientific Enquiry</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Sakr, Mona; Jewitt, Carey; Price, Sara</p> <p>2014-01-01</p> <p>This paper takes a multimodal approach to analysing embodied interaction and discourses of scientific investigation using an interactive tangible tabletop. It argues that embodied forms of interaction are central to science inquiry. More specifically, the paper examines the role of hand actions in the development of descriptions and explanations…</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://eric.ed.gov/?q=Biomedicine&pg=6&id=EJ181813','ERIC'); return false;" href="https://eric.ed.gov/?q=Biomedicine&pg=6&id=EJ181813"><span>Biomedicine's Failure to Achieve Flexnerian Standards of Education</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Engel, George L.</p> <p>1978-01-01</p> <p>The central principle of education education advocated by Flexner was mastery of the scientific method and its application to all dimensions of medicine. The biomedical model has been responsible for curricular designs that have understressed the application of the scientific method to clinical data and the more person-oriented, psychosocial…</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://files.eric.ed.gov/fulltext/EJ1101700.pdf','ERIC'); return false;" href="http://files.eric.ed.gov/fulltext/EJ1101700.pdf"><span>Designing Epistemologically Correct Science Narratives</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Sachin, Datt; Poovaiah, Ravi</p> <p>2012-01-01</p> <p>In recent years use of narratives for teaching science at secondary school level has gained impetus. This paper deals with the problem of designing narratives for teaching scientific concept. The central issue of the problem of designing narratives for carrying scientific information is that science belongs to the domain of objective observation…</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://eric.ed.gov/?q=org&pg=7&id=ED578240','ERIC'); return false;" href="https://eric.ed.gov/?q=org&pg=7&id=ED578240"><span>VizioMetrics: Mining the Scientific Visual Literature</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Lee, Po-Shen</p> <p>2017-01-01</p> <p>Scientific results are communicated visually in the literature through diagrams, visualizations, and photographs. In this thesis, we developed a figure processing pipeline to classify more than 8 million figures from PubMed Central into different figure types and study the resulting patterns of visual information as they relate to scholarly…</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://eric.ed.gov/?q=full+AND+text&pg=5&id=EJ1130686','ERIC'); return false;" href="https://eric.ed.gov/?q=full+AND+text&pg=5&id=EJ1130686"><span>Using Relational Reasoning to Learn about Scientific Phenomena at Unfamiliar Scales</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Resnick, Ilyse; Davatzes, Alexandra; Newcombe, Nora S.; Shipley, Thomas F.</p> <p>2017-01-01</p> <p>Many scientific theories and discoveries involve reasoning about extreme scales, removed from human experience, such as time in geology and size in nanoscience. Thus, understanding scale is central to science, technology, engineering, and mathematics. Unfortunately, novices have trouble understanding and comparing sizes of unfamiliar large and…</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2004APS..DPPEP1028P','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2004APS..DPPEP1028P"><span>Software Engineering for Scientific Computer Simulations</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Post, Douglass E.; Henderson, Dale B.; Kendall, Richard P.; Whitney, Earl M.</p> <p>2004-11-01</p> <p>Computer simulation is becoming a very powerful tool for analyzing and predicting the performance of fusion experiments. Simulation efforts are evolving from including only a few effects to many effects, from small teams with a few people to large teams, and from workstations and small processor count parallel computers to massively parallel platforms. Successfully making this transition requires attention to software engineering issues. We report on the conclusions drawn from a number of case studies of large scale scientific computing projects within DOE, academia and the DoD. The major lessons learned include attention to sound project management including setting reasonable and achievable requirements, building a good code team, enforcing customer focus, carrying out verification and validation and selecting the optimum computational mathematics approaches.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/28494014','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/28494014"><span>Singularity: Scientific containers for mobility of compute.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Kurtzer, Gregory M; Sochat, Vanessa; Bauer, Michael W</p> <p>2017-01-01</p> <p>Here we present Singularity, software developed to bring containers and reproducibility to scientific computing. Using Singularity containers, developers can work in reproducible environments of their choosing and design, and these complete environments can easily be copied and executed on other platforms. Singularity is an open source initiative that harnesses the expertise of system and software engineers and researchers alike, and integrates seamlessly into common workflows for both of these groups. As its primary use case, Singularity brings mobility of computing to both users and HPC centers, providing a secure means to capture and distribute software and compute environments. This ability to create and deploy reproducible environments across these centers, a previously unmet need, makes Singularity a game changing development for computational science.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=5426675','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=5426675"><span>Singularity: Scientific containers for mobility of compute</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Kurtzer, Gregory M.; Bauer, Michael W.</p> <p>2017-01-01</p> <p>Here we present Singularity, software developed to bring containers and reproducibility to scientific computing. Using Singularity containers, developers can work in reproducible environments of their choosing and design, and these complete environments can easily be copied and executed on other platforms. Singularity is an open source initiative that harnesses the expertise of system and software engineers and researchers alike, and integrates seamlessly into common workflows for both of these groups. As its primary use case, Singularity brings mobility of computing to both users and HPC centers, providing a secure means to capture and distribute software and compute environments. This ability to create and deploy reproducible environments across these centers, a previously unmet need, makes Singularity a game changing development for computational science. PMID:28494014</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://files.eric.ed.gov/fulltext/ED563254.pdf','ERIC'); return false;" href="http://files.eric.ed.gov/fulltext/ED563254.pdf"><span>Analysis of Scientific Attitude, Computer Anxiety, Educational Internet Use, Problematic Internet Use, and Academic Achievement of Middle School Students According to Demographic Variables</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Bekmezci, Mehmet; Celik, Ismail; Sahin, Ismail; Kiray, Ahmet; Akturk, Ahmet Oguz</p> <p>2015-01-01</p> <p>In this research, students' scientific attitude, computer anxiety, educational use of the Internet, academic achievement, and problematic use of the Internet are analyzed based on different variables (gender, parents' educational level and daily access to the Internet). The research group involves 361 students from two middle schools which are…</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_17");'>17</a></li> <li><a href="#" onclick='return showDiv("page_18");'>18</a></li> <li class="active"><span>19</span></li> <li><a href="#" onclick='return showDiv("page_20");'>20</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_19 --> <div id="page_20" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_18");'>18</a></li> <li><a href="#" onclick='return showDiv("page_19");'>19</a></li> <li class="active"><span>20</span></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="381"> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2007CoPhC.176..507S','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2007CoPhC.176..507S"><span>Numerical ‘health check’ for scientific codes: the CADNA approach</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Scott, N. S.; Jézéquel, F.; Denis, C.; Chesneaux, J.-M.</p> <p>2007-04-01</p> <p>Scientific computation has unavoidable approximations built into its very fabric. One important source of error that is difficult to detect and control is round-off error propagation which originates from the use of finite precision arithmetic. We propose that there is a need to perform regular numerical 'health checks' on scientific codes in order to detect the cancerous effect of round-off error propagation. This is particularly important in scientific codes that are built on legacy software. We advocate the use of the CADNA library as a suitable numerical screening tool. We present a case study to illustrate the practical use of CADNA in scientific codes that are of interest to the Computer Physics Communications readership. In doing so we hope to stimulate a greater awareness of round-off error propagation and present a practical means by which it can be analyzed and managed.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/1998CoPhC.110...12E','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/1998CoPhC.110...12E"><span>Computing at DESY — current setup, trends and strategic directions</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Ernst, Michael</p> <p>1998-05-01</p> <p>Since the HERA experiments H1 and ZEUS started data taking in '92, the computing environment at DESY has changed dramatically. Running a mainframe centred computing for more than 20 years, DESY switched to a heterogeneous, fully distributed computing environment within only about two years in almost every corner where computing has its applications. The computing strategy was highly influenced by the needs of the user community. The collaborations are usually limited by current technology and their ever increasing demands is the driving force for central computing to always move close to the technology edge. While DESY's central computing has a multidecade experience in running Central Data Recording/Central Data Processing for HEP experiments, the most challenging task today is to provide for clear and homogeneous concepts in the desktop area. Given that lowest level commodity hardware draws more and more attention, combined with the financial constraints we are facing already today, we quickly need concepts for integrated support of a versatile device which has the potential to move into basically any computing area in HEP. Though commercial solutions, especially addressing the PC management/support issues, are expected to come to market in the next 2-3 years, we need to provide for suitable solutions now. Buying PC's at DESY currently at a rate of about 30/month will otherwise absorb any available manpower in central computing and still will leave hundreds of unhappy people alone. Though certainly not the only region, the desktop issue is one of the most important one where we need HEP-wide collaboration to a large extent, and right now. Taking into account that there is traditionally no room for R&D at DESY, collaboration, meaning sharing experience and development resources within the HEP community, is a predominant factor for us.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2013AGUFMGC13D1130K','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2013AGUFMGC13D1130K"><span>Towards a Unified Architecture for Data-Intensive Seismology in VERCE</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Klampanos, I.; Spinuso, A.; Trani, L.; Krause, A.; Garcia, C. R.; Atkinson, M.</p> <p>2013-12-01</p> <p>Modern seismology involves managing, storing and processing large datasets, typically geographically distributed across organisations. Performing computational experiments using these data generates more data, which in turn have to be managed, further analysed and frequently be made available within or outside the scientific community. As part of the EU-funded project VERCE (http://verce.eu), we research and develop a number of use-cases, interfacing technologies to satisfy the data-intensive requirements of modern seismology. Our solution seeks to support: (1) familiar programming environments to develop and execute experiments, in particular via Python/ObsPy, (2) a unified view of heterogeneous computing resources, public or private, through the adoption of workflows, (3) monitoring the experiments and validating the data products at varying granularities, via a comprehensive provenance system, (4) reproducibility of experiments and consistency in collaboration, via a shared registry of processing units and contextual metadata (computing resources, data, etc.) Here, we provide a brief account of these components and their roles in the proposed architecture. Our design integrates heterogeneous distributed systems, while allowing researchers to retain current practices and control data handling and execution via higher-level abstractions. At the core of our solution lies the workflow language Dispel. While Dispel can be used to express workflows at fine detail, it may also be used as part of meta- or job-submission workflows. User interaction can be provided through a visual editor or through custom applications on top of parameterisable workflows, which is the approach VERCE follows. According to our design, the scientist may use versions of Dispel/workflow processing elements offered by the VERCE library or override them introducing custom scientific code, using ObsPy. This approach has the advantage that, while the scientist uses a familiar tool, the resulting workflow can be executed on a number of underlying stream-processing engines, such as STORM or OGSA-DAI, transparently. While making efficient use of arbitrarily distributed resources and large data-sets is of priority, such processing requires adequate provenance tracking and monitoring. Hiding computation and orchestration details via a workflow system, allows us to embed provenance harvesting where appropriate without impeding the user's regular working patterns. Our provenance model is based on the W3C PROV standard and can provide information of varying granularity regarding execution, systems and data consumption/production. A video demonstrating a prototype provenance exploration tool can be found at http://bit.ly/15t0Fz0. Keeping experimental methodology and results open and accessible, as well as encouraging reproducibility and collaboration, is of central importance to modern science. As our users are expected to be based at different geographical locations, to have access to different computing resources and to employ customised scientific codes, the use of a shared registry of workflow components, implementations, data and computing resources is critical.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015JPhCS.608a1001F','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015JPhCS.608a1001F"><span>PREFACE: 16th International workshop on Advanced Computing and Analysis Techniques in physics research (ACAT2014)</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Fiala, L.; Lokajicek, M.; Tumova, N.</p> <p>2015-05-01</p> <p>This volume of the IOP Conference Series is dedicated to scientific contributions presented at the 16th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2014), this year the motto was ''bridging disciplines''. The conference took place on September 1-5, 2014, at the Faculty of Civil Engineering, Czech Technical University in Prague, Czech Republic. The 16th edition of ACAT explored the boundaries of computing system architectures, data analysis algorithmics, automatic calculations, and theoretical calculation technologies. It provided a forum for confronting and exchanging ideas among these fields, where new approaches in computing technologies for scientific research were explored and promoted. This year's edition of the workshop brought together over 140 participants from all over the world. The workshop's 16 invited speakers presented key topics on advanced computing and analysis techniques in physics. During the workshop, 60 talks and 40 posters were presented in three tracks: Computing Technology for Physics Research, Data Analysis - Algorithms and Tools, and Computations in Theoretical Physics: Techniques and Methods. The round table enabled discussions on expanding software, knowledge sharing and scientific collaboration in the respective areas. ACAT 2014 was generously sponsored by Western Digital, Brookhaven National Laboratory, Hewlett Packard, DataDirect Networks, M Computers, Bright Computing, Huawei and PDV-Systemhaus. Special appreciations go to the track liaisons Lorenzo Moneta, Axel Naumann and Grigory Rubtsov for their work on the scientific program and the publication preparation. ACAT's IACC would also like to express its gratitude to all referees for their work on making sure the contributions are published in the proceedings. Our thanks extend to the conference liaisons Andrei Kataev and Jerome Lauret who worked with the local contacts and made this conference possible as well as to the program coordinator Federico Carminati and the conference chair Denis Perret-Gallix for their global supervision. Further information on ACAT 2014 can be found at http://www.particle.cz/acat2014</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://ntrs.nasa.gov/search.jsp?R=19840013341&hterms=seeds&qs=Ntx%3Dmode%2Bmatchall%26Ntk%3DAll%26N%3D0%26No%3D90%26Ntt%3Dseeds','NASA-TRS'); return false;" href="https://ntrs.nasa.gov/search.jsp?R=19840013341&hterms=seeds&qs=Ntx%3Dmode%2Bmatchall%26Ntk%3DAll%26N%3D0%26No%3D90%26Ntt%3Dseeds"><span>Integrating Micro-computers with a Centralized DBMS: ORACLE, SEED AND INGRES</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Hoerger, J.</p> <p>1984-01-01</p> <p>Users of ADABAS, a relational-like data base management system (ADABAS) with its data base programming language (NATURAL) are acquiring microcomputers with hopes of solving their individual word processing, office automation, decision support, and simple data processing problems. As processor speeds, memory sizes, and disk storage capacities increase, individual departments begin to maintain "their own" data base on "their own" micro-computer. This situation can adversely affect several of the primary goals set for implementing a centralized DBMS. In order to avoid this potential problem, these micro-computers must be integrated with the centralized DBMS. An easy to use and flexible means for transferring logic data base files between the central data base machine and micro-computers must be provided. Some of the problems encounted in an effort to accomplish this integration and possible solutions are discussed.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/11042320','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/11042320"><span>Hand-held computer operating system program for collection of resident experience data.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Malan, T K; Haffner, W H; Armstrong, A Y; Satin, A J</p> <p>2000-11-01</p> <p>To describe a system for recording resident experience involving hand-held computers with the Palm Operating System (3 Com, Inc., Santa Clara, CA). Hand-held personal computers (PCs) are popular, easy to use, inexpensive, portable, and can share data among other operating systems. Residents in our program carry individual hand-held database computers to record Residency Review Committee (RRC) reportable patient encounters. Each resident's data is transferred to a single central relational database compatible with Microsoft Access (Microsoft Corporation, Redmond, WA). Patient data entry and subsequent transfer to a central database is accomplished with commercially available software that requires minimal computer expertise to implement and maintain. The central database can then be used for statistical analysis or to create required RRC resident experience reports. As a result, the data collection and transfer process takes less time for residents and program director alike, than paper-based or central computer-based systems. The system of collecting resident encounter data using hand-held computers with the Palm Operating System is easy to use, relatively inexpensive, accurate, and secure. The user-friendly system provides prompt, complete, and accurate data, enhancing the education of residents while facilitating the job of the program director.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016AGUFMED42B..01K','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016AGUFMED42B..01K"><span>The nature of the (visualization) game: Challenges and opportunities from computational geophysics</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Kellogg, L. H.</p> <p>2016-12-01</p> <p>As the geosciences enters the era of big data, modeling and visualization become increasingly vital tools for discovery, understanding, education, and communication. Here, we focus on modeling and visualization of the structure and dynamics of the Earth's surface and interior. The past decade has seen accelerated data acquisition, including higher resolution imaging and modeling of Earth's deep interior, complex models of geodynamics, and high resolution topographic imaging of the changing surface, with an associated acceleration of computational modeling through better scientific software, increased computing capability, and the use of innovative methods of scientific visualization. The role of modeling is to describe a system, answer scientific questions, and test hypotheses; the term "model" encompasses mathematical models, computational models, physical models, conceptual models, statistical models, and visual models of a structure or process. These different uses of the term require thoughtful communication to avoid confusion. Scientific visualization is integral to every aspect of modeling. Not merely a means of communicating results, the best uses of visualization enable scientists to interact with their data, revealing the characteristics of the data and models to enable better interpretation and inform the direction of future investigation. Innovative immersive technologies like virtual reality, augmented reality, and remote collaboration techniques, are being adapted more widely and are a magnet for students. Time-varying or transient phenomena are especially challenging to model and to visualize; researchers and students may need to investigate the role of initial conditions in driving phenomena, while nonlinearities in the governing equations of many Earth systems make the computations and resulting visualization especially challenging. Training students how to use, design, build, and interpret scientific modeling and visualization tools prepares them to better understand the nature of complex, multiscale geoscience data.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4426844','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4426844"><span>Large-scale extraction of brain connectivity from the neuroscientific literature</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Richardet, Renaud; Chappelier, Jean-Cédric; Telefont, Martin; Hill, Sean</p> <p>2015-01-01</p> <p>Motivation: In neuroscience, as in many other scientific domains, the primary form of knowledge dissemination is through published articles. One challenge for modern neuroinformatics is finding methods to make the knowledge from the tremendous backlog of publications accessible for search, analysis and the integration of such data into computational models. A key example of this is metascale brain connectivity, where results are not reported in a normalized repository. Instead, these experimental results are published in natural language, scattered among individual scientific publications. This lack of normalization and centralization hinders the large-scale integration of brain connectivity results. In this article, we present text-mining models to extract and aggregate brain connectivity results from 13.2 million PubMed abstracts and 630 216 full-text publications related to neuroscience. The brain regions are identified with three different named entity recognizers (NERs) and then normalized against two atlases: the Allen Brain Atlas (ABA) and the atlas from the Brain Architecture Management System (BAMS). We then use three different extractors to assess inter-region connectivity. Results: NERs and connectivity extractors are evaluated against a manually annotated corpus. The complete in litero extraction models are also evaluated against in vivo connectivity data from ABA with an estimated precision of 78%. The resulting database contains over 4 million brain region mentions and over 100 000 (ABA) and 122 000 (BAMS) potential brain region connections. This database drastically accelerates connectivity literature review, by providing a centralized repository of connectivity data to neuroscientists. Availability and implementation: The resulting models are publicly available at github.com/BlueBrain/bluima. Contact: renaud.richardet@epfl.ch Supplementary information: Supplementary data are available at Bioinformatics online. PMID:25609795</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016AGUFMIN51C1869R','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016AGUFMIN51C1869R"><span>Scaling to diversity: The DERECHOS distributed infrastructure for analyzing and sharing data</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Rilee, M. L.; Kuo, K. S.; Clune, T.; Oloso, A.; Brown, P. G.</p> <p>2016-12-01</p> <p>Integrating Earth Science data from diverse sources such as satellite imagery and simulation output can be expensive and time-consuming, limiting scientific inquiry and the quality of our analyses. Reducing these costs will improve innovation and quality in science. The current Earth Science data infrastructure focuses on downloading data based on requests formed from the search and analysis of associated metadata. And while the data products provided by archives may use the best available data sharing technologies, scientist end-users generally do not have such resources (including staff) available to them. Furthermore, only once an end-user has received the data from multiple diverse sources and has integrated them can the actual analysis and synthesis begin. The cost of getting from idea to where synthesis can start dramatically slows progress. In this presentation we discuss a distributed computational and data storage framework that eliminates much of the aforementioned cost. The SciDB distributed array database is central as it is optimized for scientific computing involving very large arrays, performing better than less specialized frameworks like Spark. Adding spatiotemporal functions to the SciDB creates a powerful platform for analyzing and integrating massive, distributed datasets. SciDB allows Big Earth Data analysis to be performed "in place" without the need for expensive downloads and end-user resources. Spatiotemporal indexing technologies such as the hierarchical triangular mesh enable the compute and storage affinity needed to efficiently perform co-located and conditional analyses minimizing data transfers. These technologies automate the integration of diverse data sources using the framework, a critical step beyond current metadata search and analysis. Instead of downloading data into their idiosyncratic local environments, end-users can generate and share data products integrated from diverse multiple sources using a common shared environment, turning distributed active archive centers (DAACs) from warehouses into distributed active analysis centers.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/23230152','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/23230152"><span>The application of cloud computing to scientific workflows: a study of cost and performance.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Berriman, G Bruce; Deelman, Ewa; Juve, Gideon; Rynge, Mats; Vöckler, Jens-S</p> <p>2013-01-28</p> <p>The current model of transferring data from data centres to desktops for analysis will soon be rendered impractical by the accelerating growth in the volume of science datasets. Processing will instead often take place on high-performance servers co-located with data. Evaluations of how new technologies such as cloud computing would support such a new distributed computing model are urgently needed. Cloud computing is a new way of purchasing computing and storage resources on demand through virtualization technologies. We report here the results of investigations of the applicability of commercial cloud computing to scientific computing, with an emphasis on astronomy, including investigations of what types of applications can be run cheaply and efficiently on the cloud, and an example of an application well suited to the cloud: processing a large dataset to create a new science product.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/19970005345','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/19970005345"><span>File-System Workload on a Scientific Multiprocessor</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Kotz, David; Nieuwejaar, Nils</p> <p>1995-01-01</p> <p>Many scientific applications have intense computational and I/O requirements. Although multiprocessors have permitted astounding increases in computational performance, the formidable I/O needs of these applications cannot be met by current multiprocessors a their I/O subsystems. To prevent I/O subsystems from forever bottlenecking multiprocessors and limiting the range of feasible applications, new I/O subsystems must be designed. The successful design of computer systems (both hardware and software) depends on a thorough understanding of their intended use. A system designer optimizes the policies and mechanisms for the cases expected to most common in the user's workload. In the case of multiprocessor file systems, however, designers have been forced to build file systems based only on speculation about how they would be used, extrapolating from file-system characterizations of general-purpose workloads on uniprocessor and distributed systems or scientific workloads on vector supercomputers (see sidebar on related work). To help these system designers, in June 1993 we began the Charisma Project, so named because the project sought to characterize 1/0 in scientific multiprocessor applications from a variety of production parallel computing platforms and sites. The Charisma project is unique in recording individual read and write requests-in live, multiprogramming, parallel workloads (rather than from selected or nonparallel applications). In this article, we present the first results from the project: a characterization of the file-system workload an iPSC/860 multiprocessor running production, parallel scientific applications at NASA's Ames Research Center.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016AGUFMIN13B1663W','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016AGUFMIN13B1663W"><span>GISpark: A Geospatial Distributed Computing Platform for Spatiotemporal Big Data</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Wang, S.; Zhong, E.; Wang, E.; Zhong, Y.; Cai, W.; Li, S.; Gao, S.</p> <p>2016-12-01</p> <p>Geospatial data are growing exponentially because of the proliferation of cost effective and ubiquitous positioning technologies such as global remote-sensing satellites and location-based devices. Analyzing large amounts of geospatial data can provide great value for both industrial and scientific applications. Data- and compute- intensive characteristics inherent in geospatial big data increasingly pose great challenges to technologies of data storing, computing and analyzing. Such challenges require a scalable and efficient architecture that can store, query, analyze, and visualize large-scale spatiotemporal data. Therefore, we developed GISpark - a geospatial distributed computing platform for processing large-scale vector, raster and stream data. GISpark is constructed based on the latest virtualized computing infrastructures and distributed computing architecture. OpenStack and Docker are used to build multi-user hosting cloud computing infrastructure for GISpark. The virtual storage systems such as HDFS, Ceph, MongoDB are combined and adopted for spatiotemporal data storage management. Spark-based algorithm framework is developed for efficient parallel computing. Within this framework, SuperMap GIScript and various open-source GIS libraries can be integrated into GISpark. GISpark can also integrated with scientific computing environment (e.g., Anaconda), interactive computing web applications (e.g., Jupyter notebook), and machine learning tools (e.g., TensorFlow/Orange). The associated geospatial facilities of GISpark in conjunction with the scientific computing environment, exploratory spatial data analysis tools, temporal data management and analysis systems make up a powerful geospatial computing tool. GISpark not only provides spatiotemporal big data processing capacity in the geospatial field, but also provides spatiotemporal computational model and advanced geospatial visualization tools that deals with other domains related with spatial property. We tested the performance of the platform based on taxi trajectory analysis. Results suggested that GISpark achieves excellent run time performance in spatiotemporal big data applications.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://eric.ed.gov/?q=computer+AND+science&pg=4&id=EJ1003712','ERIC'); return false;" href="https://eric.ed.gov/?q=computer+AND+science&pg=4&id=EJ1003712"><span>Implementing an Affordable High-Performance Computing for Teaching-Oriented Computer Science Curriculum</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Abuzaghleh, Omar; Goldschmidt, Kathleen; Elleithy, Yasser; Lee, Jeongkyu</p> <p>2013-01-01</p> <p>With the advances in computing power, high-performance computing (HPC) platforms have had an impact on not only scientific research in advanced organizations but also computer science curriculum in the educational community. For example, multicore programming and parallel systems are highly desired courses in the computer science major. However,…</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016AGUFMIN31C1771F','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016AGUFMIN31C1771F"><span>The Ever-Est Virtual Research Environment Infrastructure for Marine - the Sea Monitoring Virtual Research Community (vrc) Use Case</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Foglini, F.</p> <p>2016-12-01</p> <p>The EVER-EST project aims to develop a generic Virtual Research Environment (VRE) tailored to the needs and validated by the Earth Science domain. To achieve this the EVER-EST VRE provides earth scientists with the means to seamlessly manage both the data involved in their computationally intensive disciplines and the scientific methods applied in their observations and modellings, which lead to the specific results that need to be attributable, validated and shared within the community e.g. in the form of scholarly communications. Central to this approach is the concept of Research Objects (ROs) as semantically rich aggregations of resources that bring together data, methods and people in scientific investigations. ROs enable the creation of digital artifacts that can encapsulate scientific knowledge and provide a mechanism for sharing and discovering assets of reusable research and scientific assets as first-class citizens. The EVER-EST VRE is the first RO-centric native infrastructure leveraging the notion of ROs and their application in observational rather than experimental disciplines and particularly in Earth Science. The Institute of MARine Science (ISMAR-CNR) is a scientific partner of the EVER-EST project providing useful and applicable contributions to the identification and definition of variables indicated by the European Commission in the Marine Strategy Framework Directive (MSFD) to achieve the Good Environment Status (GES). The VRC is willing to deliver practical methods, procedures and protocols to support coherent and widely accepted interpretation of the MSFD. The use case deal with 1. the Posidonia meadows along the Apulian coast, 2. the deep-sea corals along the Apulian continenatal slope and 3. the jellyfish abundance in the Italian water. The SeaMonitoring VRC created specific RO for asesing deep sea corals suitabilty, Posidonia meadows occurrences and for detecting jelly fish density aloing the italian coast. The VRC developed specific RO for bathymetric data implementing a data preservation plan and a specific vocabulary for metadata.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016EGUGA..18.7545M','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016EGUGA..18.7545M"><span>EVER-EST: a virtual research environment for Earth Sciences</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Marelli, Fulvio; Albani, Mirko; Glaves, Helen</p> <p>2016-04-01</p> <p>There is an increasing requirement for researchers to work collaboratively using common resources whilst being geographically dispersed. By creating a virtual research environment (VRE) using a service oriented architecture (SOA) tailored to the needs of Earth Science (ES) communities, the EVEREST project will provide a range of both generic and domain specific data management services to support a dynamic approach to collaborative research. EVER-EST will provide the means to overcome existing barriers to sharing of Earth Science data and information allowing research teams to discover, access, share and process heterogeneous data, algorithms, results and experiences within and across their communities, including those domains beyond Earth Science. Researchers will be able to seamlessly manage both the data involved in their computationally intensive disciplines and the scientific methods applied in their observations and modelling, which lead to the specific results that need to be attributable, validated and shared both within the community and more widely e.g. in the form of scholarly communications. Central to the EVEREST approach is the concept of the Research Object (RO) , which provides a semantically rich mechanism to aggregate related resources about a scientific investigation so that they can be shared together using a single unique identifier. Although several e-laboratories are incorporating the research object concept in their infrastructure, the EVER-EST VRE will be the first infrastructure to leverage the concept of Research Objects and their application in observational rather than experimental disciplines. Development of the EVEREST VRE will leverage the results of several previous projects which have produced state-of-the-art technologies for scientific data management and curation as well those which have developed models, techniques and tools for the preservation of scientific methods and their implementation in computational forms such as scientific workflows. The EVER-EST data processing infrastructure will be based on a Cloud Computing approach, in which new applications can be integrated using "virtual machines" that have their own specifications (disk size, processor speed, operating system etc.) and run on shared private (physical deployment over local hardware) or commercial Cloud infrastructures. The EVER-EST e-infrastructure will be validated by four virtual research communities (VRC) covering different multidisciplinary Earth Science domains including: ocean monitoring, natural hazards, land monitoring and risk management (volcanoes and seismicity). Each VRC will use the virtual research environment according to its own specific requirements for data, software, best practice and community engagement. This user-centric approach will allow an assessment to be made of the capability for the proposed solution to satisfy the heterogeneous needs of a variety of Earth Science communities for more effective collaboration, and higher efficiency and creativity in research. EVER-EST is funded by the European Commission's H2020 for three years starting in October 2015. The project is led by the European Space Agency (ESA), involves some of the major European Earth Science data providers/users including NERC, DLR, INGV, CNR and SatCEN.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/20020039531','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/20020039531"><span>Onward to Petaflops Computing</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Bailey, David H.; Chancellor, Marisa K. (Technical Monitor)</p> <p>1997-01-01</p> <p>With programs such as the US High Performance Computing and Communications Program (HPCCP), the attention of scientists and engineers worldwide has been focused on the potential of very high performance scientific computing, namely systems that are hundreds or thousands of times more powerful than those typically available in desktop systems at any given point in time. Extending the frontiers of computing in this manner has resulted in remarkable advances, both in computing technology itself and also in the various scientific and engineering disciplines that utilize these systems. Within the month or two, a sustained rate of 1 Tflop/s (also written 1 teraflops, or 10(exp 12) floating-point operations per second) is likely to be achieved by the 'ASCI Red' system at Sandia National Laboratory in New Mexico. With this objective in sight, it is reasonable to ask what lies ahead for high-end computing.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/18068610','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/18068610"><span>A future for systems and computational neuroscience in France?</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Faugeras, Olivier; Frégnac, Yves; Samuelides, Manuel</p> <p>2007-01-01</p> <p>This special issue of the Journal of Physiology, Paris, is an outcome of NeuroComp'06, the first French conference in Computational Neuroscience. The preparation for this conference, held at Pont-à-Mousson in October 2006, was accompanied by a survey which has resulted in an up-to-date inventory of human resources and labs in France concerned with this emerging new field of research (see team directory in http://neurocomp.risc.cnrs.fr/). This thematic JPP issue gathers some of the key scientific presentations made on the occasion of this first interdisciplinary meeting, which should soon become recognized as a yearly national conference representative of a new scientific community. The present introductory paper presents the general scientific context of the conference and reviews some of the historical and conceptual foundations of Systems and Computational Neuroscience in France.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/19930007839','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/19930007839"><span>Parallel processing for scientific computations</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Alkhatib, Hasan S.</p> <p>1991-01-01</p> <p>The main contribution of the effort in the last two years is the introduction of the MOPPS system. After doing extensive literature search, we introduced the system which is described next. MOPPS employs a new solution to the problem of managing programs which solve scientific and engineering applications on a distributed processing environment. Autonomous computers cooperate efficiently in solving large scientific problems with this solution. MOPPS has the advantage of not assuming the presence of any particular network topology or configuration, computer architecture, or operating system. It imposes little overhead on network and processor resources while efficiently managing programs concurrently. The core of MOPPS is an intelligent program manager that builds a knowledge base of the execution performance of the parallel programs it is managing under various conditions. The manager applies this knowledge to improve the performance of future runs. The program manager learns from experience.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4049820','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4049820"><span>Co-Authorship and Bibliographic Coupling Network Effects on Citations</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Biscaro, Claudio; Giupponi, Carlo</p> <p>2014-01-01</p> <p>This paper analyzes the effects of the co-authorship and bibliographic coupling networks on the citations received by scientific articles. It expands prior research that limited its focus on the position of co-authors and incorporates the effects of the use of knowledge sources within articles: references. By creating a network on the basis of shared references, we propose a way to understand whether an article bridges among extant strands of literature and infer the size of its research community and its embeddedness. Thus, we map onto the article – our unit of analysis – the metrics of authors' position in the co-authorship network and of the use of knowledge on which the scientific article is grounded. Specifically, we adopt centrality measures – degree, betweenneess, and closeness centrality – in the co-authorship network and degree, betweenness centrality and clustering coefficient in the bibliographic coupling and show their influence on the citations received in first two years after the year of publication. Findings show that authors' degree positively impacts citations. Also closeness centrality has a positive effect manifested only when the giant component is relevant. Author's betweenness centrality has instead a negative effect that persists until the giant component - largest component of the network in which all nodes can be linked by a path - is relevant. Moreover, articles that draw on fragmented strands of literature tend to be cited more, whereas the size of the scientific research community and the embeddedness of the article in a cohesive cluster of literature have no effect. PMID:24911416</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://cfpub.epa.gov/si/si_public_record_report.cfm?dirEntryId=126498&keyword=Building+AND+Information+AND+Modeling&actType=&TIMSType=+&TIMSSubTypeID=&DEID=&epaNumber=&ntisID=&archiveStatus=Both&ombCat=Any&dateBeginCreated=&dateEndCreated=&dateBeginPublishedPresented=&dateEndPublishedPresented=&dateBeginUpdated=&dateEndUpdated=&dateBeginCompleted=&dateEndCompleted=&personID=&role=Any&journalID=&publisherID=&sortBy=revisionDate&count=50','EPA-EIMS'); return false;" href="https://cfpub.epa.gov/si/si_public_record_report.cfm?dirEntryId=126498&keyword=Building+AND+Information+AND+Modeling&actType=&TIMSType=+&TIMSSubTypeID=&DEID=&epaNumber=&ntisID=&archiveStatus=Both&ombCat=Any&dateBeginCreated=&dateEndCreated=&dateBeginPublishedPresented=&dateEndPublishedPresented=&dateBeginUpdated=&dateEndUpdated=&dateBeginCompleted=&dateEndCompleted=&personID=&role=Any&journalID=&publisherID=&sortBy=revisionDate&count=50"><span>SOIL RADON POTENTIAL MAPPLING OF TWELVE COUNTIES IN NORTH-CENTRAL FLORIDA</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://oaspub.epa.gov/eims/query.page">EPA Science Inventory</a></p> <p></p> <p></p> <p>The report describes the approach, methods, and detailed data used to prepare soil radon potential maps of 12 counties in North-Central Florida. he maps were developed under the Florida Radon Research Program to provide a scientific basis for implementing radon-protective buildin...</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_18");'>18</a></li> <li><a href="#" onclick='return showDiv("page_19");'>19</a></li> <li class="active"><span>20</span></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_20 --> <div id="page_21" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_19");'>19</a></li> <li><a href="#" onclick='return showDiv("page_20");'>20</a></li> <li class="active"><span>21</span></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li><a href="#" onclick='return showDiv("page_23");'>23</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="401"> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/19930018035','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/19930018035"><span>USRA/RIACS</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Oliger, Joseph</p> <p>1992-01-01</p> <p>The Research Institute for Advanced Computer Science (RIACS) was established by the Universities Space Research Association (USRA) at the NASA Ames Research Center (ARC) on June 6, 1983. RIACS is privately operated by USRA, a consortium of universities with research programs in the aerospace sciences, under a cooperative agreement with NASA. The primary mission of RIACS is to provide research and expertise in computer science and scientific computing to support the scientific missions of NASA ARC. The research carried out at RIACS must change its emphasis from year to year in response to NASA ARC's changing needs and technological opportunities. A flexible scientific staff is provided through a university faculty visitor program, a post doctoral program, and a student visitor program. Not only does this provide appropriate expertise but it also introduces scientists outside of NASA to NASA problems. A small group of core RIACS staff provides continuity and interacts with an ARC technical monitor and scientific advisory group to determine the RIACS mission. RIACS activities are reviewed and monitored by a USRA advisory council and ARC technical monitor. Research at RIACS is currently being done in the following areas: (1) parallel computing; (2) advanced methods for scientific computing; (3) learning systems; (4) high performance networks and technology; and (5) graphics, visualization, and virtual environments. In the past year, parallel compiler techniques and adaptive numerical methods for flows in complicated geometries were identified as important problems to investigate for ARC's involvement in the Computational Grand Challenges of the next decade. We concluded a summer student visitors program during this six months. We had six visiting graduate students that worked on projects over the summer and presented seminars on their work at the conclusion of their visits. RIACS technical reports are usually preprints of manuscripts that have been submitted to research journals or conference proceedings. A list of these reports for the period July 1, 1992 through December 31, 1992 is provided.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://files.eric.ed.gov/fulltext/ED354872.pdf','ERIC'); return false;" href="http://files.eric.ed.gov/fulltext/ED354872.pdf"><span>Computer Assisted Instructional Design for Computer-Based Instruction. Final Report. Working Papers.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Russell, Daniel M.; Pirolli, Peter</p> <p></p> <p>Recent advances in artificial intelligence and the cognitive sciences have made it possible to develop successful intelligent computer-aided instructional systems for technical and scientific training. In addition, computer-aided design (CAD) environments that support the rapid development of such computer-based instruction have also been recently…</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://eric.ed.gov/?q=image+AND+processing&pg=7&id=EJ1055884','ERIC'); return false;" href="https://eric.ed.gov/?q=image+AND+processing&pg=7&id=EJ1055884"><span>An Imagination Effect in Learning from Scientific Text</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Leopold, Claudia; Mayer, Richard E.</p> <p>2015-01-01</p> <p>Asking students to imagine the spatial arrangement of the elements in a scientific text constitutes a learning strategy intended to foster deep processing of the instructional material. Two experiments investigated the effects of mental imagery prompts on learning from scientific text. Students read a computer-based text on the human respiratory…</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/servlets/purl/1222712','SCIGOV-STC'); return false;" href="https://www.osti.gov/servlets/purl/1222712"><span>DOE Advanced Scientific Computing Advisory Committee (ASCAC) Report: Exascale Computing Initiative Review</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Reed, Daniel; Berzins, Martin; Pennington, Robert</p> <p></p> <p>On November 19, 2014, the Advanced Scientific Computing Advisory Committee (ASCAC) was charged with reviewing the Department of Energy’s conceptual design for the Exascale Computing Initiative (ECI). In particular, this included assessing whether there are significant gaps in the ECI plan or areas that need to be given priority or extra management attention. Given the breadth and depth of previous reviews of the technical challenges inherent in exascale system design and deployment, the subcommittee focused its assessment on organizational and management issues, considering technical issues only as they informed organizational or management priorities and structures. This report presents the observationsmore » and recommendations of the subcommittee.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4934704','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4934704"><span>Secure Scientific Applications Scheduling Technique for Cloud Computing Environment Using Global League Championship Algorithm</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Abdulhamid, Shafi’i Muhammad; Abd Latiff, Muhammad Shafie; Abdul-Salaam, Gaddafi; Hussain Madni, Syed Hamid</p> <p>2016-01-01</p> <p>Cloud computing system is a huge cluster of interconnected servers residing in a datacenter and dynamically provisioned to clients on-demand via a front-end interface. Scientific applications scheduling in the cloud computing environment is identified as NP-hard problem due to the dynamic nature of heterogeneous resources. Recently, a number of metaheuristics optimization schemes have been applied to address the challenges of applications scheduling in the cloud system, without much emphasis on the issue of secure global scheduling. In this paper, scientific applications scheduling techniques using the Global League Championship Algorithm (GBLCA) optimization technique is first presented for global task scheduling in the cloud environment. The experiment is carried out using CloudSim simulator. The experimental results show that, the proposed GBLCA technique produced remarkable performance improvement rate on the makespan that ranges between 14.44% to 46.41%. It also shows significant reduction in the time taken to securely schedule applications as parametrically measured in terms of the response time. In view of the experimental results, the proposed technique provides better-quality scheduling solution that is suitable for scientific applications task execution in the Cloud Computing environment than the MinMin, MaxMin, Genetic Algorithm (GA) and Ant Colony Optimization (ACO) scheduling techniques. PMID:27384239</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/27384239','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/27384239"><span>Secure Scientific Applications Scheduling Technique for Cloud Computing Environment Using Global League Championship Algorithm.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Abdulhamid, Shafi'i Muhammad; Abd Latiff, Muhammad Shafie; Abdul-Salaam, Gaddafi; Hussain Madni, Syed Hamid</p> <p>2016-01-01</p> <p>Cloud computing system is a huge cluster of interconnected servers residing in a datacenter and dynamically provisioned to clients on-demand via a front-end interface. Scientific applications scheduling in the cloud computing environment is identified as NP-hard problem due to the dynamic nature of heterogeneous resources. Recently, a number of metaheuristics optimization schemes have been applied to address the challenges of applications scheduling in the cloud system, without much emphasis on the issue of secure global scheduling. In this paper, scientific applications scheduling techniques using the Global League Championship Algorithm (GBLCA) optimization technique is first presented for global task scheduling in the cloud environment. The experiment is carried out using CloudSim simulator. The experimental results show that, the proposed GBLCA technique produced remarkable performance improvement rate on the makespan that ranges between 14.44% to 46.41%. It also shows significant reduction in the time taken to securely schedule applications as parametrically measured in terms of the response time. In view of the experimental results, the proposed technique provides better-quality scheduling solution that is suitable for scientific applications task execution in the Cloud Computing environment than the MinMin, MaxMin, Genetic Algorithm (GA) and Ant Colony Optimization (ACO) scheduling techniques.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014AGUFMIN21B3713L','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014AGUFMIN21B3713L"><span>Enabling Data Intensive Science through Service Oriented Science: Virtual Laboratories and Science Gateways</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Lescinsky, D. T.; Wyborn, L. A.; Evans, B. J. K.; Allen, C.; Fraser, R.; Rankine, T.</p> <p>2014-12-01</p> <p>We present collaborative work on a generic, modular infrastructure for virtual laboratories (VLs, similar to science gateways) that combine online access to data, scientific code, and computing resources as services that support multiple data intensive scientific computing needs across a wide range of science disciplines. We are leveraging access to 10+ PB of earth science data on Lustre filesystems at Australia's National Computational Infrastructure (NCI) Research Data Storage Infrastructure (RDSI) node, co-located with NCI's 1.2 PFlop Raijin supercomputer and a 3000 CPU core research cloud. The development, maintenance and sustainability of VLs is best accomplished through modularisation and standardisation of interfaces between components. Our approach has been to break up tightly-coupled, specialised application packages into modules, with identified best techniques and algorithms repackaged either as data services or scientific tools that are accessible across domains. The data services can be used to manipulate, visualise and transform multiple data types whilst the scientific tools can be used in concert with multiple scientific codes. We are currently designing a scalable generic infrastructure that will handle scientific code as modularised services and thereby enable the rapid/easy deployment of new codes or versions of codes. The goal is to build open source libraries/collections of scientific tools, scripts and modelling codes that can be combined in specially designed deployments. Additional services in development include: provenance, publication of results, monitoring, workflow tools, etc. The generic VL infrastructure will be hosted at NCI, but can access alternative computing infrastructures (i.e., public/private cloud, HPC).The Virtual Geophysics Laboratory (VGL) was developed as a pilot project to demonstrate the underlying technology. This base is now being redesigned and generalised to develop a Virtual Hazards Impact and Risk Laboratory (VHIRL); any enhancements and new capabilities will be incorporated into a generic VL infrastructure. At same time, we are scoping seven new VLs and in the process, identifying other common components to prioritise and focus development.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/26968893','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/26968893"><span>From the desktop to the grid: scalable bioinformatics via workflow conversion.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>de la Garza, Luis; Veit, Johannes; Szolek, Andras; Röttig, Marc; Aiche, Stephan; Gesing, Sandra; Reinert, Knut; Kohlbacher, Oliver</p> <p>2016-03-12</p> <p>Reproducibility is one of the tenets of the scientific method. Scientific experiments often comprise complex data flows, selection of adequate parameters, and analysis and visualization of intermediate and end results. Breaking down the complexity of such experiments into the joint collaboration of small, repeatable, well defined tasks, each with well defined inputs, parameters, and outputs, offers the immediate benefit of identifying bottlenecks, pinpoint sections which could benefit from parallelization, among others. Workflows rest upon the notion of splitting complex work into the joint effort of several manageable tasks. There are several engines that give users the ability to design and execute workflows. Each engine was created to address certain problems of a specific community, therefore each one has its advantages and shortcomings. Furthermore, not all features of all workflow engines are royalty-free -an aspect that could potentially drive away members of the scientific community. We have developed a set of tools that enables the scientific community to benefit from workflow interoperability. We developed a platform-free structured representation of parameters, inputs, outputs of command-line tools in so-called Common Tool Descriptor documents. We have also overcome the shortcomings and combined the features of two royalty-free workflow engines with a substantial user community: the Konstanz Information Miner, an engine which we see as a formidable workflow editor, and the Grid and User Support Environment, a web-based framework able to interact with several high-performance computing resources. We have thus created a free and highly accessible way to design workflows on a desktop computer and execute them on high-performance computing resources. Our work will not only reduce time spent on designing scientific workflows, but also make executing workflows on remote high-performance computing resources more accessible to technically inexperienced users. We strongly believe that our efforts not only decrease the turnaround time to obtain scientific results but also have a positive impact on reproducibility, thus elevating the quality of obtained scientific results.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2008JNEng...5..455K','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2008JNEng...5..455K"><span>Neural control of computer cursor velocity by decoding motor cortical spiking activity in humans with tetraplegia</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Kim, Sung-Phil; Simeral, John D.; Hochberg, Leigh R.; Donoghue, John P.; Black, Michael J.</p> <p>2008-12-01</p> <p>Computer-mediated connections between human motor cortical neurons and assistive devices promise to improve or restore lost function in people with paralysis. Recently, a pilot clinical study of an intracortical neural interface system demonstrated that a tetraplegic human was able to obtain continuous two-dimensional control of a computer cursor using neural activity recorded from his motor cortex. This control, however, was not sufficiently accurate for reliable use in many common computer control tasks. Here, we studied several central design choices for such a system including the kinematic representation for cursor movement, the decoding method that translates neuronal ensemble spiking activity into a control signal and the cursor control task used during training for optimizing the parameters of the decoding method. In two tetraplegic participants, we found that controlling a cursor's velocity resulted in more accurate closed-loop control than controlling its position directly and that cursor velocity control was achieved more rapidly than position control. Control quality was further improved over conventional linear filters by using a probabilistic method, the Kalman filter, to decode human motor cortical activity. Performance assessment based on standard metrics used for the evaluation of a wide range of pointing devices demonstrated significantly improved cursor control with velocity rather than position decoding. Disclosure. JPD is the Chief Scientific Officer and a director of Cyberkinetics Neurotechnology Systems (CYKN); he holds stock and receives compensation. JDS has been a consultant for CYKN. LRH receives clinical trial support from CYKN.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/20140002282','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/20140002282"><span>Educational NASA Computational and Scientific Studies (enCOMPASS)</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Memarsadeghi, Nargess</p> <p>2013-01-01</p> <p>Educational NASA Computational and Scientific Studies (enCOMPASS) is an educational project of NASA Goddard Space Flight Center aimed at bridging the gap between computational objectives and needs of NASA's scientific research, missions, and projects, and academia's latest advances in applied mathematics and computer science. enCOMPASS achieves this goal via bidirectional collaboration and communication between NASA and academia. Using developed NASA Computational Case Studies in university computer science/engineering and applied mathematics classes is a way of addressing NASA's goals of contributing to the Science, Technology, Education, and Math (STEM) National Objective. The enCOMPASS Web site at http://encompass.gsfc.nasa.gov provides additional information. There are currently nine enCOMPASS case studies developed in areas of earth sciences, planetary sciences, and astrophysics. Some of these case studies have been published in AIP and IEEE's Computing in Science and Engineering magazines. A few university professors have used enCOMPASS case studies in their computational classes and contributed their findings to NASA scientists. In these case studies, after introducing the science area, the specific problem, and related NASA missions, students are first asked to solve a known problem using NASA data and past approaches used and often published in a scientific/research paper. Then, after learning about the NASA application and related computational tools and approaches for solving the proposed problem, students are given a harder problem as a challenge for them to research and develop solutions for. This project provides a model for NASA scientists and engineers on one side, and university students, faculty, and researchers in computer science and applied mathematics on the other side, to learn from each other's areas of work, computational needs and solutions, and the latest advances in research and development. This innovation takes NASA science and engineering applications to computer science and applied mathematics university classes, and makes NASA objectives part of the university curricula. There is great potential for growth and return on investment of this program to the point where every major university in the U.S. would use at least one of these case studies in one of their computational courses, and where every NASA scientist and engineer facing a computational challenge (without having resources or expertise to solve it) would use enCOMPASS to formulate the problem as a case study, provide it to a university, and get back their solutions and ideas.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://eric.ed.gov/?q=piaget%27s+AND+theory+AND+cognitive+AND+development&pg=7&id=EJ738661','ERIC'); return false;" href="https://eric.ed.gov/?q=piaget%27s+AND+theory+AND+cognitive+AND+development&pg=7&id=EJ738661"><span>Students' Development in Theory and Practice: The Doubtful Role of Research</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Egan, Kieran</p> <p>2005-01-01</p> <p>In this article, Kieran Egan contests the scientific foundations of Piaget's developmental theories and the scientific basis of much educational research. In so doing, he pushes researchers and practitioners alike to rethink the centrality of Piaget's tenets to teaching and learning. Egan traces the history of the developmental literature that…</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://eric.ed.gov/?q=Intelligent+AND+Design&pg=6&id=EJ972804','ERIC'); return false;" href="https://eric.ed.gov/?q=Intelligent+AND+Design&pg=6&id=EJ972804"><span>Vitalism and the Darwin Debate</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Henderson, James</p> <p>2012-01-01</p> <p>There are currently both scientific and public debates surrounding Darwinism. In the scientific debate, the details of evolution are in dispute, but not the central thesis of Darwin's theory; in the public debate, Darwinism itself is questioned. I concentrate on the public debate because of its direct impact on education in the United States. Some…</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://files.eric.ed.gov/fulltext/ED318911.pdf','ERIC'); return false;" href="http://files.eric.ed.gov/fulltext/ED318911.pdf"><span>Distance Education, Efficiency and Scientific Management: Some Doubts. ESTR Occasional Paper Number 12.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Campion, Mick</p> <p></p> <p>The theme of the 1983 Australian and South Pacific External Studies Association concerned developing efficient teaching-learning systems and efficient management systems. Such an emphasis on efficiency was symptomatic in the United States' educational arena to a commitment to the practices of scientific management. The central role accorded to…</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://eric.ed.gov/?q=natural+AND+world&pg=4&id=EJ1069335','ERIC'); return false;" href="https://eric.ed.gov/?q=natural+AND+world&pg=4&id=EJ1069335"><span>English Secondary Students' Thinking about the Status of Scientific Theories: Consistent, Comprehensive, Coherent and Extensively Evidenced Explanations of Aspects of the Natural World--Or Just "An Idea Someone Has"</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Taber, Keith S.; Billingsley, Berry; Riga, Fran; Newdick, Helen</p> <p>2015-01-01</p> <p>Teaching about the nature of science (NOS) is seen as a priority for science education in many national contexts. The present paper focuses on one central issue in learning about NOS: understanding the nature and status of scientific theories. A key challenge in teaching about NOS is to persuade students that scientific knowledge is generally…</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2008AIPC.1017...64A','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2008AIPC.1017...64A"><span>Architectural Aspects of Grid Computing and its Global Prospects for E-Science Community</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Ahmad, Mushtaq</p> <p>2008-05-01</p> <p>The paper reviews the imminent Architectural Aspects of Grid Computing for e-Science community for scientific research and business/commercial collaboration beyond physical boundaries. Grid Computing provides all the needed facilities; hardware, software, communication interfaces, high speed internet, safe authentication and secure environment for collaboration of research projects around the globe. It provides highly fast compute engine for those scientific and engineering research projects and business/commercial applications which are heavily compute intensive and/or require humongous amounts of data. It also makes possible the use of very advanced methodologies, simulation models, expert systems and treasure of knowledge available around the globe under the umbrella of knowledge sharing. Thus it makes possible one of the dreams of global village for the benefit of e-Science community across the globe.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://pubs.usgs.gov/pp/1798/','USGSPUBS'); return false;" href="https://pubs.usgs.gov/pp/1798/"><span>2011 floods of the central United States</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://pubs.er.usgs.gov/pubs/index.jsp?view=adv">USGS Publications Warehouse</a></p> <p>,</p> <p>2013-01-01</p> <p>* Do floods contribute to the transport and fate of contaminants that affect human and ecosystem health? In an effort to help address these and other questions, USGS Professional Paper 1798 consists of independent but complementary chapters dealing with various scientific aspects of the 2011 floods in the Central United States.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4909347','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4909347"><span>Development of the Central Dogma Concept Inventory (CDCI) Assessment Tool</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Newman, Dina L.; Snyder, Christopher W.; Fisk, J. Nick; Wright, L. Kate</p> <p>2016-01-01</p> <p>Scientific teaching requires scientifically constructed, field-tested instruments to accurately evaluate student thinking and gauge teacher effectiveness. We have developed a 23-question, multiple select–format assessment of student understanding of the essential concepts of the central dogma of molecular biology that is appropriate for all levels of undergraduate biology. Questions for the Central Dogma Concept Inventory (CDCI) tool were developed and iteratively revised based on student language and review by experts. The ability of the CDCI to discriminate between levels of understanding of the central dogma is supported by field testing (N = 54), and large-scale beta testing (N = 1733). Performance on the assessment increased with experience in biology; scores covered a broad range and showed no ceiling effect, even with senior biology majors, and pre/posttesting of a single class focused on the central dogma showed significant improvement. The multiple-select format reduces the chances of correct answers by random guessing, allows students at different levels to exhibit the extent of their knowledge, and provides deeper insight into the complexity of student thinking on each theme. To date, the CDCI is the first tool dedicated to measuring student thinking about the central dogma of molecular biology, and version 5 is ready to use. PMID:27055775</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/servlets/purl/1347549','SCIGOV-STC'); return false;" href="https://www.osti.gov/servlets/purl/1347549"><span>Equation-free and variable free modeling for complex/multiscale systems. Coarse-grained computation in science and engineering using fine-grained models</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Kevrekidis, Ioannis G.</p> <p></p> <p>The work explored the linking of modern developing machine learning techniques (manifold learning and in particular diffusion maps) with traditional PDE modeling/discretization/scientific computation techniques via the equation-free methodology developed by the PI. The result (in addition to several PhD degrees, two of them by CSGF Fellows) was a sequence of strong developments - in part on the algorithmic side, linking data mining with scientific computing, and in part on applications, ranging from PDE discretizations to molecular dynamics and complex network dynamics.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017WRR....53.2575H','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017WRR....53.2575H"><span>Reply to comment by Añel on "Most computational hydrology is not reproducible, so is it really science?"</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Hutton, Christopher; Wagener, Thorsten; Freer, Jim; Han, Dawei; Duffy, Chris; Arheimer, Berit</p> <p>2017-03-01</p> <p>In this article, we reply to a comment made on our previous commentary regarding reproducibility in computational hydrology. Software licensing and version control of code are important technical aspects of making code and workflows of scientific experiments open and reproducible. However, in our view, it is the cultural change that is the greatest challenge to overcome to achieve reproducible scientific research in computational hydrology. We believe that from changing the culture and attitude among hydrological scientists, details will evolve to cover more (technical) aspects over time.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/27516922','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/27516922"><span>Evaluation of Emerging Energy-Efficient Heterogeneous Computing Platforms for Biomolecular and Cellular Simulation Workloads.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Stone, John E; Hallock, Michael J; Phillips, James C; Peterson, Joseph R; Luthey-Schulten, Zaida; Schulten, Klaus</p> <p>2016-05-01</p> <p>Many of the continuing scientific advances achieved through computational biology are predicated on the availability of ongoing increases in computational power required for detailed simulation and analysis of cellular processes on biologically-relevant timescales. A critical challenge facing the development of future exascale supercomputer systems is the development of new computing hardware and associated scientific applications that dramatically improve upon the energy efficiency of existing solutions, while providing increased simulation, analysis, and visualization performance. Mobile computing platforms have recently become powerful enough to support interactive molecular visualization tasks that were previously only possible on laptops and workstations, creating future opportunities for their convenient use for meetings, remote collaboration, and as head mounted displays for immersive stereoscopic viewing. We describe early experiences adapting several biomolecular simulation and analysis applications for emerging heterogeneous computing platforms that combine power-efficient system-on-chip multi-core CPUs with high-performance massively parallel GPUs. We present low-cost power monitoring instrumentation that provides sufficient temporal resolution to evaluate the power consumption of individual CPU algorithms and GPU kernels. We compare the performance and energy efficiency of scientific applications running on emerging platforms with results obtained on traditional platforms, identify hardware and algorithmic performance bottlenecks that affect the usability of these platforms, and describe avenues for improving both the hardware and applications in pursuit of the needs of molecular modeling tasks on mobile devices and future exascale computers.</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_19");'>19</a></li> <li><a href="#" onclick='return showDiv("page_20");'>20</a></li> <li class="active"><span>21</span></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li><a href="#" onclick='return showDiv("page_23");'>23</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_21 --> <div id="page_22" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_20");'>20</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li class="active"><span>22</span></li> <li><a href="#" onclick='return showDiv("page_23");'>23</a></li> <li><a href="#" onclick='return showDiv("page_24");'>24</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="421"> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/FR-2011-02-11/pdf/2011-3105.pdf','FEDREG'); return false;" href="https://www.gpo.gov/fdsys/pkg/FR-2011-02-11/pdf/2011-3105.pdf"><span>76 FR 7868 - Center for Scientific Review; Notice of Closed Meetings</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.gpo.gov/fdsys/browse/collection.action?collectionCode=FR">Federal Register 2010, 2011, 2012, 2013, 2014</a></p> <p></p> <p>2011-02-11</p> <p>... Special Emphasis Panel, Small Business: Computational Biology, Image Processing and Data Mining. Date... for Scientific Review Special Emphasis Panel, Quick Trial on Imaging and Image-Guided Intervention...</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014JTAM...44c...3A','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014JTAM...44c...3A"><span>Computer Synthesis Approaches of Hyperboloid Gear Drives with Linear Contact</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Abadjiev, Valentin; Kawasaki, Haruhisa</p> <p>2014-09-01</p> <p>The computer design has improved forming different type software for scientific researches in the field of gearing theory as well as performing an adequate scientific support of the gear drives manufacture. Here are attached computer programs that are based on mathematical models as a result of scientific researches. The modern gear transmissions require the construction of new mathematical approaches to their geometric, technological and strength analysis. The process of optimization, synthesis and design is based on adequate iteration procedures to find out an optimal solution by varying definite parameters. The study is dedicated to accepted methodology in the creation of soft- ware for the synthesis of a class high reduction hyperboloid gears - Spiroid and Helicon ones (Spiroid and Helicon are trademarks registered by the Illinois Tool Works, Chicago, Ill). The developed basic computer products belong to software, based on original mathematical models. They are based on the two mathematical models for the synthesis: "upon a pitch contact point" and "upon a mesh region". Computer programs are worked out on the basis of the described mathematical models, and the relations between them are shown. The application of the shown approaches to the synthesis of commented gear drives is illustrated.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3019603','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3019603"><span>Application of Metamorphic Testing to Supervised Classifiers</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Xie, Xiaoyuan; Ho, Joshua; Kaiser, Gail; Xu, Baowen; Chen, Tsong Yueh</p> <p>2010-01-01</p> <p>Many applications in the field of scientific computing - such as computational biology, computational linguistics, and others - depend on Machine Learning algorithms to provide important core functionality to support solutions in the particular problem domains. However, it is difficult to test such applications because often there is no “test oracle” to indicate what the correct output should be for arbitrary input. To help address the quality of such software, in this paper we present a technique for testing the implementations of supervised machine learning classification algorithms on which such scientific computing software depends. Our technique is based on an approach called “metamorphic testing”, which has been shown to be effective in such cases. More importantly, we demonstrate that our technique not only serves the purpose of verification, but also can be applied in validation. In addition to presenting our technique, we describe a case study we performed on a real-world machine learning application framework, and discuss how programmers implementing machine learning algorithms can avoid the common pitfalls discovered in our study. We also discuss how our findings can be of use to other areas outside scientific computing, as well. PMID:21243103</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/FR-2013-11-14/pdf/2013-27170.pdf','FEDREG'); return false;" href="https://www.gpo.gov/fdsys/pkg/FR-2013-11-14/pdf/2013-27170.pdf"><span>78 FR 68462 - Center for Scientific Review; Notice of Closed Meetings</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.gpo.gov/fdsys/browse/collection.action?collectionCode=FR">Federal Register 2010, 2011, 2012, 2013, 2014</a></p> <p></p> <p>2013-11-14</p> <p>... personal privacy. Name of Committee: Center for Scientific Review Special Emphasis Panel; Brain Injury and... Methodologies Integrated Review Group; Biomedical Computing and Health Informatics Study Section. Date: December...</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=2703572','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=2703572"><span>Parallel, distributed and GPU computing technologies in single-particle electron microscopy</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Schmeisser, Martin; Heisen, Burkhard C.; Luettich, Mario; Busche, Boris; Hauer, Florian; Koske, Tobias; Knauber, Karl-Heinz; Stark, Holger</p> <p>2009-01-01</p> <p>Most known methods for the determination of the structure of macromolecular complexes are limited or at least restricted at some point by their computational demands. Recent developments in information technology such as multicore, parallel and GPU processing can be used to overcome these limitations. In particular, graphics processing units (GPUs), which were originally developed for rendering real-time effects in computer games, are now ubiquitous and provide unprecedented computational power for scientific applications. Each parallel-processing paradigm alone can improve overall performance; the increased computational performance obtained by combining all paradigms, unleashing the full power of today’s technology, makes certain applications feasible that were previously virtually impossible. In this article, state-of-the-art paradigms are introduced, the tools and infrastructure needed to apply these paradigms are presented and a state-of-the-art infrastructure and solution strategy for moving scientific applications to the next generation of computer hardware is outlined. PMID:19564686</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/servlets/purl/1357245','SCIGOV-STC'); return false;" href="https://www.osti.gov/servlets/purl/1357245"><span></span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Lingerfelt, Eric J; Endeve, Eirik; Hui, Yawei</p> <p></p> <p>Improvements in scientific instrumentation allow imaging at mesoscopic to atomic length scales, many spectroscopic modes, and now--with the rise of multimodal acquisition systems and the associated processing capability--the era of multidimensional, informationally dense data sets has arrived. Technical issues in these combinatorial scientific fields are exacerbated by computational challenges best summarized as a necessity for drastic improvement in the capability to transfer, store, and analyze large volumes of data. The Bellerophon Environment for Analysis of Materials (BEAM) platform provides material scientists the capability to directly leverage the integrated computational and analytical power of High Performance Computing (HPC) to perform scalablemore » data analysis and simulation and manage uploaded data files via an intuitive, cross-platform client user interface. This framework delivers authenticated, "push-button" execution of complex user workflows that deploy data analysis algorithms and computational simulations utilizing compute-and-data cloud infrastructures and HPC environments like Titan at the Oak Ridge Leadershp Computing Facility (OLCF).« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/19564686','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/19564686"><span>Parallel, distributed and GPU computing technologies in single-particle electron microscopy.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Schmeisser, Martin; Heisen, Burkhard C; Luettich, Mario; Busche, Boris; Hauer, Florian; Koske, Tobias; Knauber, Karl-Heinz; Stark, Holger</p> <p>2009-07-01</p> <p>Most known methods for the determination of the structure of macromolecular complexes are limited or at least restricted at some point by their computational demands. Recent developments in information technology such as multicore, parallel and GPU processing can be used to overcome these limitations. In particular, graphics processing units (GPUs), which were originally developed for rendering real-time effects in computer games, are now ubiquitous and provide unprecedented computational power for scientific applications. Each parallel-processing paradigm alone can improve overall performance; the increased computational performance obtained by combining all paradigms, unleashing the full power of today's technology, makes certain applications feasible that were previously virtually impossible. In this article, state-of-the-art paradigms are introduced, the tools and infrastructure needed to apply these paradigms are presented and a state-of-the-art infrastructure and solution strategy for moving scientific applications to the next generation of computer hardware is outlined.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/21799912','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/21799912"><span>Methods for structuring scientific knowledge from many areas related to aging research.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Zhavoronkov, Alex; Cantor, Charles R</p> <p>2011-01-01</p> <p>Aging and age-related disease represents a substantial quantity of current natural, social and behavioral science research efforts. Presently, no centralized system exists for tracking aging research projects across numerous research disciplines. The multidisciplinary nature of this research complicates the understanding of underlying project categories, the establishment of project relations, and the development of a unified project classification scheme. We have developed a highly visual database, the International Aging Research Portfolio (IARP), available at AgingPortfolio.org to address this issue. The database integrates information on research grants, peer-reviewed publications, and issued patent applications from multiple sources. Additionally, the database uses flexible project classification mechanisms and tools for analyzing project associations and trends. This system enables scientists to search the centralized project database, to classify and categorize aging projects, and to analyze the funding aspects across multiple research disciplines. The IARP is designed to provide improved allocation and prioritization of scarce research funding, to reduce project overlap and improve scientific collaboration thereby accelerating scientific and medical progress in a rapidly growing area of research. Grant applications often precede publications and some grants do not result in publications, thus, this system provides utility to investigate an earlier and broader view on research activity in many research disciplines. This project is a first attempt to provide a centralized database system for research grants and to categorize aging research projects into multiple subcategories utilizing both advanced machine algorithms and a hierarchical environment for scientific collaboration.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/20020610','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/20020610"><span>[Organization of scientific-methodological work in Central Military Clinical Hospital named after A.A. Vishnevskiĭ].</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Beliakin, S A; Fokin, Iu N; Kokhan, E P; Frolkin, M N</p> <p>2009-09-01</p> <p>There was congested a wide experience of organization and management of scientific work in the 3rd CMCH by Vishnevsky A.A. for a term of more than 40 years. This experience is subjected to generalization, analyze for the purpose of determination of it's priority orientations of improvement. Scientific-methods work in hospital is rated as a complex of measures, organisationaly-planed and coordinated by purpose and reinforcement of scientific schools of the 3rd CMCH by Vishnevsky A.A., as a basis of effective delivery of specialized medical aid. The vector of scientific researches is directed, generally, to solving questions of military and field medicine.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2018IJGS...47..193G','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2018IJGS...47..193G"><span>Sign use and cognition in automated scientific discovery: are computers only special kinds of signs?</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Giza, Piotr</p> <p>2018-04-01</p> <p>James Fetzer criticizes the computational paradigm, prevailing in cognitive science by questioning, what he takes to be, its most elementary ingredient: that cognition is computation across representations. He argues that if cognition is taken to be a purposive, meaningful, algorithmic problem solving activity, then computers are incapable of cognition. Instead, they appear to be signs of a special kind, that can facilitate computation. He proposes the conception of minds as semiotic systems as an alternative paradigm for understanding mental phenomena, one that seems to overcome the difficulties of computationalism. Now, I argue, that with computer systems dealing with scientific discovery, the matter is not so simple as that. The alleged superiority of humans using signs to stand for something other over computers being merely "physical symbol systems" or "automatic formal systems" is only easy to establish in everyday life, but becomes far from obvious when scientific discovery is at stake. In science, as opposed to everyday life, the meaning of symbols is, apart from very low-level experimental investigations, defined implicitly by the way the symbols are used in explanatory theories or experimental laws relevant to the field, and in consequence, human and machine discoverers are much more on a par. Moreover, the great practical success of the genetic programming method and recent attempts to apply it to automatic generation of cognitive theories seem to show, that computer systems are capable of very efficient problem solving activity in science, which is neither purposive nor meaningful, nor algorithmic. This, I think, undermines Fetzer's argument that computer systems are incapable of cognition because computation across representations is bound to be a purposive, meaningful, algorithmic problem solving activity.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/19810025595','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/19810025595"><span>Ames Research Center publications: A continuing bibliography, 1980</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p></p> <p>1981-01-01</p> <p>This bibliography lists formal NASA publications, journal articles, books, chapters of books, patents, contractor reports, and computer programs that were issued by Ames Research Center and indexed by Scientific and Technical Aerospace Reports, Limited Scientific and Technical Aerospace Reports, International Aerospace Abstracts, and Computer Program Abstracts in 1980. Citations are arranged by directorate, type of publication, and NASA accession numbers. Subject, personal author, corporate source, contract number, and report/accession number indexes are provided.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/24810335','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/24810335"><span>[Text mining, a method for computer-assisted analysis of scientific texts, demonstrated by an analysis of author networks].</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Hahn, P; Dullweber, F; Unglaub, F; Spies, C K</p> <p>2014-06-01</p> <p>Searching for relevant publications is becoming more difficult with the increasing number of scientific articles. Text mining as a specific form of computer-based data analysis may be helpful in this context. Highlighting relations between authors and finding relevant publications concerning a specific subject using text analysis programs are illustrated graphically by 2 performed examples. © Georg Thieme Verlag KG Stuttgart · New York.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/servlets/purl/1164708','SCIGOV-STC'); return false;" href="https://www.osti.gov/servlets/purl/1164708"><span>Advanced Artificial Science. The development of an artificial science and engineering research infrastructure to facilitate innovative computational modeling, analysis, and application to interdisciplinary areas of scientific investigation.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Saffer, Shelley</p> <p>2014-12-01</p> <p>This is a final report of the DOE award DE-SC0001132, Advanced Artificial Science. The development of an artificial science and engineering research infrastructure to facilitate innovative computational modeling, analysis, and application to interdisciplinary areas of scientific investigation. This document describes the achievements of the goals, and resulting research made possible by this award.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://eric.ed.gov/?q=computer+AND+games&pg=2&id=EJ1074441','ERIC'); return false;" href="https://eric.ed.gov/?q=computer+AND+games&pg=2&id=EJ1074441"><span>Scientific Inquiry Self-Efficacy and Computer Game Self-Efficacy as Predictors and Outcomes of Middle School Boys' and Girls' Performance in a Science Assessment in a Virtual Environment</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Bergey, Bradley W.; Ketelhut, Diane Jass; Liang, Senfeng; Natarajan, Uma; Karakus, Melissa</p> <p>2015-01-01</p> <p>The primary aim of the study was to examine whether performance on a science assessment in an immersive virtual environment was associated with changes in scientific inquiry self-efficacy. A secondary aim of the study was to examine whether performance on the science assessment was equitable for students with different levels of computer game…</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://eric.ed.gov/?q=reasoning+AND+psychology&pg=5&id=EJ967365','ERIC'); return false;" href="https://eric.ed.gov/?q=reasoning+AND+psychology&pg=5&id=EJ967365"><span>Operation ARA: A Computerized Learning Game that Teaches Critical Thinking and Scientific Reasoning</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Halpern, Diane F.; Millis, Keith; Graesser, Arthur C.; Butler, Heather; Forsyth, Carol; Cai, Zhiqiang</p> <p>2012-01-01</p> <p>Operation ARA (Acquiring Research Acumen) is a computerized learning game that teaches critical thinking and scientific reasoning. It is a valuable learning tool that utilizes principles from the science of learning and serious computer games. Students learn the skills of scientific reasoning by engaging in interactive dialogs with avatars. They…</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://eric.ed.gov/?q=formative+AND+assessment+AND+students&id=EJ1173768','ERIC'); return false;" href="https://eric.ed.gov/?q=formative+AND+assessment+AND+students&id=EJ1173768"><span>Validation of Automated Scoring for a Formative Assessment That Employs Scientific Argumentation</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Mao, Liyang; Liu, Ou Lydia; Roohr, Katrina; Belur, Vinetha; Mulholland, Matthew; Lee, Hee-Sun; Pallant, Amy</p> <p>2018-01-01</p> <p>Scientific argumentation is one of the core practices for teachers to implement in science classrooms. We developed a computer-based formative assessment to support students' construction and revision of scientific arguments. The assessment is built upon automated scoring of students' arguments and provides feedback to students and teachers.…</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://eric.ed.gov/?q=Kim+AND+Young-Duk&id=EJ656427','ERIC'); return false;" href="https://eric.ed.gov/?q=Kim+AND+Young-Duk&id=EJ656427"><span>Effects of Students' Prior Knowledge on Scientific Reasoning in Density.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Yang, Il-Ho; Kwon, Yong-Ju; Kim, Young-Shin; Jang, Myoung-Duk; Jeong, Jin-Woo; Park, Kuk-Tae</p> <p>2002-01-01</p> <p>Investigates the effects of students' prior knowledge on the scientific reasoning processes of performing the task of controlling variables with computer simulation and identifies a number of problems that students encounter in scientific discovery. Involves (n=27) 5th grade students and (n=33) 7th grade students. Indicates that students' prior…</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://www.dtic.mil/docs/citations/ADA115382','DTIC-ST'); return false;" href="http://www.dtic.mil/docs/citations/ADA115382"><span>Soviet Computers and Cybernetics: Shortcomings and Military Applications.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.dtic.mil/">DTIC Science & Technology</a></p> <p></p> <p>1980-06-01</p> <p>FOOTNOTES.......................................24 BIBLIOGRAPHY......................................28 INTRODUCTION Military scientific technological...exploration which have alarmed some Western analysts. America’s scientific and technological advantages are integral elements in the delicate world balance...inferior quantity only up to a point, where superior numbers take over. A major element in the military scientific technological competition between</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014IJMPC..2530001P','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014IJMPC..2530001P"><span>Technologies for Large Data Management in Scientific Computing</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Pace, Alberto</p> <p>2014-01-01</p> <p>In recent years, intense usage of computing has been the main strategy of investigations in several scientific research projects. The progress in computing technology has opened unprecedented opportunities for systematic collection of experimental data and the associated analysis that were considered impossible only few years ago. This paper focuses on the strategies in use: it reviews the various components that are necessary for an effective solution that ensures the storage, the long term preservation, and the worldwide distribution of large quantities of data that are necessary in a large scientific research project. The paper also mentions several examples of data management solutions used in High Energy Physics for the CERN Large Hadron Collider (LHC) experiments in Geneva, Switzerland which generate more than 30,000 terabytes of data every year that need to be preserved, analyzed, and made available to a community of several tenth of thousands scientists worldwide.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://images.nasa.gov/#/details-GSFC_20171208_Archive_e001293.html','SCIGOVIMAGE-NASA'); return false;" href="https://images.nasa.gov/#/details-GSFC_20171208_Archive_e001293.html"><span>Winter Storm Continues Across Central U.S.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://images.nasa.gov/">NASA Image and Video Library</a></p> <p></p> <p>2013-12-06</p> <p>The powerful winter storm that has been affecting much of the central and western U.S. continues to intensify as it moves into Canada. Snow is tapering off across the Upper Midwest, but heavy snow is possible on Thursday from the Ohio Valley to the mid-Mississippi Valley, with heavy rain possible from the central Appalachians to the lower Mississippi Valley. Freezing rain is possible from Texas to the Ohio Valley. This image was taken by GOES East at 1745Z on December 5, 2013. Credit: NOAA/NASA GOES Project Caption: NOAA NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_20");'>20</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li class="active"><span>22</span></li> <li><a href="#" onclick='return showDiv("page_23");'>23</a></li> <li><a href="#" onclick='return showDiv("page_24");'>24</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_22 --> <div id="page_23" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li class="active"><span>23</span></li> <li><a href="#" onclick='return showDiv("page_24");'>24</a></li> <li><a href="#" onclick='return showDiv("page_25");'>25</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="441"> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2007JSEdT..16..225V','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2007JSEdT..16..225V"><span>Rethinking the Role of Information Technology-Based Research Tools in Students' Development of Scientific Literacy</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>van Eijck, Michiel; Roth, Wolff-Michael</p> <p>2007-06-01</p> <p>Given the central place IT-based research tools take in scientific research, the marginal role such tools currently play in science curricula is dissatisfying from the perspective of making students scientifically literate. To appropriately frame the role of IT-based research tools in science curricula, we propose a framework that is developed to understand the use of tools in human activity, namely cultural-historical activity theory (CHAT). Accordingly, IT-based research tools constitute central moments of scientific research activity and neither can be seen apart from its objectives, nor can it be considered apart from the cultural-historical determined forms of activity (praxis) in which human subjects participate. Based on empirical data involving students participating in research activity, we point out how an appropriate account of IT-based research tools involves subjects' use of tools with respect to the objectives of research activity and the contribution to the praxis of research. We propose to reconceptualize the role of IT-based research tools as contributing to scientific literacy if students apply these tools with respect to the objectives of the research activity and contribute to praxis of research by evaluating and modifying the application of these tools. We conclude this paper by sketching the educational implications of this reconceptualized role of IT-based research tools.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/servlets/purl/1333843','SCIGOV-STC'); return false;" href="https://www.osti.gov/servlets/purl/1333843"><span>Data Intensive Scientific Workflows on a Federated Cloud: CRADA Final Report</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Garzoglio, Gabriele</p> <p></p> <p>The Fermilab Scientific Computing Division and the KISTI Global Science Experimental Data Hub Center have built a prototypical large-scale infrastructure to handle scientific workflows of stakeholders to run on multiple cloud resources. The demonstrations have been in the areas of (a) Data-Intensive Scientific Workflows on Federated Clouds, (b) Interoperability and Federation of Cloud Resources, and (c) Virtual Infrastructure Automation to enable On-Demand Services.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3551907','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3551907"><span>Percolation Centrality: Quantifying Graph-Theoretic Impact of Nodes during Percolation in Networks</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Piraveenan, Mahendra; Prokopenko, Mikhail; Hossain, Liaquat</p> <p>2013-01-01</p> <p>A number of centrality measures are available to determine the relative importance of a node in a complex network, and betweenness is prominent among them. However, the existing centrality measures are not adequate in network percolation scenarios (such as during infection transmission in a social network of individuals, spreading of computer viruses on computer networks, or transmission of disease over a network of towns) because they do not account for the changing percolation states of individual nodes. We propose a new measure, percolation centrality, that quantifies relative impact of nodes based on their topological connectivity, as well as their percolation states. The measure can be extended to include random walk based definitions, and its computational complexity is shown to be of the same order as that of betweenness centrality. We demonstrate the usage of percolation centrality by applying it to a canonical network as well as simulated and real world scale-free and random networks. PMID:23349699</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://files.eric.ed.gov/fulltext/ED396683.pdf','ERIC'); return false;" href="http://files.eric.ed.gov/fulltext/ED396683.pdf"><span>Approaches to Classroom-Based Computational Science.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Guzdial, Mark</p> <p></p> <p>Computational science includes the use of computer-based modeling and simulation to define and test theories about scientific phenomena. The challenge for educators is to develop techniques for implementing computational science in the classroom. This paper reviews some previous work on the use of simulation alone (without modeling), modeling…</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/24280532','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/24280532"><span>Frames of scientific evidence: How journalists represent the (un)certainty of molecular medicine in science television programs.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Ruhrmann, Georg; Guenther, Lars; Kessler, Sabrina Heike; Milde, Jutta</p> <p>2015-08-01</p> <p>For laypeople, media coverage of science on television is a gateway to scientific issues. Defining scientific evidence is central to the field of science, but there are still questions if news coverage of science represents scientific research findings as certain or uncertain. The framing approach is a suitable framework to classify different media representations; it is applied here to investigate the frames of scientific evidence in film clips (n=207) taken from science television programs. Molecular medicine is the domain of interest for this analysis, due to its high proportion of uncertain and conflicting research findings and risks. The results indicate that television clips vary in their coverage of scientific evidence of molecular medicine. Four frames were found: Scientific Uncertainty and Controversy, Scientifically Certain Data, Everyday Medical Risks, and Conflicting Scientific Evidence. They differ in their way of framing scientific evidence and risks of molecular medicine. © The Author(s) 2013.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2010PhyEd..45..140P','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2010PhyEd..45..140P"><span>Physics thematic paths: laboratorial activities and historical scientific instruments</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Pantano, O.; Talas, S.</p> <p>2010-03-01</p> <p>The Physics Department of Padua University keeps an important collection of historical physics instruments which alludes to the fruitful scientific activity of Padua through the centuries. This heritage led to the suggestion of setting up laboratory activities connected to the Museum collection for secondary school students. This article shows how different thematic paths have been developed, reflecting on the importance of historical perspectives in science teaching. We also show how a scientific historical museum can play a central role in improving the learning of physics concepts.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://pubs.usgs.gov/fs/0147-95/report.pdf','USGSPUBS'); return false;" href="https://pubs.usgs.gov/fs/0147-95/report.pdf"><span>USGS Scientific Visualization Laboratory</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://pubs.er.usgs.gov/pubs/index.jsp?view=adv">USGS Publications Warehouse</a></p> <p>,</p> <p>1995-01-01</p> <p>The U.S. Geological Survey's (USGS) Scientific Visualization Laboratory at the National Center in Reston, Va., provides a central facility where USGS employees can use state-of-the-art equipment for projects ranging from presentation graphics preparation to complex visual representations of scientific data. Equipment including color printers, black-and-white and color scanners, film recorders, video equipment, and DOS, Apple Macintosh, and UNIX platforms with software are available for both technical and nontechnical users. The laboratory staff provides assistance and demonstrations in the use of the hardware and software products.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/FR-2011-04-29/pdf/2011-10297.pdf','FEDREG'); return false;" href="https://www.gpo.gov/fdsys/pkg/FR-2011-04-29/pdf/2011-10297.pdf"><span>76 FR 24036 - Center for Scientific Review; Notice of Closed Meetings</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.gpo.gov/fdsys/browse/collection.action?collectionCode=FR">Federal Register 2010, 2011, 2012, 2013, 2014</a></p> <p></p> <p>2011-04-29</p> <p>... personal privacy. Name of Committee: Center for Scientific Review Special Emphasis Panel, Brain Disorders... Integrated Review Group, Biomedical Computing and Health Informatics Study Section. Date: June 7-8, 2011...</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://ntrs.nasa.gov/search.jsp?R=19900032835&hterms=Scientific+Instruments&qs=N%3D0%26Ntk%3DAll%26Ntx%3Dmode%2Bmatchall%26Ntt%3DScientific%2BInstruments','NASA-TRS'); return false;" href="https://ntrs.nasa.gov/search.jsp?R=19900032835&hterms=Scientific+Instruments&qs=N%3D0%26Ntk%3DAll%26Ntx%3Dmode%2Bmatchall%26Ntt%3DScientific%2BInstruments"><span>The Space Telescope SI C&DH system. [Scientific Instrument Control and Data Handling Subsystem</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Gadwal, Govind R.; Barasch, Ronald S.</p> <p>1990-01-01</p> <p>The Hubble Space Telescope Scientific Instrument Control and Data Handling Subsystem (SI C&DH) is designed to interface with five scientific instruments of the Space Telescope to provide ground and autonomous control and collect health and status information using the Standard Telemetry and Command Components (STACC) multiplex data bus. It also formats high throughput science data into packets. The packetized data is interleaved and Reed-Solomon encoded for error correction and Pseudo Random encoded. An inner convolutional coding with the outer Reed-Solomon coding provides excellent error correction capability. The subsystem is designed with the capacity for orbital replacement in order to meet a mission life of fifteen years. The spacecraft computer and the SI C&DH computer coordinate the activities of the spacecraft and the scientific instruments to achieve the mission objectives.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014AGUFMIN23B3732H','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014AGUFMIN23B3732H"><span>Cultural and Technological Issues and Solutions for Geodynamics Software Citation</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Heien, E. M.; Hwang, L.; Fish, A. E.; Smith, M.; Dumit, J.; Kellogg, L. H.</p> <p>2014-12-01</p> <p>Computational software and custom-written codes play a key role in scientific research and teaching, providing tools to perform data analysis and forward modeling through numerical computation. However, development of these codes is often hampered by the fact that there is no well-defined way for the authors to receive credit or professional recognition for their work through the standard methods of scientific publication and subsequent citation of the work. This in turn may discourage researchers from publishing their codes or making them easier for other scientists to use. We investigate the issues involved in citing software in a scientific context, and introduce features that should be components of a citation infrastructure, particularly oriented towards the codes and scientific culture in the area of geodynamics research. The codes used in geodynamics are primarily specialized numerical modeling codes for continuum mechanics problems; they may be developed by individual researchers, teams of researchers, geophysicists in collaboration with computational scientists and applied mathematicians, or by coordinated community efforts such as the Computational Infrastructure for Geodynamics. Some but not all geodynamics codes are open-source. These characteristics are common to many areas of geophysical software development and use. We provide background on the problem of software citation and discuss some of the barriers preventing adoption of such citations, including social/cultural barriers, insufficient technological support infrastructure, and an overall lack of agreement about what a software citation should consist of. We suggest solutions in an initial effort to create a system to support citation of software and promotion of scientific software development.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/servlets/purl/983318','SCIGOV-STC'); return false;" href="https://www.osti.gov/servlets/purl/983318"><span></span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Bailey, David H.</p> <p></p> <p>The NAS Parallel Benchmarks (NPB) are a suite of parallel computer performance benchmarks. They were originally developed at the NASA Ames Research Center in 1991 to assess high-end parallel supercomputers. Although they are no longer used as widely as they once were for comparing high-end system performance, they continue to be studied and analyzed a great deal in the high-performance computing community. The acronym 'NAS' originally stood for the Numerical Aeronautical Simulation Program at NASA Ames. The name of this organization was subsequently changed to the Numerical Aerospace Simulation Program, and more recently to the NASA Advanced Supercomputing Center, althoughmore » the acronym remains 'NAS.' The developers of the original NPB suite were David H. Bailey, Eric Barszcz, John Barton, David Browning, Russell Carter, LeoDagum, Rod Fatoohi, Samuel Fineberg, Paul Frederickson, Thomas Lasinski, Rob Schreiber, Horst Simon, V. Venkatakrishnan and Sisira Weeratunga. The original NAS Parallel Benchmarks consisted of eight individual benchmark problems, each of which focused on some aspect of scientific computing. The principal focus was in computational aerophysics, although most of these benchmarks have much broader relevance, since in a much larger sense they are typical of many real-world scientific computing applications. The NPB suite grew out of the need for a more rational procedure to select new supercomputers for acquisition by NASA. The emergence of commercially available highly parallel computer systems in the late 1980s offered an attractive alternative to parallel vector supercomputers that had been the mainstay of high-end scientific computing. However, the introduction of highly parallel systems was accompanied by a regrettable level of hype, not only on the part of the commercial vendors but even, in some cases, by scientists using the systems. As a result, it was difficult to discern whether the new systems offered any fundamental performance advantage over vector supercomputers, and, if so, which of the parallel offerings would be most useful in real-world scientific computation. In part to draw attention to some of the performance reporting abuses prevalent at the time, the present author wrote a humorous essay 'Twelve Ways to Fool the Masses,' which described in a light-hearted way a number of the questionable ways in which both vendor marketing people and scientists were inflating and distorting their performance results. All of this underscored the need for an objective and scientifically defensible measure to compare performance on these systems.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/19940011365','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/19940011365"><span>NASA/DOD Aerospace Knowledge Diffusion Research Project. Paper 32: A new era in international technical communication: American-Russian collaboration</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Flammia, Madelyn; Barclay, Rebecca O.; Pinelli, Thomas E.; Keene, Michael L.; Burger, Robert H.; Kennedy, John M.</p> <p>1993-01-01</p> <p>Until the recent dissolution of the Soviet Union, the Communist Party exerted a strict control of access to and dissemination of scientific and technical information (STI). This article presents models of the Soviet-style information society and the Western-style information society and discusses the effects of centralized governmental control of information on Russian technical communication practices. The effects of political control on technical communication are then used to interpret the results of a survey of Russian and U.S. aerospace engineers and scientists concerning the time devoted to technical communication, their collaborative writing practices and their attitudes toward collaboration, the kinds of technical documents they produce and use, their views regarding the appropriate content for an undergraduate technical communication course, and their use of computer technology. Finally, the implications of these findings for future collaboration between Russian and U.S. engineers and scientists are examined.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://ntrs.nasa.gov/search.jsp?R=19940034444&hterms=technical+writing&qs=Ntx%3Dmode%2Bmatchall%26Ntk%3DAll%26N%3D0%26No%3D50%26Ntt%3Dtechnical%2Bwriting','NASA-TRS'); return false;" href="https://ntrs.nasa.gov/search.jsp?R=19940034444&hterms=technical+writing&qs=Ntx%3Dmode%2Bmatchall%26Ntk%3DAll%26N%3D0%26No%3D50%26Ntt%3Dtechnical%2Bwriting"><span>NASA/DoD Aerospace Knowledge Diffusion Research Project. XXXII - A new era in international technical communication: American-Russian collaboration</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Flammia, Madelyn; Barclay, Rebecca O.; Pinelli, Thomas E.; Keene, Michael L.; Burger, Robert H.; Kennedy, John M.</p> <p>1993-01-01</p> <p>Until the recent dissolution of the Soviet Union, the Communist Party exerted a strict control of access to and dissemination of scientific and technical information. This article presents models of the Soviet-style information society and the Western-style information society and discusses the effects of centralized governmental control of information on Russian technical communication practices. The effects of political control on technical communication are then used to interpret the results of a survey of Russian and U.S. aerospace engineers and scientists concerning the time devoted to technical communication, their collaborative writing practices and their attitudes toward collaboration, the kinds of technical documents they produce and use, their views regarding the appropriate content for an undergraduate technical communication course, and their use of computer technology. Finally, the implications of these findings for future collaboration between Russian and U.S. engineers and scientists are examined.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/26988387','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/26988387"><span>Seeking Synthesis: The Integrative Problem in Understanding Language and Its Evolution.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Dale, Rick; Kello, Christopher T; Schoenemann, P Thomas</p> <p>2016-04-01</p> <p>We discuss two problems for a general scientific understanding of language, sequences and synergies: how language is an intricately sequenced behavior and how language is manifested as a multidimensionally structured behavior. Though both are central in our understanding, we observe that the former tends to be studied more than the latter. We consider very general conditions that hold in human brain evolution and its computational implications, and identify multimodal and multiscale organization as two key characteristics of emerging cognitive function in our species. This suggests that human brains, and cognitive function specifically, became more adept at integrating diverse information sources and operating at multiple levels for linguistic performance. We argue that framing language evolution, learning, and use in terms of synergies suggests new research questions, and it may be a fruitful direction for new developments in theory and modeling of language as an integrated system. Copyright © 2016 Cognitive Science Society, Inc.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/20040095944','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/20040095944"><span>A Multi-Discipline, Multi-Genre Digital Library for Research and Education</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Nelson, Michael L.; Maly, Kurt; Shen, Stewart N. T.</p> <p>2004-01-01</p> <p>We describe NCSTRL+, a unified, canonical digital library for educational and scientific and technical information (STI). NCSTRL+ is based on the Networked Computer Science Technical Report Library (NCSTRL), a World Wide Web (WWW) accessible digital library (DL) that provides access to over 100 university departments and laboratories. NCSTRL+ implements two new technologies: cluster functionality and publishing "buckets". We have extended the Dienst protocol, the protocol underlying NCSTRL, to provide the ability to "cluster" independent collections into a logically centralized digital library based upon subject category classification, type of organization, and genres of material. The concept of "buckets" provides a mechanism for publishing and managing logically linked entities with multiple data formats. The NCSTRL+ prototype DL contains the holdings of NCSTRL and the NASA Technical Report Server (NTRS). The prototype demonstrates the feasibility of publishing into a multi-cluster DL, searching across clusters, and storing and presenting buckets of information.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/20040100707','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/20040100707"><span>NCSTRL+: Adding Multi-Discipline and Multi-Genre Support to the Dienst Protocol Using Clusters and Buckets</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Nelson, Michael L.; Maly, Kurt; Shen, Stewart N. T.; Zubair, Mohammad</p> <p>1998-01-01</p> <p>We describe NCSTRL+, a unified, canonical digital library for scientific and technical information (STI). NCSTRL+ is based on the Networked Computer Science Technical Report Library (NCSTRL), a World Wide Web (WWW) accessible digital library (DL) that provides access to over 100 university departments and laboratories. NCSTRL+ implements two new technologies: cluster functionality and publishing buckets. We have extended Dienst, the protocol underlying NCSTRL, to provide the ability to cluster independent collections into a logically centralized digital library based upon subject category classification, type of organization, and genres of material. The bucket construct provides a mechanism for publishing and managing logically linked entities with multiple data forms as a single object. The NCSTRL+ prototype DL contains the holdings of NCSTRL and the NASA Technical Report Server (NTRS). The prototype demonstrates the feasibility of publishing into a multi-cluster DL, searching across clusters, and storing and presenting buckets of information.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4356565','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4356565"><span>The Evolution of Your Success Lies at the Centre of Your Co-Authorship Network</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Servia-Rodríguez, Sandra; Noulas, Anastasios; Mascolo, Cecilia; Fernández-Vilas, Ana; Díaz-Redondo, Rebeca P.</p> <p>2015-01-01</p> <p>Collaboration among scholars and institutions is progressively becoming essential to the success of research grant procurement and to allow the emergence and evolution of scientific disciplines. Our work focuses on analysing if the volume of collaborations of one author together with the relevance of his collaborators is somewhat related to his research performance over time. In order to prove this relation we collected the temporal distributions of scholars’ publications and citations from the Google Scholar platform and the co-authorship network (of Computer Scientists) underlying the well-known DBLP bibliographic database. By the application of time series clustering, social network analysis and non-parametric statistics, we observe that scholars with similar publications (citations) patterns also tend to have a similar centrality in the co-authorship network. To our knowledge, this is the first work that considers success evolution with respect to co-authorship. PMID:25760732</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2011CQGra..28i4016C','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2011CQGra..28i4016C"><span>Detection strategies for extreme mass ratio inspirals</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Cornish, Neil J.</p> <p>2011-05-01</p> <p>The capture of compact stellar remnants by galactic black holes provides a unique laboratory for exploring the near-horizon geometry of the Kerr spacetime, or possible departures from general relativity if the central cores prove not to be black holes. The gravitational radiation produced by these extreme mass ratio inspirals (EMRIs) encodes a detailed map of the black hole geometry, and the detection and characterization of these signals is a major scientific goal for the LISA mission. The waveforms produced are very complex, and the signals need to be coherently tracked for tens of thousands of cycles to produce a detection, making EMRI signals one of the most challenging data analysis problems in all of gravitational wave astronomy. Estimates for the number of templates required to perform an exhaustive grid-based matched-filter search for these signals are astronomically large, and far out of reach of current computational resources. Here I describe an alternative approach that employs a hybrid between genetic algorithms and Markov chain Monte Carlo techniques, along with several time-saving techniques for computing the likelihood function. This approach has proven effective at the blind extraction of relatively weak EMRI signals from simulated LISA data sets.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/servlets/purl/1223495','SCIGOV-STC'); return false;" href="https://www.osti.gov/servlets/purl/1223495"><span>Performance Assessment Institute-NV</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Lombardo, Joesph</p> <p>2012-12-31</p> <p>The National Supercomputing Center for Energy and the Environment’s intention is to purchase a multi-purpose computer cluster in support of the Performance Assessment Institute (PA Institute). The PA Institute will serve as a research consortium located in Las Vegas Nevada with membership that includes: national laboratories, universities, industry partners, and domestic and international governments. This center will provide a one-of-a-kind centralized facility for the accumulation of information for use by Institutions of Higher Learning, the U.S. Government, and Regulatory Agencies and approved users. This initiative will enhance and extend High Performance Computing (HPC) resources in Nevada to support critical nationalmore » and international needs in "scientific confirmation". The PA Institute will be promoted as the leading Modeling, Learning and Research Center worldwide. The program proposes to utilize the existing supercomputing capabilities and alliances of the University of Nevada Las Vegas as a base, and to extend these resource and capabilities through a collaborative relationship with its membership. The PA Institute will provide an academic setting for interactive sharing, learning, mentoring and monitoring of multi-disciplinary performance assessment and performance confirmation information. The role of the PA Institute is to facilitate research, knowledge-increase, and knowledge-sharing among users.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016AGUFMIN33C..06H','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016AGUFMIN33C..06H"><span>The Computational Infrastructure for Geodynamics as a Community of Practice</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Hwang, L.; Kellogg, L. H.</p> <p>2016-12-01</p> <p>Computational Infrastructure for Geodynamics (CIG), geodynamics.org, originated in 2005 out of community recognition that the efforts of individual or small groups of researchers to develop scientifically-sound software is impossible to sustain, duplicates effort, and makes it difficult for scientists to adopt state-of-the art computational methods that promote new discovery. As a community of practice, participants in CIG share an interest in computational modeling in geodynamics and work together on open source software to build the capacity to support complex, extensible, scalable, interoperable, reliable, and reusable software in an effort to increase the return on investment in scientific software development and increase the quality of the resulting software. The group interacts regularly to learn from each other and better their practices formally through webinar series, workshops, and tutorials and informally through listservs and hackathons. Over the past decade, we have learned that successful scientific software development requires at a minimum: collaboration between domain-expert researchers, software developers and computational scientists; clearly identified and committed lead developer(s); well-defined scientific and computational goals that are regularly evaluated and updated; well-defined benchmarks and testing throughout development; attention throughout development to usability and extensibility; understanding and evaluation of the complexity of dependent libraries; and managed user expectations through education, training, and support. CIG's code donation standards provide the basis for recently formalized best practices in software development (geodynamics.org/cig/dev/best-practices/). Best practices include use of version control; widely used, open source software libraries; extensive test suites; portable configuration and build systems; extensive documentation internal and external to the code; and structured, human readable input formats.</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li class="active"><span>23</span></li> <li><a href="#" onclick='return showDiv("page_24");'>24</a></li> <li><a href="#" onclick='return showDiv("page_25");'>25</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_23 --> <div id="page_24" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li><a href="#" onclick='return showDiv("page_23");'>23</a></li> <li class="active"><span>24</span></li> <li><a href="#" onclick='return showDiv("page_25");'>25</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="461"> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2011AGUFMIN41A1387M','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2011AGUFMIN41A1387M"><span>Cloud Computing Applications in Support of Earth Science Activities at Marshall Space Flight Center</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Molthan, A.; Limaye, A. S.</p> <p>2011-12-01</p> <p>Currently, the NASA Nebula Cloud Computing Platform is available to Agency personnel in a pre-release status as the system undergoes a formal operational readiness review. Over the past year, two projects within the Earth Science Office at NASA Marshall Space Flight Center have been investigating the performance and value of Nebula's "Infrastructure as a Service", or "IaaS" concept and applying cloud computing concepts to advance their respective mission goals. The Short-term Prediction Research and Transition (SPoRT) Center focuses on the transition of unique NASA satellite observations and weather forecasting capabilities for use within the operational forecasting community through partnerships with NOAA's National Weather Service (NWS). SPoRT has evaluated the performance of the Weather Research and Forecasting (WRF) model on virtual machines deployed within Nebula and used Nebula instances to simulate local forecasts in support of regional forecast studies of interest to select NWS forecast offices. In addition to weather forecasting applications, rapidly deployable Nebula virtual machines have supported the processing of high resolution NASA satellite imagery to support disaster assessment following the historic severe weather and tornado outbreak of April 27, 2011. Other modeling and satellite analysis activities are underway in support of NASA's SERVIR program, which integrates satellite observations, ground-based data and forecast models to monitor environmental change and improve disaster response in Central America, the Caribbean, Africa, and the Himalayas. Leveraging SPoRT's experience, SERVIR is working to establish a real-time weather forecasting model for Central America. Other modeling efforts include hydrologic forecasts for Kenya, driven by NASA satellite observations and reanalysis data sets provided by the broader meteorological community. Forecast modeling efforts are supplemented by short-term forecasts of convective initiation, determined by geostationary satellite observations processed on virtual machines powered by Nebula. This presentation will provide an overview of these activities from a scientific and cloud computing applications perspective, identifying the strengths and weaknesses for deploying each project within an IaaS environment, and ways to collaborate with the Nebula or other cloud-user communities to collaborate on projects as they go forward.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3875239','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3875239"><span>A midas plugin to enable construction of reproducible web-based image processing pipelines</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Grauer, Michael; Reynolds, Patrick; Hoogstoel, Marion; Budin, Francois; Styner, Martin A.; Oguz, Ipek</p> <p>2013-01-01</p> <p>Image processing is an important quantitative technique for neuroscience researchers, but difficult for those who lack experience in the field. In this paper we present a web-based platform that allows an expert to create a brain image processing pipeline, enabling execution of that pipeline even by those biomedical researchers with limited image processing knowledge. These tools are implemented as a plugin for Midas, an open-source toolkit for creating web based scientific data storage and processing platforms. Using this plugin, an image processing expert can construct a pipeline, create a web-based User Interface, manage jobs, and visualize intermediate results. Pipelines are executed on a grid computing platform using BatchMake and HTCondor. This represents a new capability for biomedical researchers and offers an innovative platform for scientific collaboration. Current tools work well, but can be inaccessible for those lacking image processing expertise. Using this plugin, researchers in collaboration with image processing experts can create workflows with reasonable default settings and streamlined user interfaces, and data can be processed easily from a lab environment without the need for a powerful desktop computer. This platform allows simplified troubleshooting, centralized maintenance, and easy data sharing with collaborators. These capabilities enable reproducible science by sharing datasets and processing pipelines between collaborators. In this paper, we present a description of this innovative Midas plugin, along with results obtained from building and executing several ITK based image processing workflows for diffusion weighted MRI (DW MRI) of rodent brain images, as well as recommendations for building automated image processing pipelines. Although the particular image processing pipelines developed were focused on rodent brain MRI, the presented plugin can be used to support any executable or script-based pipeline. PMID:24416016</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/24416016','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/24416016"><span>A midas plugin to enable construction of reproducible web-based image processing pipelines.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Grauer, Michael; Reynolds, Patrick; Hoogstoel, Marion; Budin, Francois; Styner, Martin A; Oguz, Ipek</p> <p>2013-01-01</p> <p>Image processing is an important quantitative technique for neuroscience researchers, but difficult for those who lack experience in the field. In this paper we present a web-based platform that allows an expert to create a brain image processing pipeline, enabling execution of that pipeline even by those biomedical researchers with limited image processing knowledge. These tools are implemented as a plugin for Midas, an open-source toolkit for creating web based scientific data storage and processing platforms. Using this plugin, an image processing expert can construct a pipeline, create a web-based User Interface, manage jobs, and visualize intermediate results. Pipelines are executed on a grid computing platform using BatchMake and HTCondor. This represents a new capability for biomedical researchers and offers an innovative platform for scientific collaboration. Current tools work well, but can be inaccessible for those lacking image processing expertise. Using this plugin, researchers in collaboration with image processing experts can create workflows with reasonable default settings and streamlined user interfaces, and data can be processed easily from a lab environment without the need for a powerful desktop computer. This platform allows simplified troubleshooting, centralized maintenance, and easy data sharing with collaborators. These capabilities enable reproducible science by sharing datasets and processing pipelines between collaborators. In this paper, we present a description of this innovative Midas plugin, along with results obtained from building and executing several ITK based image processing workflows for diffusion weighted MRI (DW MRI) of rodent brain images, as well as recommendations for building automated image processing pipelines. Although the particular image processing pipelines developed were focused on rodent brain MRI, the presented plugin can be used to support any executable or script-based pipeline.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014AGUFMIN34B..08M','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014AGUFMIN34B..08M"><span>GeoDataspaces: Simplifying Data Management Tasks with Globus</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Malik, T.; Chard, K.; Tchoua, R. B.; Foster, I.</p> <p>2014-12-01</p> <p>Data and its management are central to modern scientific enterprise. Typically, geoscientists rely on observations and model output data from several disparate sources (file systems, RDBMS, spreadsheets, remote data sources). Integrated data management solutions that provide intuitive semantics and uniform interfaces, irrespective of the kind of data source are, however, lacking. Consequently, geoscientists are left to conduct low-level and time-consuming data management tasks, individually, and repeatedly for discovering each data source, often resulting in errors in handling. In this talk we will describe how the EarthCube GeoDataspace project is improving this situation for seismologists, hydrologists, and space scientists by simplifying some of the existing data management tasks that arise when developing computational models. We will demonstrate a GeoDataspace, bootstrapped with "geounits", which are self-contained metadata packages that provide complete description of all data elements associated with a model run, including input/output and parameter files, model executable and any associated libraries. Geounits link raw and derived data as well as associating provenance information describing how data was derived. We will discuss challenges in establishing geounits and describe machine learning and human annotation approaches that can be used for extracting and associating ad hoc and unstructured scientific metadata hidden in binary formats with data resources and models. We will show how geounits can improve search and discoverability of data associated with model runs. To support this model, we will describe efforts related towards creating a scalable metadata catalog that helps to maintain, search and discover geounits within the Globus network of accessible endpoints. This talk will focus on the issue of creating comprehensive personal inventories of data assets for computational geoscientists, and describe a publishing mechanism, which can be used to feed into national, international, or thematic discovery portals.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/19890006077','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/19890006077"><span>MODIS Information, Data, and Control System (MIDACS) system specifications and conceptual design</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Han, D.; Salomonson, V.; Ormsby, J.; Ardanuy, P.; Mckay, A.; Hoyt, D.; Jaffin, S.; Vallette, B.; Sharts, B.; Folta, D.</p> <p>1988-01-01</p> <p>The MODIS Information, Data, and Control System (MIDACS) Specifications and Conceptual Design Document discusses system level requirements, the overall operating environment in which requirements must be met, and a breakdown of MIDACS into component subsystems, which include the Instrument Support Terminal, the Instrument Control Center, the Team Member Computing Facility, the Central Data Handling Facility, and the Data Archive and Distribution System. The specifications include sizing estimates for the processing and storage capacities of each data system element, as well as traffic analyses of data flows between the elements internally, and also externally across the data system interfaces. The specifications for the data system, as well as for the individual planning and scheduling, control and monitoring, data acquisition and processing, calibration and validation, and data archive and distribution components, do not yet fully specify the data system in the complete manner needed to achieve the scientific objectives of the MODIS instruments and science teams. The teams have not yet been formed; however, it was possible to develop the specifications and conceptual design based on the present concept of EosDIS, the Level-1 and Level-2 Functional Requirements Documents, the Operations Concept, and through interviews and meetings with key members of the scientific community.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://eric.ed.gov/?q=water+AND+quality&pg=6&id=EJ1053690','ERIC'); return false;" href="https://eric.ed.gov/?q=water+AND+quality&pg=6&id=EJ1053690"><span>A Blended Professional Development Program to Help a Teacher Learn to Provide One-to-One Scaffolding</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Belland, Brian R.; Burdo, Ryan; Gu, Jiangyue</p> <p>2015-01-01</p> <p>Argumentation is central to instruction centered on socio-scientific issues (Sadler & Donnelly in "International Journal of Science Education," 28(12), 1463-1488, 2006. doi: 10.1080/09500690600708717). Teachers can play a big role in helping students engage in argumentation and solve authentic scientific problems. To do so, they need…</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://eric.ed.gov/?q=Experimental+AND+design&pg=2&id=EJ1071595','ERIC'); return false;" href="https://eric.ed.gov/?q=Experimental+AND+design&pg=2&id=EJ1071595"><span>Using a Discussion about Scientific Controversy to Teach Central Concepts in Experimental Design</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Bennett, Kimberley Ann</p> <p>2015-01-01</p> <p>Students may need explicit training in informal statistical reasoning in order to design experiments or use formal statistical tests effectively. By using scientific scandals and media misinterpretation, we can explore the need for good experimental design in an informal way. This article describes the use of a paper that reviews the measles mumps…</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://eric.ed.gov/?q=scientific+AND+argument+AND+middle+AND+school+AND+students&pg=2&id=EJ896322','ERIC'); return false;" href="https://eric.ed.gov/?q=scientific+AND+argument+AND+middle+AND+school+AND+students&pg=2&id=EJ896322"><span>A Learning Progression for Scientific Argumentation: Understanding Student Work and Designing Supportive Instructional Contexts</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Berland, Leema K.; McNeill, Katherine L.</p> <p>2010-01-01</p> <p>Argumentation is a central goal of science education because it engages students in a complex scientific practice in which they construct and justify knowledge claims. Although there is a growing body of research around argumentation, there has been little focus on developing a learning progression for this practice. We describe a learning…</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://eric.ed.gov/?q=definition+AND+emotion&pg=5&id=EJ659928','ERIC'); return false;" href="https://eric.ed.gov/?q=definition+AND+emotion&pg=5&id=EJ659928"><span>Unweaving Time and Food Chains: Two Classroom Exercises in Scientific and Emotional Literacy.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Alsop, Steve; Watts, Mike</p> <p>2002-01-01</p> <p>Discusses affective dimensions in school science. Uses data from two case studies and explores ways in which science has the potential to stimulate and challenge emotions. Discusses the importance of affect in learning, how emotions might feature more centrally in science classrooms, and how definitions of scientific literacy might more explicitly…</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016SPIE.9971E..03S','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016SPIE.9971E..03S"><span>Automatic Mexican sign language and digits recognition using normalized central moments</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Solís, Francisco; Martínez, David; Espinosa, Oscar; Toxqui, Carina</p> <p>2016-09-01</p> <p>This work presents a framework for automatic Mexican sign language and digits recognition based on computer vision system using normalized central moments and artificial neural networks. Images are captured by digital IP camera, four LED reflectors and a green background in order to reduce computational costs and prevent the use of special gloves. 42 normalized central moments are computed per frame and used in a Multi-Layer Perceptron to recognize each database. Four versions per sign and digit were used in training phase. 93% and 95% of recognition rates were achieved for Mexican sign language and digits respectively.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2013CS%26D....6a5010B','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2013CS%26D....6a5010B"><span>What makes computational open source software libraries successful?</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Bangerth, Wolfgang; Heister, Timo</p> <p>2013-01-01</p> <p>Software is the backbone of scientific computing. Yet, while we regularly publish detailed accounts about the results of scientific software, and while there is a general sense of which numerical methods work well, our community is largely unaware of best practices in writing the large-scale, open source scientific software upon which our discipline rests. This is particularly apparent in the commonly held view that writing successful software packages is largely the result of simply ‘being a good programmer’ when in fact there are many other factors involved, for example the social skill of community building. In this paper, we consider what we have found to be the necessary ingredients for successful scientific software projects and, in particular, for software libraries upon which the vast majority of scientific codes are built today. In particular, we discuss the roles of code, documentation, communities, project management and licenses. We also briefly comment on the impact on academic careers of engaging in software projects.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016JInst..11C3021M','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016JInst..11C3021M"><span>Common Readout Unit (CRU) - A new readout architecture for the ALICE experiment</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Mitra, J.; Khan, S. A.; Mukherjee, S.; Paul, R.</p> <p>2016-03-01</p> <p>The ALICE experiment at the CERN Large Hadron Collider (LHC) is presently going for a major upgrade in order to fully exploit the scientific potential of the upcoming high luminosity run, scheduled to start in the year 2021. The high interaction rate and the large event size will result in an experimental data flow of about 1 TB/s from the detectors, which need to be processed before sending to the online computing system and data storage. This processing is done in a dedicated Common Readout Unit (CRU), proposed for data aggregation, trigger and timing distribution and control moderation. It act as common interface between sub-detector electronic systems, computing system and trigger processors. The interface links include GBT, TTC-PON and PCIe. GBT (Gigabit transceiver) is used for detector data payload transmission and fixed latency path for trigger distribution between CRU and detector readout electronics. TTC-PON (Timing, Trigger and Control via Passive Optical Network) is employed for time multiplex trigger distribution between CRU and Central Trigger Processor (CTP). PCIe (Peripheral Component Interconnect Express) is the high-speed serial computer expansion bus standard for bulk data transport between CRU boards and processors. In this article, we give an overview of CRU architecture in ALICE, discuss the different interfaces, along with the firmware design and implementation of CRU on the LHCb PCIe40 board.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017APS..APR.C2004M','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017APS..APR.C2004M"><span>Theoretical Comparison Between Candidates for Dark Matter</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>McKeough, James; Hira, Ajit; Valdez, Alexandra</p> <p>2017-01-01</p> <p>Since the generally-accepted view among astrophysicists is that the matter component of the universe is mostly dark matter, the search for dark matter particles continues unabated. The Large Underground Xenon (LUX) improvements, aided by advanced computer simulations at the U.S. Department of Energy's Lawrence Berkeley National Laboratory's (Berkeley Lab) National Energy Research Scientific Computing Center (NERSC) and Brown University's Center for Computation and Visualization (CCV), can potentially eliminate some particle models of dark matter. Generally, the proposed candidates can be put in three categories: baryonic dark matter, hot dark matter, and cold dark matter. The Lightest Supersymmetric Particle(LSP) of supersymmetric models is a dark matter candidate, and is classified as a Weakly Interacting Massive Particle (WIMP). Similar to the cosmic microwave background radiation left over from the Big Bang, there is a background of low-energy neutrinos in our Universe. According to some researchers, these may be the explanation for the dark matter. One advantage of the Neutrino Model is that they are known to exist. Dark matter made from neutrinos is termed ``hot dark matter''. We formulate a novel empirical function for the average density profile of cosmic voids, identified via the watershed technique in ΛCDM N-body simulations. This function adequately treats both void size and redshift, and describes the scale radius and the central density of voids. We started with a five-parameter model. Our research is mainly on LSP and Neutrino models.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/biblio/1018968-implementing-molecular-dynamics-hybrid-high-performance-computers-short-range-forces','SCIGOV-STC'); return false;" href="https://www.osti.gov/biblio/1018968-implementing-molecular-dynamics-hybrid-high-performance-computers-short-range-forces"><span>Implementing Molecular Dynamics for Hybrid High Performance Computers - 1. Short Range Forces</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Brown, W Michael; Wang, Peng; Plimpton, Steven J</p> <p></p> <p>The use of accelerators such as general-purpose graphics processing units (GPGPUs) have become popular in scientific computing applications due to their low cost, impressive floating-point capabilities, high memory bandwidth, and low electrical power requirements. Hybrid high performance computers, machines with more than one type of floating-point processor, are now becoming more prevalent due to these advantages. In this work, we discuss several important issues in porting a large molecular dynamics code for use on parallel hybrid machines - 1) choosing a hybrid parallel decomposition that works on central processing units (CPUs) with distributed memory and accelerator cores with shared memory,more » 2) minimizing the amount of code that must be ported for efficient acceleration, 3) utilizing the available processing power from both many-core CPUs and accelerators, and 4) choosing a programming model for acceleration. We present our solution to each of these issues for short-range force calculation in the molecular dynamics package LAMMPS. We describe algorithms for efficient short range force calculation on hybrid high performance machines. We describe a new approach for dynamic load balancing of work between CPU and accelerator cores. We describe the Geryon library that allows a single code to compile with both CUDA and OpenCL for use on a variety of accelerators. Finally, we present results on a parallel test cluster containing 32 Fermi GPGPUs and 180 CPU cores.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/28443981','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/28443981"><span>Mapping the dengue scientific landscape worldwide: a bibliometric and network analysis.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Mota, Fabio Batista; Fonseca, Bruna de Paula Fonseca E; Galina, Andréia Cristina; Silva, Roseli Monteiro da</p> <p>2017-05-01</p> <p>Despite the current global trend of reduction in the morbidity and mortality of neglected diseases, dengue's incidence has increased and occurrence areas have expanded. Dengue also persists as a scientific and technological challenge since there is no effective treatment, vaccine, vector control or public health intervention. Combining bibliometrics and social network analysis methods can support the mapping of dengue research and development (R&D) activities worldwide. The aim of this paper is to map the scientific scenario related to dengue research worldwide. We use scientific publication data from Web of Science Core Collection - articles indexed in Science Citation Index Expanded (SCI-EXPANDED) - and combine bibliometrics and social network analysis techniques to identify the most relevant journals, scientific references, research areas, countries and research organisations in the dengue scientific landscape. Our results show a significant increase of dengue publications over time; tropical medicine and virology as the most frequent research areas and biochemistry and molecular biology as the most central area in the network; USA and Brazil as the most productive countries; and Mahidol University and Fundação Oswaldo Cruz as the main research organisations and the Centres for Disease Control and Prevention as the most central organisation in the collaboration network. Our findings can be used to strengthen a global knowledge platform guiding policy, planning and funding decisions as well as to providing directions to researchers and institutions. So that, by offering to the scientific community, policy makers and public health practitioners a mapping of the dengue scientific landscape, this paper has aimed to contribute to upcoming debates, decision-making and planning on dengue R&D and public health strategies worldwide.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2006APS..MARW40015G','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2006APS..MARW40015G"><span>Some Thoughts Regarding Practical Quantum Computing</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Ghoshal, Debabrata; Gomez, Richard; Lanzagorta, Marco; Uhlmann, Jeffrey</p> <p>2006-03-01</p> <p>Quantum computing has become an important area of research in computer science because of its potential to provide more efficient algorithmic solutions to certain problems than are possible with classical computing. The ability of performing parallel operations over an exponentially large computational space has proved to be the main advantage of the quantum computing model. In this regard, we are particularly interested in the potential applications of quantum computers to enhance real software systems of interest to the defense, industrial, scientific and financial communities. However, while much has been written in popular and scientific literature about the benefits of the quantum computational model, several of the problems associated to the practical implementation of real-life complex software systems in quantum computers are often ignored. In this presentation we will argue that practical quantum computation is not as straightforward as commonly advertised, even if the technological problems associated to the manufacturing and engineering of large-scale quantum registers were solved overnight. We will discuss some of the frequently overlooked difficulties that plague quantum computing in the areas of memories, I/O, addressing schemes, compilers, oracles, approximate information copying, logical debugging, error correction and fault-tolerant computing protocols.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=1656841','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=1656841"><span>Comparison of Scientific Calipers and Computer-Enabled CT Review for the Measurement of Skull Base and Craniomaxillofacial Dimensions</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Citardi, Martin J.; Herrmann, Brian; Hollenbeak, Chris S.; Stack, Brendan C.; Cooper, Margaret; Bucholz, Richard D.</p> <p>2001-01-01</p> <p>Traditionally, cadaveric studies and plain-film cephalometrics provided information about craniomaxillofacial proportions and measurements; however, advances in computer technology now permit software-based review of computed tomography (CT)-based models. Distances between standardized anatomic points were measured on five dried human skulls with standard scientific calipers (Geneva Gauge, Albany, NY) and through computer workstation (StealthStation 2.6.4, Medtronic Surgical Navigation Technology, Louisville, CO) review of corresponding CT scans. Differences in measurements between the caliper and CT model were not statistically significant for each parameter. Measurements obtained by computer workstation CT review of the cranial skull base are an accurate representation of actual bony anatomy. Such information has important implications for surgical planning and clinical research. ImagesFigure 1Figure 2Figure 3 PMID:17167599</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/servlets/purl/1360691','SCIGOV-STC'); return false;" href="https://www.osti.gov/servlets/purl/1360691"><span>Seventy Years of Computing in the Nuclear Weapons Program</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Archer, Billy Joe</p> <p></p> <p>Los Alamos has continuously been on the forefront of scientific computing since it helped found the field. This talk will explore the rich history of computing in the Los Alamos weapons program. The current status of computing will be discussed, as will the expectations for the near future.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://eric.ed.gov/?q=Automata&pg=2&id=EJ769565','ERIC'); return false;" href="https://eric.ed.gov/?q=Automata&pg=2&id=EJ769565"><span>Teaching Reductive Thinking</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Armoni, Michal; Gal-Ezer, Judith</p> <p>2005-01-01</p> <p>When dealing with a complex problem, solving it by reduction to simpler problems, or problems for which the solution is already known, is a common method in mathematics and other scientific disciplines, as in computer science and, specifically, in the field of computability. However, when teaching computational models (as part of computability)…</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://eric.ed.gov/?q=REPOSITORIES&pg=6&id=EJ1006559','ERIC'); return false;" href="https://eric.ed.gov/?q=REPOSITORIES&pg=6&id=EJ1006559"><span>PubMed Central Canada: Beyond an Open Access Repository?</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Nariani, Rajiv</p> <p>2013-01-01</p> <p>PubMed Central Canada (PMC Canada) represents a partnership between the Canadian Institutes of Health Research (CIHR), the National Research Council's Canada Institute for Scientific and Technical Information (NRC-CISTI), and the National Library of Medicine of the US. The present study was done to gauge faculty awareness about the CIHR Policy on…</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li><a href="#" onclick='return showDiv("page_23");'>23</a></li> <li class="active"><span>24</span></li> <li><a href="#" onclick='return showDiv("page_25");'>25</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_24 --> <div id="page_25" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li><a href="#" onclick='return showDiv("page_23");'>23</a></li> <li><a href="#" onclick='return showDiv("page_24");'>24</a></li> <li class="active"><span>25</span></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="481"> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/24000625','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/24000625"><span>[Contribution of the Vishnevsky Central Military Clinical Hospital N 3 to the history of combat casualty care and delivery of care to the injured soldiers].</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Beliakin, S A; Dolgikh, R N; Fokin, Iu N</p> <p>2013-05-01</p> <p>The article is dedicated to the 45-year history of combat casualty care in the Vishnevsky Central Military Clinical Hospital N 3. In the echelon system of medical care the Vishnevsky Central Military Clinical Hospital N 3 ranks the echelon (level) N 3. Specialists of the hospital, along with a medical and preventive activity, practice methodological, educational and innovative activity, participate in different scientific forums. Temporary duty assignment to the combat, human-made disaster and natural disaster areas is a real functional test. 64 physicians have an extreme situation experience. The Vishnevsky Central Military Clinical Hospital N 3 is a clinical base of department of surgery, advanced physician training department, combat casualty care department of the Institute for advanced physician training of Mandryka scientific and educational clinical center. For the purpose of reducing the terms and improving the quality of medical care it was suggested to make the integration connections with leading hospitals of the Ministry of Defense of the Russian Federation.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/servlets/purl/1250467','SCIGOV-STC'); return false;" href="https://www.osti.gov/servlets/purl/1250467"><span>2014 Annual Report - Argonne Leadership Computing Facility</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Collins, James R.; Papka, Michael E.; Cerny, Beth A.</p> <p></p> <p>The Argonne Leadership Computing Facility provides supercomputing capabilities to the scientific and engineering community to advance fundamental discovery and understanding in a broad range of disciplines.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/servlets/purl/1250464','SCIGOV-STC'); return false;" href="https://www.osti.gov/servlets/purl/1250464"><span>2015 Annual Report - Argonne Leadership Computing Facility</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Collins, James R.; Papka, Michael E.; Cerny, Beth A.</p> <p></p> <p>The Argonne Leadership Computing Facility provides supercomputing capabilities to the scientific and engineering community to advance fundamental discovery and understanding in a broad range of disciplines.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015JSEdT..24..378P','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015JSEdT..24..378P"><span>Constructing Scientific Arguments Using Evidence from Dynamic Computational Climate Models</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Pallant, Amy; Lee, Hee-Sun</p> <p>2015-04-01</p> <p>Modeling and argumentation are two important scientific practices students need to develop throughout school years. In this paper, we investigated how middle and high school students ( N = 512) construct a scientific argument based on evidence from computational models with which they simulated climate change. We designed scientific argumentation tasks with three increasingly complex dynamic climate models. Each scientific argumentation task consisted of four parts: multiple-choice claim, openended explanation, five-point Likert scale uncertainty rating, and open-ended uncertainty rationale. We coded 1,294 scientific arguments in terms of a claim's consistency with current scientific consensus, whether explanations were model based or knowledge based and categorized the sources of uncertainty (personal vs. scientific). We used chi-square and ANOVA tests to identify significant patterns. Results indicate that (1) a majority of students incorporated models as evidence to support their claims, (2) most students used model output results shown on graphs to confirm their claim rather than to explain simulated molecular processes, (3) students' dependence on model results and their uncertainty rating diminished as the dynamic climate models became more and more complex, (4) some students' misconceptions interfered with observing and interpreting model results or simulated processes, and (5) students' uncertainty sources reflected more frequently on their assessment of personal knowledge or abilities related to the tasks than on their critical examination of scientific evidence resulting from models. These findings have implications for teaching and research related to the integration of scientific argumentation and modeling practices to address complex Earth systems.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://eric.ed.gov/?q=Scientific+AND+discoveries&pg=7&id=EJ600088','ERIC'); return false;" href="https://eric.ed.gov/?q=Scientific+AND+discoveries&pg=7&id=EJ600088"><span>A View of the Science Education Research Literature: Scientific Discovery Learning with Computer Simulations.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Robinson, William R.</p> <p>2000-01-01</p> <p>Describes a review of research that addresses the effectiveness of simulations in promoting scientific discovery learning and the problems that learners may encounter when using discovery learning. (WRM)</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://images.nasa.gov/#/details-GSFC_20171208_Archive_e001251.html','SCIGOVIMAGE-NASA'); return false;" href="https://images.nasa.gov/#/details-GSFC_20171208_Archive_e001251.html"><span>Haze in central China</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://images.nasa.gov/">NASA Image and Video Library</a></p> <p></p> <p>2014-01-24</p> <p>The Moderate Resolution Imaging Spectroradiometer (MODIS) on NASA’s Terra satellite acquired this natural-color image of central China on January 23, 2013 at 04:05 UTC. The image shows extensive haze over the region. In areas where the ground is visible, some of the landscape is covered with lingering snow. Credit: NASA/GSFC/Jeff Schmaltz/MODIS Land Rapid Response Team NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://www.dtic.mil/docs/citations/ADA132359','DTIC-ST'); return false;" href="http://www.dtic.mil/docs/citations/ADA132359"><span>Four Frames Suffice. A Provisionary Model of Vision and Space,</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.dtic.mil/">DTIC Science & Technology</a></p> <p></p> <p>1982-09-01</p> <p>0 * / Justifi ati AvailabilitY Codes 1. Introduction This paper is an attempt to specify’ a computationally and scientifically plausible model of how...abstract neural compuiting unit and a variety of construtions built of these units and their properties. All of this is part of the connectionist...chosen are inlerided to elucidate the nia’or scientific problems in intermediate level vision and would not be the best choice or a practical computer</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/19870007345','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/19870007345"><span>Applications of artificial intelligence to scientific research</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Prince, Mary Ellen</p> <p>1986-01-01</p> <p>Artificial intelligence (AI) is a growing field which is just beginning to make an impact on disciplines other than computer science. While a number of military and commercial applications were undertaken in recent years, few attempts were made to apply AI techniques to basic scientific research. There is no inherent reason for the discrepancy. The characteristics of the problem, rather than its domain, determines whether or not it is suitable for an AI approach. Expert system, intelligent tutoring systems, and learning programs are examples of theoretical topics which can be applied to certain areas of scientific research. Further research and experimentation should eventurally make it possible for computers to act as intelligent assistants to scientists.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/19900004116','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/19900004116"><span>Analysis of the flight dynamics of the Solar Maximum Mission (SMM) off-sun scientific pointing</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Pitone, D. S.; Klein, J. R.</p> <p>1989-01-01</p> <p>Algorithms are presented which were created and implemented by the Goddard Space Flight Center's (GSFC's) Solar Maximum Mission (SMM) attitude operations team to support large-angle spacecraft pointing at scientific objectives. The mission objective of the post-repair SMM satellite was to study solar phenomena. However, because the scientific instruments, such as the Coronagraph/Polarimeter (CP) and the Hard X ray Burst Spectrometer (HXRBS), were able to view objects other than the Sun, attitude operations support for attitude pointing at large angles from the nominal solar-pointing attitudes was required. Subsequently, attitude support for SMM was provided for scientific objectives such as Comet Halley, Supernova 1987A, Cygnus X-1, and the Crab Nebula. In addition, the analysis was extended to include the reverse problem, computing the right ascension and declination of a body given the off-Sun angles. This analysis led to the computation of the orbits of seven new solar comets seen in the field-of-view (FOV) of the CP. The activities necessary to meet these large-angle attitude-pointing sequences, such as slew sequence planning, viewing-period prediction, and tracking-bias computation are described. Analysis is presented for the computation of maneuvers and pointing parameters relative to the SMM-unique, Sun-centered reference frame. Finally, science data and independent attitude solutions are used to evaluate the large-angle pointing performance.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://ntrs.nasa.gov/search.jsp?R=19900059710&hterms=SMM&qs=Ntx%3Dmode%2Bmatchall%26Ntk%3DAll%26N%3D0%26No%3D20%26Ntt%3DSMM','NASA-TRS'); return false;" href="https://ntrs.nasa.gov/search.jsp?R=19900059710&hterms=SMM&qs=Ntx%3Dmode%2Bmatchall%26Ntk%3DAll%26N%3D0%26No%3D20%26Ntt%3DSMM"><span>Analysis of the flight dynamics of the Solar Maximum Mission (SMM) off-sun scientific pointing</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Pitone, D. S.; Klein, J. R.; Twambly, B. J.</p> <p>1990-01-01</p> <p>Algorithms are presented which were created and implemented by the Goddard Space Flight Center's (GSFC's) Solar Maximum Mission (SMM) attitude operations team to support large-angle spacecraft pointing at scientific objectives. The mission objective of the post-repair SMM satellite was to study solar phenomena. However, because the scientific instruments, such as the Coronagraph/Polarimeter (CP) and the Hard X-ray Burst Spectrometer (HXRBS), were able to view objects other than the Sun, attitude operations support for attitude pointing at large angles from the nominal solar-pointing attitudes was required. Subsequently, attitude support for SMM was provided for scientific objectives such as Comet Halley, Supernova 1987A, Cygnus X-1, and the Crab Nebula. In addition, the analysis was extended to include the reverse problem, computing the right ascension and declination of a body given the off-Sun angles. This analysis led to the computation of the orbits of seven new solar comets seen in the field-of-view (FOV) of the CP. The activities necessary to meet these large-angle attitude-pointing sequences, such as slew sequence planning, viewing-period prediction, and tracking-bias computation are described. Analysis is presented for the computation of maneuvers and pointing parameters relative to the SMM-unique, Sun-centered reference frame. Finally, science data and independent attitude solutions are used to evaluate the larg-angle pointing performance.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://ncifrederick.cancer.gov/about/theposter/content/scientific-library-offers-new-training-options','NCI'); return false;" href="https://ncifrederick.cancer.gov/about/theposter/content/scientific-library-offers-new-training-options"><span>Scientific Library Offers New Training Options | Poster</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.cancer.gov">Cancer.gov</a></p> <p></p> <p></p> <p>The Scientific Library is expanding its current training opportunities by offering webinars, allowing employees to take advantage of trainings from the comfort of their own offices. Due to the nature of their work, some employees find it inconvenient to attend in-person training classes; others simply prefer to use their own computers. The Scientific Library has been</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://files.eric.ed.gov/fulltext/EJ1035647.pdf','ERIC'); return false;" href="http://files.eric.ed.gov/fulltext/EJ1035647.pdf"><span>Games as a Platform for Student Participation in Authentic Scientific Research</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Magnussen, Rikke; Hansen, Sidse Damgaard; Planke, Tilo; Sherson, Jacob Friis</p> <p>2014-01-01</p> <p>This paper presents results from the design and testing of an educational version of Quantum Moves, a Scientific Discovery Game that allows players to help solve authentic scientific challenges in the effort to develop a quantum computer. The primary aim of developing a game-based platform for student-research collaboration is to investigate if…</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/20050082883','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/20050082883"><span>DB90: A Fortran Callable Relational Database Routine for Scientific and Engineering Computer Programs</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Wrenn, Gregory A.</p> <p>2005-01-01</p> <p>This report describes a database routine called DB90 which is intended for use with scientific and engineering computer programs. The software is written in the Fortran 90/95 programming language standard with file input and output routines written in the C programming language. These routines should be completely portable to any computing platform and operating system that has Fortran 90/95 and C compilers. DB90 allows a program to supply relation names and up to 5 integer key values to uniquely identify each record of each relation. This permits the user to select records or retrieve data in any desired order.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=2324197','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=2324197"><span>An Analysis of the Abstracts Presented at the Annual Meetings of the Society for Neuroscience from 2001 to 2006</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Lin, John M.; Bohland, Jason W.; Andrews, Peter; Burns, Gully A. P. C.; Allen, Cara B.; Mitra, Partha P.</p> <p>2008-01-01</p> <p>Annual meeting abstracts published by scientific societies often contain rich arrays of information that can be computationally mined and distilled to elucidate the state and dynamics of the subject field. We extracted and processed abstract data from the Society for Neuroscience (SFN) annual meeting abstracts during the period 2001–2006 in order to gain an objective view of contemporary neuroscience. An important first step in the process was the application of data cleaning and disambiguation methods to construct a unified database, since the data were too noisy to be of full utility in the raw form initially available. Using natural language processing, text mining, and other data analysis techniques, we then examined the demographics and structure of the scientific collaboration network, the dynamics of the field over time, major research trends, and the structure of the sources of research funding. Some interesting findings include a high geographical concentration of neuroscience research in the north eastern United States, a surprisingly large transient population (66% of the authors appear in only one out of the six studied years), the central role played by the study of neurodegenerative disorders in the neuroscience community, and an apparent growth of behavioral/systems neuroscience with a corresponding shrinkage of cellular/molecular neuroscience over the six year period. The results from this work will prove useful for scientists, policy makers, and funding agencies seeking to gain a complete and unbiased picture of the community structure and body of knowledge encapsulated by a specific scientific domain. PMID:18446237</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=5741828','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=5741828"><span>Adverse Drug Event Discovery Using Biomedical Literature: A Big Data Neural Network Adventure</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Badger, Jonathan; LaRose, Eric; Shirzadi, Ehsan; Mahnke, Andrea; Mayer, John; Ye, Zhan; Page, David; Peissig, Peggy</p> <p>2017-01-01</p> <p>Background The study of adverse drug events (ADEs) is a tenured topic in medical literature. In recent years, increasing numbers of scientific articles and health-related social media posts have been generated and shared daily, albeit with very limited use for ADE study and with little known about the content with respect to ADEs. Objective The aim of this study was to develop a big data analytics strategy that mines the content of scientific articles and health-related Web-based social media to detect and identify ADEs. Methods We analyzed the following two data sources: (1) biomedical articles and (2) health-related social media blog posts. We developed an intelligent and scalable text mining solution on big data infrastructures composed of Apache Spark, natural language processing, and machine learning. This was combined with an Elasticsearch No-SQL distributed database to explore and visualize ADEs. Results The accuracy, precision, recall, and area under receiver operating characteristic of the system were 92.7%, 93.6%, 93.0%, and 0.905, respectively, and showed better results in comparison with traditional approaches in the literature. This work not only detected and classified ADE sentences from big data biomedical literature but also scientifically visualized ADE interactions. Conclusions To the best of our knowledge, this work is the first to investigate a big data machine learning strategy for ADE discovery on massive datasets downloaded from PubMed Central and social media. This contribution illustrates possible capacities in big data biomedical text analysis using advanced computational methods with real-time update from new data published on a daily basis. PMID:29222076</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017AGUFMED11C0137A','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017AGUFMED11C0137A"><span>Visualization and Interaction in Research, Teaching, and Scientific Communication</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Ammon, C. J.</p> <p>2017-12-01</p> <p>Modern computing provides many tools for exploring observations, numerical calculations, and theoretical relationships. The number of options is, in fact, almost overwhelming. But the choices provide those with modest programming skills opportunities to create unique views of scientific information and to develop deeper insights into their data, their computations, and the underlying theoretical data-model relationships. I present simple examples of using animation and human-computer interaction to explore scientific data and scientific-analysis approaches. I illustrate how valuable a little programming ability can free scientists from the constraints of existing tools and can facilitate the development of deeper appreciation data and models. I present examples from a suite of programming languages ranging from C to JavaScript including the Wolfram Language. JavaScript is valuable for sharing tools and insight (hopefully) with others because it is integrated into one of the most powerful communication tools in human history, the web browser. Although too much of that power is often spent on distracting advertisements, the underlying computation and graphics engines are efficient, flexible, and almost universally available in desktop and mobile computing platforms. Many are working to fulfill the browser's potential to become the most effective tool for interactive study. Open-source frameworks for visualizing everything from algorithms to data are available, but advance rapidly. One strategy for dealing with swiftly changing tools is to adopt common, open data formats that are easily adapted (often by framework or tool developers). I illustrate the use of animation and interaction in research and teaching with examples from earthquake seismology.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4978513','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4978513"><span>Evaluation of Emerging Energy-Efficient Heterogeneous Computing Platforms for Biomolecular and Cellular Simulation Workloads</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Stone, John E.; Hallock, Michael J.; Phillips, James C.; Peterson, Joseph R.; Luthey-Schulten, Zaida; Schulten, Klaus</p> <p>2016-01-01</p> <p>Many of the continuing scientific advances achieved through computational biology are predicated on the availability of ongoing increases in computational power required for detailed simulation and analysis of cellular processes on biologically-relevant timescales. A critical challenge facing the development of future exascale supercomputer systems is the development of new computing hardware and associated scientific applications that dramatically improve upon the energy efficiency of existing solutions, while providing increased simulation, analysis, and visualization performance. Mobile computing platforms have recently become powerful enough to support interactive molecular visualization tasks that were previously only possible on laptops and workstations, creating future opportunities for their convenient use for meetings, remote collaboration, and as head mounted displays for immersive stereoscopic viewing. We describe early experiences adapting several biomolecular simulation and analysis applications for emerging heterogeneous computing platforms that combine power-efficient system-on-chip multi-core CPUs with high-performance massively parallel GPUs. We present low-cost power monitoring instrumentation that provides sufficient temporal resolution to evaluate the power consumption of individual CPU algorithms and GPU kernels. We compare the performance and energy efficiency of scientific applications running on emerging platforms with results obtained on traditional platforms, identify hardware and algorithmic performance bottlenecks that affect the usability of these platforms, and describe avenues for improving both the hardware and applications in pursuit of the needs of molecular modeling tasks on mobile devices and future exascale computers. PMID:27516922</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2012EGUGA..14.7192F','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2012EGUGA..14.7192F"><span>The StratusLab cloud distribution: Use-cases and support for scientific applications</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Floros, E.</p> <p>2012-04-01</p> <p>The StratusLab project is integrating an open cloud software distribution that enables organizations to setup and provide their own private or public IaaS (Infrastructure as a Service) computing clouds. StratusLab distribution capitalizes on popular infrastructure virtualization solutions like KVM, the OpenNebula virtual machine manager, Claudia service manager and SlipStream deployment platform, which are further enhanced and expanded with additional components developed within the project. The StratusLab distribution covers the core aspects of a cloud IaaS architecture, namely Computing (life-cycle management of virtual machines), Storage, Appliance management and Networking. The resulting software stack provides a packaged turn-key solution for deploying cloud computing services. The cloud computing infrastructures deployed using StratusLab can support a wide range of scientific and business use cases. Grid computing has been the primary use case pursued by the project and for this reason the initial priority has been the support for the deployment and operation of fully virtualized production-level grid sites; a goal that has already been achieved by operating such a site as part of EGI's (European Grid Initiative) pan-european grid infrastructure. In this area the project is currently working to provide non-trivial capabilities like elastic and autonomic management of grid site resources. Although grid computing has been the motivating paradigm, StratusLab's cloud distribution can support a wider range of use cases. Towards this direction, we have developed and currently provide support for setting up general purpose computing solutions like Hadoop, MPI and Torque clusters. For what concerns scientific applications the project is collaborating closely with the Bioinformatics community in order to prepare VM appliances and deploy optimized services for bioinformatics applications. In a similar manner additional scientific disciplines like Earth Science can take advantage of StratusLab cloud solutions. Interested users are welcomed to join StratusLab's user community by getting access to the reference cloud services deployed by the project and offered to the public.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/18481591','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/18481591"><span>Ethical muscle and scientific interests: a role for philosophy in scientific research.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Kaposy, Chris</p> <p>2008-03-01</p> <p>Ethics, a branch of philosophy, has a place in the regulatory framework of human subjects research. Sometimes, however, ethical concepts and arguments play a more central role in scientific activity. This can happen, for example, when violations of research norms are also ethical violations. In such a situation, ethical arguments can be marshaled to improve the quality of the scientific research. I explore two different examples in which philosophers and scientists have used ethical arguments to plead for epistemological improvements in the conduct of research. The first example deals with research dishonesty in pharmaceutical development. The second example is concerned with neuropsychological research using fMRI technology.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://eric.ed.gov/?q=immunization+AND+registry&id=EJ292543','ERIC'); return false;" href="https://eric.ed.gov/?q=immunization+AND+registry&id=EJ292543"><span>The Organization and Evaluation of a Computer-Assisted, Centralized Immunization Registry.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Loeser, Helen; And Others</p> <p>1983-01-01</p> <p>Evaluation of a computer-assisted, centralized immunization registry after one year shows that 93 percent of eligible health practitioners initially agreed to provide data and that 73 percent continue to do so. Immunization rates in audited groups have improved significantly. (GC)</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li><a href="#" onclick='return showDiv("page_23");'>23</a></li> <li><a href="#" onclick='return showDiv("page_24");'>24</a></li> <li class="active"><span>25</span></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_25 --> <div class="footer-extlink text-muted" style="margin-bottom:1rem; text-align:center;">Some links on this page may take you to non-federal websites. Their policies may differ from this site.</div> </div><!-- container --> <a id="backToTop" href="#top"> Top </a> <footer> <nav> <ul class="links"> <li><a href="/sitemap.html">Site Map</a></li> <li><a href="/website-policies.html">Website Policies</a></li> <li><a href="https://www.energy.gov/vulnerability-disclosure-policy" target="_blank">Vulnerability Disclosure Program</a></li> <li><a href="/contact.html">Contact Us</a></li> </ul> </nav> </footer> <script type="text/javascript"><!-- // var lastDiv = ""; function showDiv(divName) { // hide last div if (lastDiv) { document.getElementById(lastDiv).className = "hiddenDiv"; } //if value of the box is not nothing and an object with that name exists, then change the class if (divName && document.getElementById(divName)) { document.getElementById(divName).className = "visibleDiv"; lastDiv = divName; } } //--> </script> <script> /** * Function that tracks a click on an outbound link in Google Analytics. * This function takes a valid URL string as an argument, and uses that URL string * as the event label. */ var trackOutboundLink = function(url,collectionCode) { try { h = window.open(url); setTimeout(function() { ga('send', 'event', 'topic-page-click-through', collectionCode, url); }, 1000); } catch(err){} }; </script> <!-- Google Analytics --> <script> (function(i,s,o,g,r,a,m){i['GoogleAnalyticsObject']=r;i[r]=i[r]||function(){ (i[r].q=i[r].q||[]).push(arguments)},i[r].l=1*new Date();a=s.createElement(o), m=s.getElementsByTagName(o)[0];a.async=1;a.src=g;m.parentNode.insertBefore(a,m) })(window,document,'script','//www.google-analytics.com/analytics.js','ga'); ga('create', 'UA-1122789-34', 'auto'); ga('send', 'pageview'); </script> <!-- End Google Analytics --> <script> showDiv('page_1') </script> </body> </html>