Status of GDL - GNU Data Language
NASA Astrophysics Data System (ADS)
Coulais, A.; Schellens, M.; Gales, J.; Arabas, S.; Boquien, M.; Chanial, P.; Messmer, P.; Fillmore, D.; Poplawski, O.; Maret, S.; Marchal, G.; Galmiche, N.; Mermet, T.
2010-12-01
Gnu Data Language (GDL) is an open-source interpreted language aimed at numerical data analysis and visualisation. It is a free implementation of the Interactive Data Language (IDL) widely used in Astronomy. GDL has a full syntax compatibility with IDL, and includes a large set of library routines targeting advanced matrix manipulation, plotting, time-series and image analysis, mapping, and data input/output including numerous scientific data formats. We will present the current status of the project, the key accomplishments, and the weaknesses - areas where contributions are welcome!
NASA Astrophysics Data System (ADS)
Noreen, Amna; Olaussen, Kåre
2012-10-01
A subroutine for a very-high-precision numerical solution of a class of ordinary differential equations is provided. For a given evaluation point and equation parameters the memory requirement scales linearly with precision P, and the number of algebraic operations scales roughly linearly with P when P becomes sufficiently large. We discuss results from extensive tests of the code, and how one, for a given evaluation point and equation parameters, may estimate precision loss and computing time in advance. Program summary Program title: seriesSolveOde1 Catalogue identifier: AEMW_v1_0 Program summary URL: http://cpc.cs.qub.ac.uk/summaries/AEMW_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 991 No. of bytes in distributed program, including test data, etc.: 488116 Distribution format: tar.gz Programming language: C++ Computer: PC's or higher performance computers. Operating system: Linux and MacOS RAM: Few to many megabytes (problem dependent). Classification: 2.7, 4.3 External routines: CLN — Class Library for Numbers [1] built with the GNU MP library [2], and GSL — GNU Scientific Library [3] (only for time measurements). Nature of problem: The differential equation -s2({d2}/{dz2}+{1-ν+-ν-}/{z}{d}/{dz}+{ν+ν-}/{z2})ψ(z)+{1}/{z} ∑n=0N vnznψ(z)=0, is solved numerically to very high precision. The evaluation point z and some or all of the equation parameters may be complex numbers; some or all of them may be represented exactly in terms of rational numbers. Solution method: The solution ψ(z), and optionally ψ'(z), is evaluated at the point z by executing the recursion A(z)={s-2}/{(m+1+ν-ν+)(m+1+ν-ν-)} ∑n=0N Vn(z)A(z), ψ(z)=ψ(z)+A(z), to sufficiently large m. Here ν is either ν+ or ν-, and Vn(z)=vnz. The recursion is initialized by A(z)=δzν,for n=0,1,…,N ψ(z)=A0(z). Restrictions: No solution is computed if z=0, or s=0, or if ν=ν- (assuming Reν+≥Reν-) with ν+-ν- an integer, except when ν+-ν-=1 and v =0 (i.e. when z is an ordinary point for zψ(z)). Additional comments: The code of the main algorithm is in the file seriesSolveOde1.cc, which "#include" the file checkForBreakOde1.cc. These routines, and the programs using them, must "#include" the file seriesSolveOde1.cc. Running time: On a Linux PC that is a few years old, at y=√{10} to an accuracy of P=200 decimal digits, evaluating the ground state wavefunction of the anharmonic oscillator (with the eigenvalue known in advance); (cf. Eq. (6)) takes about 2 ms, and about 40 min at an accuracy of P=100000 decimal digits. References: [1] B. Haible and R.B. Kreckel, CLN — Class Library for Numbers, http://www.ginac.de/CLN/ [2] T. Granlund and collaborators, GMP — The GNU Multiple Precision Arithmetic Library, http://gmplib.org/ [3] M. Galassi et al., GNU Scientific Library Reference Manual (3rd Ed.), ISBN 0954612078., http://www.gnu.org/software/gsl/
Gnuastro: GNU Astronomy Utilities
NASA Astrophysics Data System (ADS)
Akhlaghi, Mohammad
2018-01-01
Gnuastro (GNU Astronomy Utilities) manipulates and analyzes astronomical data. It is an official GNU package of a large collection of programs and C/C++ library functions. Command-line programs perform arithmetic operations on images, convert FITS images to common types like JPG or PDF, convolve an image with a given kernel or matching of kernels, perform cosmological calculations, crop parts of large images (possibly in multiple files), manipulate FITS extensions and keywords, and perform statistical operations. In addition, it contains programs to make catalogs from detection maps, add noise, make mock profiles with a variety of radial functions using monte-carlo integration for their centers, match catalogs, and detect objects in an image among many other operations. The command-line programs share the same basic command-line user interface for the comfort of both the users and developers. Gnuastro is written to comply fully with the GNU coding standards and integrates well with all Unix-like operating systems. This enables astronomers to expect a fully familiar experience in the source code, building, installing and command-line user interaction that they have seen in all the other GNU software that they use. Gnuastro's extensive library is included for users who want to build their own unique programs.
GNU Data Language (GDL) - a free and open-source implementation of IDL
NASA Astrophysics Data System (ADS)
Arabas, Sylwester; Schellens, Marc; Coulais, Alain; Gales, Joel; Messmer, Peter
2010-05-01
GNU Data Language (GDL) is developed with the aim of providing an open-source drop-in replacement for the ITTVIS's Interactive Data Language (IDL). It is free software developed by an international team of volunteers led by Marc Schellens - the project's founder (a list of contributors is available on the project's website). The development is hosted on SourceForge where GDL continuously ranks in the 99th percentile of most active projects. GDL with its library routines is designed as a tool for numerical data analysis and visualisation. As its proprietary counterparts (IDL and PV-WAVE), GDL is used particularly in geosciences and astronomy. GDL is dynamically-typed, vectorized and has object-oriented programming capabilities. The library routines handle numerical calculations, data visualisation, signal/image processing, interaction with host OS and data input/output. GDL supports several data formats such as netCDF, HDF4, HDF5, GRIB, PNG, TIFF, DICOM, etc. Graphical output is handled by X11, PostScript, SVG or z-buffer terminals, the last one allowing output to be saved in a variety of raster graphics formats. GDL is an incremental compiler with integrated debugging facilities. It is written in C++ using the ANTLR language-recognition framework. Most of the library routines are implemented as interfaces to open-source packages such as GNU Scientific Library, PLPlot, FFTW, ImageMagick, and others. GDL features a Python bridge (Python code can be called from GDL; GDL can be compiled as a Python module). Extensions to GDL can be written in C++, GDL, and Python. A number of open software libraries written in IDL, such as the NASA Astronomy Library, MPFIT, CMSVLIB and TeXtoIDL are fully or partially functional under GDL. Packaged versions of GDL are available for several Linux distributions and Mac OS X. The source code compiles on some other UNIX systems, including BSD and OpenSolaris. The presentation will cover the current status of the project, the key accomplishments, and the weaknesses - areas where contributions and users' feedback are welcome! While still being in beta-stage of development, GDL proved to be a useful tool for classroom work on data analysis. Its usage for teaching meteorological-data processing at the University of Warsaw will serve as an example.
XMDS2: Fast, scalable simulation of coupled stochastic partial differential equations
NASA Astrophysics Data System (ADS)
Dennis, Graham R.; Hope, Joseph J.; Johnsson, Mattias T.
2013-01-01
XMDS2 is a cross-platform, GPL-licensed, open source package for numerically integrating initial value problems that range from a single ordinary differential equation up to systems of coupled stochastic partial differential equations. The equations are described in a high-level XML-based script, and the package generates low-level optionally parallelised C++ code for the efficient solution of those equations. It combines the advantages of high-level simulations, namely fast and low-error development, with the speed, portability and scalability of hand-written code. XMDS2 is a complete redesign of the XMDS package, and features support for a much wider problem space while also producing faster code. Program summaryProgram title: XMDS2 Catalogue identifier: AENK_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AENK_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public License, version 2 No. of lines in distributed program, including test data, etc.: 872490 No. of bytes in distributed program, including test data, etc.: 45522370 Distribution format: tar.gz Programming language: Python and C++. Computer: Any computer with a Unix-like system, a C++ compiler and Python. Operating system: Any Unix-like system; developed under Mac OS X and GNU/Linux. RAM: Problem dependent (roughly 50 bytes per grid point) Classification: 4.3, 6.5. External routines: The external libraries required are problem-dependent. Uses FFTW3 Fourier transforms (used only for FFT-based spectral methods), dSFMT random number generation (used only for stochastic problems), MPI message-passing interface (used only for distributed problems), HDF5, GNU Scientific Library (used only for Bessel-based spectral methods) and a BLAS implementation (used only for non-FFT-based spectral methods). Nature of problem: General coupled initial-value stochastic partial differential equations. Solution method: Spectral method with method-of-lines integration Running time: Determined by the size of the problem
PAL: Positional Astronomy Library
NASA Astrophysics Data System (ADS)
Jenness, T.; Berry, D. S.
2016-06-01
The PAL library is a partial re-implementation of Pat Wallace's popular SLALIB library written in C using a Gnu GPL license and layered on top of the IAU's SOFA library (or the BSD-licensed ERFA) where appropriate. PAL attempts to stick to the SLA C API where possible.
NEBULAR: Spectrum synthesis for mixed hydrogen-helium gas in ionization equilibrium
NASA Astrophysics Data System (ADS)
Schirmer, Mischa
2016-08-01
NEBULAR synthesizes the spectrum of a mixed hydrogen helium gas in collisional ionization equilibrium. It is not a spectral fitting code, but it can be used to resample a model spectrum onto the wavelength grid of a real observation. It supports a wide range of temperatures and densities. NEBULAR includes free-free, free-bound, two-photon and line emission from HI, HeI and HeII. The code will either return the composite model spectrum, or, if desired, the unrescaled atomic emission coefficients. It is written in C++ and depends on the GNU Scientific Library (GSL).
Marchetti, Luca; Manca, Vincenzo
2015-04-15
MpTheory Java library is an open-source project collecting a set of objects and algorithms for modeling observed dynamics by means of the Metabolic P (MP) theory, that is, a mathematical theory introduced in 2004 for modeling biological dynamics. By means of the library, it is possible to model biological systems both at continuous and at discrete time. Moreover, the library comprises a set of regression algorithms for inferring MP models starting from time series of observations. To enhance the modeling experience, beside a pure Java usage, the library can be directly used within the most popular computing environments, such as MATLAB, GNU Octave, Mathematica and R. The library is open-source and licensed under the GNU Lesser General Public License (LGPL) Version 3.0. Source code, binaries and complete documentation are available at http://mptheory.scienze.univr.it. luca.marchetti@univr.it, marchetti@cosbi.eu Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Resolution of singularities for multi-loop integrals
NASA Astrophysics Data System (ADS)
Bogner, Christian; Weinzierl, Stefan
2008-04-01
We report on a program for the numerical evaluation of divergent multi-loop integrals. The program is based on iterated sector decomposition. We improve the original algorithm of Binoth and Heinrich such that the program is guaranteed to terminate. The program can be used to compute numerically the Laurent expansion of divergent multi-loop integrals regulated by dimensional regularisation. The symbolic and the numerical steps of the algorithm are combined into one program. Program summaryProgram title: sector_decomposition Catalogue identifier: AEAG_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEAG_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 47 506 No. of bytes in distributed program, including test data, etc.: 328 485 Distribution format: tar.gz Programming language: C++ Computer: all Operating system: Unix RAM: Depending on the complexity of the problem Classification: 4.4 External routines: GiNaC, available from http://www.ginac.de, GNU scientific library, available from http://www.gnu.org/software/gsl Nature of problem: Computation of divergent multi-loop integrals. Solution method: Sector decomposition. Restrictions: Only limited by the available memory and CPU time. Running time: Depending on the complexity of the problem.
2001-09-01
Readily Available Linux has been copyrighted under the terms of the GNU General Public 5 License (GPL)1. This is a license written by the Free...GNOME and KDE . d. Portability Linux is highly compatible with many common operating systems. For...using suitable libraries, Linux is able to run programs written for other operating systems. [Ref. 8] 1 The GNU Project is coordinated by the
Moessfit. A free Mössbauer fitting program
NASA Astrophysics Data System (ADS)
Kamusella, Sirko; Klauss, Hans-Henning
2016-12-01
A free data analysis program for Mössbauer spectroscopy was developed to solve commonly faced problems such as simultaneous fitting of multiple data sets, Maximum Entropy Method and a proper error estimation. The program is written in C++ using the Qt application framework and the Gnu Scientific Library. Moessfit makes use of multithreading to reasonably apply the multi core CPU capacities of modern PC. The whole fit is specified in a text input file issued to simplify work flow for the user and provide a simple start in the Mössbauer data analysis for beginners. However, the possibility to define arbitrary parameter dependencies and distributions as well as relaxation spectra makes Moessfit interesting for advanced user as well.
cit: hypothesis testing software for mediation analysis in genomic applications.
Millstein, Joshua; Chen, Gary K; Breton, Carrie V
2016-08-01
The challenges of successfully applying causal inference methods include: (i) satisfying underlying assumptions, (ii) limitations in data/models accommodated by the software and (iii) low power of common multiple testing approaches. The causal inference test (CIT) is based on hypothesis testing rather than estimation, allowing the testable assumptions to be evaluated in the determination of statistical significance. A user-friendly software package provides P-values and optionally permutation-based FDR estimates (q-values) for potential mediators. It can handle single and multiple binary and continuous instrumental variables, binary or continuous outcome variables and adjustment covariates. Also, the permutation-based FDR option provides a non-parametric implementation. Simulation studies demonstrate the validity of the cit package and show a substantial advantage of permutation-based FDR over other common multiple testing strategies. The cit open-source R package is freely available from the CRAN website (https://cran.r-project.org/web/packages/cit/index.html) with embedded C ++ code that utilizes the GNU Scientific Library, also freely available (http://www.gnu.org/software/gsl/). joshua.millstein@usc.edu Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Streamlining the Process of Acquiring Secure Open Architecture Software Systems
2013-10-08
Microsoft.NET, Enterprise Java Beans, GNU Lesser General Public License (LGPL) libraries, and data communication protocols like the Hypertext Transfer...NetBeans development environments), customer relationship management (SugarCRM), database management systems (PostgreSQL, MySQL ), operating
NASA Astrophysics Data System (ADS)
Powell, Keith B.; Vaitheeswaran, Vidhya
2010-07-01
The MMT observatory has recently implemented and tested an optimal wavefront controller for the NGS adaptive optics system. Open loop atmospheric data collected at the telescope is used as the input to a MATLAB based analytical model. The model uses nonlinear constrained minimization to determine controller gains and optimize the system performance. The real-time controller performing the adaptive optics close loop operation is implemented on a dedicated high performance PC based quad core server. The controller algorithm is written in C and uses the GNU scientific library for linear algebra. Tests at the MMT confirmed the optimal controller significantly reduced the residual RMS wavefront compared with the previous controller. Significant reductions in image FWHM and increased peak intensities were obtained in J, H and K-bands. The optimal PID controller is now operating as the baseline wavefront controller for the MMT NGS-AO system.
libdrdc: software standards library
NASA Astrophysics Data System (ADS)
Erickson, David; Peng, Tie
2008-04-01
This paper presents the libdrdc software standards library including internal nomenclature, definitions, units of measure, coordinate reference frames, and representations for use in autonomous systems research. This library is a configurable, portable C-function wrapped C++ / Object Oriented C library developed to be independent of software middleware, system architecture, processor, or operating system. It is designed to use the automatically-tuned linear algebra suite (ATLAS) and Basic Linear Algebra Suite (BLAS) and port to firmware and software. The library goal is to unify data collection and representation for various microcontrollers and Central Processing Unit (CPU) cores and to provide a common Application Binary Interface (ABI) for research projects at all scales. The library supports multi-platform development and currently works on Windows, Unix, GNU/Linux, and Real-Time Executive for Multiprocessor Systems (RTEMS). This library is made available under LGPL version 2.1 license.
PTools: an opensource molecular docking library
Saladin, Adrien; Fiorucci, Sébastien; Poulain, Pierre; Prévost, Chantal; Zacharias, Martin
2009-01-01
Background Macromolecular docking is a challenging field of bioinformatics. Developing new algorithms is a slow process generally involving routine tasks that should be found in a robust library and not programmed from scratch for every new software application. Results We present an object-oriented Python/C++ library to help the development of new docking methods. This library contains low-level routines like PDB-format manipulation functions as well as high-level tools for docking and analyzing results. We also illustrate the ease of use of this library with the detailed implementation of a 3-body docking procedure. Conclusion The PTools library can handle molecules at coarse-grained or atomic resolution and allows users to rapidly develop new software. The library is already in use for protein-protein and protein-DNA docking with the ATTRACT program and for simulation analysis. This library is freely available under the GNU GPL license, together with detailed documentation. PMID:19409097
PTools: an opensource molecular docking library.
Saladin, Adrien; Fiorucci, Sébastien; Poulain, Pierre; Prévost, Chantal; Zacharias, Martin
2009-05-01
Macromolecular docking is a challenging field of bioinformatics. Developing new algorithms is a slow process generally involving routine tasks that should be found in a robust library and not programmed from scratch for every new software application. We present an object-oriented Python/C++ library to help the development of new docking methods. This library contains low-level routines like PDB-format manipulation functions as well as high-level tools for docking and analyzing results. We also illustrate the ease of use of this library with the detailed implementation of a 3-body docking procedure. The PTools library can handle molecules at coarse-grained or atomic resolution and allows users to rapidly develop new software. The library is already in use for protein-protein and protein-DNA docking with the ATTRACT program and for simulation analysis. This library is freely available under the GNU GPL license, together with detailed documentation.
ERIC Educational Resources Information Center
Coombs, Karen
2009-01-01
Drupal is a PHP-and MySQL-based system for managing web sites, developed in 2000 and released in 2001 under the open GNU General Public License (GPL). It is modular, extensible, and scalable. In recent years, Drupal has gained a huge following within libraries as a content management system (CMS). Probably the best-known extension of Drupal in the…
DSPSR: Digital Signal Processing Software for Pulsar Astronomy
NASA Astrophysics Data System (ADS)
van Straten, W.; Bailes, M.
2010-10-01
DSPSR, written primarily in C++, is an open-source, object-oriented, digital signal processing software library and application suite for use in radio pulsar astronomy. The library implements an extensive range of modular algorithms for use in coherent dedispersion, filterbank formation, pulse folding, and other tasks. The software is installed and compiled using the standard GNU configure and make system, and is able to read astronomical data in 18 different file formats, including FITS, S2, CPSR, CPSR2, PuMa, PuMa2, WAPP, ASP, and Mark5.
TRIQS: A toolbox for research on interacting quantum systems
NASA Astrophysics Data System (ADS)
Parcollet, Olivier; Ferrero, Michel; Ayral, Thomas; Hafermann, Hartmut; Krivenko, Igor; Messio, Laura; Seth, Priyanka
2015-11-01
We present the TRIQS library, a Toolbox for Research on Interacting Quantum Systems. It is an open-source, computational physics library providing a framework for the quick development of applications in the field of many-body quantum physics, and in particular, strongly-correlated electronic systems. It supplies components to develop codes in a modern, concise and efficient way: e.g. Green's function containers, a generic Monte Carlo class, and simple interfaces to HDF5. TRIQS is a C++/Python library that can be used from either language. It is distributed under the GNU General Public License (GPLv3). State-of-the-art applications based on the library, such as modern quantum many-body solvers and interfaces between density-functional-theory codes and dynamical mean-field theory (DMFT) codes are distributed along with it.
Libstatmech and applications to astrophysics
NASA Astrophysics Data System (ADS)
Yu, Tianhong
In this work an introduction to Libstatmech is presented and applications especially to astrophysics are discussed. Libstatmech is a C toolkit for computing the statistical mechanics of fermions and bosons, written on top of libxml and gsl (GNU Scientific Library). Calculations of Thomas-Fermi Screening model and Bose-Einstein Condensate based on libstatmech demonstrate the expected results. For astrophysics application, a simple Type Ia Supernovae model is established to run the network calculation with weak reactions, in which libstatmech contributes to compute the electron chemical potential and allows the weak reverse rates to be calculated from detailed balance. Starting with pure 12C and T9=1.8, we find that at high initial density (rho~ 9x 109 g/cm3) there are relatively large abundances of neutron-rich iron-group isotopes (e.g. 66Ni, 50Ti, 48Ca) produced during the explosion, and Y e can drop to ~0.4, which indicates that the rare, high density Type Ia supernovae may help to explain the 48Ca and 50Ti effect in FUN CAIs.
System for Automated Geoscientific Analyses (SAGA) v. 2.1.4
NASA Astrophysics Data System (ADS)
Conrad, O.; Bechtel, B.; Bock, M.; Dietrich, H.; Fischer, E.; Gerlitz, L.; Wehberg, J.; Wichmann, V.; Böhner, J.
2015-02-01
The System for Automated Geoscientific Analyses (SAGA) is an open-source Geographic Information System (GIS), mainly licensed under the GNU General Public License. Since its first release in 2004, SAGA has rapidly developed from a specialized tool for digital terrain analysis to a comprehensive and globally established GIS platform for scientific analysis and modeling. SAGA is coded in C++ in an object oriented design and runs under several operating systems including Windows and Linux. Key functional features of the modular organized software architecture comprise an application programming interface for the development and implementation of new geoscientific methods, an easily approachable graphical user interface with many visualization options, a command line interpreter, and interfaces to scripting and low level programming languages like R and Python. The current version 2.1.4 offers more than 700 tools, which are implemented in dynamically loadable libraries or shared objects and represent the broad scopes of SAGA in numerous fields of geoscientific endeavor and beyond. In this paper, we inform about the system's architecture, functionality, and its current state of development and implementation. Further, we highlight the wide spectrum of scientific applications of SAGA in a review of published studies with special emphasis on the core application areas digital terrain analysis, geomorphology, soil science, climatology and meteorology, as well as remote sensing.
System for Automated Geoscientific Analyses (SAGA) v. 2.1.4
NASA Astrophysics Data System (ADS)
Conrad, O.; Bechtel, B.; Bock, M.; Dietrich, H.; Fischer, E.; Gerlitz, L.; Wehberg, J.; Wichmann, V.; Böhner, J.
2015-07-01
The System for Automated Geoscientific Analyses (SAGA) is an open source geographic information system (GIS), mainly licensed under the GNU General Public License. Since its first release in 2004, SAGA has rapidly developed from a specialized tool for digital terrain analysis to a comprehensive and globally established GIS platform for scientific analysis and modeling. SAGA is coded in C++ in an object oriented design and runs under several operating systems including Windows and Linux. Key functional features of the modular software architecture comprise an application programming interface for the development and implementation of new geoscientific methods, a user friendly graphical user interface with many visualization options, a command line interpreter, and interfaces to interpreted languages like R and Python. The current version 2.1.4 offers more than 600 tools, which are implemented in dynamically loadable libraries or shared objects and represent the broad scopes of SAGA in numerous fields of geoscientific endeavor and beyond. In this paper, we inform about the system's architecture, functionality, and its current state of development and implementation. Furthermore, we highlight the wide spectrum of scientific applications of SAGA in a review of published studies, with special emphasis on the core application areas digital terrain analysis, geomorphology, soil science, climatology and meteorology, as well as remote sensing.
Libsharp - spherical harmonic transforms revisited
NASA Astrophysics Data System (ADS)
Reinecke, M.; Seljebotn, D. S.
2013-06-01
We present libsharp, a code library for spherical harmonic transforms (SHTs), which evolved from the libpsht library and addresses several of its shortcomings, such as adding MPI support for distributed memory systems and SHTs of fields with arbitrary spin, but also supporting new developments in CPU instruction sets like the Advanced Vector Extensions (AVX) or fused multiply-accumulate (FMA) instructions. The library is implemented in portable C99 and provides an interface that can be easily accessed from other programming languages such as C++, Fortran, Python, etc. Generally, libsharp's performance is at least on par with that of its predecessor; however, significant improvements were made to the algorithms for scalar SHTs, which are roughly twice as fast when using the same CPU capabilities. The library is available at
A basic analysis toolkit for biological sequences
Giancarlo, Raffaele; Siragusa, Alessandro; Siragusa, Enrico; Utro, Filippo
2007-01-01
This paper presents a software library, nicknamed BATS, for some basic sequence analysis tasks. Namely, local alignments, via approximate string matching, and global alignments, via longest common subsequence and alignments with affine and concave gap cost functions. Moreover, it also supports filtering operations to select strings from a set and establish their statistical significance, via z-score computation. None of the algorithms is new, but although they are generally regarded as fundamental for sequence analysis, they have not been implemented in a single and consistent software package, as we do here. Therefore, our main contribution is to fill this gap between algorithmic theory and practice by providing an extensible and easy to use software library that includes algorithms for the mentioned string matching and alignment problems. The library consists of C/C++ library functions as well as Perl library functions. It can be interfaced with Bioperl and can also be used as a stand-alone system with a GUI. The software is available at under the GNU GPL. PMID:17877802
A new version of the CADNA library for estimating round-off error propagation in Fortran programs
NASA Astrophysics Data System (ADS)
Jézéquel, Fabienne; Chesneaux, Jean-Marie; Lamotte, Jean-Luc
2010-11-01
The CADNA library enables one to estimate, using a probabilistic approach, round-off error propagation in any simulation program. CADNA provides new numerical types, the so-called stochastic types, on which round-off errors can be estimated. Furthermore CADNA contains the definition of arithmetic and relational operators which are overloaded for stochastic variables and the definition of mathematical functions which can be used with stochastic arguments. On 64-bit processors, depending on the rounding mode chosen, the mathematical library associated with the GNU Fortran compiler may provide incorrect results or generate severe bugs. Therefore the CADNA library has been improved to enable the numerical validation of programs on 64-bit processors. New version program summaryProgram title: CADNA Catalogue identifier: AEAT_v1_1 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEAT_v1_1.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 28 488 No. of bytes in distributed program, including test data, etc.: 463 778 Distribution format: tar.gz Programming language: Fortran NOTE: A C++ version of this program is available in the Library as AEGQ_v1_0 Computer: PC running LINUX with an i686 or an ia64 processor, UNIX workstations including SUN, IBM Operating system: LINUX, UNIX Classification: 6.5 Catalogue identifier of previous version: AEAT_v1_0 Journal reference of previous version: Comput. Phys. Commun. 178 (2008) 933 Does the new version supersede the previous version?: Yes Nature of problem: A simulation program which uses floating-point arithmetic generates round-off errors, due to the rounding performed at each assignment and at each arithmetic operation. Round-off error propagation may invalidate the result of a program. The CADNA library enables one to estimate round-off error propagation in any simulation program and to detect all numerical instabilities that may occur at run time. Solution method: The CADNA library [1-3] implements Discrete Stochastic Arithmetic [4,5] which is based on a probabilistic model of round-off errors. The program is run several times with a random rounding mode generating different results each time. From this set of results, CADNA estimates the number of exact significant digits in the result that would have been computed with standard floating-point arithmetic. Reasons for new version: On 64-bit processors, the mathematical library associated with the GNU Fortran compiler may provide incorrect results or generate severe bugs with rounding towards -∞ and +∞, which the random rounding mode is based on. Therefore a particular definition of mathematical functions for stochastic arguments has been included in the CADNA library to enable its use with the GNU Fortran compiler on 64-bit processors. Summary of revisions: If CADNA is used on a 64-bit processor with the GNU Fortran compiler, mathematical functions are computed with rounding to the nearest, otherwise they are computed with the random rounding mode. It must be pointed out that the knowledge of the accuracy of the stochastic argument of a mathematical function is never lost. Restrictions: CADNA requires a Fortran 90 (or newer) compiler. In the program to be linked with the CADNA library, round-off errors on complex variables cannot be estimated. Furthermore array functions such as product or sum must not be used. Only the arithmetic operators and the abs, min, max and sqrt functions can be used for arrays. Additional comments: In the library archive, users are advised to read the INSTALL file first. The doc directory contains a user guide named ug.cadna.pdf which shows how to control the numerical accuracy of a program using CADNA, provides installation instructions and describes test runs. The source code, which is located in the src directory, consists of one assembly language file (cadna_rounding.s) and eighteen Fortran language files. cadna_rounding.s is a symbolic link to the assembly file corresponding to the processor and the Fortran compiler used. This assembly file contains routines which are frequently called in the CADNA Fortran files to change the rounding mode. The Fortran language files contain the definition of the stochastic types on which the control of accuracy can be performed, CADNA specific functions (for instance to enable or disable the detection of numerical instabilities), the definition of arithmetic and relational operators which are overloaded for stochastic variables and the definition of mathematical functions which can be used with stochastic arguments. The examples directory contains seven test runs which illustrate the use of the CADNA library and the benefits of Discrete Stochastic Arithmetic. Running time: The version of a code which uses CADNA runs at least three times slower than its floating-point version. This cost depends on the computer architecture and can be higher if the detection of numerical instabilities is enabled. In this case, the cost may be related to the number of instabilities detected.
blend4php: a PHP API for galaxy
Wytko, Connor; Soto, Brian; Ficklin, Stephen P.
2017-01-01
Galaxy is a popular framework for execution of complex analytical pipelines typically for large data sets, and is a commonly used for (but not limited to) genomic, genetic and related biological analysis. It provides a web front-end and integrates with high performance computing resources. Here we report the development of the blend4php library that wraps Galaxy’s RESTful API into a PHP-based library. PHP-based web applications can use blend4php to automate execution, monitoring and management of a remote Galaxy server, including its users, workflows, jobs and more. The blend4php library was specifically developed for the integration of Galaxy with Tripal, the open-source toolkit for the creation of online genomic and genetic web sites. However, it was designed as an independent library for use by any application, and is freely available under version 3 of the GNU Lesser General Public License (LPGL v3.0) at https://github.com/galaxyproject/blend4php. Database URL: https://github.com/galaxyproject/blend4php PMID:28077564
S2PLOT: Three-dimensional (3D) Plotting Library
NASA Astrophysics Data System (ADS)
Barnes, D. G.; Fluke, C. J.; Bourke, P. D.; Parry, O. T.
2011-03-01
We present a new, three-dimensional (3D) plotting library with advanced features, and support for standard and enhanced display devices. The library - S2PLOT - is written in C and can be used by C, C++ and FORTRAN programs on GNU/Linux and Apple/OSX systems. S2PLOT draws objects in a 3D (x,y,z) Cartesian space and the user interactively controls how this space is rendered at run time. With a PGPLOT inspired interface, S2PLOT provides astronomers with elegant techniques for displaying and exploring 3D data sets directly from their program code, and the potential to use stereoscopic and dome display devices. The S2PLOT architecture supports dynamic geometry and can be used to plot time-evolving data sets, such as might be produced by simulation codes. In this paper, we introduce S2PLOT to the astronomical community, describe its potential applications, and present some example uses of the library.
An Advanced, Three-Dimensional Plotting Library for Astronomy
NASA Astrophysics Data System (ADS)
Barnes, David G.; Fluke, Christopher J.; Bourke, Paul D.; Parry, Owen T.
2006-07-01
We present a new, three-dimensional (3D) plotting library with advanced features, and support for standard and enhanced display devices. The library - s2plot - is written in c and can be used by c, c++, and fortran programs on GNU/Linux and Apple/OSX systems. s2plot draws objects in a 3D (x,y,z) Cartesian space and the user interactively controls how this space is rendered at run time. With a pgplot-inspired interface, s2plot provides astronomers with elegant techniques for displaying and exploring 3D data sets directly from their program code, and the potential to use stereoscopic and dome display devices. The s2plot architecture supports dynamic geometry and can be used to plot time-evolving data sets, such as might be produced by simulation codes. In this paper, we introduce s2plot to the astronomical community, describe its potential applications, and present some example uses of the library.
LOOS: an extensible platform for the structural analysis of simulations.
Romo, Tod D; Grossfield, Alan
2009-01-01
We have developed LOOS (Lightweight Object-Oriented Structure-analysis library) as an object-oriented library designed to facilitate the rapid development of tools for the structural analysis of simulations. LOOS supports the native file formats of most common simulation packages including AMBER, CHARMM, CNS, Gromacs, NAMD, Tinker, and X-PLOR. Encapsulation and polymorphism are used to simultaneously provide a stable interface to the programmer and make LOOS easily extensible. A rich atom selection language based on the C expression syntax is included as part of the library. LOOS enables students and casual programmer-scientists to rapidly write their own analytical tools in a compact and expressive manner resembling scripting. LOOS is written in C++ and makes extensive use of the Standard Template Library and Boost, and is freely available under the GNU General Public License (version 3) LOOS has been tested on Linux and MacOS X, but is written to be portable and should work on most Unix-based platforms.
Accelerating numerical solution of stochastic differential equations with CUDA
NASA Astrophysics Data System (ADS)
Januszewski, M.; Kostur, M.
2010-01-01
Numerical integration of stochastic differential equations is commonly used in many branches of science. In this paper we present how to accelerate this kind of numerical calculations with popular NVIDIA Graphics Processing Units using the CUDA programming environment. We address general aspects of numerical programming on stream processors and illustrate them by two examples: the noisy phase dynamics in a Josephson junction and the noisy Kuramoto model. In presented cases the measured speedup can be as high as 675× compared to a typical CPU, which corresponds to several billion integration steps per second. This means that calculations which took weeks can now be completed in less than one hour. This brings stochastic simulation to a completely new level, opening for research a whole new range of problems which can now be solved interactively. Program summaryProgram title: SDE Catalogue identifier: AEFG_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEFG_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Gnu GPL v3 No. of lines in distributed program, including test data, etc.: 978 No. of bytes in distributed program, including test data, etc.: 5905 Distribution format: tar.gz Programming language: CUDA C Computer: any system with a CUDA-compatible GPU Operating system: Linux RAM: 64 MB of GPU memory Classification: 4.3 External routines: The program requires the NVIDIA CUDA Toolkit Version 2.0 or newer and the GNU Scientific Library v1.0 or newer. Optionally gnuplot is recommended for quick visualization of the results. Nature of problem: Direct numerical integration of stochastic differential equations is a computationally intensive problem, due to the necessity of calculating multiple independent realizations of the system. We exploit the inherent parallelism of this problem and perform the calculations on GPUs using the CUDA programming environment. The GPU's ability to execute hundreds of threads simultaneously makes it possible to speed up the computation by over two orders of magnitude, compared to a typical modern CPU. Solution method: The stochastic Runge-Kutta method of the second order is applied to integrate the equation of motion. Ensemble-averaged quantities of interest are obtained through averaging over multiple independent realizations of the system. Unusual features: The numerical solution of the stochastic differential equations in question is performed on a GPU using the CUDA environment. Running time: < 1 minute
2013-06-01
Radio is a software development toolkit that provides signal processing blocks to drive the SDR. GNU Radio has many strong points – it is actively...maintained with a large user base, new capabilities are constantly being added, and compiled C code is fast for many real-time applications such as...programming interface (API) makes learning the architecture a daunting task, even for the experienced software developer. This requirement poses many
A Code Generation Approach for Auto-Vectorization in the Spade Compiler
NASA Astrophysics Data System (ADS)
Wang, Huayong; Andrade, Henrique; Gedik, Buğra; Wu, Kun-Lung
We describe an auto-vectorization approach for the Spade stream processing programming language, comprising two ideas. First, we provide support for vectors as a primitive data type. Second, we provide a C++ library with architecture-specific implementations of a large number of pre-vectorized operations as the means to support language extensions. We evaluate our approach with several stream processing operators, contrasting Spade's auto-vectorization with the native auto-vectorization provided by the GNU gcc and Intel icc compilers.
DOVIS 2.0: An Efficient and Easy to Use Parallel Virtual Screening Tool Based on AutoDock 4.0
2008-09-08
under the GNU General Public License. Background Molecular docking is a computational method that pre- dicts how a ligand interacts with a receptor...Hence, it is an important tool in studying receptor-ligand interactions and plays an essential role in drug design. Particularly, molecular docking has...libraries from OpenBabel and setup a molecular data structure as a C++ object in our program. This makes handling of molecular structures (e.g., atoms
PAL: an object-oriented programming library for molecular evolution and phylogenetics.
Drummond, A; Strimmer, K
2001-07-01
Phylogenetic Analysis Library (PAL) is a collection of Java classes for use in molecular evolution and phylogenetics. PAL provides a modular environment for the rapid construction of both special-purpose and general analysis programs. PAL version 1.1 consists of 145 public classes or interfaces in 13 packages, including classes for models of character evolution, maximum-likelihood estimation, and the coalescent, with a total of more than 27000 lines of code. The PAL project is set up as a collaborative project to facilitate contributions from other researchers. AVAILIABILTY: The program is free and is available at http://www.pal-project.org. It requires Java 1.1 or later. PAL is licensed under the GNU General Public License.
Continuous-time quantum Monte Carlo impurity solvers
NASA Astrophysics Data System (ADS)
Gull, Emanuel; Werner, Philipp; Fuchs, Sebastian; Surer, Brigitte; Pruschke, Thomas; Troyer, Matthias
2011-04-01
Continuous-time quantum Monte Carlo impurity solvers are algorithms that sample the partition function of an impurity model using diagrammatic Monte Carlo techniques. The present paper describes codes that implement the interaction expansion algorithm originally developed by Rubtsov, Savkin, and Lichtenstein, as well as the hybridization expansion method developed by Werner, Millis, Troyer, et al. These impurity solvers are part of the ALPS-DMFT application package and are accompanied by an implementation of dynamical mean-field self-consistency equations for (single orbital single site) dynamical mean-field problems with arbitrary densities of states. Program summaryProgram title: dmft Catalogue identifier: AEIL_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEIL_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: ALPS LIBRARY LICENSE version 1.1 No. of lines in distributed program, including test data, etc.: 899 806 No. of bytes in distributed program, including test data, etc.: 32 153 916 Distribution format: tar.gz Programming language: C++ Operating system: The ALPS libraries have been tested on the following platforms and compilers: Linux with GNU Compiler Collection (g++ version 3.1 and higher), and Intel C++ Compiler (icc version 7.0 and higher) MacOS X with GNU Compiler (g++ Apple-version 3.1, 3.3 and 4.0) IBM AIX with Visual Age C++ (xlC version 6.0) and GNU (g++ version 3.1 and higher) compilers Compaq Tru64 UNIX with Compq C++ Compiler (cxx) SGI IRIX with MIPSpro C++ Compiler (CC) HP-UX with HP C++ Compiler (aCC) Windows with Cygwin or coLinux platforms and GNU Compiler Collection (g++ version 3.1 and higher) RAM: 10 MB-1 GB Classification: 7.3 External routines: ALPS [1], BLAS/LAPACK, HDF5 Nature of problem: (See [2].) Quantum impurity models describe an atom or molecule embedded in a host material with which it can exchange electrons. They are basic to nanoscience as representations of quantum dots and molecular conductors and play an increasingly important role in the theory of "correlated electron" materials as auxiliary problems whose solution gives the "dynamical mean field" approximation to the self-energy and local correlation functions. Solution method: Quantum impurity models require a method of solution which provides access to both high and low energy scales and is effective for wide classes of physically realistic models. The continuous-time quantum Monte Carlo algorithms for which we present implementations here meet this challenge. Continuous-time quantum impurity methods are based on partition function expansions of quantum impurity models that are stochastically sampled to all orders using diagrammatic quantum Monte Carlo techniques. For a review of quantum impurity models and their applications and of continuous-time quantum Monte Carlo methods for impurity models we refer the reader to [2]. Additional comments: Use of dmft requires citation of this paper. Use of any ALPS program requires citation of the ALPS [1] paper. Running time: 60 s-8 h per iteration.
Amber Plug-In for Protein Shop
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oliva, Ricardo
2004-05-10
The Amber Plug-in for ProteinShop has two main components: an AmberEngine library to compute the protein energy models, and a module to solve the energy minimization problem using an optimization algorithm in the OPTI-+ library. Together, these components allow the visualization of the protein folding process in ProteinShop. AmberEngine is a object-oriented library to compute molecular energies based on the Amber model. The main class is called ProteinEnergy. Its main interface methods are (1) "init" to initialize internal variables needed to compute the energy. (2) "eval" to evaluate the total energy given a vector of coordinates. Additional methods allow themore » user to evaluate the individual components of the energy model (bond, angle, dihedral, non-bonded-1-4, and non-bonded energies) and to obtain the energy of each individual atom. The Amber Engine library source code includes examples and test routines that illustrate the use of the library in stand alone programs. The energy minimization module uses the AmberEngine library and the nonlinear optimization library OPT++. OPT++ is open source software available under the GNU Lesser General Public License. The minimization module currently makes use of the LBFGS optimization algorithm in OPT++ to perform the energy minimization. Future releases may give the user a choice of other algorithms available in OPT++.« less
XTALOPT: An open-source evolutionary algorithm for crystal structure prediction
NASA Astrophysics Data System (ADS)
Lonie, David C.; Zurek, Eva
2011-02-01
The implementation and testing of XTALOPT, an evolutionary algorithm for crystal structure prediction, is outlined. We present our new periodic displacement (ripple) operator which is ideally suited to extended systems. It is demonstrated that hybrid operators, which combine two pure operators, reduce the number of duplicate structures in the search. This allows for better exploration of the potential energy surface of the system in question, while simultaneously zooming in on the most promising regions. A continuous workflow, which makes better use of computational resources as compared to traditional generation based algorithms, is employed. Various parameters in XTALOPT are optimized using a novel benchmarking scheme. XTALOPT is available under the GNU Public License, has been interfaced with various codes commonly used to study extended systems, and has an easy to use, intuitive graphical interface. Program summaryProgram title:XTALOPT Catalogue identifier: AEGX_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEGX_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GPL v2.1 or later [1] No. of lines in distributed program, including test data, etc.: 36 849 No. of bytes in distributed program, including test data, etc.: 1 149 399 Distribution format: tar.gz Programming language: C++ Computer: PCs, workstations, or clusters Operating system: Linux Classification: 7.7 External routines: QT [2], OpenBabel [3], AVOGADRO [4], SPGLIB [8] and one of: VASP [5], PWSCF [6], GULP [7]. Nature of problem: Predicting the crystal structure of a system from its stoichiometry alone remains a grand challenge in computational materials science, chemistry, and physics. Solution method: Evolutionary algorithms are stochastic search techniques which use concepts from biological evolution in order to locate the global minimum on their potential energy surface. Our evolutionary algorithm, XTALOPT, is freely available to the scientific community for use and collaboration under the GNU Public License. Running time: User dependent. The program runs until stopped by the user.
OpenMP-accelerated SWAT simulation using Intel C and FORTRAN compilers: Development and benchmark
NASA Astrophysics Data System (ADS)
Ki, Seo Jin; Sugimura, Tak; Kim, Albert S.
2015-02-01
We developed a practical method to accelerate execution of Soil and Water Assessment Tool (SWAT) using open (free) computational resources. The SWAT source code (rev 622) was recompiled using a non-commercial Intel FORTRAN compiler in Ubuntu 12.04 LTS Linux platform, and newly named iOMP-SWAT in this study. GNU utilities of make, gprof, and diff were used to develop the iOMP-SWAT package, profile memory usage, and check identicalness of parallel and serial simulations. Among 302 SWAT subroutines, the slowest routines were identified using GNU gprof, and later modified using Open Multiple Processing (OpenMP) library in an 8-core shared memory system. In addition, a C wrapping function was used to rapidly set large arrays to zero by cross compiling with the original SWAT FORTRAN package. A universal speedup ratio of 2.3 was achieved using input data sets of a large number of hydrological response units. As we specifically focus on acceleration of a single SWAT run, the use of iOMP-SWAT for parameter calibrations will significantly improve the performance of SWAT optimization.
A self-referential HOWTO on release engineering
DOE Office of Scientific and Technical Information (OSTI.GOV)
Galassi, Mark C.
Release engineering is a fundamental part of the software development cycle: it is the point at which quality control is exercised and bug fixes are integrated. The way in which software is released also gives the end user her first experience of a software package, while in scientific computing release engineering can guarantee reproducibility. For these reasons and others, the release process is a good indicator of the maturity and organization of a development team. Software teams often do not put in place a release process at the beginning. This is unfortunate because the team does not have early andmore » continuous execution of test suites, and it does not exercise the software in the same conditions as the end users. I describe an approach to release engineering based on the software tools developed and used by the GNU project, together with several specific proposals related to packaging and distribution. I do this in a step-by-step manner, demonstrating how this very paper is written and built using proper release engineering methods. Because many aspects of release engineering are not exercised in the building of the paper, the accompanying software repository also contains examples of software libraries.« less
XtalOpt version r9: An open-source evolutionary algorithm for crystal structure prediction
Falls, Zackary; Lonie, David C.; Avery, Patrick; ...
2015-10-23
This is a new version of XtalOpt, an evolutionary algorithm for crystal structure prediction available for download from the CPC library or the XtalOpt website, http://xtalopt.github.io. XtalOpt is published under the Gnu Public License (GPL), which is an open source license that is recognized by the Open Source Initiative. We have detailed the new version incorporates many bug-fixes and new features here and predict the crystal structure of a system from its stoichiometry alone, using evolutionary algorithms.
The new protein topology graph library web server.
Schäfer, Tim; Scheck, Andreas; Bruneß, Daniel; May, Patrick; Koch, Ina
2016-02-01
We present a new, extended version of the Protein Topology Graph Library web server. The Protein Topology Graph Library describes the protein topology on the super-secondary structure level. It allows to compute and visualize protein ligand graphs and search for protein structural motifs. The new server features additional information on ligand binding to secondary structure elements, increased usability and an application programming interface (API) to retrieve data, allowing for an automated analysis of protein topology. The Protein Topology Graph Library server is freely available on the web at http://ptgl.uni-frankfurt.de. The website is implemented in PHP, JavaScript, PostgreSQL and Apache. It is supported by all major browsers. The VPLG software that was used to compute the protein ligand graphs and all other data in the database is available under the GNU public license 2.0 from http://vplg.sourceforge.net. tim.schaefer@bioinformatik.uni-frankfurt.de; ina.koch@bioinformatik.uni-frankfurt.de Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Elegent—An elastic event generator
NASA Astrophysics Data System (ADS)
Kašpar, J.
2014-03-01
Although elastic scattering of nucleons may look like a simple process, it presents a long-lasting challenge for theory. Due to missing hard energy scale, the perturbative QCD cannot be applied. Instead, many phenomenological/theoretical models have emerged. In this paper we present a unified implementation of some of the most prominent models in a C++ library, moreover extended to account for effects of the electromagnetic interaction. The library is complemented with a number of utilities. For instance, programs to sample many distributions of interest in four-momentum transfer squared, t, impact parameter, b, and collision energy √{s}. These distributions at ISR, Spp¯S, RHIC, Tevatron and LHC energies are available for download from the project web site. Both in the form of ROOT files and PDF figures providing comparisons among the models. The package includes also a tool for Monte-Carlo generation of elastic scattering events, which can easily be embedded in any other program framework. Catalogue identifier: AERT_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AERT_v1_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: GNU General Public License, version 3 No. of lines in distributed program, including test data, etc.: 10551 No. of bytes in distributed program, including test data, etc.: 126316 Distribution format: tar.gz Programming language: C++. Computer: Any in principle, tested on x86-64 architecture. Operating system: Any in principle, tested on GNU/Linux. RAM: Strongly depends on the task, but typically below 20MB Classification: 11.6. External routines: ROOT, HepMC Nature of problem: Monte-Carlo simulation of elastic nucleon-nucleon collisions Solution method: Implementation of some of the most prominent phenomenological/theoretical models providing cumulative distribution function that is used for random event generation. Running time: Strongly depends on the task, but typically below 1 h.
NASA Astrophysics Data System (ADS)
Giraldo, Francis; Abdi, Daniel; Kopera, Michal
2017-04-01
We have built a Galerkin-based Numerical Modeling Environment (GNuMe) for non hydrostatic atmospheric and ocean processes. GNuMe uses continuous Galerkin and Discontinuous Galerkin (CG/DG) discetizations as well as non-conforming adaptive mesh refinement (AMR), along with advanced time-integration methods that exploits both CG/DG and AMR capabilities. GNuMe currently solves the compressible and incompressible Navier-Stokes equations, the shallow water equations (with wetting and drying), and work is underway for inclusion of other types of equations. Moreover, GNuMe can run in both 2D and 3D modes on any type of accelerator hardware such as Nvidia GPUs and Intel KNL, and on standard X86 cores. In this talk, we shall present representative solutions obtained with GNuMe and will discuss where we think such a modeling framework could fit within standard Earth Systems Models. For further information on GNuMe please visit: http://frankgiraldo.wixsite.com/mysite/gnume.
NASA Astrophysics Data System (ADS)
Shameoni Niaei, M.; Kilic, Y.; Yildiran, B. E.; Yüzlükoglu, F.; Yesilyaprak, C.
2016-12-01
We have described a new software (MIPS) about the analysis and image processing of the meteorological satellite (Meteosat) data for an astronomical observatory. This software will be able to help to make some atmospherical forecast (cloud, humidity, rain) using meteosat data for robotic telescopes. MIPS uses a python library for Eumetsat data that aims to be completely open-source and licenced under GNU/General Public Licence (GPL). MIPS is a platform independent and uses h5py, numpy, and PIL with the general-purpose and high-level programming language Python and the QT framework.
FFT-split-operator code for solving the Dirac equation in 2+1 dimensions
NASA Astrophysics Data System (ADS)
Mocken, Guido R.; Keitel, Christoph H.
2008-06-01
The main part of the code presented in this work represents an implementation of the split-operator method [J.A. Fleck, J.R. Morris, M.D. Feit, Appl. Phys. 10 (1976) 129-160; R. Heather, Comput. Phys. Comm. 63 (1991) 446] for calculating the time-evolution of Dirac wave functions. It allows to study the dynamics of electronic Dirac wave packets under the influence of any number of laser pulses and its interaction with any number of charged ion potentials. The initial wave function can be either a free Gaussian wave packet or an arbitrary discretized spinor function that is loaded from a file provided by the user. The latter option includes Dirac bound state wave functions. The code itself contains the necessary tools for constructing such wave functions for a single-electron ion. With the help of self-adaptive numerical grids, we are able to study the electron dynamics for various problems in 2+1 dimensions at high spatial and temporal resolutions that are otherwise unachievable. Along with the position and momentum space probability density distributions, various physical observables, such as the expectation values of position and momentum, can be recorded in a time-dependent way. The electromagnetic spectrum that is emitted by the evolving particle can also be calculated with this code. Finally, for planning and comparison purposes, both the time-evolution and the emission spectrum can also be treated in an entirely classical relativistic way. Besides the implementation of the above-mentioned algorithms, the program also contains a large C++ class library to model the geometric algebra representation of spinors that we use for representing the Dirac wave function. This is why the code is called "Dirac++". Program summaryProgram title: Dirac++ or (abbreviated) d++ Catalogue identifier: AEAS_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEAS_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 474 937 No. of bytes in distributed program, including test data, etc.: 4 128 347 Distribution format: tar.gz Programming language: C++ Computer: Any, but SMP systems are preferred Operating system: Linux and MacOS X are actively supported by the current version. Earlier versions were also tested successfully on IRIX and AIX Number of processors used: Generally unlimited, but best scaling with 2-4 processors for typical problems RAM: 160 Megabytes minimum for the examples given here Classification: 2.7 External routines: FFTW Library [3,4], Gnu Scientific Library [5], bzip2, bunzip2 Nature of problem: The relativistic time evolution of wave functions according to the Dirac equation is a challenging numerical task. Especially for an electron in the presence of high intensity laser beams and/or highly charged ions, this type of problem is of considerable interest to atomic physicists. Solution method: The code employs the split-operator method [1,2], combined with fast Fourier transforms (FFT) for calculating any occurring spatial derivatives, to solve the given problem. An autocorrelation spectral method [6] is provided to generate a bound state for use as the initial wave function of further dynamical studies. Restrictions: The code in its current form is restricted to problems in two spatial dimensions. Otherwise it is only limited by CPU time and memory that one can afford to spend on a particular problem. Unusual features: The code features dynamically adapting position and momentum space grids to keep execution time and memory requirements as small as possible. It employs an object-oriented approach, and it relies on a Clifford algebra class library to represent the mathematical objects of the Dirac formalism which we employ. Besides that it includes a feature (typically called "checkpointing") which allows the resumption of an interrupted calculation. Additional comments: Along with the program's source code, we provide several sample configuration files, a pre-calculated bound state wave function, and template files for the analysis of the results with both MatLab and Igor Pro. Running time: Running time ranges from a few minutes for simple tests up to several days, even weeks for real-world physical problems that require very large grids or very small time steps. References:J.A. Fleck, J.R. Morris, M.D. Feit, Time-dependent propagation of high energy laser beams through the atmosphere, Appl. Phys. 10 (1976) 129-160. R. Heather, An asymptotic wavefunction splitting procedure for propagating spatially extended wavefunctions: Application to intense field photodissociation of H +2, Comput. Phys. Comm. 63 (1991) 446. M. Frigo, S.G. Johnson, FFTW: An adaptive software architecture for the FFT, in: Proceedings of the IEEE International Conference on Acoustics, Speech and Signal Processing, vol. 3, IEEE, 1998, pp. 1381-1384. M. Frigo, S.G. Johnson, The design and implementation of FFTW3, in: Proceedings of the IEEE, vol. 93, IEEE, 2005, pp. 216-231. URL: http://www.fftw.org/. M. Galassi, J. Davies, J. Theiler, B. Gough, G. Jungman, M. Booth, F. Rossi, GNU Scientific Library Reference Manual, second ed., Network Theory Limited, 2006. URL: http://www.gnu.org/software/gsl/. M.D. Feit, J.A. Fleck, A. Steiger, Solution of the Schrödinger equation by a spectral method, J. Comput. Phys. 47 (1982) 412-433.
blend4php: a PHP API for galaxy.
Wytko, Connor; Soto, Brian; Ficklin, Stephen P
2017-01-01
Galaxy is a popular framework for execution of complex analytical pipelines typically for large data sets, and is a commonly used for (but not limited to) genomic, genetic and related biological analysis. It provides a web front-end and integrates with high performance computing resources. Here we report the development of the blend4php library that wraps Galaxy's RESTful API into a PHP-based library. PHP-based web applications can use blend4php to automate execution, monitoring and management of a remote Galaxy server, including its users, workflows, jobs and more. The blend4php library was specifically developed for the integration of Galaxy with Tripal, the open-source toolkit for the creation of online genomic and genetic web sites. However, it was designed as an independent library for use by any application, and is freely available under version 3 of the GNU Lesser General Public License (LPGL v3.0) at https://github.com/galaxyproject/blend4phpDatabase URL: https://github.com/galaxyproject/blend4php. © The Author(s) 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Optimized multiple quantum MAS lineshape simulations in solid state NMR
NASA Astrophysics Data System (ADS)
Brouwer, William J.; Davis, Michael C.; Mueller, Karl T.
2009-10-01
The majority of nuclei available for study in solid state Nuclear Magnetic Resonance have half-integer spin I>1/2, with corresponding electric quadrupole moment. As such, they may couple with a surrounding electric field gradient. This effect introduces anisotropic line broadening to spectra, arising from distinct chemical species within polycrystalline solids. In Multiple Quantum Magic Angle Spinning (MQMAS) experiments, a second frequency dimension is created, devoid of quadrupolar anisotropy. As a result, the center of gravity of peaks in the high resolution dimension is a function of isotropic second order quadrupole and chemical shift alone. However, for complex materials, these parameters take on a stochastic nature due in turn to structural and chemical disorder. Lineshapes may still overlap in the isotropic dimension, complicating the task of assignment and interpretation. A distributed computational approach is presented here which permits simulation of the two-dimensional MQMAS spectrum, generated by random variates from model distributions of isotropic chemical and quadrupole shifts. Owing to the non-convex nature of the residual sum of squares (RSS) function between experimental and simulated spectra, simulated annealing is used to optimize the simulation parameters. In this manner, local chemical environments for disordered materials may be characterized, and via a re-sampling approach, error estimates for parameters produced. Program summaryProgram title: mqmasOPT Catalogue identifier: AEEC_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEEC_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 3650 No. of bytes in distributed program, including test data, etc.: 73 853 Distribution format: tar.gz Programming language: C, OCTAVE Computer: UNIX/Linux Operating system: UNIX/Linux Has the code been vectorised or parallelized?: Yes RAM: Example: (1597 powder angles) × (200 Samples) × (81 F2 frequency pts) × (31 F1 frequency points) = 3.5M, SMP AMD opteron Classification: 2.3 External routines: OCTAVE ( http://www.gnu.org/software/octave/), GNU Scientific Library ( http://www.gnu.org/software/gsl/), OPENMP ( http://openmp.org/wp/) Nature of problem: The optimal simulation and modeling of multiple quantum magic angle spinning NMR spectra, for general systems, especially those with mild to significant disorder. The approach outlined and implemented in C and OCTAVE also produces model parameter error estimates. Solution method: A model for each distinct chemical site is first proposed, for the individual contribution of crystallite orientations to the spectrum. This model is averaged over all powder angles [1], as well as the (stochastic) parameters; isotropic chemical shift and quadrupole coupling constant. The latter is accomplished via sampling from a bi-variate Gaussian distribution, using the Box-Muller algorithm to transform Sobol (quasi) random numbers [2]. A simulated annealing optimization is performed, and finally the non-linear jackknife [3] is applied in developing model parameter error estimates. Additional comments: The distribution contains a script, mqmasOpt.m, which runs in the OCTAVE language workspace. Running time: Example: (1597 powder angles) × (200 Samples) × (81 F2 frequency pts) × (31 F1 frequency points) = 58.35 seconds, SMP AMD opteron. References:S.K. Zaremba, Annali di Matematica Pura ed Applicata 73 (1966) 293. H. Niederreiter, Random Number Generation and Quasi-Monte Carlo Methods, SIAM, 1992. T. Fox, D. Hinkley, K. Larntz, Technometrics 22 (1980) 29.
Simulation Platform: a cloud-based online simulation environment.
Yamazaki, Tadashi; Ikeno, Hidetoshi; Okumura, Yoshihiro; Satoh, Shunji; Kamiyama, Yoshimi; Hirata, Yutaka; Inagaki, Keiichiro; Ishihara, Akito; Kannon, Takayuki; Usui, Shiro
2011-09-01
For multi-scale and multi-modal neural modeling, it is needed to handle multiple neural models described at different levels seamlessly. Database technology will become more important for these studies, specifically for downloading and handling the neural models seamlessly and effortlessly. To date, conventional neuroinformatics databases have solely been designed to archive model files, but the databases should provide a chance for users to validate the models before downloading them. In this paper, we report our on-going project to develop a cloud-based web service for online simulation called "Simulation Platform". Simulation Platform is a cloud of virtual machines running GNU/Linux. On a virtual machine, various software including developer tools such as compilers and libraries, popular neural simulators such as GENESIS, NEURON and NEST, and scientific software such as Gnuplot, R and Octave, are pre-installed. When a user posts a request, a virtual machine is assigned to the user, and the simulation starts on that machine. The user remotely accesses to the machine through a web browser and carries out the simulation, without the need to install any software but a web browser on the user's own computer. Therefore, Simulation Platform is expected to eliminate impediments to handle multiple neural models that require multiple software. Copyright © 2011 Elsevier Ltd. All rights reserved.
Reprint of: Simulation Platform: a cloud-based online simulation environment.
Yamazaki, Tadashi; Ikeno, Hidetoshi; Okumura, Yoshihiro; Satoh, Shunji; Kamiyama, Yoshimi; Hirata, Yutaka; Inagaki, Keiichiro; Ishihara, Akito; Kannon, Takayuki; Usui, Shiro
2011-11-01
For multi-scale and multi-modal neural modeling, it is needed to handle multiple neural models described at different levels seamlessly. Database technology will become more important for these studies, specifically for downloading and handling the neural models seamlessly and effortlessly. To date, conventional neuroinformatics databases have solely been designed to archive model files, but the databases should provide a chance for users to validate the models before downloading them. In this paper, we report our on-going project to develop a cloud-based web service for online simulation called "Simulation Platform". Simulation Platform is a cloud of virtual machines running GNU/Linux. On a virtual machine, various software including developer tools such as compilers and libraries, popular neural simulators such as GENESIS, NEURON and NEST, and scientific software such as Gnuplot, R and Octave, are pre-installed. When a user posts a request, a virtual machine is assigned to the user, and the simulation starts on that machine. The user remotely accesses to the machine through a web browser and carries out the simulation, without the need to install any software but a web browser on the user's own computer. Therefore, Simulation Platform is expected to eliminate impediments to handle multiple neural models that require multiple software. Copyright © 2011 Elsevier Ltd. All rights reserved.
An integrated tool for loop calculations: AITALC
NASA Astrophysics Data System (ADS)
Lorca, Alejandro; Riemann, Tord
2006-01-01
AITALC, a new tool for automating loop calculations in high energy physics, is described. The package creates Fortran code for two-fermion scattering processes automatically, starting from the generation and analysis of the Feynman graphs. We describe the modules of the tool, the intercommunication between them and illustrate its use with three examples. Program summaryTitle of the program:AITALC version 1.2.1 (9 August 2005) Catalogue identifier:ADWO Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADWO Program obtainable from:CPC Program Library, Queen's University of Belfast, N. Ireland Computer:PC i386 Operating system:GNU/ LINUX, tested on different distributions SuSE 8.2 to 9.3, Red Hat 7.2, Debian 3.0, Ubuntu 5.04. Also on SOLARIS Programming language used:GNU MAKE, DIANA, FORM, FORTRAN77 Additional programs/libraries used:DIANA 2.35 ( QGRAF 2.0), FORM 3.1, LOOPTOOLS 2.1 ( FF) Memory required to execute with typical data:Up to about 10 MB No. of processors used:1 No. of lines in distributed program, including test data, etc.:40 926 No. of bytes in distributed program, including test data, etc.:371 424 Distribution format:tar gzip file High-speed storage required:from 1.5 to 30 MB, depending on modules present and unfolding of examples Nature of the physical problem:Calculation of differential cross sections for ee annihilation in one-loop approximation. Method of solution:Generation and perturbative analysis of Feynman diagrams with later evaluation of matrix elements and form factors. Restriction of the complexity of the problem:The limit of application is, for the moment, the 2→2 particle reactions in the electro-weak standard model. Typical running time:Few minutes, being highly depending on the complexity of the process and the FORTRAN compiler.
StrBioLib: a Java library for development of custom computationalstructural biology applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chandonia, John-Marc
2007-05-14
Summary: StrBioLib is a library of Java classes useful fordeveloping software for computational structural biology research.StrBioLib contains classes to represent and manipulate proteinstructures, biopolymer sequences, sets of biopolymer sequences, andalignments between biopolymers based on either sequence or structure.Interfaces are provided to interact with commonly used bioinformaticsapplications, including (PSI)-BLAST, MODELLER, MUSCLE, and Primer3, andtools are provided to read and write many file formats used to representbioinformatic data. The library includes a general-purpose neural networkobject with multiple training algorithms, the Hooke and Jeeves nonlinearoptimization algorithm, and tools for efficient C-style string parsingand formatting. StrBioLib is the basis for the Pred2ary secondarystructure predictionmore » program, is used to build the ASTRAL compendium forsequence and structure analysis, and has been extensively tested throughuse in many smaller projects. Examples and documentation are available atthe site below.Availability: StrBioLib may be obtained under the terms ofthe GNU LGPL license from http://strbio.sourceforge.net/Contact:JMChandonia@lbl.gov« less
StrBioLib: a Java library for development of custom computational structural biology applications.
Chandonia, John-Marc
2007-08-01
StrBioLib is a library of Java classes useful for developing software for computational structural biology research. StrBioLib contains classes to represent and manipulate protein structures, biopolymer sequences, sets of biopolymer sequences, and alignments between biopolymers based on either sequence or structure. Interfaces are provided to interact with commonly used bioinformatics applications, including (psi)-blast, modeller, muscle and Primer3, and tools are provided to read and write many file formats used to represent bioinformatic data. The library includes a general-purpose neural network object with multiple training algorithms, the Hooke and Jeeves non-linear optimization algorithm, and tools for efficient C-style string parsing and formatting. StrBioLib is the basis for the Pred2ary secondary structure prediction program, is used to build the astral compendium for sequence and structure analysis, and has been extensively tested through use in many smaller projects. Examples and documentation are available at the site below. StrBioLib may be obtained under the terms of the GNU LGPL license from http://strbio.sourceforge.net/
Fast computation of close-coupling exchange integrals using polynomials in a tree representation
NASA Astrophysics Data System (ADS)
Wallerberger, Markus; Igenbergs, Katharina; Schweinzer, Josef; Aumayr, Friedrich
2011-03-01
The semi-classical atomic-orbital close-coupling method is a well-known approach for the calculation of cross sections in ion-atom collisions. It strongly relies on the fast and stable computation of exchange integrals. We present an upgrade to earlier implementations of the Fourier-transform method. For this purpose, we implement an extensive library for symbolic storage of polynomials, relying on sophisticated tree structures to allow fast manipulation and numerically stable evaluation. Using this library, we considerably speed up creation and computation of exchange integrals. This enables us to compute cross sections for more complex collision systems. Program summaryProgram title: TXINT Catalogue identifier: AEHS_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEHS_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 12 332 No. of bytes in distributed program, including test data, etc.: 157 086 Distribution format: tar.gz Programming language: Fortran 95 Computer: All with a Fortran 95 compiler Operating system: All with a Fortran 95 compiler RAM: Depends heavily on input, usually less than 100 MiB Classification: 16.10 Nature of problem: Analytical calculation of one- and two-center exchange matrix elements for the close-coupling method in the impact parameter model. Solution method: Similar to the code of Hansen and Dubois [1], we use the Fourier-transform method suggested by Shakeshaft [2] to compute the integrals. However, we heavily speed up the calculation using a library for symbolic manipulation of polynomials. Restrictions: We restrict ourselves to a defined collision system in the impact parameter model. Unusual features: A library for symbolic manipulation of polynomials, where polynomials are stored in a space-saving left-child right-sibling binary tree. This provides stable numerical evaluation and fast mutation while maintaining full compatibility with the original code. Additional comments: This program makes heavy use of the new features provided by the Fortran 90 standard, most prominently pointers, derived types and allocatable structures and a small portion of Fortran 95. Only newer compilers support these features. Following compilers support all features needed by the program. GNU Fortran Compiler "gfortran" from version 4.3.0 GNU Fortran 95 Compiler "g95" from version 4.2.0 Intel Fortran Compiler "ifort" from version 11.0
NASA Astrophysics Data System (ADS)
Vukics, András
2012-06-01
C++QED is a versatile framework for simulating open quantum dynamics. It allows to build arbitrarily complex quantum systems from elementary free subsystems and interactions, and simulate their time evolution with the available time-evolution drivers. Through this framework, we introduce a design which should be generic for high-level representations of composite quantum systems. It relies heavily on the object-oriented and generic programming paradigms on one hand, and on the other hand, compile-time algorithms, in particular C++ template-metaprogramming techniques. The core of the design is the data structure which represents the state vectors of composite quantum systems. This data structure models the multi-array concept. The use of template metaprogramming is not only crucial to the design, but with its use all computations pertaining to the layout of the simulated system can be shifted to compile time, hence cutting on runtime. Program summaryProgram title: C++QED Catalogue identifier: AELU_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AELU_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions:http://cpc.cs.qub.ac.uk/licence/aelu_v1_0.html. The C++QED package contains other software packages, Blitz, Boost and FLENS, all of which may be distributed freely but have individual license requirements. Please see individual packages for license conditions. No. of lines in distributed program, including test data, etc.: 597 974 No. of bytes in distributed program, including test data, etc.: 4 874 839 Distribution format: tar.gz Programming language: C++ Computer: i386-i686, x86_64 Operating system: In principle cross-platform, as yet tested only on UNIX-like systems (including Mac OS X). RAM: The framework itself takes about 60 MB, which is fully shared. The additional memory taken by the program which defines the actual physical system (script) is typically less than 1 MB. The memory storing the actual data scales with the system dimension for state-vector manipulations, and the square of the dimension for density-operator manipulations. This might easily be GBs, and often the memory of the machine limits the size of the simulated system. Classification: 4.3, 4.13, 6.2, 20 External routines: Boost C++ libraries (http://www.boost.org/), GNU Scientific Library (http://www.gnu.org/software/gsl/), Blitz++ (http://www.oonumerics.org/blitz/), Linear Algebra Package - Flexible Library for Efficient Numerical Solutions (http://flens.sourceforge.net/). Nature of problem: Definition of (open) composite quantum systems out of elementary building blocks [1]. Manipulation of such systems, with emphasis on dynamical simulations such as Master-equation evolution [2] and Monte Carlo wave-function simulation [3]. Solution method: Master equation, Monte Carlo wave-function method. Restrictions: Total dimensionality of the system. Master equation - few thousands. Monte Carlo wave-function trajectory - several millions. Unusual features: Because of the heavy use of compile-time algorithms, compilation of programs written in the framework may take a long time and much memory (up to several GBs). Additional comments: The framework is not a program, but provides and implements an application-programming interface for developing simulations in the indicated problem domain. Supplementary information: http://cppqed.sourceforge.net/. Running time: Depending on the magnitude of the problem, can vary from a few seconds to weeks.
Distributed databases for materials study of thermo-kinetic properties
NASA Astrophysics Data System (ADS)
Toher, Cormac
2015-03-01
High-throughput computational materials science provides researchers with the opportunity to rapidly generate large databases of materials properties. To rapidly add thermal properties to the AFLOWLIB consortium and Materials Project repositories, we have implemented an automated quasi-harmonic Debye model, the Automatic GIBBS Library (AGL). This enables us to screen thousands of materials for thermal conductivity, bulk modulus, thermal expansion and related properties. The search and sort functions of the online database can then be used to identify suitable materials for more in-depth study using more precise computational or experimental techniques. AFLOW-AGL source code is public domain and will soon be released within the GNU-GPL license.
PD5: a general purpose library for primer design software.
Riley, Michael C; Aubrey, Wayne; Young, Michael; Clare, Amanda
2013-01-01
Complex PCR applications for large genome-scale projects require fast, reliable and often highly sophisticated primer design software applications. Presently, such applications use pipelining methods to utilise many third party applications and this involves file parsing, interfacing and data conversion, which is slow and prone to error. A fully integrated suite of software tools for primer design would considerably improve the development time, the processing speed, and the reliability of bespoke primer design software applications. The PD5 software library is an open-source collection of classes and utilities, providing a complete collection of software building blocks for primer design and analysis. It is written in object-oriented C(++) with an emphasis on classes suitable for efficient and rapid development of bespoke primer design programs. The modular design of the software library simplifies the development of specific applications and also integration with existing third party software where necessary. We demonstrate several applications created using this software library that have already proved to be effective, but we view the project as a dynamic environment for building primer design software and it is open for future development by the bioinformatics community. Therefore, the PD5 software library is published under the terms of the GNU General Public License, which guarantee access to source-code and allow redistribution and modification. The PD5 software library is downloadable from Google Code and the accompanying Wiki includes instructions and examples: http://code.google.com/p/primer-design.
Using Spherical-Harmonics Expansions for Optics Surface Reconstruction from Gradients.
Solano-Altamirano, Juan Manuel; Vázquez-Otero, Alejandro; Khikhlukha, Danila; Dormido, Raquel; Duro, Natividad
2017-11-30
In this paper, we propose a new algorithm to reconstruct optics surfaces (aka wavefronts) from gradients, defined on a circular domain, by means of the Spherical Harmonics. The experimental results indicate that this algorithm renders the same accuracy, compared to the reconstruction based on classical Zernike polynomials, using a smaller number of polynomial terms, which potentially speeds up the wavefront reconstruction. Additionally, we provide an open-source C++ library, released under the terms of the GNU General Public License version 2 (GPLv2), wherein several polynomial sets are coded. Therefore, this library constitutes a robust software alternative for wavefront reconstruction in a high energy laser field, optical surface reconstruction, and, more generally, in surface reconstruction from gradients. The library is a candidate for being integrated in control systems for optical devices, or similarly to be used in ad hoc simulations. Moreover, it has been developed with flexibility in mind, and, as such, the implementation includes the following features: (i) a mock-up generator of various incident wavefronts, intended to simulate the wavefronts commonly encountered in the field of high-energy lasers production; (ii) runtime selection of the library in charge of performing the algebraic computations; (iii) a profiling mechanism to measure and compare the performance of different steps of the algorithms and/or third-party linear algebra libraries. Finally, the library can be easily extended to include additional dependencies, such as porting the algebraic operations to specific architectures, in order to exploit hardware acceleration features.
Using Spherical-Harmonics Expansions for Optics Surface Reconstruction from Gradients
Solano-Altamirano, Juan Manuel; Khikhlukha, Danila
2017-01-01
In this paper, we propose a new algorithm to reconstruct optics surfaces (aka wavefronts) from gradients, defined on a circular domain, by means of the Spherical Harmonics. The experimental results indicate that this algorithm renders the same accuracy, compared to the reconstruction based on classical Zernike polynomials, using a smaller number of polynomial terms, which potentially speeds up the wavefront reconstruction. Additionally, we provide an open-source C++ library, released under the terms of the GNU General Public License version 2 (GPLv2), wherein several polynomial sets are coded. Therefore, this library constitutes a robust software alternative for wavefront reconstruction in a high energy laser field, optical surface reconstruction, and, more generally, in surface reconstruction from gradients. The library is a candidate for being integrated in control systems for optical devices, or similarly to be used in ad hoc simulations. Moreover, it has been developed with flexibility in mind, and, as such, the implementation includes the following features: (i) a mock-up generator of various incident wavefronts, intended to simulate the wavefronts commonly encountered in the field of high-energy lasers production; (ii) runtime selection of the library in charge of performing the algebraic computations; (iii) a profiling mechanism to measure and compare the performance of different steps of the algorithms and/or third-party linear algebra libraries. Finally, the library can be easily extended to include additional dependencies, such as porting the algebraic operations to specific architectures, in order to exploit hardware acceleration features. PMID:29189722
The orbifolder: A tool to study the low-energy effective theory of heterotic orbifolds
NASA Astrophysics Data System (ADS)
Nilles, H. P.; Ramos-Sánchez, S.; Vaudrevange, P. K. S.; Wingerter, A.
2012-06-01
The orbifolder is a program developed in C++ that computes and analyzes the low-energy effective theory of heterotic orbifold compactifications. The program includes routines to compute the massless spectrum, to identify the allowed couplings in the superpotential, to automatically generate large sets of orbifold models, to identify phenomenologically interesting models (e.g. MSSM-like models) and to analyze their vacuum configurations. Program summaryProgram title: orbifolder Catalogue identifier: AELR_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AELR_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public License version 3 No. of lines in distributed program, including test data, etc.: 145 572 No. of bytes in distributed program, including test data, etc.: 930 517 Distribution format: tar.gz Programming language:C++ Computer: Personal computer Operating system: Tested on Linux (Fedora 15, Ubuntu 11, SuSE 11) Word size: 32 bits or 64 bits Classification: 11.1 External routines: Boost (http://www.boost.org/), GSL (http://www.gnu.org/software/gsl/) Nature of problem: Calculating the low-energy spectrum of heterotic orbifold compactifications. Solution method: Quadratic equations on a lattice; representation theory; polynomial algebra. Running time: Less than a second per model.
Interference Cancellation System Design Using GNU Radio
2015-12-01
ARL-TR-7546 ● DEC 2015 US Army Research Laboratory Interference Cancellation System Design Using GNU Radio by Jan Paolo...Interference Cancellation System Design Using GNU Radio by Jan Paolo Acosta Sensors and Electron Devices Directorate, ARL...REPORT DATE (DD-MM-YYYY) December 2015 2. REPORT TYPE Final 3. DATES COVERED (From - To) 4. TITLE AND SUBTITLE Interference Cancellation System
MinFinder v2.0: An improved version of MinFinder
NASA Astrophysics Data System (ADS)
Tsoulos, Ioannis G.; Lagaris, Isaac E.
2008-10-01
A new version of the "MinFinder" program is presented that offers an augmented linking procedure for Fortran-77 subprograms, two additional stopping rules and a new start-point rejection mechanism that saves a significant portion of gradient and function evaluations. The method is applied on a set of standard test functions and the results are reported. New version program summaryProgram title: MinFinder v2.0 Catalogue identifier: ADWU_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADWU_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC Licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 14 150 No. of bytes in distributed program, including test data, etc.: 218 144 Distribution format: tar.gz Programming language used: GNU C++, GNU FORTRAN, GNU C Computer: The program is designed to be portable in all systems running the GNU C++ compiler Operating system: Linux, Solaris, FreeBSD RAM: 200 000 bytes Classification: 4.9 Catalogue identifier of previous version: ADWU_v1_0 Journal reference of previous version: Computer Physics Communications 174 (2006) 166-179 Does the new version supersede the previous version?: Yes Nature of problem: A multitude of problems in science and engineering are often reduced to minimizing a function of many variables. There are instances that a local optimum does not correspond to the desired physical solution and hence the search for a better solution is required. Local optimization techniques can be trapped in any local minimum. Global optimization is then the appropriate tool. For example, solving a non-linear system of equations via optimization, one may encounter many local minima that do not correspond to solutions, i.e. they are far from zero. Solution method: Using a uniform pdf, points are sampled from a rectangular domain. A clustering technique, based on a typical distance and a gradient criterion, is used to decide from which points a local search should be started. Further searching is terminated when all the local minima inside the search domain are thought to be found. This is accomplished via three stopping rules: the "double-box" stopping rule, the "observables" stopping rule and the "expected minimizers" stopping rule. Reasons for the new version: The link procedure for source code in Fortran 77 is enhanced, two additional stopping rules are implemented and a new criterion for accepting-start points, that economizes on function and gradient calls, is introduced. Summary of revisions:Addition of command line parameters to the utility program make_program. Augmentation of the link process for Fortran 77 subprograms, by linking the final executable with the g2c library. Addition of two probabilistic stopping rules. Introduction of a rejection mechanism to the Checking step of the original method, that reduces the number of gradient evaluations. Additional comments: A technical report describing the revisions, experiments and test runs is packaged with the source code. Running time: Depending on the objective function.
WeBIAS: a web server for publishing bioinformatics applications.
Daniluk, Paweł; Wilczyński, Bartek; Lesyng, Bogdan
2015-11-02
One of the requirements for a successful scientific tool is its availability. Developing a functional web service, however, is usually considered a mundane and ungratifying task, and quite often neglected. When publishing bioinformatic applications, such attitude puts additional burden on the reviewers who have to cope with poorly designed interfaces in order to assess quality of presented methods, as well as impairs actual usefulness to the scientific community at large. In this note we present WeBIAS-a simple, self-contained solution to make command-line programs accessible through web forms. It comprises a web portal capable of serving several applications and backend schedulers which carry out computations. The server handles user registration and authentication, stores queries and results, and provides a convenient administrator interface. WeBIAS is implemented in Python and available under GNU Affero General Public License. It has been developed and tested on GNU/Linux compatible platforms covering a vast majority of operational WWW servers. Since it is written in pure Python, it should be easy to deploy also on all other platforms supporting Python (e.g. Windows, Mac OS X). Documentation and source code, as well as a demonstration site are available at http://bioinfo.imdik.pan.pl/webias . WeBIAS has been designed specifically with ease of installation and deployment of services in mind. Setting up a simple application requires minimal effort, yet it is possible to create visually appealing, feature-rich interfaces for query submission and presentation of results.
STILTS -- Starlink Tables Infrastructure Library Tool Set
NASA Astrophysics Data System (ADS)
Taylor, Mark
STILTS is a set of command-line tools for processing tabular data. It has been designed for, but is not restricted to, use on astronomical data such as source catalogues. It contains both generic (format-independent) table processing tools and tools for processing VOTable documents. Facilities offered include crossmatching, format conversion, format validation, column calculation and rearrangement, row selection, sorting, plotting, statistical calculations and metadata display. Calculations on cell data can be performed using a powerful and extensible expression language. The package is written in pure Java and based on STIL, the Starlink Tables Infrastructure Library. This gives it high portability, support for many data formats (including FITS, VOTable, text-based formats and SQL databases), extensibility and scalability. Where possible the tools are written to accept streamed data so the size of tables which can be processed is not limited by available memory. As well as the tutorial and reference information in this document, detailed on-line help is available from the tools themselves. STILTS is available under the GNU General Public Licence.
Libpsht - algorithms for efficient spherical harmonic transforms
NASA Astrophysics Data System (ADS)
Reinecke, M.
2011-02-01
Libpsht (or "library for performant spherical harmonic transforms") is a collection of algorithms for efficient conversion between spatial-domain and spectral-domain representations of data defined on the sphere. The package supports both transforms of scalars and spin-1 and spin-2 quantities, and can be used for a wide range of pixelisations (including HEALPix, GLESP, and ECP). It will take advantage of hardware features such as multiple processor cores and floating-point vector operations, if available. Even without this additional acceleration, the employed algorithms are among the most efficient (in terms of CPU time, as well as memory consumption) currently being used in the astronomical community. The library is written in strictly standard-conforming C90, ensuring portability to many different hard- and software platforms, and allowing straightforward integration with codes written in various programming languages like C, C++, Fortran, Python etc. Libpsht is distributed under the terms of the GNU General Public License (GPL) version 2 and can be downloaded from .
Libpsht: Algorithms for Efficient Spherical Harmonic Transforms
NASA Astrophysics Data System (ADS)
Reinecke, Martin
2010-10-01
Libpsht (or "library for Performing Spherical Harmonic Transforms") is a collection of algorithms for efficient conversion between spatial-domain and spectral-domain representations of data defined on the sphere. The package supports transforms of scalars as well as spin-1 and spin-2 quantities, and can be used for a wide range of pixelisations (including HEALPix, GLESP and ECP). It will take advantage of hardware features like multiple processor cores and floating-point vector operations, if available. Even without this additional acceleration, the employed algorithms are among the most efficient (in terms of CPU time as well as memory consumption) currently being used in the astronomical community. The library is written in strictly standard-conforming C90, ensuring portability to many different hard- and software platforms, and allowing straightforward integration with codes written in various programming languages like C, C++, Fortran, Python etc. Libpsht is distributed under the terms of the GNU General Public License (GPL) version 2. Development on this project has ended; its successor is libsharp (ascl:1402.033).
Advanced complex trait analysis.
Gray, A; Stewart, I; Tenesa, A
2012-12-01
The Genome-wide Complex Trait Analysis (GCTA) software package can quantify the contribution of genetic variation to phenotypic variation for complex traits. However, as those datasets of interest continue to increase in size, GCTA becomes increasingly computationally prohibitive. We present an adapted version, Advanced Complex Trait Analysis (ACTA), demonstrating dramatically improved performance. We restructure the genetic relationship matrix (GRM) estimation phase of the code and introduce the highly optimized parallel Basic Linear Algebra Subprograms (BLAS) library combined with manual parallelization and optimization. We introduce the Linear Algebra PACKage (LAPACK) library into the restricted maximum likelihood (REML) analysis stage. For a test case with 8999 individuals and 279,435 single nucleotide polymorphisms (SNPs), we reduce the total runtime, using a compute node with two multi-core Intel Nehalem CPUs, from ∼17 h to ∼11 min. The source code is fully available under the GNU Public License, along with Linux binaries. For more information see http://www.epcc.ed.ac.uk/software-products/acta. a.gray@ed.ac.uk Supplementary data are available at Bioinformatics online.
XTALOPT version r11: An open-source evolutionary algorithm for crystal structure prediction
NASA Astrophysics Data System (ADS)
Avery, Patrick; Falls, Zackary; Zurek, Eva
2018-01-01
Version 11 of XTALOPT, an evolutionary algorithm for crystal structure prediction, has now been made available for download from the CPC library or the XTALOPT website, http://xtalopt.github.io. Whereas the previous versions of XTALOPT were published under the Gnu Public License (GPL), the current version is made available under the 3-Clause BSD License, which is an open source license that is recognized by the Open Source Initiative. Importantly, the new version can be executed via a command line interface (i.e., it does not require the use of a Graphical User Interface). Moreover, the new version is written as a stand-alone program, rather than an extension to AVOGADRO.
Online data analysis using Web GDL
NASA Astrophysics Data System (ADS)
Jaffey, A.; Cheung, M.; Kobashi, A.
2008-12-01
The ever improving capability of modern astronomical instruments to capture data at high spatial resolution and cadence is opening up unprecedented opportunities for scientific discovery. When data sets become so large that they cannot be easily transferred over the internet, the researcher must find alternative ways to perform data analysis. One strategy is to bring the data analysis code to where the data resides. We present Web GDL, an implementation of GDL (GNU Data Language, open source incremental compiler compatible with IDL) that allows users to perform interactive data analysis within a web browser.
NASA Astrophysics Data System (ADS)
Jarecka, D.; Arabas, S.; Fijalkowski, M.; Gaynor, A.
2012-04-01
The language of choice for numerical modelling in geoscience has long been Fortran. A choice of a particular language and coding paradigm comes with different set of tradeoffs such as that between performance, ease of use (and ease of abuse), code clarity, maintainability and reusability, availability of open source compilers, debugging tools, adequate external libraries and parallelisation mechanisms. The availability of trained personnel and the scale and activeness of the developer community is of importance as well. We present a short comparison study aimed at identification and quantification of these tradeoffs for a particular example of an object oriented implementation of a parallel 2D-advection-equation solver in Python/NumPy, C++/Blitz++ and modern Fortran. The main angles of comparison will be complexity of implementation, performance of various compilers or interpreters and characterisation of the "added value" gained by a particular choice of the language. The choice of the numerical problem is dictated by the aim to make the comparison useful and meaningful to geoscientists. Python is chosen as a language that traditionally is associated with ease of use, elegant syntax but limited performance. C++ is chosen for its traditional association with high performance but even higher complexity and syntax obscurity. Fortran is included in the comparison for its widespread use in geoscience often attributed to its performance. We confront the validity of these traditional views. We point out how the usability of a particular language in geoscience depends on the characteristics of the language itself and the availability of pre-existing software libraries (e.g. NumPy, SciPy, PyNGL, PyNIO, MPI4Py for Python and Blitz++, Boost.Units, Boost.MPI for C++). Having in mind the limited complexity of the considered numerical problem, we present a tentative comparison of performance of the three implementations with different open source compilers including CPython and PyPy, Clang++ and GNU g++, and GNU gfortran.
MinFinder: Locating all the local minima of a function
NASA Astrophysics Data System (ADS)
Tsoulos, Ioannis G.; Lagaris, Isaac E.
2006-01-01
A new stochastic clustering algorithm is introduced that aims to locate all the local minima of a multidimensional continuous and differentiable function inside a bounded domain. The accompanying software (MinFinder) is written in ANSI C++. However, the user may code his objective function either in C++, C or Fortran 77. We compare the performance of this new method to the performance of Multistart and Topographical Multilevel Single Linkage Clustering on a set of benchmark problems. Program summaryTitle of program:MinFinder Catalogue identifier:ADWU Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADWU Program obtainable from: CPC Program Library, Queen's University of Belfast, N. Ireland Computer for which the program is designed and others on which is has been tested:The tool is designed to be portable in all systems running the GNU C++ compiler Installation:University of Ioannina, Greece Programming language used:GNU-C++, GNU-C, GNU Fortran 77 Memory required to execute with typical data:200 KB No. of bits in a word:32 No. of processors used:1 Has the code been vectorized or parallelized?:no No. of lines in distributed program, including test data, etc.:5797 No. of bytes in distributed program, including test data, etc.:588 121 Distribution format:gzipped tar file Nature of the physical problem:A multitude of problems in science and engineering are often reduced to minimizing a function of many variables. There are instances that a local optimum does not correspond to the desired physical solution and hence the search for a better solution is required. Local optimization techniques can be trapped in any local minimum. Global optimization is then the appropriate tool. For example, solving a non-linear system of equations via optimization, employing a "least squares" type of objective, one may encounter many local minima that do not correspond to solutions, i.e. they are far from zero. Method of solution:Using a uniform pdf, points are sampled from the rectangular search domain. A clustering technique, based on a typical distance and a gradient criterion, is used to decide from which points a local search should be started. The employed local procedure is a BFGS version due to Powell. Further searching is terminated when all the local minima inside the search domain are thought to be found. This is accomplished via the double-box rule. Typical running time:Depending on the objective function
GLoBES: General Long Baseline Experiment Simulator
NASA Astrophysics Data System (ADS)
Huber, Patrick; Kopp, Joachim; Lindner, Manfred; Rolinec, Mark; Winter, Walter
2007-09-01
GLoBES (General Long Baseline Experiment Simulator) is a flexible software package to simulate neutrino oscillation long baseline and reactor experiments. On the one hand, it contains a comprehensive abstract experiment definition language (AEDL), which allows to describe most classes of long baseline experiments at an abstract level. On the other hand, it provides a C-library to process the experiment information in order to obtain oscillation probabilities, rate vectors, and Δχ-values. Currently, GLoBES is available for GNU/Linux. Since the source code is included, the port to other operating systems is in principle possible. GLoBES is an open source code that has previously been described in Computer Physics Communications 167 (2005) 195 and in Ref. [7]). The source code and a comprehensive User Manual for GLoBES v3.0.8 is now available from the CPC Program Library as described in the Program Summary below. The home of GLobES is http://www.mpi-hd.mpg.de/~globes/. Program summaryProgram title: GLoBES version 3.0.8 Catalogue identifier: ADZI_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADZI_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 145 295 No. of bytes in distributed program, including test data, etc.: 1 811 892 Distribution format: tar.gz Programming language: C Computer: GLoBES builds and installs on 32bit and 64bit Linux systems Operating system: 32bit or 64bit Linux RAM: Typically a few MBs Classification: 11.1, 11.7, 11.10 External routines: GSL—The GNU Scientific Library, www.gnu.org/software/gsl/ Nature of problem: Neutrino oscillations are now established as the leading flavor transition mechanism for neutrinos. In a long history of many experiments, see, e.g., [1], two oscillation frequencies have been identified: The fast atmospheric and the slow solar oscillations, which are driven by the respective mass squared differences. In addition, there could be interference effects between these two oscillations, provided that the coupling given by the small mixing angle θ is large enough. Such interference effects include, for example, leptonic CP violation. In order to test the unknown oscillation parameters, i.e. the mixing angle θ, the leptonic CP phase, and the neutrino mass hierarchy, new long-baseline and reactor experiments are proposed. These experiments send an artificial neutrino beam to a detector, or detect the neutrinos produced by a nuclear fission reactor. However, the presence of multiple solutions which are intrinsic to neutrino oscillation probabilities [2-5] affect these measurements. Thus optimization strategies are required which maximally exploit complementarity between experiments. Therefore, a modern, complete experiment simulation and analysis tool does not only need to have a highly accurate beam and detector simulation, but also powerful means to analyze correlations and degeneracies, especially for the combination of several experiments. The GLoBES software package is such a tool [6,7]. Solution method: GLoBES is a flexible software tool to simulate and analyze neutrino oscillation long-baseline and reactor experiments using a complete three-flavor description. On the one hand, it contains a comprehensive abstract experiment definition language (AEDL), which makes it possible to describe most classes of long baseline and reactor experiments at an abstract level. On the other hand, it provides a C-library to process the experiment information in order to obtain oscillation probabilities, rate vectors, and Δχ-values. In addition, it provides a binary program to test experiment definitions very quickly, before they are used by the application software. Restrictions: Currently restricted to discrete sets of sources and detectors. For example, the simulation of an atmospheric neutrino flux is not supported. Unusual features: Clear separation between experiment description and the simulation software. Additional comments: To find information on the latest version of the software and user manual, please check the author's web site, http://www.mpi-hd.mpg.de/~globes Running time: The examples included in the distribution take only a few minutes to complete. More sophisticated problems can take up to several days. References [1] V. Barger, D. Marfatia, K. Whisnant, Int. J. Mod. Phys. E 12 (2003) 569, hep-ph/0308123, and references therein. [2] G.L. Fogli, E. Lisi, Phys. Rev. D 54 (1996) 3667, hep-ph/9604415. [3] J. Burguet-Castell, M.B. Gavela, J.J. Gomez-Cadenas, P. Hernandez, O. Mena, Nucl. Phys. B 608 (2001) 301, hep-ph/0103258. [4] H. Minakata, H. Nunokawa, JHEP 0110 (2001) 001, hep-ph/0108085. [5] V. Barger, D. Marfatia, K. Whisnant, Phys. Rev. D 65 (2002) 073023, hep-ph/0112119. [6] P. Huber, M. Lindner, W. Winter, Comput. Phys. Commun. 167 (2005) 195. [7] P. Huber, J. Kopp, M. Lindner, M. Rolinec, W. Winter, Comput. Phys. Commun. 177 (2007) 432.
Scientific Digital Libraries, Interoperability, and Ontologies
NASA Technical Reports Server (NTRS)
Hughes, J. Steven; Crichton, Daniel J.; Mattmann, Chris A.
2009-01-01
Scientific digital libraries serve complex and evolving research communities. Justifications for the development of scientific digital libraries include the desire to preserve science data and the promises of information interconnectedness, correlative science, and system interoperability. Shared ontologies are fundamental to fulfilling these promises. We present a tool framework, some informal principles, and several case studies where shared ontologies are used to guide the implementation of scientific digital libraries. The tool framework, based on an ontology modeling tool, was configured to develop, manage, and keep shared ontologies relevant within changing domains and to promote the interoperability, interconnectedness, and correlation desired by scientists.
NASA Astrophysics Data System (ADS)
Foucar, Lutz; Barty, Anton; Coppola, Nicola; Hartmann, Robert; Holl, Peter; Hoppe, Uwe; Kassemeyer, Stephan; Kimmel, Nils; Küpper, Jochen; Scholz, Mirko; Techert, Simone; White, Thomas A.; Strüder, Lothar; Ullrich, Joachim
2012-10-01
The Max Planck Advanced Study Group (ASG) at the Center for Free Electron Laser Science (CFEL) has created the CFEL-ASG Software Suite CASS to view, process and analyse multi-parameter experimental data acquired at Free Electron Lasers (FELs) using the CFEL-ASG Multi Purpose (CAMP) instrument Strüder et al. (2010) [6]. The software is based on a modular design so that it can be adjusted to accommodate the needs of all the various experiments that are conducted with the CAMP instrument. In fact, this allows the use of the software in all experiments where multiple detectors are involved. One of the key aspects of CASS is that it can be used either 'on-line', using a live data stream from the free-electron laser facility's data acquisition system to guide the experiment, and 'off-line', on data acquired from a previous experiment which has been saved to file. Program summary Program title: CASS Catalogue identifier: AEMP_v1_0 Program summary URL: http://cpc.cs.qub.ac.uk/summaries/AEMP_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public Licence, version 3 No. of lines in distributed program, including test data, etc.: 167073 No. of bytes in distributed program, including test data, etc.: 1065056 Distribution format: tar.gz Programming language: C++. Computer: Intel x86-64. Operating system: GNU/Linux (for information about restrictions see outlook). RAM: >8 GB Classification: 2.3, 3, 15, 16.4. External routines: Qt-Framework[1], SOAP[2], (optional HDF5[3], VIGRA[4], ROOT[5], QWT[6]) Nature of problem: Analysis and visualisation of scientific data acquired at Free-Electron-Lasers Solution method: Generalise data access and storage so that a variety of small programming pieces can be linked to form a complex analysis chain. Unusual features: Complex analysis chains can be built without recompiling the program Additional comments: An updated extensive documentation of CASS is available at [7]. Running time: Depending on the data size and complexity of analysis algorithms. References: [1] http://qt.nokia.com [2] http://www.cs.fsu.edu/~engelen/soap.html [3] http://www.hdfgroup.org/HDF5/ [4] http://hci.iwr.uni-heidelberg.de/vigra/ [5] http://root.cern.ch [6] http://qwt.sourceforge.net/ [7] http://www.mpi-hd.mpg.de/personalhomes/gitasg/cass
Scientific and Technical Information in Canada, Part II, Chapter 6: Libraries.
ERIC Educational Resources Information Center
Science Council of Canada, Ottawa (Ontario).
The four types of libraries - special, academic, public, and school - collectively constitute a large part of the knowledge available in Canada. Consequently, a scientific and technical information network will be heavily dependent on these established library collections. Communications across the "type of library" boundaries is…
Scientific Library Offers New Training Options | Poster
The Scientific Library is expanding its current training opportunities by offering webinars, allowing employees to take advantage of trainings from the comfort of their own offices. Due to the nature of their work, some employees find it inconvenient to attend in-person training classes; others simply prefer to use their own computers. The Scientific Library has been
Digital beacon receiver for ionospheric TEC measurement developed with GNU Radio
NASA Astrophysics Data System (ADS)
Yamamoto, M.
2008-11-01
A simple digital receiver named GNU Radio Beacon Receiver (GRBR) was developed for the satellite-ground beacon experiment to measure the ionospheric total electron content (TEC). The open-source software toolkit for the software defined radio, GNU Radio, is utilized to realize the basic function of the receiver and perform fast signal processing. The software is written in Python for a LINUX PC. The open-source hardware called Universal Software Radio Peripheral (USRP), which best matches the GNU Radio, is used as a front-end to acquire the satellite beacon signals of 150 and 400 MHz. The first experiment was successful as results from GRBR showed very good agreement to those from the co-located analog beacon receiver. Detailed design information and software codes are open at the URL http://www.rish.kyoto-u.ac.jp/digitalbeacon/.
GALARIO: a GPU accelerated library for analysing radio interferometer observations
NASA Astrophysics Data System (ADS)
Tazzari, Marco; Beaujean, Frederik; Testi, Leonardo
2018-06-01
We present GALARIO, a computational library that exploits the power of modern graphical processing units (GPUs) to accelerate the analysis of observations from radio interferometers like Atacama Large Millimeter and sub-millimeter Array or the Karl G. Jansky Very Large Array. GALARIO speeds up the computation of synthetic visibilities from a generic 2D model image or a radial brightness profile (for axisymmetric sources). On a GPU, GALARIO is 150 faster than standard PYTHON and 10 times faster than serial C++ code on a CPU. Highly modular, easy to use, and to adopt in existing code, GALARIO comes as two compiled libraries, one for Nvidia GPUs and one for multicore CPUs, where both have the same functions with identical interfaces. GALARIO comes with PYTHON bindings but can also be directly used in C or C++. The versatility and the speed of GALARIO open new analysis pathways that otherwise would be prohibitively time consuming, e.g. fitting high-resolution observations of large number of objects, or entire spectral cubes of molecular gas emission. It is a general tool that can be applied to any field that uses radio interferometer observations. The source code is available online at http://github.com/mtazzari/galario under the open source GNU Lesser General Public License v3.
GenomeD3Plot: a library for rich, interactive visualizations of genomic data in web applications.
Laird, Matthew R; Langille, Morgan G I; Brinkman, Fiona S L
2015-10-15
A simple static image of genomes and associated metadata is very limiting, as researchers expect rich, interactive tools similar to the web applications found in the post-Web 2.0 world. GenomeD3Plot is a light weight visualization library written in javascript using the D3 library. GenomeD3Plot provides a rich API to allow the rapid visualization of complex genomic data using a convenient standards based JSON configuration file. When integrated into existing web services GenomeD3Plot allows researchers to interact with data, dynamically alter the view, or even resize or reposition the visualization in their browser window. In addition GenomeD3Plot has built in functionality to export any resulting genome visualization in PNG or SVG format for easy inclusion in manuscripts or presentations. GenomeD3Plot is being utilized in the recently released Islandviewer 3 (www.pathogenomics.sfu.ca/islandviewer/) to visualize predicted genomic islands with other genome annotation data. However, its features enable it to be more widely applicable for dynamic visualization of genomic data in general. GenomeD3Plot is licensed under the GNU-GPL v3 at https://github.com/brinkmanlab/GenomeD3Plot/. brinkman@sfu.ca. © The Author 2015. Published by Oxford University Press.
Sharing electronic structure and crystallographic data with ETSF_IO
NASA Astrophysics Data System (ADS)
Caliste, D.; Pouillon, Y.; Verstraete, M. J.; Olevano, V.; Gonze, X.
2008-11-01
We present a library of routines whose main goal is to read and write exchangeable files (NetCDF file format) storing electronic structure and crystallographic information. It is based on the specification agreed inside the European Theoretical Spectroscopy Facility (ETSF). Accordingly, this library is nicknamed ETSF_IO. The purpose of this article is to give both an overview of the ETSF_IO library and a closer look at its usage. ETSF_IO is designed to be robust and easy to use, close to Fortran read and write routines. To facilitate its adoption, a complete documentation of the input and output arguments of the routines is available in the package, as well as six tutorials explaining in detail various possible uses of the library routines. Catalogue identifier: AEBG_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEBG_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Gnu Lesser General Public License No. of lines in distributed program, including test data, etc.: 63 156 No. of bytes in distributed program, including test data, etc.: 363 390 Distribution format: tar.gz Programming language: Fortran 95 Computer: All systems with a Fortran95 compiler Operating system: All systems with a Fortran95 compiler Classification: 7.3, 8 External routines: NetCDF, http://www.unidata.ucar.edu/software/netcdf Nature of problem: Store and exchange electronic structure data and crystallographic data independently of the computational platform, language and generating software Solution method: Implement a library based both on NetCDF file format and an open specification (http://etsf.eu/index.php?page=standardization)
Summer Events at the Scientific Library | Poster
If it’s summer, then it’s time for Jeopardy and videos at the National Cancer Institute at Frederick! Traditionally, the Scientific Library has hosted a Science Jeopardy Tournament in July and offered a Summer Video Series in June, July, and August. This year will be no different, as the Scientific Library will host the 11th Annual Student Science Jeopardy Tournament and a
The Structure of the Library Market for Scientific Journals: The Case of Chemistry.
ERIC Educational Resources Information Center
Bensman, Stephen J.
1996-01-01
An analysis of price and scientific value of chemistry journals concluded that scientific value does not play a role in the pricing of scientific journals and that consequently little relationship exists between scientific value and the prices charged libraries for journals. Describes a software package, Serials Evaluator, being developed at…
CLIPS++: Embedding CLIPS into C++
NASA Technical Reports Server (NTRS)
Obermeyer, Lance; Miranker, Daniel P.
1994-01-01
This paper describes a set of C++ extensions to the CLIPS language and their embodiment in CLIPS++. These extensions and the implementation approach of CLIPS++ provide a new level of embeddability with C and C++. These extensions are a C++ include statement and a defcontainer construct; (include (c++-header-file.h)) and (defcontainer (c++-type)). The include construct allows C++ functions to be embedded in both the LHS and RHS of CLIPS rules. The header file in an include construct is the same header file the programmer uses for his/her own C++ code, independent of CLIPS. The defcontainer construct allows the inference engine to treat C++ class instances as CLIPS deftemplate facts. Consequently existing C++ class libraries may be transparently imported into CLIPS. These C++ types may use advanced features like inheritance, virtual functions, and templates. The implementation has been tested with several class libraries, including Rogue Wave Software's Tools.h++, GNU's libg++, and USL's C++ Standard Components. The execution speed of CLIPS++ has been determined to be 5 to 700 times the execution speed of CLIPS 6.0 (10 to 20X typical).
NASA Astrophysics Data System (ADS)
Lundberg, J.; Conrad, J.; Rolke, W.; Lopez, A.
2010-03-01
A C++ class was written for the calculation of frequentist confidence intervals using the profile likelihood method. Seven combinations of Binomial, Gaussian, Poissonian and Binomial uncertainties are implemented. The package provides routines for the calculation of upper and lower limits, sensitivity and related properties. It also supports hypothesis tests which take uncertainties into account. It can be used in compiled C++ code, in Python or interactively via the ROOT analysis framework. Program summaryProgram title: TRolke version 2.0 Catalogue identifier: AEFT_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEFT_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: MIT license No. of lines in distributed program, including test data, etc.: 3431 No. of bytes in distributed program, including test data, etc.: 21 789 Distribution format: tar.gz Programming language: ISO C++. Computer: Unix, GNU/Linux, Mac. Operating system: Linux 2.6 (Scientific Linux 4 and 5, Ubuntu 8.10), Darwin 9.0 (Mac-OS X 10.5.8). RAM:˜20 MB Classification: 14.13. External routines: ROOT ( http://root.cern.ch/drupal/) Nature of problem: The problem is to calculate a frequentist confidence interval on the parameter of a Poisson process with statistical or systematic uncertainties in signal efficiency or background. Solution method: Profile likelihood method, Analytical Running time:<10 seconds per extracted limit.
Summer Events at the Scientific Library | Poster
If it’s summer, then it’s time for Jeopardy and videos at the National Cancer Institute at Frederick! Traditionally, the Scientific Library has hosted a Science Jeopardy Tournament in July and offered a Summer Video Series in June, July, and August. This year will be no different, as the Scientific Library will host the 11th Annual Student Science Jeopardy Tournament and a six-week series of film screenings.
NCI at Frederick Scientific Library Reintroduces Scientific Publications Database | Poster
A 20-year-old database of scientific publications by NCI at Frederick, FNLCR, and affiliated employees has gotten a significant facelift. Maintained by the Scientific Library, the redesigned database—which is linked from each of the Scientific Library’s web pages—offers features that were not available in previous versions, such as additional search limits and non-traditional
[GNU Pattern: open source pattern hunter for biological sequences based on SPLASH algorithm].
Xu, Ying; Li, Yi-xue; Kong, Xiang-yin
2005-06-01
To construct a high performance open source software engine based on IBM SPLASH algorithm for later research on pattern discovery. Gpat, which is based on SPLASH algorithm, was developed by using open source software. GNU Pattern (Gpat) software was developped, which efficiently implemented the core part of SPLASH algorithm. Full source code of Gpat was also available for other researchers to modify the program under the GNU license. Gpat is a successful implementation of SPLASH algorithm and can be used as a basic framework for later research on pattern recognition in biological sequences.
Scientific and Technical Libraries in Kentucky.
ERIC Educational Resources Information Center
Powell, Russell H.; Gleim, David E.
Based on initial questionnaires, plus followup contacts and interviews, this survey documents for the first time the holdings, rates of growth, and information resources available at 72 of Kentucky's scientific and technical libraries. Included are library book collections that emphasize the business, economic, biological, physical, medical, and…
NCI at Frederick Scientific Library Reintroduces Scientific Publications Database | Poster
A 20-year-old database of scientific publications by NCI at Frederick, FNLCR, and affiliated employees has gotten a significant facelift. Maintained by the Scientific Library, the redesigned database—which is linked from each of the Scientific Library’s web pages—offers features that were not available in previous versions, such as additional search limits and non-traditional metrics for scholarly and scientific publishing known as altmetrics.
Space Missions: Long Term Preservation of IDL-based Software using GDL
NASA Astrophysics Data System (ADS)
Coulais, A.; Schellens, M.; Arabas, S.; Lenoir, M.; Noreskal, L.; Erard, S.
2012-09-01
GNU Data Language (GDL) is a free software clone of IDL, an interactive language widely used in Astronomy and space missions since decades. Proprietary status, license restrictions, price, sustainability and continuity of support for particular platforms are recurrent concerns in the Astronomy community, especially concerning space missions, which require long-term support. In this paper, we describe the key features of GDL and the main achievements from recent development work. We illustrate the maturity of GDL by presenting two examples of application: reading spectral cubes in PDS format and use of the HEALPix library. These examples support the main argument of the paper: that GDL has reached a level of maturity and usability ensuring long term preservation of analysis capabilities for numerous ground experiments and spaces missions based on IDL.
BnmrOffice: A Free Software for β-nmr Data Analysis
NASA Astrophysics Data System (ADS)
Saadaoui, Hassan
A data-analysis framework with a graphical user interface (GUI) is developed to analyze β-nmr spectra in an automated and intuitive way. This program, named BnmrOffice is written in C++ and employs the QT libraries and tools for designing the GUI, and the CERN's Minuit optimization routines for minimization. The program runs under multiple platforms, and is available for free under the terms of the GNU GPL standards. The GUI is structured in tabs to search, plot and analyze data, along other functionalities. The user can tweak the minimization options; and fit multiple data files (or runs) using single or global fitting routines with pre-defined or new models. Currently, BnmrOffice reads TRIUMF's MUD data and ASCII files, and can be extended to other formats.
Visiting Old Libraries: Scientific Books in the Religious Institutions of Early Modem Portugal.
Giurgevich, Luana
2016-08-01
Knowledge of libraries and book collecting is a preliminary task for the characterisation of scientific culture and practice. In the case of Iberia, and especially Portugal, this is still a desideratum. This paper provides a first global look at this issue. In early modem Portugal religious institutions organised impressive collections of books, by far the largest in the country These libraries not only served the religious institutions themselves, but also supplied books to lesser libraries, such as the University Library of Coimbra and the Royal Library. The Portuguese book market mirrored the purchase and selection of books made by religious congregations. This was also true for the circulation of scientific books, which depended above all on the interests, choices and cultural relations of these most peculiar book collectors.
A Web-Based Library and Algorithm System for Satellite and Airborne Image Products
2011-06-28
Sequoia Scientific, Inc., and Dr. Paul Bissett at FERI, under other 6.1/6.2 program funding. 2 A Web-Based Library And Algorithm System For...of the spectrum matching approach to inverting hyperspectral imagery created by Drs. C. Mobley ( Sequoia Scientific) and P. Bissett (FERI...algorithms developed by Sequoia Scientific and FERI. Testing and Implementation of Library This project will result in the delivery of a WeoGeo
NASA Astrophysics Data System (ADS)
Gribov, I. A.; Trigger, S. A.
2016-11-01
A large-scale self-similar crystallized phase of finite gravitationally neutral universe (GNU)—huge GNU-ball—with spherical 2D-boundary immersed into an endless empty 3D- space is considered. The main principal assumptions of this universe model are: (1) existence of stable elementary particles-antiparticles with the opposite gravitational “charges” (M+gr and M -gr), which have the same positive inertial mass M in = |M ±gr | ≥ 0 and are equally presented in the universe during all universe evolution epochs; (2) the gravitational interaction between the masses of the opposite charges” is repulsive; (3) the unbroken baryon-antibaryon symmetry; (4) M+gr-M-gr “charges” symmetry, valid for two equally presented matter-antimatter GNU-components: (a) ordinary matter (OM)-ordinary antimatter (OAM), (b) dark matter (DM)-dark antimatter (DAM). The GNU-ball is weightless crystallized dust of equally presented, mutually repulsive (OM+DM) clusters and (OAM+DAM) anticlusters. Newtonian GNU-hydrodynamics gives the observable spatial flatness and ideal Hubble flow. The GNU in the obtained large-scale self-similar crystallized phase preserves absence of the cluster-anticluster collisions and simultaneously explains the observable large-scale universe phenomena: (1) the absence of the matter-antimatter clusters annihilation, (2) the self-similar Hubble flow stability and homogeneity, (3) flatness, (4) bubble and cosmic-net structures as 3D-2D-1D decrystallization phases with decelerative (a ≤ 0) and accelerative (a ≥ 0) expansion epochs, (5) the dark energy (DE) phenomena with Λ VACUUM = 0, (6) the DE and DM fine-tuning nature and predicts (7) evaporation into isolated huge M±gr superclusters without Big Rip.
ERIC Educational Resources Information Center
Marcondes, Carlos Henrique; Sayao, Luis Fernando; Diaz, Paloma; Gibbons, Susan; Pinfield, Stephen; Kenning, Arlitsch; Edge, Karen; Yapp, L.; Witten, Ian H.
2003-01-01
Includes six articles that focus on practical uses of technologies developed from digital library research in the areas of education and scholarship reflecting the international impact of digital library research initiatives. Includes the Scientific Electronic Library Online (SciELO) (Brazil); the National Science Foundation (NSF) (US); the Joint…
Iskander, John; Bang, Gail; Stupp, Emma; Connick, Kathy; Gomez, Onnalee; Gidudu, Jane
2016-01-01
To describe scientific information usage and publication patterns of the Centers for Disease Control and Prevention (CDC) Public Health Library and Information Center patrons. Administratively collected patron usage data and aggregate data on CDC-authored publications from the CDC Library for 3 consecutive years were analyzed. The CDC Public Health Library and Information Center, which serves CDC employees nationally and internationally. Internal patrons and external users of the CDC Library. Three-year trends in full-text article publication and downloads including most common journals used for each purpose, systematic literature searches requested and completed, and subscriptions to a weekly public health current literature awareness service. From 2011 to 2013, CDC scientists published a total of 7718 articles in the peer-reviewed literature. During the same period, article downloads from the CDC Library increased 25% to more than 1.1 million, completed requests for reviews of the scientific literature increased by 34%, and electronic subscriptions to literature compilation services increased by 23%. CDC's scientific output and information use via the CDC Library are both increasing. Researchers and field staff are making greater use of literature review services and other customized information content delivery. Virtual public health library access is an increasingly important resource for the scientific practice of public health.
JUDE: An Ultraviolet Imaging Telescope pipeline
NASA Astrophysics Data System (ADS)
Murthy, J.; Rahna, P. T.; Sutaria, F.; Safonova, M.; Gudennavar, S. B.; Bubbly, S. G.
2017-07-01
The Ultraviolet Imaging Telescope (UVIT) was launched as part of the multi-wavelength Indian AstroSat mission on 28 September, 2015 into a low Earth orbit. A 6-month performance verification (PV) phase ended in March 2016, and the instrument is now in the general observing phase. UVIT operates in three channels: visible, near-ultraviolet (NUV) and far-ultraviolet (FUV), each with a choice of broad and narrow band filters, and has NUV and FUV gratings for low-resolution spectroscopy. We have written a software package (JUDE) to convert the Level 1 data from UVIT into scientifically useful photon lists and images. The routines are written in the GNU Data Language (GDL) and are compatible with the IDL software package. We use these programs in our own scientific work, and will continue to update the programs as we gain better understanding of the UVIT instrument and its performance. We have released JUDE under an Apache License.
Frame Decoder for Consultative Committee for Space Data Systems (CCSDS)
NASA Technical Reports Server (NTRS)
Reyes, Miguel A. De Jesus
2014-01-01
GNU Radio is a free and open source development toolkit that provides signal processing to implement software radios. It can be used with low-cost external RF hardware to create software defined radios, or without hardware in a simulation-like environment. GNU Radio applications are primarily written in Python and C++. The Universal Software Radio Peripheral (USRP) is a computer-hosted software radio designed by Ettus Research. The USRP connects to a host computer via high-speed Gigabit Ethernet. Using the open source Universal Hardware Driver (UHD), we can run GNU Radio applications using the USRP. An SDR is a "radio in which some or all physical layer functions are software defined"(IEEE Definition). A radio is any kind of device that wirelessly transmits or receives radio frequency (RF) signals in the radio frequency. An SDR is a radio communication system where components that have been typically implemented in hardware are implemented in software. GNU Radio has a generic packet decoder block that is not optimized for CCSDS frames. Using this generic packet decoder will add bytes to the CCSDS frames and will not permit for bit error correction using Reed-Solomon. The CCSDS frames consist of 256 bytes, including a 32-bit sync marker (0x1ACFFC1D). This frames are generated by the Space Data Processor and GNU Radio will perform the modulation and framing operations, including frame synchronization.
Component Technology for High-Performance Scientific Simulation Software
DOE Office of Scientific and Technical Information (OSTI.GOV)
Epperly, T; Kohn, S; Kumfert, G
2000-11-09
We are developing scientific software component technology to manage the complexity of modem, parallel simulation software and increase the interoperability and re-use of scientific software packages. In this paper, we describe a language interoperability tool named Babel that enables the creation and distribution of language-independent software libraries using interface definition language (IDL) techniques. We have created a scientific IDL that focuses on the unique interface description needs of scientific codes, such as complex numbers, dense multidimensional arrays, complicated data types, and parallelism. Preliminary results indicate that in addition to language interoperability, this approach provides useful tools for thinking about themore » design of modem object-oriented scientific software libraries. Finally, we also describe a web-based component repository called Alexandria that facilitates the distribution, documentation, and re-use of scientific components and libraries.« less
Mechanisation and Automation of Information Library Procedures in the USSR.
ERIC Educational Resources Information Center
Batenko, A. I.
Scientific and technical libraries represent a fundamental link in a complex information storage and retrieval system. The handling of a large volume of scientific and technical data and provision of information library services requires the utilization of computing facilities and automation equipment, and was started in the Soviet Union on a…
Scientific Library Offers New Training Options | Poster
The Scientific Library is expanding its current training opportunities by offering webinars, allowing employees to take advantage of trainings from the comfort of their own offices. Due to the nature of their work, some employees find it inconvenient to attend in-person training classes; others simply prefer to use their own computers. The Scientific Library has been experimenting with webinar sessions since 2016 and expanded the service in 2017. Now, due to the popularity of webinars, it plans to offer even more webinar training sessions.
ERIC Educational Resources Information Center
Lage, Kathryn; Losoff, Barbara; Maness, Jack
2011-01-01
Increasingly libraries are expected to play a role in scientific data curation initiatives, i.e., "the management and preservation of digital data over the long-term." This case study offers a novel approach for identifying researchers who are receptive toward library involvement in data curation. The authors interviewed researchers at…
Open source pipeline for ESPaDOnS reduction and analysis
NASA Astrophysics Data System (ADS)
Martioli, Eder; Teeple, Doug; Manset, Nadine; Devost, Daniel; Withington, Kanoa; Venne, Andre; Tannock, Megan
2012-09-01
OPERA is a Canada-France-Hawaii Telescope (CFHT) open source collaborative software project currently under development for an ESPaDOnS echelle spectro-polarimetric image reduction pipeline. OPERA is designed to be fully automated, performing calibrations and reduction, producing one-dimensional intensity and polarimetric spectra. The calibrations are performed on two-dimensional images. Spectra are extracted using an optimal extraction algorithm. While primarily designed for CFHT ESPaDOnS data, the pipeline is being written to be extensible to other echelle spectrographs. A primary design goal is to make use of fast, modern object-oriented technologies. Processing is controlled by a harness, which manages a set of processing modules, that make use of a collection of native OPERA software libraries and standard external software libraries. The harness and modules are completely parametrized by site configuration and instrument parameters. The software is open- ended, permitting users of OPERA to extend the pipeline capabilities. All these features have been designed to provide a portable infrastructure that facilitates collaborative development, code re-usability and extensibility. OPERA is free software with support for both GNU/Linux and MacOSX platforms. The pipeline is hosted on SourceForge under the name "opera-pipeline".
Two C++ Libraries for Counting Trees on a Phylogenetic Terrace.
Biczok, R; Bozsoky, P; Eisenmann, P; Ernst, J; Ribizel, T; Scholz, F; Trefzer, A; Weber, F; Hamann, M; Stamatakis, A
2018-05-08
The presence of terraces in phylogenetic tree space, that is, a potentially large number of distinct tree topologies that have exactly the same analytical likelihood score, was first described by Sanderson et al. (2011). However, popular software tools for maximum likelihood and Bayesian phylogenetic inference do not yet routinely report, if inferred phylogenies reside on a terrace, or not. We believe, this is due to the lack of an efficient library to (i) determine if a tree resides on a terrace, (ii) calculate how many trees reside on a terrace, and (iii) enumerate all trees on a terrace. In our bioinformatics practical that is set up as a programming contest we developed two efficient and independent C++ implementations of the SUPERB algorithm by Constantinescu and Sankoff (1995) for counting and enumerating trees on a terrace. Both implementations yield exactly the same results, are more than one order of magnitude faster, and require one order of magnitude less memory than a previous 3rd party python implementation. The source codes are available under GNU GPL at https://github.com/terraphast. Alexandros.Stamatakis@h-its.org. Supplementary data are available at Bioinformatics online.
FOAM: the modular adaptive optics framework
NASA Astrophysics Data System (ADS)
van Werkhoven, T. I. M.; Homs, L.; Sliepen, G.; Rodenhuis, M.; Keller, C. U.
2012-07-01
Control software for adaptive optics systems is mostly custom built and very specific in nature. We have developed FOAM, a modular adaptive optics framework for controlling and simulating adaptive optics systems in various environments. Portability is provided both for different control hardware and adaptive optics setups. To achieve this, FOAM is written in C++ and runs on standard CPUs. Furthermore we use standard Unix libraries and compilation procedures and implemented a hardware abstraction layer in FOAM. We have successfully implemented FOAM on the adaptive optics system of ExPo - a high-contrast imaging polarimeter developed at our institute - in the lab and will test it on-sky late June 2012. We also plan to implement FOAM on adaptive optics systems for microscopy and solar adaptive optics. FOAM is available* under the GNU GPL license and is free to be used by anyone.
ERIC Educational Resources Information Center
Tyshkevich, N. I.
Emphasis is placed on the role of scientific and technical libraries in the education of Soviet workers. One of the main tasks of technical libraries is to educate workers to respect their professions, and to maintain communistic attitudes towards labor. Librarians acquaint younger workers with literature on the history of their plants with…
Scientific writing and editing: a new role for the library.
Stephens, P A; Campbell, J M
1995-01-01
Traditional library instruction programs teach scientists how to find and manage information, but not how to report their research findings effectively. Since 1990, the William H. Welch Medical Library has sponsored classes on scientific writing and, since 1991, has offered a fee-based editing service for affiliates of the Johns Hopkins Medical Institutions. These programs were designed to fill an educational gap: Although formal instruction was offered to support other phases of the scientific communication process, the medical institutions had no central resource designed to help scientists develop and improve their writing skills. The establishment of such a resource at Welch has been well received by the community. Attendance at classes has grown steadily, and in 1993 a credit course on biomedical writing was added to the curriculum. The editing service, introduced in late 1991, has generated more requests for assistance than can be handled by the library's editor. This service not only extends the library's educational outreach but also generates a revenue stream. The Welch program in scientific writing and editing, or elements of it, could provide a model for other academic medical libraries interested in moving in this new direction. PMID:8547910
Creating a Canonical Scientific and Technical Information Classification System for NCSTRL+
NASA Technical Reports Server (NTRS)
Tiffany, Melissa E.; Nelson, Michael L.
1998-01-01
The purpose of this paper is to describe the new subject classification system for the NCSTRL+ project. NCSTRL+ is a canonical digital library (DL) based on the Networked Computer Science Technical Report Library (NCSTRL). The current NCSTRL+ classification system uses the NASA Scientific and Technical (STI) subject classifications, which has a bias towards the aerospace, aeronautics, and engineering disciplines. Examination of other scientific and technical information classification systems showed similar discipline-centric weaknesses. Traditional, library-oriented classification systems represented all disciplines, but were too generalized to serve the needs of a scientific and technically oriented digital library. Lack of a suitable existing classification system led to the creation of a lightweight, balanced, general classification system that allows the mapping of more specialized classification schemes into the new framework. We have developed the following classification system to give equal weight to all STI disciplines, while being compact and lightweight.
Controlador para un Reloj GPS de Referencia en el Protocolo NTP
NASA Astrophysics Data System (ADS)
Hauscarriaga, F.; Bareilles, F. A.
The synchronization between computers in a local network plays a very important role on enviroments similar to IAR. Calculations for exact time are needed before, during and after an observation. For this purpose the IAR's GNU/Linux Software Development Team implemented a driver inside NTP protocol (an internet standard for time synchronization of computers) for a GPS receiver acquired a few years ago by IAR, which did not have support in such protocol. Today our Institute has a stable and reliable time base synchronized to atomic clocks on board GPS Satellites according to computers's synchronization standard, offering precise time services to all scientific community and particularly to the University of La Plata. FULL TEXT IN SPANISH
Scientific Library to Hold Annual Winter Video Series | Poster
The Scientific Library is getting ready for its Annual Winter Video Series. Beginning on Monday, January 9 and concluding on Friday, February 17, the Winter Video Series will consist of two different PBS programs, each with three episodes.
Scrapbooking at the Scientific Library | Poster
This year, the Scientific Library marks the 25th anniversary of its NCI at Frederick scrapbook, the pages of which tell the story of a quarter century of cancer research and engagement with both the Frederick community and the world.
ERIC Educational Resources Information Center
International Federation of Library Associations, The Hague (Netherlands).
Papers on Third World academic, research, and medical libraries and their role in scientific and technical information transfer, which were presented at the 1983 UNESCO/IFLA (United Nations Educational, Scientific, and Cultural Organization/International Federation of Library Associations) seminar, include: (1) "Development of Effective…
ERIC Educational Resources Information Center
International Federation of Library Associations, The Hague (Netherlands).
Papers on scientific/technical information and libraries presented at the 1984 IFLA general conference include: (1) "Library Ethics and the Special Library Network in Science and Technology" (Dieter Schmidmaier, East Germany); (2) "The Dissemination of Patent Information by Libraries: An Example Demonstrating the Necessity of…
lsjk—a C++ library for arbitrary-precision numeric evaluation of the generalized log-sine functions
NASA Astrophysics Data System (ADS)
Kalmykov, M. Yu.; Sheplyakov, A.
2005-10-01
Generalized log-sine functions Lsj(k)(θ) appear in higher order ɛ-expansion of different Feynman diagrams. We present an algorithm for the numerical evaluation of these functions for real arguments. This algorithm is implemented as a C++ library with arbitrary-precision arithmetics for integer 0⩽k⩽9 and j⩾2. Some new relations and representations of the generalized log-sine functions are given. Program summaryTitle of program:lsjk Catalogue number:ADVS Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADVS Program obtained from: CPC Program Library, Queen's University of Belfast, N. Ireland Licensing terms: GNU General Public License Computers:all Operating systems:POSIX Programming language:C++ Memory required to execute:Depending on the complexity of the problem, at least 32 MB RAM recommended No. of lines in distributed program, including testing data, etc.:41 975 No. of bytes in distributed program, including testing data, etc.:309 156 Distribution format:tar.gz Other programs called:The CLN library for arbitrary-precision arithmetics is required at version 1.1.5 or greater External files needed:none Nature of the physical problem:Numerical evaluation of the generalized log-sine functions for real argument in the region 0<θ<π. These functions appear in Feynman integrals Method of solution:Series representation for the real argument in the region 0<θ<π Restriction on the complexity of the problem:Limited up to Lsj(9)(θ), and j is an arbitrary integer number. Thus, all function up to the weight 12 in the region 0<θ<π can be evaluated. The algorithm can be extended up to higher values of k(k>9) without modification Typical running time:Depending on the complexity of problem. See text below.
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
The Library Services Alliance is a unique multi-type library consortium committed to resource sharing. As a voluntary association of university and governmental laboratory libraries supporting scientific research, the Alliance has become a leader in New Mexico in using cooperative ventures to cost-effectively expand resources supporting their scientific and technical communities. During 1994, the alliance continued to expand on their strategic planning foundation to enhance access to research information for the scientific and technical communities. Significant progress was made in facilitating easy access to the on-line catalogs of member libraries via connections through the Internet. Access to Alliance resources is nowmore » available via the World Wide Web and Gopher, as well as links to other databases and electronic information. This report highlights the accomplishments of the Alliance during calendar year 1994.« less
Jflow: a workflow management system for web applications.
Mariette, Jérôme; Escudié, Frédéric; Bardou, Philippe; Nabihoudine, Ibouniyamine; Noirot, Céline; Trotard, Marie-Stéphane; Gaspin, Christine; Klopp, Christophe
2016-02-01
Biologists produce large data sets and are in demand of rich and simple web portals in which they can upload and analyze their files. Providing such tools requires to mask the complexity induced by the needed High Performance Computing (HPC) environment. The connection between interface and computing infrastructure is usually specific to each portal. With Jflow, we introduce a Workflow Management System (WMS), composed of jQuery plug-ins which can easily be embedded in any web application and a Python library providing all requested features to setup, run and monitor workflows. Jflow is available under the GNU General Public License (GPL) at http://bioinfo.genotoul.fr/jflow. The package is coming with full documentation, quick start and a running test portal. Jerome.Mariette@toulouse.inra.fr. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
High performance geospatial and climate data visualization using GeoJS
NASA Astrophysics Data System (ADS)
Chaudhary, A.; Beezley, J. D.
2015-12-01
GeoJS (https://github.com/OpenGeoscience/geojs) is an open-source library developed to support interactive scientific and geospatial visualization of climate and earth science datasets in a web environment. GeoJS has a convenient application programming interface (API) that enables users to harness the fast performance of WebGL and Canvas 2D APIs with sophisticated Scalable Vector Graphics (SVG) features in a consistent and convenient manner. We started the project in response to the need for an open-source JavaScript library that can combine traditional geographic information systems (GIS) and scientific visualization on the web. Many libraries, some of which are open source, support mapping or other GIS capabilities, but lack the features required to visualize scientific and other geospatial datasets. For instance, such libraries are not be capable of rendering climate plots from NetCDF files, and some libraries are limited in regards to geoinformatics (infovis in a geospatial environment). While libraries such as d3.js are extremely powerful for these kinds of plots, in order to integrate them into other GIS libraries, the construction of geoinformatics visualizations must be completed manually and separately, or the code must somehow be mixed in an unintuitive way.We developed GeoJS with the following motivations:• To create an open-source geovisualization and GIS library that combines scientific visualization with GIS and informatics• To develop an extensible library that can combine data from multiple sources and render them using multiple backends• To build a library that works well with existing scientific visualizations tools such as VTKWe have successfully deployed GeoJS-based applications for multiple domains across various projects. The ClimatePipes project funded by the Department of Energy, for example, used GeoJS to visualize NetCDF datasets from climate data archives. Other projects built visualizations using GeoJS for interactively exploring data and analysis regarding 1) the human trafficking domain, 2) New York City taxi drop-offs and pick-ups, and 3) the Ebola outbreak. GeoJS supports advanced visualization features such as picking and selecting, as well as clustering. It also supports 2D contour plots, vector plots, heat maps, and geospatial graphs.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Curtis, Darren S.; Peterson, Elena S.; Oehmen, Chris S.
2008-05-04
This work presents the ScalaBLAST Web Application (SWA), a web based application implemented using the PHP script language, MySQL DBMS, and Apache web server under a GNU/Linux platform. SWA is an application built as part of the Data Intensive Computer for Complex Biological Systems (DICCBS) project at the Pacific Northwest National Laboratory (PNNL). SWA delivers accelerated throughput of bioinformatics analysis via high-performance computing through a convenient, easy-to-use web interface. This approach greatly enhances emerging fields of study in biology such as ontology-based homology, and multiple whole genome comparisons which, in the absence of a tool like SWA, require a heroicmore » effort to overcome the computational bottleneck associated with genome analysis. The current version of SWA includes a user account management system, a web based user interface, and a backend process that generates the files necessary for the Internet scientific community to submit a ScalaBLAST parallel processing job on a dedicated cluster.« less
A Constant Envelope OFDM Implementation on GNU Radio
2015-02-02
more advanced schemes like Decision Feedback Equalization or Turbo Equalization must be implemented to avoid the noise enhancement that all linear...block is coded in C++, and uses the phase unwrapping algorithm similar to MATLABs unwrap() function. To avoid false wraps propagating throughout the...outperform the real-time GNU radio implementation at higher SNR’s. While the unequalized experiment with the Matlab processor usually stayed within 5
The Astronomy Collections: From the Project to the Laboratory
NASA Astrophysics Data System (ADS)
Bobis, L.
2015-04-01
Within some astronomical libraries, just as it is with other libraries, there are collections we might refer to as being in "the border zone." The materials most representative of this are those that relate to an institution's heritage and history. The challenges of these patrimonial collections are scientific, legal, economic, and political. These collections establish the scientific status of their respective libraries because they extend beyond meeting the needs of astronomers: the material is important in defining the history of the field. The influence of these libraries derives from these heritage materials. From this point of view, the library is a worksite and a laboratory for librarians, project managers, and researchers.
Identifying Opportunities in Citizen Science for Academic Libraries
ERIC Educational Resources Information Center
Cohen, Cynthia M.; Cheney, Liz; Duong, Khue; Lea, Ben; Unno, Zoe Pettway
2015-01-01
Citizen science projects continue to grow in popularity, providing opportunities for nonexpert volunteers to contribute to and become personally invested in rigorous scientific research. Academic libraries, aiming to promote and provide tools and resources to master scientific and information literacy, can support these efforts. While few examples…
Acquisition of Scientific Literature in Developing Countries. 2: Malaysia.
ERIC Educational Resources Information Center
Taib, Rosna
1989-01-01
Describes the acquisition of scientific literature by academic libraries in Malaysia. The discussion covers the impact of government policies, library acquisition policies, the selection process, acquisition of special materials, the role of gifts and exchanges, and problems with customs clearance and censorship. Progress in cooperative…
Science and Technology Libraries Section. Special Libraries Division. Papers.
ERIC Educational Resources Information Center
International Federation of Library Associations, The Hague (Netherlands).
Papers on scientific/technical information and libraries, which were presented at the 1983 International Federation of Library Associations (IFLA) conference, include: (1) "Patents as Information--An Unused Resource" by Richard D. Walker (United States); (2) "Survey of the Information Services of the Library of the German Patent…
ERIC Educational Resources Information Center
Foster, Barbara
1974-01-01
Israel is sprinkled with a noteworthy representation of special libraries which run the gamut from modest kibbutz efforts to highly technical scientific and humanities libraries. A few examples are discussed here. (Author/CH)
Scientific Library Will Hold 16th Annual Book and Media Swap | Poster
The Scientific Library has begun collecting materials for the 16th Annual Book and Media Swap and will continue to do so through Tuesday, October 25. Opening day for the Swap is Wednesday, October 26, and the event will continue through Wednesday, November 30.
The moving mesh code SHADOWFAX
NASA Astrophysics Data System (ADS)
Vandenbroucke, B.; De Rijcke, S.
2016-07-01
We introduce the moving mesh code SHADOWFAX, which can be used to evolve a mixture of gas, subject to the laws of hydrodynamics and gravity, and any collisionless fluid only subject to gravity, such as cold dark matter or stars. The code is written in C++ and its source code is made available to the scientific community under the GNU Affero General Public Licence. We outline the algorithm and the design of our implementation, and demonstrate its validity through the results of a set of basic test problems, which are also part of the public version. We also compare SHADOWFAX with a number of other publicly available codes using different hydrodynamical integration schemes, illustrating the advantages and disadvantages of the moving mesh technique.
GNU Radio Sandia Utilities v. 1.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gilbert, Jacob; Knee, Peter
This software adds a data handling module to the GNU Radio (GR) software defined radio (SDR) framework as well as some general-purpose function blocks (filters, metadata control, etc). This software is useful for processing bursty RF transmissions with GR, and serves as a base for applying SDR signal processing techniques to a whole burst of data at a time, as opposed to streaming data which GR has been primarily focused around.
Web-Based Library and Algorithm System for Satellite and Airborne Image Products
2011-01-01
the spectrum matching approach to inverting hyperspectral imagery created by Drs. C. Mobley ( Sequoia Scientific) and P. Bissett (FERI). 5...matching algorithms developed by Sequoia Scientific and FERI. Testing and Implementation of Library This project will result in the delivery of a...transitioning VSW algorithms developed by Dr. Curtis D. Mobley at Sequoia Scientific, Inc., and Dr. Paul Bissett at FERI, under other 6.1/6.2 program funding.
IQM: An Extensible and Portable Open Source Application for Image and Signal Analysis in Java
Kainz, Philipp; Mayrhofer-Reinhartshuber, Michael; Ahammer, Helmut
2015-01-01
Image and signal analysis applications are substantial in scientific research. Both open source and commercial packages provide a wide range of functions for image and signal analysis, which are sometimes supported very well by the communities in the corresponding fields. Commercial software packages have the major drawback of being expensive and having undisclosed source code, which hampers extending the functionality if there is no plugin interface or similar option available. However, both variants cannot cover all possible use cases and sometimes custom developments are unavoidable, requiring open source applications. In this paper we describe IQM, a completely free, portable and open source (GNU GPLv3) image and signal analysis application written in pure Java. IQM does not depend on any natively installed libraries and is therefore runnable out-of-the-box. Currently, a continuously growing repertoire of 50 image and 16 signal analysis algorithms is provided. The modular functional architecture based on the three-tier model is described along the most important functionality. Extensibility is achieved using operator plugins, and the development of more complex workflows is provided by a Groovy script interface to the JVM. We demonstrate IQM’s image and signal processing capabilities in a proof-of-principle analysis and provide example implementations to illustrate the plugin framework and the scripting interface. IQM integrates with the popular ImageJ image processing software and is aiming at complementing functionality rather than competing with existing open source software. Machine learning can be integrated into more complex algorithms via the WEKA software package as well, enabling the development of transparent and robust methods for image and signal analysis. PMID:25612319
IQM: an extensible and portable open source application for image and signal analysis in Java.
Kainz, Philipp; Mayrhofer-Reinhartshuber, Michael; Ahammer, Helmut
2015-01-01
Image and signal analysis applications are substantial in scientific research. Both open source and commercial packages provide a wide range of functions for image and signal analysis, which are sometimes supported very well by the communities in the corresponding fields. Commercial software packages have the major drawback of being expensive and having undisclosed source code, which hampers extending the functionality if there is no plugin interface or similar option available. However, both variants cannot cover all possible use cases and sometimes custom developments are unavoidable, requiring open source applications. In this paper we describe IQM, a completely free, portable and open source (GNU GPLv3) image and signal analysis application written in pure Java. IQM does not depend on any natively installed libraries and is therefore runnable out-of-the-box. Currently, a continuously growing repertoire of 50 image and 16 signal analysis algorithms is provided. The modular functional architecture based on the three-tier model is described along the most important functionality. Extensibility is achieved using operator plugins, and the development of more complex workflows is provided by a Groovy script interface to the JVM. We demonstrate IQM's image and signal processing capabilities in a proof-of-principle analysis and provide example implementations to illustrate the plugin framework and the scripting interface. IQM integrates with the popular ImageJ image processing software and is aiming at complementing functionality rather than competing with existing open source software. Machine learning can be integrated into more complex algorithms via the WEKA software package as well, enabling the development of transparent and robust methods for image and signal analysis.
42 CFR 4.3 - Purpose of the Library.
Code of Federal Regulations, 2012 CFR
2012-10-01
... medical and related sciences and aid the dissemination and exchange of scientific and other information important to the progress of medicine and the public health. The Library acquires and maintains library...
42 CFR 4.3 - Purpose of the Library.
Code of Federal Regulations, 2014 CFR
2014-10-01
... medical and related sciences and aid the dissemination and exchange of scientific and other information important to the progress of medicine and the public health. The Library acquires and maintains library...
42 CFR 4.3 - Purpose of the Library.
Code of Federal Regulations, 2011 CFR
2011-10-01
... medical and related sciences and aid the dissemination and exchange of scientific and other information important to the progress of medicine and the public health. The Library acquires and maintains library...
42 CFR 4.3 - Purpose of the Library.
Code of Federal Regulations, 2013 CFR
2013-10-01
... medical and related sciences and aid the dissemination and exchange of scientific and other information important to the progress of medicine and the public health. The Library acquires and maintains library...
ERIC Educational Resources Information Center
International Federation of Library Associations, The Hague (Netherlands).
The 23 papers in this collection include papers on special libraries and miscellaneous contributed papers: (1) "Networking Potentialities and Limitations--Special Library Networks in Socialist Countries--An Overview, and the Main Ways of Perestroika in the Work of Scientific and Technical Libraries at the Present Stage" (D. Schmidmaier…
76 FR 31621 - National Library of Medicine; Notice of Closed Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2011-06-01
... DEPARTMENT OF HEALTH AND HUMAN SERVICES National Institutes of Health National Library of Medicine... personal privacy. Name of Committee: National Library of Medicine Special Emphasis Panel, T15 Review. Date..., Scientific Review Administrator, Division of Extramural Programs, National Library of Medicine, National...
150th Anniversary of the Astronomical Observatory Library of Sciences
NASA Astrophysics Data System (ADS)
Solntseva, T.
The scientific library of the Astronomical observatory of Kyiv Taras Shevchenko University is one of the oldest ones of such a type in Ukraine. Our Astronomical Observatory and its scientific library will celebrate 150th anniversary of their foundation. 900 volumes of duplicates of Olbers' private library underlay our library. These ones were acquired by Russian Academy of Sciences for Poulkovo observatory in 1841 but according to Struve's order were transmitted to Kyiv Saint Volodymyr University. These books are of great value. There are works edited during Copernicus', Kepler's, Galilei's, Newton's, Descartes' lifetime. Our library contains more than 100000 units of storage - monographs, periodical astronomical editions from the first (Astronomische Nachrichten, Astronomical journal, Monthly Notices etc.), editions of the majority of the astronomical observatories and institutions of the world, unique astronomical atlases and maps
Basic Training Programme for Library Technicians in Mexico.
ERIC Educational Resources Information Center
Vilentchuk, Lydia
The Consejo Nacional de Ciencia y Tecnologia (CONACYT), set up in 1971 to further scientific and technological advancement in Mexico, commissioned this determination of the steps necessary to promote the use of libraries and recorded scientific and technical information, and to foster the reading habits of the population. A brief overview examines…
Only for "purely scientific" institutions: the Medical Library Association's Exchange, 1898-1950s.
Connor, Jennifer J
2011-04-01
Centralized exchanges of scientific materials existed by the late nineteenth century, but they did not include medical publications. North American medical leaders therefore formed an association of institutions to run their own exchange: the Medical Library Association (MLA). After providing background to the exchange concept and the importance of institutional members for MLA, this article examines archival MLA correspondence to consider the role of its Exchange in the association's professional development before the 1950s. MLA's membership policy admitted only libraries open to the medical profession with a large number of volumes. But the correspondence of the MLA Executive Committee reveals that the committee constantly adjusted the definition of library membership: personal, public, sectarian, commercial, allied science, and the then-termed "colored" medical school libraries all were denied membership. Study of these decisions, using commercial and sectarian libraries as a focus, uncovers the primary justification for membership exclusions: a goal of operating a scientific exchange. Also, it shows that in this way, MLA shadowed policies and actions of the American Medical Association. Finally, the study suggests that the medical profession enforced its policies of exclusion through MLA, despite a proclaimed altruistic sharing of medical literature.
SCIFIO: an extensible framework to support scientific image formats.
Hiner, Mark C; Rueden, Curtis T; Eliceiri, Kevin W
2016-12-07
No gold standard exists in the world of scientific image acquisition; a proliferation of instruments each with its own proprietary data format has made out-of-the-box sharing of that data nearly impossible. In the field of light microscopy, the Bio-Formats library was designed to translate such proprietary data formats to a common, open-source schema, enabling sharing and reproduction of scientific results. While Bio-Formats has proved successful for microscopy images, the greater scientific community was lacking a domain-independent framework for format translation. SCIFIO (SCientific Image Format Input and Output) is presented as a freely available, open-source library unifying the mechanisms of reading and writing image data. The core of SCIFIO is its modular definition of formats, the design of which clearly outlines the components of image I/O to encourage extensibility, facilitated by the dynamic discovery of the SciJava plugin framework. SCIFIO is structured to support coexistence of multiple domain-specific open exchange formats, such as Bio-Formats' OME-TIFF, within a unified environment. SCIFIO is a freely available software library developed to standardize the process of reading and writing scientific image formats.
Browse without a Browser at the ATRF Library | Poster
By Robin Meckley, Contributing Writer Employees at the Advanced Technology Research Facility (ATRF) asked the Scientific Library, and the library responded: print journals are now available in the ATRF Library. Employees can now browse 20 print journals, which will rotate, with one issue available at a time for each title. The library will also temporarily display some new
EggLib: processing, analysis and simulation tools for population genetics and genomics
2012-01-01
Background With the considerable growth of available nucleotide sequence data over the last decade, integrated and flexible analytical tools have become a necessity. In particular, in the field of population genetics, there is a strong need for automated and reliable procedures to conduct repeatable and rapid polymorphism analyses, coalescent simulations, data manipulation and estimation of demographic parameters under a variety of scenarios. Results In this context, we present EggLib (Evolutionary Genetics and Genomics Library), a flexible and powerful C++/Python software package providing efficient and easy to use computational tools for sequence data management and extensive population genetic analyses on nucleotide sequence data. EggLib is a multifaceted project involving several integrated modules: an underlying computationally efficient C++ library (which can be used independently in pure C++ applications); two C++ programs; a Python package providing, among other features, a high level Python interface to the C++ library; and the egglib script which provides direct access to pre-programmed Python applications. Conclusions EggLib has been designed aiming to be both efficient and easy to use. A wide array of methods are implemented, including file format conversion, sequence alignment edition, coalescent simulations, neutrality tests and estimation of demographic parameters by Approximate Bayesian Computation (ABC). Classes implementing different demographic scenarios for ABC analyses can easily be developed by the user and included to the package. EggLib source code is distributed freely under the GNU General Public License (GPL) from its website http://egglib.sourceforge.net/ where a full documentation and a manual can also be found and downloaded. PMID:22494792
EggLib: processing, analysis and simulation tools for population genetics and genomics.
De Mita, Stéphane; Siol, Mathieu
2012-04-11
With the considerable growth of available nucleotide sequence data over the last decade, integrated and flexible analytical tools have become a necessity. In particular, in the field of population genetics, there is a strong need for automated and reliable procedures to conduct repeatable and rapid polymorphism analyses, coalescent simulations, data manipulation and estimation of demographic parameters under a variety of scenarios. In this context, we present EggLib (Evolutionary Genetics and Genomics Library), a flexible and powerful C++/Python software package providing efficient and easy to use computational tools for sequence data management and extensive population genetic analyses on nucleotide sequence data. EggLib is a multifaceted project involving several integrated modules: an underlying computationally efficient C++ library (which can be used independently in pure C++ applications); two C++ programs; a Python package providing, among other features, a high level Python interface to the C++ library; and the egglib script which provides direct access to pre-programmed Python applications. EggLib has been designed aiming to be both efficient and easy to use. A wide array of methods are implemented, including file format conversion, sequence alignment edition, coalescent simulations, neutrality tests and estimation of demographic parameters by Approximate Bayesian Computation (ABC). Classes implementing different demographic scenarios for ABC analyses can easily be developed by the user and included to the package. EggLib source code is distributed freely under the GNU General Public License (GPL) from its website http://egglib.sourceforge.net/ where a full documentation and a manual can also be found and downloaded.
Scientific and Information Activities of Institute's Libraries.
ERIC Educational Resources Information Center
Portnov, N.
An analysis of the information service of the Leningrad Institute of Railway Transportation Engineer's libraries led to the following conclusions: Optimal information service for all basic needs of the users in an institute can be ensured by the combined efforts of the libraries and their information services. University libraries should supply…
DataCite - Making data sets citable
NASA Astrophysics Data System (ADS)
Brase, J.
2013-12-01
The scientific and information communities have largely mastered the presentation of, and linkages between, text-based electronic information by assigning persistent identifiers to give scientific literature unique identities and accessibility. Knowledge, as published through scientific literature, is however often the last step in a process originating from scientific research data. Today scientists are using simulation, observational, and experimentation techniques that yield massive quantities of research data. These data are analysed, synthesised, interpreted, and the outcome of this process is generally published as a scientific article. Access to the original data as the foundation of knowledge has become an important issue throughout the world and different projects have started to find solutions. Global collaboration and scientific advances could be accelerated through broader access to scientific research data. In other words, data access could be revolutionized through the same technologies used to make textual literature accessible. The most obvious opportunity to broaden visibility of and access to research data is to integrate its access into the medium where it is most often cited: electronic textual information. Besides this opportunity, it is important, irrespective of where they are cited, for research data to have an internet identity. Since 2005, the German National Library of Science and Technology (TIB) has offered a successful Digital Object Identifier (DOI) registration service for persistent identification of research data. Since 2010 these services are offered by the global consortium DataCite, carried by 17 member organisations from 12 different countries: The German National Library of Science and Technology (TIB), the German National Library of Medicine (ZB MED), the German National Library of Economics (ZBW) and the German GESIS - Leibniz Institute for the Social Sciences. Additional European members are: The Library of the ETH Zürich in Switzerland, the Library of TU Delft, from the Netherlands, the L'Institut de l'Information Scientifique et Technique (INIST) from France, The technical Information Center of Denmark, The British Library, the Sedish National Data Service (SND), the Conferenza dei Rettori delle Università Italiane (CRUI) from Italy. North America is represented through: the California Digital Library, the Office of Scientific and Technical Information (OSTI), the Purdue University and the Canada Institute for Scientific and Technical Information (CISTI). Furthermore the Australian National Data Service (ANDS) and the National Research Council of Thailand (NRCT) are members. DataCite offers through its members DOI registration for data centers, currently over 2 million objects have been registered with a DOI name and are available through a central search portal at http://search.datacite.org . Based on the DOI registration DataCite offers a variety of services such as a detailed statistic portal of the number of DOI names registered and resolved (http://stats.datacite.org). In June 2012 DataCite and the STM association (http://www.stm-assoc.org) signed a joint statement to encourage publishers and data centers to link articles and underlying data (http://www.datacite.org/node/65 )
50(th) Anniversary of the Central Dental Library of School of Dental Medicine University of Zagreb.
Borić, Vesna
2014-12-01
Libraries have an exceptional place in the history, culture, education and scientific life of a nation. They collect all aspects of our linguistics and literacy, all out theoretical assumptions as well as all the results of experience and practice. The importance of a library is not mirrored only in the national and historical role and heritage, but in a more permanent, informational role, since a modern library must, above all, be an effective information system. Since a library of a university operates as a part of its matrix, it is easily shadowed by other forms of educational and scientific infrastructure. 50(th) anniversary of the Central Dental Library of the School of Dental Medicine University of Zagreb is an excellent opportunity to make a call to the institution and public to its unique and irreplaceable role.
"A Scientific Library of Some Value": An Early History of the Australian Museum Library
ERIC Educational Resources Information Center
Stephens, Matthew
2007-01-01
The Australian Museum, Sydney, is Australia's oldest museum, internationally recognised for its longstanding scientific contributions. Less well-known is the Museum's fine collection of monographs and journals relating to natural history and anthropology, which has been used to support the work of Museum staff and external enquirers since the late…
ERIC Educational Resources Information Center
Alexander, Jennifer K.; Pradenas, Lorena; Parada, Victor; Scherer, Robert F.
2012-01-01
Access to published research for knowledge creation and education in the administrative science disciplines in South America has been enhanced since the introduction of the Scientific Electronic Library Online (SciELO). Although SciELO has been available as an online journal indexing and publication service since 1998, there have been no…
Evaluation of HITRAN 2012 H2O linelist
NASA Astrophysics Data System (ADS)
Toon, Geoffrey C.
2014-06-01
The HITRAN 2012 H2O linelist has been evaluated in spectral regions used for ground-based remote sensing, such as the NDACC and TCCON networks. Both atmospheric and laboratory spectra have been used in the evaluation, which covers selected regions in the mid-IR and Near-IR. Results are compared with some other linelists. as part of the GNU EPrints system
Rapid Prototyping of Application Specific Signal Processors (RASSP)
1993-12-23
Compilers 2-9 - Cadre Teamwork 2-13 - CodeCenter (Centerline) 2-15 - dbx/dbxtool (UNIXm) 2-17 - Falcon (Mentor) 2-19 - FrameMaker (Frame Tech) 2-21 - gprof...UNIXm C debuggers Falcon Mentor ECAD Framework FrameMaker Frame Tech Word Processing gcc GNU CIC++ compiler gprof GNU Software profiling tool...organization can put their own documentation on-line using the BOLD Com- poser for Framemaker . " The AMPLE programming language is a C like language used for
Flynn, Allen J; Bahulekar, Namita; Boisvert, Peter; Lagoze, Carl; Meng, George; Rampton, James; Friedman, Charles P
2017-01-01
Throughout the world, biomedical knowledge is routinely generated and shared through primary and secondary scientific publications. However, there is too much latency between publication of knowledge and its routine use in practice. To address this latency, what is actionable in scientific publications can be encoded to make it computable. We have created a purpose-built digital library platform to hold, manage, and share actionable, computable knowledge for health called the Knowledge Grid Library. Here we present it with its system architecture.
The Management of the Scientific Information Environment: The Role of the Research Library Web Site.
ERIC Educational Resources Information Center
Arte, Assunta
2001-01-01
Describes the experiences of the Italian National Research Council Library staff in the successful development and implementation of its Web site. Discusses electronic information sources that interface with the Web site; library services; technical infrastructure; and the choice of a Web-based library management system. (Author/LRW)
A-Track: A New Approach for Detection of Moving Objects in FITS Images
NASA Astrophysics Data System (ADS)
Kılıç, Yücel; Karapınar, Nurdan; Atay, Tolga; Kaplan, Murat
2016-07-01
Small planet and asteroid observations are important for understanding the origin and evolution of the Solar System. In this work, we have developed a fast and robust pipeline, called A-Track, for detecting asteroids and comets in sequential telescope images. The moving objects are detected using a modified line detection algorithm, called ILDA. We have coded the pipeline in Python 3, where we have made use of various scientific modules in Python to process the FITS images. We tested the code on photometrical data taken by an SI-1100 CCD with a 1-meter telescope at TUBITAK National Observatory, Antalya. The pipeline can be used to analyze large data archives or daily sequential data. The code is hosted on GitHub under the GNU GPL v3 license.
ERIC Educational Resources Information Center
Weisbrod, David L.
This booklet, one of a series of background papers for the White House Conference, explores the potential of new technologies to improve library services while reducing library costs. Separate subsections describe the application of technology to the following library functions: acquisitions, catalogs and cataloging, serials control, circulation…
ERIC Educational Resources Information Center
Bensman, Stephen J.; Wilder, Stanley J.
1998-01-01
Analyzes the structure of the library market for scientific and technical (ST) serials. Describes an exercise aimed at a theoretical reconstruction of the ST-serials holdings of Louisiana State University (LSU) Libraries. Discusses the set definitions, measures, and algorithms necessary in the design of a computer program to appraise ST serials.…
Porting the Starlink Software Collection to GNU Autotools
NASA Astrophysics Data System (ADS)
Gray, N.; Jenness, T.; Allan, A.; Berry, D. S.; Currie, M. J.; Draper, P. W.; Taylor, M. B.; Cavanagh, B.
2005-12-01
The Starlink software collection currently runs on three different Unix platforms and contains around 100 separate software items, totaling 2.5 million lines of code, in a mixture of languages. We have changed the build system from a hand-maintained collection of makefiles with hard-wired OS variants to a scheme involving feature-discovery via GNU Autoconf. As a result of this work, we have already ported the collection to Mac OS X and Cygwin. This had some unexpected benefits and costs, and valuable lessons.
Application of the GNU Radio platform in the multistatic radar
NASA Astrophysics Data System (ADS)
Szlachetko, Boguslaw; Lewandowski, Andrzej
2009-06-01
This document presents the application of the Software Defined Radio-based platform in the multistatic radar. This platform consists of four-sensor linear antenna, Universal Software Radio Peripheral (USRP) hardware (radio frequency frontend) and GNU-Radio PC software. The paper provides information about architecture of digital signal processing performed by USRP's FPGA (digital down converting blocks) and PC host (implementation of the multichannel digital beamforming). The preliminary results of the signal recording performed by our experimental platform are presented.
Masic, Izet; Begic, Edin
2016-12-01
Information technologies have found their application in virtually every branch of health care. In recent years they have demonstrated their potential in the development of online library, where scientists and researchers can share their latest findings. Academia.edu, ResearchGate, Mendeley, Kudos, with the support of platform GoogleScholar, have indeed increased the visibility of scientific work of one author, and enable a much greater availability of the scientific work to the broader audience. Online libraries have allowed free access to the scientific content to the countries that could not follow the economic costs of getting access to certain scientific bases. Especially great benefit occurred in countries in transition and developing countries. Online libraries have great potential in terms of expanding knowledge, but they also present a major problem for many publishers, because their rights can be violated, which are signed by the author when publishing the paper. In the future it will lead to a major conflict of the author, the editorial board and online database, about the right to scientific content This question certainly represents one of the most pressing issues of publishing, whose future in printed form is already in the past, and the future of the online editions will be a problem of large-scale.
Masic, Izet; Begic, Edin
2016-01-01
Information technologies have found their application in virtually every branch of health care. In recent years they have demonstrated their potential in the development of online library, where scientists and researchers can share their latest findings. Academia.edu, ResearchGate, Mendeley, Kudos, with the support of platform GoogleScholar, have indeed increased the visibility of scientific work of one author, and enable a much greater availability of the scientific work to the broader audience. Online libraries have allowed free access to the scientific content to the countries that could not follow the economic costs of getting access to certain scientific bases. Especially great benefit occurred in countries in transition and developing countries. Online libraries have great potential in terms of expanding knowledge, but they also present a major problem for many publishers, because their rights can be violated, which are signed by the author when publishing the paper. In the future it will lead to a major conflict of the author, the editorial board and online database, about the right to scientific content This question certainly represents one of the most pressing issues of publishing, whose future in printed form is already in the past, and the future of the online editions will be a problem of large-scale. PMID:28077905
Enabling Scientists: Serving Sci-Tech Library Users with Disabilities.
ERIC Educational Resources Information Center
Coonin, Bryna
2001-01-01
Discusses how librarians in scientific and technical libraries can contribute to an accessible electronic library environment for users with disabilities to ensure independent access to information. Topics include relevant assistive technologies; creating accessible Web pages; monitoring accessibility of electronic databases; preparing accessible…
42 CFR 4.3 - Purpose of the Library.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 42 Public Health 1 2010-10-01 2010-10-01 false Purpose of the Library. 4.3 Section 4.3 Public... OF MEDICINE § 4.3 Purpose of the Library. The purpose of the Library is to assist the advancement of medical and related sciences and aid the dissemination and exchange of scientific and other information...
HTSeq--a Python framework to work with high-throughput sequencing data.
Anders, Simon; Pyl, Paul Theodor; Huber, Wolfgang
2015-01-15
A large choice of tools exists for many standard tasks in the analysis of high-throughput sequencing (HTS) data. However, once a project deviates from standard workflows, custom scripts are needed. We present HTSeq, a Python library to facilitate the rapid development of such scripts. HTSeq offers parsers for many common data formats in HTS projects, as well as classes to represent data, such as genomic coordinates, sequences, sequencing reads, alignments, gene model information and variant calls, and provides data structures that allow for querying via genomic coordinates. We also present htseq-count, a tool developed with HTSeq that preprocesses RNA-Seq data for differential expression analysis by counting the overlap of reads with genes. HTSeq is released as an open-source software under the GNU General Public Licence and available from http://www-huber.embl.de/HTSeq or from the Python Package Index at https://pypi.python.org/pypi/HTSeq. © The Author 2014. Published by Oxford University Press.
NASA Astrophysics Data System (ADS)
Caliari, Marco; Zuccher, Simone
2017-04-01
Although Fourier series approximation is ubiquitous in computational physics owing to the Fast Fourier Transform (FFT) algorithm, efficient techniques for the fast evaluation of a three-dimensional truncated Fourier series at a set of arbitrary points are quite rare, especially in MATLAB language. Here we employ the Nonequispaced Fast Fourier Transform (NFFT, by J. Keiner, S. Kunis, and D. Potts), a C library designed for this purpose, and provide a Matlab® and GNU Octave interface that makes NFFT easily available to the Numerical Analysis community. We test the effectiveness of our package in the framework of quantum vortex reconnections, where pseudospectral Fourier methods are commonly used and local high resolution is required in the post-processing stage. We show that the efficient evaluation of a truncated Fourier series at arbitrary points provides excellent results at a computational cost much smaller than carrying out a numerical simulation of the problem on a sufficiently fine regular grid that can reproduce comparable details of the reconnecting vortices.
Resource sharing of online teaching materials: The lon-capa project
NASA Astrophysics Data System (ADS)
Bauer, Wolfgang
2004-03-01
The use of information technology resources in conventional lecture-based courses, in distance-learning offerings, as well as hybrid courses, is increasing. But this may put additional burden on faculty, who are now asked to deliver this new content. Additionally, it may require the installation of commercial courseware systems, putting the colleges and universities in new financial licensing dependencies. To address exactly these two problems, the lon-capa system was invented to provide an open-source, gnu public license based, courseware system that allows for sharing of educational resources across institutional and disciplinary boundaries. This presentation will focus on both aspects of the system, the courseware capabilities that allow for customized environments for individual students, and the educational resources library that enables teachers to take full advantages of the work of their colleagues. Research results on learning effectiveness, resource and system usage patterns, and customization for different learning styles will be shown. Institutional perceptions of and responses to open source courseware systems will be discussed.
MDAnalysis: a toolkit for the analysis of molecular dynamics simulations.
Michaud-Agrawal, Naveen; Denning, Elizabeth J; Woolf, Thomas B; Beckstein, Oliver
2011-07-30
MDAnalysis is an object-oriented library for structural and temporal analysis of molecular dynamics (MD) simulation trajectories and individual protein structures. It is written in the Python language with some performance-critical code in C. It uses the powerful NumPy package to expose trajectory data as fast and efficient NumPy arrays. It has been tested on systems of millions of particles. Many common file formats of simulation packages including CHARMM, Gromacs, Amber, and NAMD and the Protein Data Bank format can be read and written. Atoms can be selected with a syntax similar to CHARMM's powerful selection commands. MDAnalysis enables both novice and experienced programmers to rapidly write their own analytical tools and access data stored in trajectories in an easily accessible manner that facilitates interactive explorative analysis. MDAnalysis has been tested on and works for most Unix-based platforms such as Linux and Mac OS X. It is freely available under the GNU General Public License from http://mdanalysis.googlecode.com. Copyright © 2011 Wiley Periodicals, Inc.
Experimental research control software system
NASA Astrophysics Data System (ADS)
Cohn, I. A.; Kovalenko, A. G.; Vystavkin, A. N.
2014-05-01
A software system, intended for automation of a small scale research, has been developed. The software allows one to control equipment, acquire and process data by means of simple scripts. The main purpose of that development is to increase experiment automation easiness, thus significantly reducing experimental setup automation efforts. In particular, minimal programming skills are required and supervisors have no reviewing troubles. Interactions between scripts and equipment are managed automatically, thus allowing to run multiple scripts simultaneously. Unlike well-known data acquisition commercial software systems, the control is performed by an imperative scripting language. This approach eases complex control and data acquisition algorithms implementation. A modular interface library performs interaction with external interfaces. While most widely used interfaces are already implemented, a simple framework is developed for fast implementations of new software and hardware interfaces. While the software is in continuous development with new features being implemented, it is already used in our laboratory for automation of a helium-3 cryostat control and data acquisition. The software is open source and distributed under Gnu Public License.
AutoClickChem: click chemistry in silico.
Durrant, Jacob D; McCammon, J Andrew
2012-01-01
Academic researchers and many in industry often lack the financial resources available to scientists working in "big pharma." High costs include those associated with high-throughput screening and chemical synthesis. In order to address these challenges, many researchers have in part turned to alternate methodologies. Virtual screening, for example, often substitutes for high-throughput screening, and click chemistry ensures that chemical synthesis is fast, cheap, and comparatively easy. Though both in silico screening and click chemistry seek to make drug discovery more feasible, it is not yet routine to couple these two methodologies. We here present a novel computer algorithm, called AutoClickChem, capable of performing many click-chemistry reactions in silico. AutoClickChem can be used to produce large combinatorial libraries of compound models for use in virtual screens. As the compounds of these libraries are constructed according to the reactions of click chemistry, they can be easily synthesized for subsequent testing in biochemical assays. Additionally, in silico modeling of click-chemistry products may prove useful in rational drug design and drug optimization. AutoClickChem is based on the pymolecule toolbox, a framework that may facilitate the development of future python-based programs that require the manipulation of molecular models. Both the pymolecule toolbox and AutoClickChem are released under the GNU General Public License version 3 and are available for download from http://autoclickchem.ucsd.edu.
AutoClickChem: Click Chemistry in Silico
Durrant, Jacob D.; McCammon, J. Andrew
2012-01-01
Academic researchers and many in industry often lack the financial resources available to scientists working in “big pharma.” High costs include those associated with high-throughput screening and chemical synthesis. In order to address these challenges, many researchers have in part turned to alternate methodologies. Virtual screening, for example, often substitutes for high-throughput screening, and click chemistry ensures that chemical synthesis is fast, cheap, and comparatively easy. Though both in silico screening and click chemistry seek to make drug discovery more feasible, it is not yet routine to couple these two methodologies. We here present a novel computer algorithm, called AutoClickChem, capable of performing many click-chemistry reactions in silico. AutoClickChem can be used to produce large combinatorial libraries of compound models for use in virtual screens. As the compounds of these libraries are constructed according to the reactions of click chemistry, they can be easily synthesized for subsequent testing in biochemical assays. Additionally, in silico modeling of click-chemistry products may prove useful in rational drug design and drug optimization. AutoClickChem is based on the pymolecule toolbox, a framework that may facilitate the development of future python-based programs that require the manipulation of molecular models. Both the pymolecule toolbox and AutoClickChem are released under the GNU General Public License version 3 and are available for download from http://autoclickchem.ucsd.edu. PMID:22438795
OneSearch Gives You Access to More Than 7,000 Publishers and Content Providers | Poster
By Robin Meckley, Contributing Writer OneSearch, an exciting new resource from the Scientific Library, is now available to the NCI at Frederick community. This new resource provides a quick and easy way to search multiple Scientific Library resources and collections using a single search box for journal articles, books, media, and more. A large central index is compiled from
CH5M3D: an HTML5 program for creating 3D molecular structures.
Earley, Clarke W
2013-11-18
While a number of programs and web-based applications are available for the interactive display of 3-dimensional molecular structures, few of these provide the ability to edit these structures. For this reason, we have developed a library written in JavaScript to allow for the simple creation of web-based applications that should run on any browser capable of rendering HTML5 web pages. While our primary interest in developing this application was for educational use, it may also prove useful to researchers who want a light-weight application for viewing and editing small molecular structures. Molecular compounds are drawn on the HTML5 Canvas element, with the JavaScript code making use of standard techniques to allow display of three-dimensional structures on a two-dimensional canvas. Information about the structure (bond lengths, bond angles, and dihedral angles) can be obtained using a mouse or other pointing device. Both atoms and bonds can be added or deleted, and rotation about bonds is allowed. Routines are provided to read structures either from the web server or from the user's computer, and creation of galleries of structures can be accomplished with only a few lines of code. Documentation and examples are provided to demonstrate how users can access all of the molecular information for creation of web pages with more advanced features. A light-weight (≈ 75 kb) JavaScript library has been made available that allows for the simple creation of web pages containing interactive 3-dimensional molecular structures. Although this library is designed to create web pages, a web server is not required. Installation on a web server is straightforward and does not require any server-side modules or special permissions. The ch5m3d.js library has been released under the GNU GPL version 3 open-source license and is available from http://sourceforge.net/projects/ch5m3d/.
CH5M3D: an HTML5 program for creating 3D molecular structures
2013-01-01
Background While a number of programs and web-based applications are available for the interactive display of 3-dimensional molecular structures, few of these provide the ability to edit these structures. For this reason, we have developed a library written in JavaScript to allow for the simple creation of web-based applications that should run on any browser capable of rendering HTML5 web pages. While our primary interest in developing this application was for educational use, it may also prove useful to researchers who want a light-weight application for viewing and editing small molecular structures. Results Molecular compounds are drawn on the HTML5 Canvas element, with the JavaScript code making use of standard techniques to allow display of three-dimensional structures on a two-dimensional canvas. Information about the structure (bond lengths, bond angles, and dihedral angles) can be obtained using a mouse or other pointing device. Both atoms and bonds can be added or deleted, and rotation about bonds is allowed. Routines are provided to read structures either from the web server or from the user’s computer, and creation of galleries of structures can be accomplished with only a few lines of code. Documentation and examples are provided to demonstrate how users can access all of the molecular information for creation of web pages with more advanced features. Conclusions A light-weight (≈ 75 kb) JavaScript library has been made available that allows for the simple creation of web pages containing interactive 3-dimensional molecular structures. Although this library is designed to create web pages, a web server is not required. Installation on a web server is straightforward and does not require any server-side modules or special permissions. The ch5m3d.js library has been released under the GNU GPL version 3 open-source license and is available from http://sourceforge.net/projects/ch5m3d/. PMID:24246004
More than 3,200 Books and DVDs Donated to Annual Book Swap | Poster
Robin Meckley, Contributing Writer The Scientific Library’s 14th Annual Book and Media Swap, held on April 16 in the lobby of Building 549, proved to be a popular event. When the swap was rescheduled from fall 2013 to spring 2014, the library staff was uncertain if the response would be equal to previous years, said Sue Wilson, principal manager of the Scientific Library. NCI
Administering Our State Library Agencies
ERIC Educational Resources Information Center
DuFrane, Gerard
1970-01-01
A satire on the application of scientific management principles to a state library agency. Covers relationships of the state librarian to staff, the profession, and state and federal governments. (Author/JS)
E-Approval Plans in Research Libraries
ERIC Educational Resources Information Center
Pickett, Carmelita; Tabacaru, Simona; Harrell, Jeanne
2014-01-01
Research libraries have long invested in approval plan services, which offer an economical way to acquire scholarly and scientific publications. Traditional approval plans have evolved and now enable libraries to expand their e-book offerings to better serve researchers. Publishers offer a myriad of e-book purchasing options. These range from…
Digital Libraries: Situating Use in Changing Information Infrastructure.
ERIC Educational Resources Information Center
Bishop, Ann Peterson; Neumann, Laura J.; Star, Susan Leigh; Merkel, Cecelia; Ignacio, Emily; Sandusky, Robert J.
2000-01-01
Reviews empirical studies about how digital libraries evolve for use in scientific and technical work based on the Digital Libraries Initiative (DLI) at the University of Illinois. Discusses how users meet infrastructure and document disaggregation; describes use of the DLI testbed of full text journal articles; and explains research methodology.…
How to Use the Marine Realms Information Bank (MRIB) Digital Libraries
Lightsom, Frances L.; Allwardt, Alan O.
2009-01-01
Marine Realms Information Bank (MRIB) digital libraries provide access to free online scientific resources about oceans, coasts, and coastal watersheds. MRIB allows category, geographic, and keyword searching, alone or in combination. Instructions for searching the three MRIB libraries and for refining the searches are explained in detail.
NASA Astrophysics Data System (ADS)
Khademizadeh, Shahnaz
2012-08-01
The explosion of information communication technology (ICT) since the beginning of the 20th century has been rendering manual-based library system in academic, research, special and public libraries less relevant. This is because using and implementing information communication technology in the library depend largely on the librarian attitude toward the current digital age. This study examined the attitudinal correlates of some selected scientific and research institutes libraries in Irantowards the use and application of ICT in their various libraries. A total of ten libraries from all the forty nine libraries in Iran formed the studyís population. It is observed that 'Internet/intranet etc' (1046; 67.5%) is the most important source through which the users become aware of modern information technologies used in their libraries. The vast majority of the respondents who answered electronic sources make it 'Easier' to gather and use information are (1313; 84.7%). The results indicate that there is a significant relationship between e-environment and collection development (?262.86, p=0.000). Findings further show that all of librarians (9; 100%) opined they feel that ICT application affects the collection development of library. Based on these findings, it is recommended that libraries in the developing countries should consider training those librarians who do not have knowledge of ICT in order to remove the fear and anxiety hindering them from developing good attitude towards the use of ICT in their libraries.
A main path domain map as digital library interface
NASA Astrophysics Data System (ADS)
Demaine, Jeffrey
2009-01-01
The shift to electronic publishing of scientific journals is an opportunity for the digital library to provide non-traditional ways of accessing the literature. One method is to use citation metadata drawn from a collection of electronic journals to generate maps of science. These maps visualize the communication patterns in the collection, giving the user an easy-tograsp view of the semantic structure underlying the scientific literature. For this visualization to be understandable the complexity of the citation network must be reduced through an algorithm. This paper describes the Citation Pathfinder application and its integration into a prototype digital library. This application generates small-scale citation networks that expand upon the search results of the digital library. These domain maps are linked to the collection, creating an interface that is based on the communication patterns in science. The Main Path Analysis technique is employed to simplify these networks into linear, sequential structures. By identifying patterns that characterize the evolution of the research field, Citation Pathfinder uses citations to give users a deeper understanding of the scientific literature.
ERIC Educational Resources Information Center
Summit, Roger K.; Firschein, Oscar
Eight public libraries participated in a two-year experiment to investigate the potential of the public library as a "linking agent" between the public and the many machine-readable data bases currently accessible using on line computer terminals. The investigation covered users of the service, impact on the library, conditions for…
Browse without a Browser at the ATRF Library | Poster
By Robin Meckley, Contributing Writer Employees at the Advanced Technology Research Facility (ATRF) asked the Scientific Library, and the library responded: print journals are now available in the ATRF Library. Employees can now browse 20 print journals, which will rotate, with one issue available at a time for each title. The library will also temporarily display some new books each week. ATRF employees may indicate their interest in these books by signing the wait lists.
Tahim, Arpan; Stokes, Oliver; Vedi, Vikas
2012-06-01
NHS Library Services are utilised by NHS staff and junior trainees to locate scientific papers that provide them with the evidence base required for modern medical practice. The cost of accessing articles can be considerable particularly for junior trainees. This survey looks at variations in cost of journal article loans and investigates access to particular orthopaedic journals across the country. A national survey of UK Health Libraries was performed. Access to and costs of journals and interlibrary loan services were assessed. Availability of five wide-reaching orthopaedic journals was investigated. Seven hundred and ten libraries were identified. One hundred and ten libraries completed the questionnaire (16.7%). Of these, 96.2% reported free access to scientific journals for users. 99.1% of libraries used interlibrary loan services with 38.2% passing costs on to the user at an average of £2.99 per article. 72.7% of libraries supported orthopaedic services. Journal of Bone and Joint Surgery (British) had greatest onsite availability. The study demonstrates fluctuations in cost of access to interlibrary loan services and variation in access to important orthopaedic journals. It provides a reflection of current policy of charging for the acquisition of medical evidence by libraries in the UK. © 2012 The authors. Health Information and Libraries Journal © 2012 Health Libraries Group.
From Sky to Archive: Long Term Management of Sky Survey Data
NASA Astrophysics Data System (ADS)
Darch, Peter T.; Sands, Ashley E.; Borgman, Christine; Golshan, Milena S.; Traweek, Sharon
2017-01-01
Sky survey data may remain scientifically valuable long beyond the end of a survey’s operational period, both for continuing inquiry and for calibrating and testing instruments for subsequent generations of surveys. Astronomy infrastructure has many stakeholders, including those concerned with data management. Research libraries are increasingly partnering with scholars to sustain access to data.The Sloan Digital Sky Survey (SDSS) was among the first major scientific projects to partner with libraries in this way, embarking on a data transfer process with two university libraries. We report on a qualitative case study of this process.Ideally, long-term sustainability of sky survey data would be a key part of planning and construction, but rarely does this occur. Teams are under pressure to deliver a project on time and on budget that produces high-quality data during its operational period, leaving few resources available to plan long-term data management. The difficulty of planning is further compounded by the complexity of predicting circumstances and needs of the astronomy community in future decades. SDSS team members regarded libraries, long-lived institutions concerned with access to scholarship, as a potential solution to long-term data sustainability.As the SDSS data transfer was the first of this scale attempted - 160 TB of data - astronomers and library staff were faced with scoping the range of activities involved. They spent two years planning this five-year process. While successful overall as demonstration projects, the libraries encountered many obstacles. We found all parties experienced difficulty in articulating their notions of “scientific data,” “archiving,” “serving,” and “providing access” to the datasets. Activities and interpretations of the data transfer process varied by institutional motivations for participation and by available infrastructure. We conclude several, rather than a single, “library solutions” for long-term data management should be considered. Life cycle models popular in the library community are insufficient to conceptualize data management at this scale. We also identify institutional and policy challenges for curating large scientific datasets.
75 FR 33838 - National Environmental Policy Act; Scientific Balloon Program
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-15
... may be viewed at the following locations: (a) Fort Sumner Public Library, 235 West Sumner Avenue, Fort Sumner, New Mexico 88119 (575-355-2832). (b) Palestine Public Library, 1101 North Cedar Street, Palestine, Texas 75801 (903-729-4121). (c) NASA Headquarters Library, Room 1J20, 300 E Street, SW., Washington, DC...
intelligentCAPTURE 1.0 Adds Tables of Content to Library Catalogues and Improves Retrieval.
ERIC Educational Resources Information Center
Hauer, Manfred; Simedy, Walton
2002-01-01
Describes an online library catalog that was developed for an Austrian scientific library that includes table of contents in addition to the standard bibliographic information in order to increase relevance for searchers. Discusses the technology involved, including OCR (Optical Character Recognition) and automatic indexing techniques; weighted…
Orientation and Functions of Library in Quality Education of College
ERIC Educational Resources Information Center
Yang, Lan
2011-01-01
Quality education is the core of college education. Libraries are the second class for students due to the extremely important position and function in quality education. Libraries are the best place for cultivating students' morals, the important front for improving students' scientific and cultural qualities, and the effective facilities for…
ERIC Educational Resources Information Center
Bjoernshauge, Lars
The traditional mode of operation of academic libraries is in crisis due to a combination of zero growth funding, rapidly escalating pricing on information resources (especially scientific journals), necessary investments in technology and human resource development, and increasing customer expectations. These issues are addressed as they relate…
ERIC Educational Resources Information Center
Turner, Judith Axler
1988-01-01
Soviet intelligence agents have been collecting scientific and technical documents in research libraries to identify emerging technology before its components become classified or restricted. Librarians are also recruited as spies. However, asking librarians to identify suspicious library users would violate ethics and intellectual freedom. (MSE)
A Quantum Leap : Innovation in the Evolving Digital Library
DOE Office of Scientific and Technical Information (OSTI.GOV)
Luce, R. E.
2002-01-01
It is an honor to give the Lazerow lecture tonight and to discuss digital library developments from the perspective of working at a national laboratory. Tonight I would like to consider what lies ahead given the evolution in scientific research, how that impacts the development of digital libraries, and finally, look at some of the challenges ahead of us. I'm particularly interested in giving this talk tonight because it provides an opportunity to talk to those of you who are students. You represent the next generation of professionals who will to confront some of the challenges I will outline tonight,more » as well as those of you who are the mentors and teachers of the next generation. The two roles are pivotal in terms of the challenges on the horizon. Most of you are familiar with the information literacy challenges we face as a nation. As the library director of a national laboratory's science library, I am also acutely aware that we also have a real problem with the lack of scientific literacy within the general population in this country and it has a corresponding impact on decision-making in a technological society. Those of us engaged in supporting scientific research, or just generally interested, should be concerned about this fact because science and technology are at the foundation of our success as a nation in the 20th Century. For our nation to continue to be successful in the 21st Century, we will need to improve on the state of scientific literacy.« less
A general spectral method for the numerical simulation of one-dimensional interacting fermions
NASA Astrophysics Data System (ADS)
Clason, Christian; von Winckel, Gregory
2012-08-01
This software implements a general framework for the direct numerical simulation of systems of interacting fermions in one spatial dimension. The approach is based on a specially adapted nodal spectral Galerkin method, where the basis functions are constructed to obey the antisymmetry relations of fermionic wave functions. An efficient Matlab program for the assembly of the stiffness and potential matrices is presented, which exploits the combinatorial structure of the sparsity pattern arising from this discretization to achieve optimal run-time complexity. This program allows the accurate discretization of systems with multiple fermions subject to arbitrary potentials, e.g., for verifying the accuracy of multi-particle approximations such as Hartree-Fock in the few-particle limit. It can be used for eigenvalue computations or numerical solutions of the time-dependent Schrödinger equation. The new version includes a Python implementation of the presented approach. New version program summaryProgram title: assembleFermiMatrix Catalogue identifier: AEKO_v1_1 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEKO_v1_1.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 332 No. of bytes in distributed program, including test data, etc.: 5418 Distribution format: tar.gz Programming language: MATLAB/GNU Octave, Python Computer: Any architecture supported by MATLAB, GNU Octave or Python Operating system: Any supported by MATLAB, GNU Octave or Python RAM: Depends on the data Classification: 4.3, 2.2. External routines: Python 2.7+, NumPy 1.3+, SciPy 0.10+ Catalogue identifier of previous version: AEKO_v1_0 Journal reference of previous version: Comput. Phys. Commun. 183 (2012) 405 Does the new version supersede the previous version?: Yes Nature of problem: The direct numerical solution of the multi-particle one-dimensional Schrödinger equation in a quantum well is challenging due to the exponential growth in the number of degrees of freedom with increasing particles. Solution method: A nodal spectral Galerkin scheme is used where the basis functions are constructed to obey the antisymmetry relations of the fermionic wave function. The assembly of these matrices is performed efficiently by exploiting the combinatorial structure of the sparsity patterns. Reasons for new version: A Python implementation is now included. Summary of revisions: Added a Python implementation; small documentation fixes in Matlab implementation. No change in features of the package. Restrictions: Only one-dimensional computational domains with homogeneous Dirichlet or periodic boundary conditions are supported. Running time: Seconds to minutes.
Léon, Grégory; Ouimet, Mathieu; Lavis, John N; Grimshaw, Jeremy; Gagnon, Marie-Pierre
2013-03-21
Evidence-informed health policymaking logically depends on timely access to research evidence. To our knowledge, despite the substantial political and societal pressure to enhance the use of the best available research evidence in public health policy and program decision making, there is no study addressing availability of peer-reviewed research in Canadian health ministries. To assess availability of (1) a purposive sample of high-ranking scientific journals, (2) bibliographic databases, and (3) health library services in the fourteen Canadian health ministries. From May to October 2011, we conducted a cross-sectional survey among librarians employed by Canadian health ministries to collect information relative to availability of scientific journals, bibliographic databases, and health library services. Availability of scientific journals in each ministry was determined using a sample of 48 journals selected from the 2009 Journal Citation Reports (Sciences and Social Sciences Editions). Selection criteria were: relevance for health policy based on scope note information about subject categories and journal popularity based on impact factors. We found that the majority of Canadian health ministries did not have subscription access to key journals and relied heavily on interlibrary loans. Overall, based on a sample of high-ranking scientific journals, availability of journals through interlibrary loans, online and print-only subscriptions was estimated at 63%, 28% and 3%, respectively. Health Canada had a 2.3-fold higher number of journal subscriptions than that of the provincial ministries' average. Most of the organisations provided access to numerous discipline-specific and multidisciplinary databases. Many organisations provided access to the library resources described through library partnerships or consortia. No professionally led health library environment was found in four out of fourteen Canadian health ministries (i.e. Manitoba Health, Northwest Territories Department of Health and Social Services, Nunavut Department of Health and Social Services and Yukon Department of Health and Social Services). There is inequity in availability of peer-reviewed research in the fourteen Canadian health ministries. This inequity could present a problem, as each province and territory is responsible for formulating and implementing evidence-informed health policies and services for the benefit of its population.
2013-01-01
Background Evidence-informed health policymaking logically depends on timely access to research evidence. To our knowledge, despite the substantial political and societal pressure to enhance the use of the best available research evidence in public health policy and program decision making, there is no study addressing availability of peer-reviewed research in Canadian health ministries. Objectives To assess availability of (1) a purposive sample of high-ranking scientific journals, (2) bibliographic databases, and (3) health library services in the fourteen Canadian health ministries. Methods From May to October 2011, we conducted a cross-sectional survey among librarians employed by Canadian health ministries to collect information relative to availability of scientific journals, bibliographic databases, and health library services. Availability of scientific journals in each ministry was determined using a sample of 48 journals selected from the 2009 Journal Citation Reports (Sciences and Social Sciences Editions). Selection criteria were: relevance for health policy based on scope note information about subject categories and journal popularity based on impact factors. Results We found that the majority of Canadian health ministries did not have subscription access to key journals and relied heavily on interlibrary loans. Overall, based on a sample of high-ranking scientific journals, availability of journals through interlibrary loans, online and print-only subscriptions was estimated at 63%, 28% and 3%, respectively. Health Canada had a 2.3-fold higher number of journal subscriptions than that of the provincial ministries’ average. Most of the organisations provided access to numerous discipline-specific and multidisciplinary databases. Many organisations provided access to the library resources described through library partnerships or consortia. No professionally led health library environment was found in four out of fourteen Canadian health ministries (i.e. Manitoba Health, Northwest Territories Department of Health and Social Services, Nunavut Department of Health and Social Services and Yukon Department of Health and Social Services). Conclusions There is inequity in availability of peer-reviewed research in the fourteen Canadian health ministries. This inequity could present a problem, as each province and territory is responsible for formulating and implementing evidence-informed health policies and services for the benefit of its population. PMID:23514333
PhyLIS: a simple GNU/Linux distribution for phylogenetics and phyloinformatics.
Thomson, Robert C
2009-07-30
PhyLIS is a free GNU/Linux distribution that is designed to provide a simple, standardized platform for phylogenetic and phyloinformatic analysis. The operating system incorporates most commonly used phylogenetic software, which has been pre-compiled and pre-configured, allowing for straightforward application of phylogenetic methods and development of phyloinformatic pipelines in a stable Linux environment. The software is distributed as a live CD and can be installed directly or run from the CD without making changes to the computer. PhyLIS is available for free at http://www.eve.ucdavis.edu/rcthomson/phylis/.
PhyLIS: A Simple GNU/Linux Distribution for Phylogenetics and Phyloinformatics
Thomson, Robert C.
2009-01-01
PhyLIS is a free GNU/Linux distribution that is designed to provide a simple, standardized platform for phylogenetic and phyloinformatic analysis. The operating system incorporates most commonly used phylogenetic software, which has been pre-compiled and pre-configured, allowing for straightforward application of phylogenetic methods and development of phyloinformatic pipelines in a stable Linux environment. The software is distributed as a live CD and can be installed directly or run from the CD without making changes to the computer. PhyLIS is available for free at http://www.eve.ucdavis.edu/rcthomson/phylis/. PMID:19812729
FROG: Time Series Analysis for the Web Service Era
NASA Astrophysics Data System (ADS)
Allan, A.
2005-12-01
The FROG application is part of the next generation Starlink{http://www.starlink.ac.uk} software work (Draper et al. 2005) and released under the GNU Public License{http://www.gnu.org/copyleft/gpl.html} (GPL). Written in Java, it has been designed for the Web and Grid Service era as an extensible, pluggable, tool for time series analysis and display. With an integrated SOAP server the packages functionality is exposed to the user for use in their own code, and to be used remotely over the Grid, as part of the Virtual Observatory (VO).
2004-03-01
PIII/500 (K) 512 A11 3C905 Honeynet PIII/1000 (C) 512 A11 3C905 Generator PIII/800 (C) 256 A11 3C905 Each system is running Debian GNU / Linux “unstable...Network,” September 2000. http://www.issues.af.mil/notams/notam00-5.html; accessed January 16, 2004. 5. “Debian GNU / Linux 3.0 Released,” Debian News...interact with those servers. 1.5 Summary The remainder of this document is organized into four chapters. Chapter 2 con - tains the literature review where
ERIC Educational Resources Information Center
International Federation of Library Associations, The Hague (Netherlands).
Papers on public library service to rural areas, which were presented at the 1985 UNESCO/IFLA (United Nations Educational, Scientific, and Cultural Organization/International Federation of Library Associations) presession seminar are compiled here and include: (1) "Characteristics and Needs of Various Groups, Families, and Other Community…
Adaptation of Flux-Corrected Transport Algorithms for Modeling Dusty Flows.
1983-12-20
Defense Comunications Agency Olcy Attn XLA Washington, DC 20305 01cy Attn nTW-2 (ADR CNW D I: Attn Code 240 for) Olcy Attn NL-STN O Library Olcy Attn...Library Olcy Attn TIC-Library Olcy Attn R Welch Olcy Attn M Johnson Los Alamos National Scientific Lab. Mail Station 5000 Information Science, Inc. P
ERIC Educational Resources Information Center
International Federation of Library Associations and Institutions (NJ1), 2004
2004-01-01
This set of guidelines, for audiovisual and multimedia materials in libraries of all kinds and other appropriate institutions, is the product of many years of consultation and collaborative effort. As early as 1972, The UNESCO (United Nations Educational, Scientific and Cultural Organization) Public Library Manifesto had stressed the need for…
ERIC Educational Resources Information Center
Hamaker, Charles; Tagler, John
1988-01-01
The first article describes effects of the devaluation of the dollar on American libraries that purchase scientific, technical and medical serials published in Europe, and the responses of libraries and library associations. The second presents reasons for increased serial prices from the perspective of the publishing industry. (15…
An Array Library for Microsoft SQL Server with Astrophysical Applications
NASA Astrophysics Data System (ADS)
Dobos, L.; Szalay, A. S.; Blakeley, J.; Falck, B.; Budavári, T.; Csabai, I.
2012-09-01
Today's scientific simulations produce output on the 10-100 TB scale. This unprecedented amount of data requires data handling techniques that are beyond what is used for ordinary files. Relational database systems have been successfully used to store and process scientific data, but the new requirements constantly generate new challenges. Moving terabytes of data among servers on a timely basis is a tough problem, even with the newest high-throughput networks. Thus, moving the computations as close to the data as possible and minimizing the client-server overhead are absolutely necessary. At least data subsetting and preprocessing have to be done inside the server process. Out of the box commercial database systems perform very well in scientific applications from the prospective of data storage optimization, data retrieval, and memory management but lack basic functionality like handling scientific data structures or enabling advanced math inside the database server. The most important gap in Microsoft SQL Server is the lack of a native array data type. Fortunately, the technology exists to extend the database server with custom-written code that enables us to address these problems. We present the prototype of a custom-built extension to Microsoft SQL Server that adds array handling functionality to the database system. With our Array Library, fix-sized arrays of all basic numeric data types can be created and manipulated efficiently. Also, the library is designed to be able to be seamlessly integrated with the most common math libraries, such as BLAS, LAPACK, FFTW, etc. With the help of these libraries, complex operations, such as matrix inversions or Fourier transformations, can be done on-the-fly, from SQL code, inside the database server process. We are currently testing the prototype with two different scientific data sets: The Indra cosmological simulation will use it to store particle and density data from N-body simulations, and the Milky Way Laboratory project will use it to store galaxy simulation data.
Yousefy, Alireza; Malekahmadi, Parisa
2013-01-01
Research is essential for development. In other words, scientific development of each country can be evaluated by researchers' scientific production. Understanding and assessing the activities of researchers for planning and policy making is essential. The significance of collaboration in the production of scientific publications in today's complex world where technology is everything is very apparent. Scientists realized that in order to get their work wildly used and cited to by experts, they must collaborate. The collaboration among researchers results in the development of scientific knowledge and hence, attainment of wider information. The main objective of this research is to survey scientific production and collaboration rate in philosophy and theoretical bases of medical library and information sciences in ISI, SCOPUS, and Pubmed databases during 2001-2010. This is a descriptive survey and scientometrics methods were used for this research. Then data gathered via check list and analyzed by the SPSS software. Collaboration rate was calculated according to the formula. Among the 294 related abstracts about philosophy, and theoretical bases of medical library and information science in ISI, SCOPUS, and Pubmed databases during 2001-2010, the year 2007 with 45 articles has the most and the year 2003 with 16 articles has the least number of related collaborative articles in this scope. "B. Hjorland" with eight collaborative articles had the most one among Library and Information Sciences (LIS) professionals in ISI, SCOPUS, and Pubmed. Journal of Documentation with 29 articles and 12 collaborative articles had the most related articles. Medical library and information science challenges with 150 articles had first place in number of articles. Results also show that the most elaborative country in terms of collaboration point of view and number of articles was US. "University of Washington" and "University Western Ontario" are the most elaborative affiliation from a collaboration point. The average collaboration rate between researchers in this field during the years studied is 0.25. The most completive reviewed articles are single authors that included 60.54% of the whole articles. Only 30.46% of articles were provided with two or more than two authors.
[The virtual library in equity, health, and human development].
Valdés, América
2002-01-01
This article attempts to describe the rationale that has led to the development of information sources dealing with equity, health, and human development in countries of Latin America and the Caribbean within the context of the Virtual Health Library (Biblioteca Virtual en Salud, BVS). Such information sources include the scientific literature, databases in printed and electronic format, institutional directories and lists of specialists, lists of events and courses, distance education programs, specialty journals and bulletins, as well as other means of disseminating health information. The pages that follow deal with the development of a Virtual Library in Equity, Health, and Human Development, an effort rooted in the conviction that decision-making and policy geared toward achieving greater equity in health must, of necessity, be based on coherent, well-organized, and readily accessible first-rate scientific information. Information is useless unless it is converted into knowledge that benefits society. The Virtual Library in Equity, Health, and Human Development is a coordinated effort to develop a decentralized regional network of scientific information sources, with strict quality control, from which public officials can draw data and practical examples that can help them set health and development policies geared toward achieving greater equity for all.
Evolution of Scientific and Technical Information Distribution
NASA Technical Reports Server (NTRS)
Esler, Sandra; Nelson, Michael L.
1998-01-01
World Wide Web (WWW) and related information technologies are transforming the distribution of scientific and technical information (STI). We examine 11 recent, functioning digital libraries focusing on the distribution of STI publications, including journal articles, conference papers, and technical reports. We introduce 4 main categories of digital library projects: based on the architecture (distributed vs. centralized) and the contributor (traditional publisher vs. authoring individual/organization). Many digital library prototypes merely automate existing publishing practices or focus solely on the digitization of the publishing cycle output, not sampling and capturing elements of the input. Still others do not consider for distribution the large body of "gray literature." We address these deficiencies in the current model of STI exchange by suggesting methods for expanding the scope and target of digital libraries by focusing on a greater source of technical publications and using "buckets," an object-oriented construct for grouping logically related information objects, to include holdings other than technical publications.
OneSearch Gives You Access to More Than 7,000 Publishers and Content Providers | Poster
By Robin Meckley, Contributing Writer OneSearch, an exciting new resource from the Scientific Library, is now available to the NCI at Frederick community. This new resource provides a quick and easy way to search multiple Scientific Library resources and collections using a single search box for journal articles, books, media, and more. A large central index is compiled from more than 7,000 publishers and content providers outside the library’s holdings.
Youpi: YOUr processing PIpeline
NASA Astrophysics Data System (ADS)
Monnerville, Mathias; Sémah, Gregory
2012-03-01
Youpi is a portable, easy to use web application providing high level functionalities to perform data reduction on scientific FITS images. Built on top of various open source reduction tools released to the community by TERAPIX (http://terapix.iap.fr), Youpi can help organize data, manage processing jobs on a computer cluster in real time (using Condor) and facilitate teamwork by allowing fine-grain sharing of results and data. Youpi is modular and comes with plugins which perform, from within a browser, various processing tasks such as evaluating the quality of incoming images (using the QualityFITS software package), computing astrometric and photometric solutions (using SCAMP), resampling and co-adding FITS images (using SWarp) and extracting sources and building source catalogues from astronomical images (using SExtractor). Youpi is useful for small to medium-sized data reduction projects; it is free and is published under the GNU General Public License.
Document delivery by the Jupiter Library Consortium
NASA Technical Reports Server (NTRS)
Wessels, Robert H. A.
1994-01-01
The Jupiter library consortium consists of 4 of the leading libraries in the Netherlands. During 1993 Jupiter received 600,000 requests for copies of journal articles, or 70 percent of all external article requests in the Netherlands. Over 90 percent of the requested documents were delivered from a collection of 40,000 current international journal subscriptions. Jupiter and its affiliate libraries are non-profit organizations belonging to, and serving, the scientific and technical research community. The usage of the current journal collection of the libraries was analyzed to improve the cost/benefit ratio.
Atlas - a data warehouse for integrative bioinformatics.
Shah, Sohrab P; Huang, Yong; Xu, Tao; Yuen, Macaire M S; Ling, John; Ouellette, B F Francis
2005-02-21
We present a biological data warehouse called Atlas that locally stores and integrates biological sequences, molecular interactions, homology information, functional annotations of genes, and biological ontologies. The goal of the system is to provide data, as well as a software infrastructure for bioinformatics research and development. The Atlas system is based on relational data models that we developed for each of the source data types. Data stored within these relational models are managed through Structured Query Language (SQL) calls that are implemented in a set of Application Programming Interfaces (APIs). The APIs include three languages: C++, Java, and Perl. The methods in these API libraries are used to construct a set of loader applications, which parse and load the source datasets into the Atlas database, and a set of toolbox applications which facilitate data retrieval. Atlas stores and integrates local instances of GenBank, RefSeq, UniProt, Human Protein Reference Database (HPRD), Biomolecular Interaction Network Database (BIND), Database of Interacting Proteins (DIP), Molecular Interactions Database (MINT), IntAct, NCBI Taxonomy, Gene Ontology (GO), Online Mendelian Inheritance in Man (OMIM), LocusLink, Entrez Gene and HomoloGene. The retrieval APIs and toolbox applications are critical components that offer end-users flexible, easy, integrated access to this data. We present use cases that use Atlas to integrate these sources for genome annotation, inference of molecular interactions across species, and gene-disease associations. The Atlas biological data warehouse serves as data infrastructure for bioinformatics research and development. It forms the backbone of the research activities in our laboratory and facilitates the integration of disparate, heterogeneous biological sources of data enabling new scientific inferences. Atlas achieves integration of diverse data sets at two levels. First, Atlas stores data of similar types using common data models, enforcing the relationships between data types. Second, integration is achieved through a combination of APIs, ontology, and tools. The Atlas software is freely available under the GNU General Public License at: http://bioinformatics.ubc.ca/atlas/
The EnzymeTracker: an open-source laboratory information management system for sample tracking.
Triplet, Thomas; Butler, Gregory
2012-01-26
In many laboratories, researchers store experimental data on their own workstation using spreadsheets. However, this approach poses a number of problems, ranging from sharing issues to inefficient data-mining. Standard spreadsheets are also error-prone, as data do not undergo any validation process. To overcome spreadsheets inherent limitations, a number of proprietary systems have been developed, which laboratories need to pay expensive license fees for. Those costs are usually prohibitive for most laboratories and prevent scientists from benefiting from more sophisticated data management systems. In this paper, we propose the EnzymeTracker, a web-based laboratory information management system for sample tracking, as an open-source and flexible alternative that aims at facilitating entry, mining and sharing of experimental biological data. The EnzymeTracker features online spreadsheets and tools for monitoring numerous experiments conducted by several collaborators to identify and characterize samples. It also provides libraries of shared data such as protocols, and administration tools for data access control using OpenID and user/team management. Our system relies on a database management system for efficient data indexing and management and a user-friendly AJAX interface that can be accessed over the Internet. The EnzymeTracker facilitates data entry by dynamically suggesting entries and providing smart data-mining tools to effectively retrieve data. Our system features a number of tools to visualize and annotate experimental data, and export highly customizable reports. It also supports QR matrix barcoding to facilitate sample tracking. The EnzymeTracker was designed to be easy to use and offers many benefits over spreadsheets, thus presenting the characteristics required to facilitate acceptance by the scientific community. It has been successfully used for 20 months on a daily basis by over 50 scientists. The EnzymeTracker is freely available online at http://cubique.fungalgenomics.ca/enzymedb/index.html under the GNU GPLv3 license.
The EnzymeTracker: an open-source laboratory information management system for sample tracking
2012-01-01
Background In many laboratories, researchers store experimental data on their own workstation using spreadsheets. However, this approach poses a number of problems, ranging from sharing issues to inefficient data-mining. Standard spreadsheets are also error-prone, as data do not undergo any validation process. To overcome spreadsheets inherent limitations, a number of proprietary systems have been developed, which laboratories need to pay expensive license fees for. Those costs are usually prohibitive for most laboratories and prevent scientists from benefiting from more sophisticated data management systems. Results In this paper, we propose the EnzymeTracker, a web-based laboratory information management system for sample tracking, as an open-source and flexible alternative that aims at facilitating entry, mining and sharing of experimental biological data. The EnzymeTracker features online spreadsheets and tools for monitoring numerous experiments conducted by several collaborators to identify and characterize samples. It also provides libraries of shared data such as protocols, and administration tools for data access control using OpenID and user/team management. Our system relies on a database management system for efficient data indexing and management and a user-friendly AJAX interface that can be accessed over the Internet. The EnzymeTracker facilitates data entry by dynamically suggesting entries and providing smart data-mining tools to effectively retrieve data. Our system features a number of tools to visualize and annotate experimental data, and export highly customizable reports. It also supports QR matrix barcoding to facilitate sample tracking. Conclusions The EnzymeTracker was designed to be easy to use and offers many benefits over spreadsheets, thus presenting the characteristics required to facilitate acceptance by the scientific community. It has been successfully used for 20 months on a daily basis by over 50 scientists. The EnzymeTracker is freely available online at http://cubique.fungalgenomics.ca/enzymedb/index.html under the GNU GPLv3 license. PMID:22280360
Atlas – a data warehouse for integrative bioinformatics
Shah, Sohrab P; Huang, Yong; Xu, Tao; Yuen, Macaire MS; Ling, John; Ouellette, BF Francis
2005-01-01
Background We present a biological data warehouse called Atlas that locally stores and integrates biological sequences, molecular interactions, homology information, functional annotations of genes, and biological ontologies. The goal of the system is to provide data, as well as a software infrastructure for bioinformatics research and development. Description The Atlas system is based on relational data models that we developed for each of the source data types. Data stored within these relational models are managed through Structured Query Language (SQL) calls that are implemented in a set of Application Programming Interfaces (APIs). The APIs include three languages: C++, Java, and Perl. The methods in these API libraries are used to construct a set of loader applications, which parse and load the source datasets into the Atlas database, and a set of toolbox applications which facilitate data retrieval. Atlas stores and integrates local instances of GenBank, RefSeq, UniProt, Human Protein Reference Database (HPRD), Biomolecular Interaction Network Database (BIND), Database of Interacting Proteins (DIP), Molecular Interactions Database (MINT), IntAct, NCBI Taxonomy, Gene Ontology (GO), Online Mendelian Inheritance in Man (OMIM), LocusLink, Entrez Gene and HomoloGene. The retrieval APIs and toolbox applications are critical components that offer end-users flexible, easy, integrated access to this data. We present use cases that use Atlas to integrate these sources for genome annotation, inference of molecular interactions across species, and gene-disease associations. Conclusion The Atlas biological data warehouse serves as data infrastructure for bioinformatics research and development. It forms the backbone of the research activities in our laboratory and facilitates the integration of disparate, heterogeneous biological sources of data enabling new scientific inferences. Atlas achieves integration of diverse data sets at two levels. First, Atlas stores data of similar types using common data models, enforcing the relationships between data types. Second, integration is achieved through a combination of APIs, ontology, and tools. The Atlas software is freely available under the GNU General Public License at: PMID:15723693
HepML, an XML-based format for describing simulated data in high energy physics
NASA Astrophysics Data System (ADS)
Belov, S.; Dudko, L.; Kekelidze, D.; Sherstnev, A.
2010-10-01
In this paper we describe a HepML format and a corresponding C++ library developed for keeping complete description of parton level events in a unified and flexible form. HepML tags contain enough information to understand what kind of physics the simulated events describe and how the events have been prepared. A HepML block can be included into event files in the LHEF format. The structure of the HepML block is described by means of several XML Schemas. The Schemas define necessary information for the HepML block and how this information should be located within the block. The library libhepml is a C++ library intended for parsing and serialization of HepML tags, and representing the HepML block in computer memory. The library is an API for external software. For example, Matrix Element Monte Carlo event generators can use the library for preparing and writing a header of an LHEF file in the form of HepML tags. In turn, Showering and Hadronization event generators can parse the HepML header and get the information in the form of C++ classes. libhepml can be used in C++, C, and Fortran programs. All necessary parts of HepML have been prepared and we present the project to the HEP community. Program summaryProgram title: libhepml Catalogue identifier: AEGL_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEGL_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU GPLv3 No. of lines in distributed program, including test data, etc.: 138 866 No. of bytes in distributed program, including test data, etc.: 613 122 Distribution format: tar.gz Programming language: C++, C Computer: PCs and workstations Operating system: Scientific Linux CERN 4/5, Ubuntu 9.10 RAM: 1 073 741 824 bytes (1 Gb) Classification: 6.2, 11.1, 11.2 External routines: Xerces XML library ( http://xerces.apache.org/xerces-c/), Expat XML Parser ( http://expat.sourceforge.net/) Nature of problem: Monte Carlo simulation in high energy physics is divided into several stages. Various programs exist for these stages. In this article we are interested in interfacing different Monte Carlo event generators via data files, in particular, Matrix Element (ME) generators and Showering and Hadronization (SH) generators. There is a widely accepted format for data files for such interfaces - Les Houches Event Format (LHEF). Although information kept in an LHEF file is enough for proper working of SH generators, it is insufficient for understanding how events in the LHEF file have been prepared and which physical model has been applied. In this paper we propose an extension of the format for keeping additional information available in generators. We propose to add a new information block, marked up with XML tags, to the LHEF file. This block describes events in the file in more detail. In particular, it stores information about a physical model, kinematical cuts, generator, etc. This helps to make LHEF files self-documented. Certainly, HepML can be applied in more general context, not in LHEF files only. Solution method: In order to overcome drawbacks of the original LHEF accord we propose to add a new information block of HepML tags. HepML is an XML-based markup language. We designed several XML Schemas for all tags in the language. Any HepML document should follow rules of the Schemas. The language is equipped with a library for operation with HepML tags and documents. This C++ library, called libhepml, consists of classes for HepML objects, which represent a HepML document in computer memory, parsing classes, serializating classes, and some auxiliary classes. Restrictions: The software is adapted for solving problems, described in the article. There are no additional restrictions. Running time: Tests have been done on a computer with Intel(R) Core(TM)2 Solo, 1.4 GHz. Parsing of a HepML file: 6 ms (size of the HepML files is 12.5 Kb) Writing of a HepML block to file: 14 ms (file size 12.5 Kb) Merging of two HepML blocks and writing to file: 18 ms (file size - 25.0 Kb).
ERIC Educational Resources Information Center
International Research and Exchange Board, New York, NY.
This document contains 13 papers by Soviet participants in the U.S.-U.S.S.R. Seminar on Access to Library Resources through Technology and Preservation: (1) "Automation of Information-Library Work at Scientific and Technical Libraries of the U.S.S.R." (A. S. Sorokin and V. M. Rostovtsev); (2) "Automated Information Systems for…
ERIC Educational Resources Information Center
Summit, Roger K.; Firschein, Oscar
Project Dialib was a two-year investigation of the impact of on line information retrieval in a public library setting. This volume documents the efforts that were made to publicize the project and to promote the use of the service among library patrons. (JY)
A Potential Theory for the Steady Separated Flow about an Aerofoil Section
1988-02-01
Adviser (3 copies Doc Data sheet) Aircraft Maintenance and Flight Trials Unit Director of Naval Aircraft Engineering Director of Naval Air Warfare...Superintendent, Aircraft Maintenance and Repair Army Office Scientific Adviser - Army (Doc Data sheet only) Engineering Development Establishment, Library...Flight Group Library Technical Division Library Director General Aircraft Engineering - Air Force Director General Operational Requirements - Air Force
ERIC Educational Resources Information Center
Mackenzie, A. Graham, Ed.; Stuart, Ian M., Ed.
This proceedings volume of a seminar on planning library services is the third in a series of papers, published at irregular intervals, to report on research work by members of the University of Lancaster library staff. From January 1967 until June 1969 the Office of Scientific and Technical Information (OSTI) organized regular meetings under the…
NASA Astrophysics Data System (ADS)
Krumholz, Mark R.
2014-01-01
I describe DESPOTIC, a code to Derive the Energetics and SPectra of Optically Thick Interstellar Clouds. DESPOTIC represents such clouds using a one-zone model, and can calculate line luminosities, line cooling rates, and in restricted cases line profiles using an escape probability formalism. It also includes approximate treatments of the dominant heating, cooling and chemical processes for the cold interstellar medium, including cosmic ray and X-ray heating, grain photoelectric heating, heating of the dust by infrared and ultraviolet radiation, thermal cooling of the dust, collisional energy exchange between dust and gas, and a simple network for carbon chemistry. Based on these heating, cooling and chemical rates, DESPOTIC can calculate clouds' equilibrium gas and dust temperatures, equilibrium carbon chemical state and time-dependent thermal and chemical evolution. The software is intended to allow rapid and interactive calculation of clouds' characteristic temperatures, identification of their dominant heating and cooling mechanisms and prediction of their observable spectra across a wide range of interstellar environments. DESPOTIC is implemented as a PYTHON package, and is released under the GNU General Public License.
NASA Astrophysics Data System (ADS)
Herbrechtsmeier, Stefan; Witkowski, Ulf; Rückert, Ulrich
Mobile robots become more and more important in current research and education. Especially small ’on the table’ experiments attract interest, because they need no additional or special laboratory equipments. In this context platforms are desirable which are small, simple to access and relatively easy to program. An additional powerful information processing unit is advantageous to simplify the implementation of algorithm and the porting of software from desktop computers to the robot platform. In this paper we present a new versatile miniature robot that can be ideally used for research and education. The small size of the robot of about 9 cm edge length, its robust drive and its modular structure make the robot a general device for single and multi-robot experiments executed ’on the table’. For programming and evaluation the robot can be wirelessly connected via Bluetooth or WiFi. The operating system of the robot is based on the standard Linux kernel and the GNU C standard library. A player/stage model eases software development and testing.
High-Performance Mixed Models Based Genome-Wide Association Analysis with omicABEL software
Fabregat-Traver, Diego; Sharapov, Sodbo Zh.; Hayward, Caroline; Rudan, Igor; Campbell, Harry; Aulchenko, Yurii; Bientinesi, Paolo
2014-01-01
To raise the power of genome-wide association studies (GWAS) and avoid false-positive results in structured populations, one can rely on mixed model based tests. When large samples are used, and when multiple traits are to be studied in the ’omics’ context, this approach becomes computationally challenging. Here we consider the problem of mixed-model based GWAS for arbitrary number of traits, and demonstrate that for the analysis of single-trait and multiple-trait scenarios different computational algorithms are optimal. We implement these optimal algorithms in a high-performance computing framework that uses state-of-the-art linear algebra kernels, incorporates optimizations, and avoids redundant computations, increasing throughput while reducing memory usage and energy consumption. We show that, compared to existing libraries, our algorithms and software achieve considerable speed-ups. The OmicABEL software described in this manuscript is available under the GNU GPL v. 3 license as part of the GenABEL project for statistical genomics at http: //www.genabel.org/packages/OmicABEL. PMID:25717363
High-Performance Mixed Models Based Genome-Wide Association Analysis with omicABEL software.
Fabregat-Traver, Diego; Sharapov, Sodbo Zh; Hayward, Caroline; Rudan, Igor; Campbell, Harry; Aulchenko, Yurii; Bientinesi, Paolo
2014-01-01
To raise the power of genome-wide association studies (GWAS) and avoid false-positive results in structured populations, one can rely on mixed model based tests. When large samples are used, and when multiple traits are to be studied in the 'omics' context, this approach becomes computationally challenging. Here we consider the problem of mixed-model based GWAS for arbitrary number of traits, and demonstrate that for the analysis of single-trait and multiple-trait scenarios different computational algorithms are optimal. We implement these optimal algorithms in a high-performance computing framework that uses state-of-the-art linear algebra kernels, incorporates optimizations, and avoids redundant computations, increasing throughput while reducing memory usage and energy consumption. We show that, compared to existing libraries, our algorithms and software achieve considerable speed-ups. The OmicABEL software described in this manuscript is available under the GNU GPL v. 3 license as part of the GenABEL project for statistical genomics at http: //www.genabel.org/packages/OmicABEL.
BiDiBlast: comparative genomics pipeline for the PC.
de Almeida, João M G C F
2010-06-01
Bi-directional BLAST is a simple approach to detect, annotate, and analyze candidate orthologous or paralogous sequences in a single go. This procedure is usually confined to the realm of customized Perl scripts, usually tuned for UNIX-like environments. Porting those scripts to other operating systems involves refactoring them, and also the installation of the Perl programming environment with the required libraries. To overcome these limitations, a data pipeline was implemented in Java. This application submits two batches of sequences to local versions of the NCBI BLAST tool, manages result lists, and refines both bi-directional and simple hits. GO Slim terms are attached to hits, several statistics are derived, and molecular evolution rates are estimated through PAML. The results are written to a set of delimited text tables intended for further analysis. The provided graphic user interface allows a friendly interaction with this application, which is documented and available to download at http://moodle.fct.unl.pt/course/view.php?id=2079 or https://sourceforge.net/projects/bidiblast/ under the GNU GPL license. Copyright 2010 Beijing Genomics Institute. Published by Elsevier Ltd. All rights reserved.
Basic Scientific Subroutines, Volume II.
ERIC Educational Resources Information Center
Ruckdeschel, F. R.
This book, second in a series dealing with scientific programing in the BASIC language, provides students, engineers, and scientists with a documented library of subroutines for scientific applications. Subjects of the eight chapters include: (1) least-squares approximation of functions and smoothing of data; (2) approximating functions by series…
Welcome to the National Wetlands Research Center Library: Successful Research Begins @ Your Library
Broussard, Linda
2007-01-01
The National Wetlands Research Center (NWRC) library is part of the U.S. Geological Survey (USGS) and is the only USGS library dedicated to wetland science. The mission of the NWRC library is to support the research and information needs of scientists, managers, and support personnel by providing a specialized, scientific collection of library materials and related information services that are responsive to and reflect internal and external customer needs and work processes. The NWRC library participates in international cataloging and resource sharing that allows libraries from throughout the world to borrow from its collections and lend to NWRC. This sharing of materials facilitates the research of other governmental agencies, universities, and those interested in the study of wetlands.
Hughes, C
1998-01-01
Academic medical libraries have a responsibility to inform library users regarding retracted publications. Many have created policies and procedures that identify flawed journal articles. A questionnaire was sent to the 129 academic medical libraries in the United States and Canada to find out how many had policies and procedures for identifying retracted publications. Of the returned questionnaires, 59% had no policy and no practice for calling the attention of the library user to retracted publications. Forty-one percent of the libraries called attention to retractions with or without a formal policy for doing so. Several responding libraries included their policy statement with the survey. The increasing number of academic medical libraries that realize the importance of having policies and practices in place highlights the necessity for this procedure.
ERIC Educational Resources Information Center
Proceedings of the ASIS Mid-Year Meeting, 1992
1992-01-01
Lists the speakers and summarizes the issues addressed for 12 panel sessions on topics related to networking, including libraries and national networks, federal national resources and energy programs, multimedia issues, telecommuting, remote image serving, accessing the Internet, library automation, scientific information, applications of Z39.50,…
The Dissemination and Accessibility of Canadian Government Information.
ERIC Educational Resources Information Center
Morton, Bruce; Zink, Steven D.
1992-01-01
Discusses information agencies and issues that affect the dissemination and accessibility of Canadian government information, including the Canada Communication Group, depository libraries, the National Library, bibliographic control of government information, the Canada Institute for Scientific and Technical Information, Statistics Canada,…
The Scientific Library Presents “How to Get Published in a Research Journal” on May 16 | Poster
When aiming to publish a scientific work, every writer should consider the following questions: - Do you know the best way to structure a scientific paper? - Have you identified the most appropriate journal? - Do you understand the peer-review process?
ERIC Educational Resources Information Center
Mackenzie, A. Graham, Ed.; Stuart, Ian M., Ed.
This proceedings volume of a seminar on planning library services is the third in a series of papers, published at irregular intervals, to report on research work by members of the University of Lancaster library staff. From January 1967 until June 1969 the Office of Scientific and Technical Information (OSTI) organized regular meetings under the…
Measuring Academic Productivity and Changing Definitions of Scientific Impact
Sarli, Cathy C.; Carpenter, Christopher R.
2016-01-01
This manuscript provides a brief overview of the history of communication of scientific research and reporting of scientific research impact outcomes. Current day practices are outlined along with examples of how organizations and libraries are providing tools to evaluate and document the impact of scientific research to provide a meaningful narrative suitable for a variety of purposes and audiences. PMID:25438359
De Castro, Paola; Marsili, Daniela; Poltronieri, Elisabetta; Calderón, Carlos Agudelo
2012-06-01
Open Access (OA) to scientific information is an important step forward in communication patterns, yet we still need to reinforce OA principles to promote a cultural change of traditional publishing practices. The advantages of free access to scientific information are even more evident in public health where knowledge is directly associated with human wellbeing. An OA 'consolidation' initiative in public health is presented to show how the involvement of people and institutions is fundamental to create awareness on OA and promote a cultural change. This initiative is developed within the project NEtwork of COllaboration Between Europe and Latin American Caribbean countries (NECOBELAC), financed by the European Commission. Three actions are envisaged: Capacity building through a flexible and sustainable training programme on scientific writing and OA publishing; creation of training tools based on semantic web technologies; development of a network of supporting institutions. In 2010-2011, 23 training initiatives were performed involving 856 participants from 15 countries; topic maps on scientific publication and OA were produced; 195 institutions are included in the network. Cultural change in scientific dissemination practices is a long process requiring a flexible approach and strong commitment by all stakeholders. © 2012 The authors. Health Information and Libraries Journal © 2012 Health Libraries Group Health Information and Libraries Journal.
ERIC Educational Resources Information Center
Kurtz, Michael J.; Eichorn, Guenther; Accomazzi, Alberto; Grant, Carolyn S.; Demleitner, Markus; Murray, Stephen S.; Jones, Michael L. W.; Gay, Geri K.; Rieger, Robert H.; Millman, David; Bruggemann-Klein, Anne; Klein, Rolf; Landgraf, Britta; Wang, James Ze; Li, Jia; Chan, Desmond; Wiederhold, Gio; Pitti, Daniel V.
1999-01-01
Includes six articles that discuss a digital library for astronomy; comparing evaluations of digital collection efforts; cross-organizational access management of Web-based resources; searching scientific bibliographic databases based on content-based relations between documents; semantics-sensitive retrieval for digital picture libraries; and…
SCTE: An open-source Perl framework for testing equipment control and data acquisition
NASA Astrophysics Data System (ADS)
Mostaço-Guidolin, Luiz C.; Frigori, Rafael B.; Ruchko, Leonid; Galvão, Ricardo M. O.
2012-07-01
SCTE intends to provide a simple, yet powerful, framework for building data acquisition and equipment control systems for experimental Physics, and correlated areas. Via its SCTE::Instrument module, RS-232, USB, and LAN buses are supported, and the intricacies of hardware communication are encapsulated underneath an object oriented abstraction layer. Written in Perl, and using the SCPI protocol, enabled instruments can be easily programmed to perform a wide variety of tasks. While this work presents general aspects of the development of data acquisition systems using the SCTE framework, it is illustrated by particular applications designed for the calibration of several in-house developed devices for power measurement in the tokamak TCABR Alfvén Waves Excitement System. Catalogue identifier: AELZ_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AELZ_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public License Version 3 No. of lines in distributed program, including test data, etc.: 13 811 No. of bytes in distributed program, including test data, etc.: 743 709 Distribution format: tar.gz Programming language: Perl version 5.10.0 or higher. Computer: PC. SCPI capable digital oscilloscope, with RS-232, USB, or LAN communication ports, null modem, USB, or Ethernet cables Operating system: GNU/Linux (2.6.28-11), should also work on any Unix-based operational system Classification: 4.14 External routines: Perl modules: Device::SerialPort, Term::ANSIColor, Math::GSL, Net::HTTP. Gnuplot 4.0 or higher Nature of problem: Automation of experiments and data acquisition often requires expensive equipment and in-house development of software applications. Nowadays personal computers and test equipment come with fast and easy-to-use communication ports. Instrument vendors often supply application programs capable of controlling such devices, but are very restricted in terms of functionalities. For instance, they are not capable of controlling more than one test equipment at a same time or to automate repetitive tasks. SCTE provides a way of using auxiliary equipment in order to automate experiment procedures at low cost using only free, and open-source operational system and libraries. Solution method: SCTE provides a Perl module that implements RS-232, USB, and LAN communication allowing the use of SCPI capable instruments [1]. Therefore providing a straightforward way of creating automation and data acquisition applications using personal computers and testing instruments [2]. SCPI Consortium, Standard Commands for Programmable Instruments, 1999, http://www.scpiconsortium.org. L.C.B. Mostaço-Guidolin, Determinação da configuração de ondas de Alfvén excitadas no tokamak TCABR, Master's thesis, Universidade de São Paulo (2007), http://www.teses.usp.br/teses/disponiveis/43/43134/tde-23042009-230419/.
What makes computational open source software libraries successful?
NASA Astrophysics Data System (ADS)
Bangerth, Wolfgang; Heister, Timo
2013-01-01
Software is the backbone of scientific computing. Yet, while we regularly publish detailed accounts about the results of scientific software, and while there is a general sense of which numerical methods work well, our community is largely unaware of best practices in writing the large-scale, open source scientific software upon which our discipline rests. This is particularly apparent in the commonly held view that writing successful software packages is largely the result of simply ‘being a good programmer’ when in fact there are many other factors involved, for example the social skill of community building. In this paper, we consider what we have found to be the necessary ingredients for successful scientific software projects and, in particular, for software libraries upon which the vast majority of scientific codes are built today. In particular, we discuss the roles of code, documentation, communities, project management and licenses. We also briefly comment on the impact on academic careers of engaging in software projects.
Chemical Space of DNA-Encoded Libraries.
Franzini, Raphael M; Randolph, Cassie
2016-07-28
In recent years, DNA-encoded chemical libraries (DECLs) have attracted considerable attention as a potential discovery tool in drug development. Screening encoded libraries may offer advantages over conventional hit discovery approaches and has the potential to complement such methods in pharmaceutical research. As a result of the increased application of encoded libraries in drug discovery, a growing number of hit compounds are emerging in scientific literature. In this review we evaluate reported encoded library-derived structures and identify general trends of these compounds in relation to library design parameters. We in particular emphasize the combinatorial nature of these libraries. Generally, the reported molecules demonstrate the ability of this technology to afford hits suitable for further lead development, and on the basis of them, we derive guidelines for DECL design.
ERIC Educational Resources Information Center
Science Council of Canada, Ottawa (Ontario).
Canada's major scientific and technical information resources are supported largely by the Federal Government. They consist of libraries, data files, specialized information centers, and field services. The Canadian Government has no overall policy concerning the handling of scientific and technical information. The need for a national information…
ERIC Educational Resources Information Center
Weiss, Charles J.
2017-01-01
The Scientific Computing for Chemists course taught at Wabash College teaches chemistry students to use the Python programming language, Jupyter notebooks, and a number of common Python scientific libraries to process, analyze, and visualize data. Assuming no prior programming experience, the course introduces students to basic programming and…
Enhancing Scientific Practice and Education through Collaborative Digital Libraries.
ERIC Educational Resources Information Center
Maini, Gaurav; Leggett, John J.; Ong, Teongjoo; Wilson, Hugh D.; Reed, Monique D.; Hatch, Stephan L.; Dawson, John E.
The need for accurate and current scientific information in the fast paced Internet-aware world has prompted the scientific community to develop tools that reduce the scientist's time and effort to make digital information available to all interested parties. The availability of such tools has made the Internet a vast digital repository of…
Experiences with the New TEST Thesaurus and the New NASA Thesaurus
ERIC Educational Resources Information Center
Rainey, Laura
1970-01-01
Paper presented at Special Libraries Association Annual Conference (Montreal, June 1969). A survey of 75 special libraries on use of the NASA Thesaurus and Thesaurus of Engineering and Scientific Terms (TEST). The findings reveal wide use and satisfaction with both. (JS)
Automatic Publishing of Library Bulletins.
ERIC Educational Resources Information Center
Inbal, Moshe
1980-01-01
Describes the use of a computer to publish library bulletins that list recent accessions of technical reports according to the subject classification scheme of NTIS/SRIM (National Technical Information Service's Scientific Reports in Microfiche). The codes file, the four computer program functions, and costs/economy are discussed. (JD)
Photocopying For Researchers Held Legal
ERIC Educational Resources Information Center
Chemical and Engineering News, 1973
1973-01-01
Reports on a recent decision of the United States Court of Claims in Washington which permits the National Institutes of Health library and the National Library of Medicine to photocopy copyrighted scientific articles for researchers without paying royalties to Williams and Wilkins Company, a medical publisher. (JR)
A BIOINFORMATIC STRATEGY TO RAPIDLY CHARACTERIZE CDNA LIBRARIES
A Bioinformatic Strategy to Rapidly Characterize cDNA Libraries
G. Charles Ostermeier1, David J. Dix2 and Stephen A. Krawetz1.
1Departments of Obstetrics and Gynecology, Center for Molecular Medicine and Genetics, & Institute for Scientific Computing, Wayne State Univer...
Field Museum of Natural History Library.
ERIC Educational Resources Information Center
Williams, Benjamin W.; Fawcett, W. Peyton
1986-01-01
Founded in 1894 to support museum research, the Field Museum of Natural History Library specializes in fields of anthropology, archaeology, botany, geology, palaeontology, and zoology. A rich serials collection and numerous special collections serve both the scientific community and wider public as noncirculating reference collection and through…
ERIC Educational Resources Information Center
Shank, Russell
Access to scientific and technical information is essential to the conduct of high quality research and development work. Indonesia's scientists and engineers in Government research institutes are generally not being well-served by their own libraries. The most serious deficiencies are: (1) inadequately trained library staffs, (2) lack of…
Only for “purely scientific” institutions: the Medical Library Association's Exchange, 1898–1950s
Connor, Jennifer J
2011-01-01
Objective: Centralized exchanges of scientific materials existed by the late nineteenth century, but they did not include medical publications. North American medical leaders therefore formed an association of institutions to run their own exchange: the Medical Library Association (MLA). After providing background to the exchange concept and the importance of institutional members for MLA, this article examines archival MLA correspondence to consider the role of its Exchange in the association's professional development before the 1950s. Results: MLA's membership policy admitted only libraries open to the medical profession with a large number of volumes. But the correspondence of the MLA Executive Committee reveals that the committee constantly adjusted the definition of library membership: personal, public, sectarian, commercial, allied science, and the then-termed “colored” medical school libraries all were denied membership. Conclusion: Study of these decisions, using commercial and sectarian libraries as a focus, uncovers the primary justification for membership exclusions: a goal of operating a scientific exchange. Also, it shows that in this way, MLA shadowed policies and actions of the American Medical Association. Finally, the study suggests that the medical profession enforced its policies of exclusion through MLA, despite a proclaimed altruistic sharing of medical literature. PMID:21464849
SOCIB Glider toolbox: from sensor to data repository
NASA Astrophysics Data System (ADS)
Pau Beltran, Joan; Heslop, Emma; Ruiz, Simón; Troupin, Charles; Tintoré, Joaquín
2015-04-01
Nowadays in oceanography, gliders constitutes a mature, cost-effective technology for the acquisition of measurements independently of the sea state (unlike ships), providing subsurface data during sustained periods, including extreme weather events. The SOCIB glider toolbox is a set of MATLAB/Octave scripts and functions developed in order to manage the data collected by a glider fleet. They cover the main stages of the data management process, both in real-time and delayed-time modes: metadata aggregation, downloading, processing, and automatic generation of data products and figures. The toolbox is distributed under the GNU licence (http://www.gnu.org/copyleft/gpl.html) and is available at http://www.socib.es/users/glider/glider_toolbox.
NASA Astrophysics Data System (ADS)
Sylwestrzak, Marcin; Szlag, Daniel; Marchand, Paul J.; Kumar, Ashwin S.; Lasser, Theo
2017-08-01
We present an application of massively parallel processing of quantitative flow measurements data acquired using spectral optical coherence microscopy (SOCM). The need for massive signal processing of these particular datasets has been a major hurdle for many applications based on SOCM. In view of this difficulty, we implemented and adapted quantitative total flow estimation algorithms on graphics processing units (GPU) and achieved a 150 fold reduction in processing time when compared to a former CPU implementation. As SOCM constitutes the microscopy counterpart to spectral optical coherence tomography (SOCT), the developed processing procedure can be applied to both imaging modalities. We present the developed DLL library integrated in MATLAB (with an example) and have included the source code for adaptations and future improvements. Catalogue identifier: AFBT_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AFBT_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU GPLv3 No. of lines in distributed program, including test data, etc.: 913552 No. of bytes in distributed program, including test data, etc.: 270876249 Distribution format: tar.gz Programming language: CUDA/C, MATLAB. Computer: Intel x64 CPU, GPU supporting CUDA technology. Operating system: 64-bit Windows 7 Professional. Has the code been vectorized or parallelized?: Yes, CPU code has been vectorized in MATLAB, CUDA code has been parallelized. RAM: Dependent on users parameters, typically between several gigabytes and several tens of gigabytes Classification: 6.5, 18. Nature of problem: Speed up of data processing in optical coherence microscopy Solution method: Utilization of GPU for massively parallel data processing Additional comments: Compiled DLL library with source code and documentation, example of utilization (MATLAB script with raw data) Running time: 1,8 s for one B-scan (150 × faster in comparison to the CPU data processing time)
Development of web-GIS system for analysis of georeferenced geophysical data
NASA Astrophysics Data System (ADS)
Okladnikov, I.; Gordov, E. P.; Titov, A. G.; Bogomolov, V. Y.; Genina, E.; Martynova, Y.; Shulgina, T. M.
2012-12-01
Georeferenced datasets (meteorological databases, modeling and reanalysis results, remote sensing products, etc.) are currently actively used in numerous applications including modeling, interpretation and forecast of climatic and ecosystem changes for various spatial and temporal scales. Due to inherent heterogeneity of environmental datasets as well as their huge size which might constitute up to tens terabytes for a single dataset at present studies in the area of climate and environmental change require a special software support. A dedicated web-GIS information-computational system for analysis of georeferenced climatological and meteorological data has been created. The information-computational system consists of 4 basic parts: computational kernel developed using GNU Data Language (GDL), a set of PHP-controllers run within specialized web-portal, JavaScript class libraries for development of typical components of web mapping application graphical user interface (GUI) based on AJAX technology, and an archive of geophysical datasets. Computational kernel comprises of a number of dedicated modules for querying and extraction of data, mathematical and statistical data analysis, visualization, and preparing output files in geoTIFF and netCDF format containing processing results. Specialized web-portal consists of a web-server Apache, complying OGC standards Geoserver software which is used as a base for presenting cartographical information over the Web, and a set of PHP-controllers implementing web-mapping application logic and governing computational kernel. JavaScript libraries aiming at graphical user interface development are based on GeoExt library combining ExtJS Framework and OpenLayers software. The archive of geophysical data consists of a number of structured environmental datasets represented by data files in netCDF, HDF, GRIB, ESRI Shapefile formats. For processing by the system are available: two editions of NCEP/NCAR Reanalysis, JMA/CRIEPI JRA-25 Reanalysis, ECMWF ERA-40 Reanalysis, ECMWF ERA Interim Reanalysis, MRI/JMA APHRODITE's Water Resources Project Reanalysis, DWD Global Precipitation Climatology Centre's data, GMAO Modern Era-Retrospective analysis for Research and Applications, meteorological observational data for the territory of the former USSR for the 20th century, results of modeling by global and regional climatological models, and others. The system is already involved into a scientific research process. Particularly, recently the system was successfully used for analysis of Siberia climate changes and its impact in the region. The Web-GIS information-computational system for geophysical data analysis provides specialists involved into multidisciplinary research projects with reliable and practical instruments for complex analysis of climate and ecosystems changes on global and regional scales. Using it even unskilled user without specific knowledge can perform computational processing and visualization of large meteorological, climatological and satellite monitoring datasets through unified web-interface in a common graphical web-browser. This work is partially supported by the Ministry of education and science of the Russian Federation (contract #07.514.114044), projects IV.31.1.5, IV.31.2.7, RFBR grants #10-07-00547a, #11-05-01190a, and integrated project SB RAS #131.
The Open Access Model of Meteorologische Zeitschrift and other meteorological journals
NASA Astrophysics Data System (ADS)
Emeis, S.
2009-09-01
Today's availability and possibilities of the internet have already brought significant changes to the means of scientific communication. This also affects the publication and reception of peer-reviewed papers in scientific journals. In pre-internet times, the publication of scientific journals was mainly financed through subscription fees paid by libraries and other subscribers. The readers went to the libraries of their institution to search, read, and photocopy these papers. Today, everybody expects to have scientific papers more or less freely available on their desktop computers and from their printers. This has forced the publishers to change the financial model for the publication of scientific papers. An increasing number of journals now publish papers whose production costs have to be paid before the publication by the author or its institution. Those "pre-paid” papers are then freely available from the internet. This publication model has become known as "Open Access (OA)” model. Also the 126-year old Meteorologische Zeitschrift has changed its publication model to an Optional Open Access model. The features of this model will be presented and compared to other OA models with meteorological journals. This change in the publication models with a shift of its payment from the end (libraries and subscribers) to the beginning of the publication process (authors) has also confronted the scientific research and funding institutions with some problems. They must now also change their structures in financing one of their major outputs, the publications of their researchers. A few aspects of the present state of this shift will be addressed.
Journal pricing issues: an economic perspective.
Hafner, A W; Podsadecki, T J; Whitely, W P
1990-01-01
Scientific journal prices have increased markedly in the past two decades, outpacing inflation by severalfold. Such increases challenge the librarian's ability to manage acquisitions resources effectively and threaten the mission of the health sciences library as a resource for present and future scientific information needs. Explanations for serial price increases vary with the point of view considered. Publishers, librarians, faculty, and consumers of scientific information perceive the situation differently. This paper provides an economic analysis of each group's views. Particular emphasis is given to the aspects of journal publishing and pricing that foster price increases. In addition, the paper examines the problems of dual-pricing structures and narrowly focused journals that cater to subspecialties of medicine. Suggested responses to subscription rate increases are offered to curtail further increases and to avoid the potential detrimental effects of reduced library collections. Since one of the underpinnings of education is threatened by reductions in library collections, actions must be taken by publishers, librarians, faculty, and professional associations to ameliorate the present situation and to limit additional increases in serial prices. PMID:2203496
OsiriX: an open-source software for navigating in multidimensional DICOM images.
Rosset, Antoine; Spadola, Luca; Ratib, Osman
2004-09-01
A multidimensional image navigation and display software was designed for display and interpretation of large sets of multidimensional and multimodality images such as combined PET-CT studies. The software is developed in Objective-C on a Macintosh platform under the MacOS X operating system using the GNUstep development environment. It also benefits from the extremely fast and optimized 3D graphic capabilities of the OpenGL graphic standard widely used for computer games optimized for taking advantage of any hardware graphic accelerator boards available. In the design of the software special attention was given to adapt the user interface to the specific and complex tasks of navigating through large sets of image data. An interactive jog-wheel device widely used in the video and movie industry was implemented to allow users to navigate in the different dimensions of an image set much faster than with a traditional mouse or on-screen cursors and sliders. The program can easily be adapted for very specific tasks that require a limited number of functions, by adding and removing tools from the program's toolbar and avoiding an overwhelming number of unnecessary tools and functions. The processing and image rendering tools of the software are based on the open-source libraries ITK and VTK. This ensures that all new developments in image processing that could emerge from other academic institutions using these libraries can be directly ported to the OsiriX program. OsiriX is provided free of charge under the GNU open-source licensing agreement at http://homepage.mac.com/rossetantoine/osirix.
LIB LAB the Library Laboratory: hands-on multimedia science communication
NASA Astrophysics Data System (ADS)
Fillo, Aaron; Niemeyer, Kyle
2017-11-01
Teaching scientific research topics to K-12 audiences in an engaging and meaningful way does not need to be hard; with the right insight and techniques it can be fun to encourage self-guided STEAM (science, technology, engineering, arts, and mathematics) exploration. LIB LAB, short for Library Laboratory, is an educational video series produced by Aaron J. Fillo at Oregon State University in partnership with the Corvallis-Benton County Public Library targeted at K-12 students. Each episode explores a variety of scientific fundamentals with playful experiments and demonstrations. The video lessons are developed using evidence-based practices such as dispelling misconceptions, and language immersion. Each video includes directions for a related experiment that young viewers can conduct at home. In addition, science kits for these at-home experiments are distributed for free to students through the public library network in Benton County, Oregon. This talk will focus on the development of multimedia science education tools and several techniques that scientists can use to engage with a broad audience more effectively. Using examples from the LIB LAB YouTube Channel and collection of hands-on science demonstrations and take-home kits, this talk will present STEAM education in action. Corvallis-Benton County Public Library.
NASA Astrophysics Data System (ADS)
Kuipers, J.; Ueda, T.; Vermaseren, J. A. M.; Vollinga, J.
2013-05-01
We present version 4.0 of the symbolic manipulation system FORM. The most important new features are manipulation of rational polynomials and the factorization of expressions. Many other new functions and commands are also added; some of them are very general, while others are designed for building specific high level packages, such as one for Gröbner bases. New is also the checkpoint facility, that allows for periodic backups during long calculations. Finally, FORM 4.0 has become available as open source under the GNU General Public License version 3. Program summaryProgram title: FORM. Catalogue identifier: AEOT_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEOT_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public License, version 3 No. of lines in distributed program, including test data, etc.: 151599 No. of bytes in distributed program, including test data, etc.: 1 078 748 Distribution format: tar.gz Programming language: The FORM language. FORM itself is programmed in a mixture of C and C++. Computer: All. Operating system: UNIX, LINUX, Mac OS, Windows. Classification: 5. Nature of problem: FORM defines a symbolic manipulation language in which the emphasis lies on fast processing of very large formulas. It has been used successfully for many calculations in Quantum Field Theory and mathematics. In speed and size of formulas that can be handled it outperforms other systems typically by an order of magnitude. Special in this version: The version 4.0 contains many new features. Most important are factorization and rational arithmetic. The program has also become open source under the GPL. The code in CPC is for reference. You are encouraged to upload the most recent sources from www.nikhef.nl/form/formcvs.php because of frequent bug fixes. Solution method: See "Nature of Problem", above. Additional comments: NOTE: The code in CPC is for reference. You are encouraged to upload the most recent sources from www.nikhef.nl/form/formcvs.php because of frequent bug fixes.
CINDOC, CSIC, and Spanish R and D
NASA Technical Reports Server (NTRS)
Delaviesca, Rosa
1994-01-01
The organizational structure and functional activities of the Spanish Center for Scientific Information and Documentation (CINDOC) are discussed. The library holds 8,500 journals, including all the Spanish scientific journals; 16,000 books and 20 CD-ROM data bases. CINDOC creates and distributes its own data bases that include all the articles published in Spanish scientific journals.
ERIC Educational Resources Information Center
Kubow, Stefan
The history of library science in Poland and a number of Polish research projects are reviewed in this paper. It is concluded that a considerable amount of research has been done on the history of libraries in Poland, but that this research is fragmented and separated by its focus on theory or methodology. The methodology of scientific research in…
Scientific Library’s Book and Media Swap Coming April 16 | Poster
By Robin Meckley, Contributing Writer The 14th annual Book and Media Swap will be held on Wednesday, April 16, from 10 a.m. to 2 p.m., in the lobby of the Conference Center in Building 549. The staff is holding the swap to coincide with National Library Week, an annual celebration of libraries that occurs in April. As of April 10, the library had collected nearly 2,000 books,
NASA Astrophysics Data System (ADS)
Moore, R.; Faerman, M.; Minster, J.; Day, S. M.; Ely, G.
2003-12-01
A community digital library provides support for ingestion, organization, description, preservation, and access of digital entities. The technologies that traditionally provide these capabilities are digital libraries (ingestion, organization, description), persistent archives (preservation) and data grids (access). We present a design for the SCEC community digital library that incorporates aspects of all three systems. Multiple groups have created integrated environments that sustain large-scale scientific data collections. By examining these projects, the following stages of implementation can be identified: \\begin{itemize} Definition of semantic terms to associate with relevant information. This includes definition of uniform content descriptors to describe physical quantities relevant to the scientific discipline, and creation of concept spaces to define how the uniform content descriptors are logically related. Organization of digital entities into logical collections that make it simple to browse and manage related material. Definition of services that are used to access and manipulate material in the collection. Creation of a preservation environment for the long-term management of the collection. Each community is faced with heterogeneity that is introduced when data is distributed across multiple sites, or when multiple sets of collection semantics are used, and or when multiple scientific sub-disciplines are federated. We will present the relevant standards that simplify the implementation of the SCEC community library, the resource requirements for different types of data sets that drive the implementation, and the digital library processes that the SCEC community library will support. The SCEC community library can be viewed as the set of processing steps that are required to build the appropriate SCEC reference data sets (SCEC approved encoding format, SCEC approved descriptive metadata, SCEC approved collection organization, and SCEC managed storage location). Each digital entity that is ingested into the SCEC community library is processed and validated for conformance to SCEC standards. These steps generate provenance, descriptive, administrative, structural, and behavioral metadata. Using data grid technology, the descriptive metadata can be registered onto a logical name space that is controlled and managed by the SCEC digital library. A version of the SCEC community digital library is being implemented in the Storage Resource Broker. The SRB system provides almost all the features enumerated above. The peer-to-peer federation of metadata catalogs is planned for release in September, 2003. The SRB system is in production use in multiple projects, from high-energy physics, to astronomy, to earth systems science, to bio-informatics. The SCEC community library will be based on the definition of standard metadata attributes, the creation of logical collections within the SRB, the creation of access services, and the demonstration of a preservation environment. The use of the SRB for the SCEC digital library will sustain the expected collection size and collection capabilities.
Buckets: A New Digital Library Technology for Preserving NASA Research.
ERIC Educational Resources Information Center
Nelson, Michael L.
2001-01-01
Discusses the need for preserving and disseminating scientific and technical information through digital libraries and describes buckets, an intelligent construct for publishing that contains data and metadata and methods for accessing them. Explains SODA (Smart Object, Dumb Archive) and discusses experiences using these technologies in NASA and…
Can We Future-Proof Library Automation?
ERIC Educational Resources Information Center
Breeding, Marshall
2010-01-01
It's an obvious observation that librarians today find themselves dealing with collections of ever larger proportions of electronic content. The degree to which that shift has already taken place varies from one type of library to another. Some organizations, especially those involved with specializations in biomedical, scientific, or business,…
Library-Labs-for-Science Literacy Courses.
ERIC Educational Resources Information Center
Pestel, Beverly C.; Engeldinger, Eugene A.
1992-01-01
Describes two library-lab exercises the authors have incorporated into their college chemistry course. The first exercise introduces students to scientific information and familiarizes them with the tools for accessing it. The second provides a framework for evaluating the reliability of that information and addresses the criteria that should be…
A Solution in Search of a Problem: Bibliometrics and Libraries.
ERIC Educational Resources Information Center
Wallace, Danny P.
1987-01-01
The literature of bibliometrics suggests that the results of bibliometric studies can be of practical use in libraries by providing a scientific basis for collection management decisions. Reports on the application of these studies are virtually nonexistent and a number of reasons for this are suggested. (EM)
"Library Quarterly," 1956-2004: An Exploratory Bibliometric Analysis
ERIC Educational Resources Information Center
Young, Arthur P.
2006-01-01
"Library Quarterly's" seventy-fifth anniversary invites an analysis of the journal's bibliometric dimension, including contributor attributes, various author rankings, and citation impact. Eugene Garfield's HistCite software, linked to Thomson Scientific's Web of Science, as made available by Garfield, for the period 1956-2004, was used as the…
ERIC Educational Resources Information Center
Rux, Paul
1993-01-01
Discussion of the application of TQM (Total Quality Management) to libraries addresses planning based on customer needs and use of the scientific method to evaluate customer satisfaction. A TQM experiment at Marquette Middle School (Wisconsin) is examined, and ways that TQM was used to meet the needs of homeless students are described. (MES)
ERIC Educational Resources Information Center
Cote, L. G.
A system in which the function of the library is to acquire, store, and organize materials is proposed which separates the reference function into a group of subject specialists backed up by computerized information retrieval systems. This division of labor is caused by the scientific community's need for access to graphic and other specific (not…
Adult Education in Museums and Public Libraries.
ERIC Educational Resources Information Center
Miller, Harry G.
Both museums and public libraries are available sources of education for adults. Besides their traditional functions of collecting and preserving items from human artistic or scientific history, museums have taken on a more active role in educating the public, particularly adults. Some educational services provided by museums are dioramas, period…
How People Use Books and Journals.
ERIC Educational Resources Information Center
Sabine, Gordon A.; Sabine, Patricia L.
1986-01-01
Reports results of a survey of 613 heavy users of scientific and technical books and journals in U.S. libraries. Information obtained through interviews includes sex and occupation of interviewees; the type of resource most recently used; why and how it was used; and the type of library used. (EM)
76 FR 79200 - National Cancer Institute; Notice of Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2011-12-21
... proposed to be performed at NCI-Frederick. Place: NCI-Frederick Library and Conference Center, Building 549... Rosemont Ave. Note that the intended destination is the Conference Center/Scientific Library (Bldg. 549). A... Nos. 93.392, Cancer Construction; 93.393, Cancer Cause and Prevention Research; 93.394, Cancer...
Information sources in science and technology in Finland
NASA Technical Reports Server (NTRS)
Haarala, Arja-Riitta
1994-01-01
Finland poses some problems to be overcome in the field of scientific and technical information: a small user community which makes domestic systems costly; great distances within the country between users and suppliers of information; great distances to international data systems and large libraries abroad; and inadequate collections of scientific and technical information. The national bibliography Fennica includes all books and journals published in Finland. Data base services available in Finland include: reference data bases in science and technology; data banks for decision making such as statistical time series or legal proceedings; national bibliographies; and library catalogs.
The SCEC Broadband Platform: Open-Source Software for Strong Ground Motion Simulation and Validation
NASA Astrophysics Data System (ADS)
Goulet, C.; Silva, F.; Maechling, P. J.; Callaghan, S.; Jordan, T. H.
2015-12-01
The Southern California Earthquake Center (SCEC) Broadband Platform (BBP) is a carefully integrated collection of open-source scientific software programs that can simulate broadband (0-100Hz) ground motions for earthquakes at regional scales. The BBP scientific software modules implement kinematic rupture generation, low and high-frequency seismogram synthesis using wave propagation through 1D layered velocity structures, seismogram ground motion amplitude calculations, and goodness of fit measurements. These modules are integrated into a software system that provides user-defined, repeatable, calculation of ground motion seismograms, using multiple alternative ground motion simulation methods, and software utilities that can generate plots, charts, and maps. The BBP has been developed over the last five years in a collaborative scientific, engineering, and software development project involving geoscientists, earthquake engineers, graduate students, and SCEC scientific software developers. The BBP can run earthquake rupture and wave propagation modeling software to simulate ground motions for well-observed historical earthquakes and to quantify how well the simulated broadband seismograms match the observed seismograms. The BBP can also run simulations for hypothetical earthquakes. In this case, users input an earthquake location and magnitude description, a list of station locations, and a 1D velocity model for the region of interest, and the BBP software then calculates ground motions for the specified stations. The SCEC BBP software released in 2015 can be compiled and run on recent Linux systems with GNU compilers. It includes 5 simulation methods, 7 simulation regions covering California, Japan, and Eastern North America, the ability to compare simulation results against GMPEs, updated ground motion simulation methods, and a simplified command line user interface.
Selected Mechanized Scientific and Technical Information Systems.
ERIC Educational Resources Information Center
Ackerman, Lynn, Ed.; And Others
The publication describes the following thirteen computer-based, operational systems designed primarily for the announcement, storage, retrieval and secondary distribution of scientific and technical reports: Defense Documentation Center; Highway Research Board; National Aeronautics and Space Administration; National Library of Medicine; U.S.…
2011-05-01
iTunes illustrate the difference between the centralized approach of digital library systems and the distributed approach of container file formats...metadata in a container file format. Apple’s iTunes uses a centralized metadata approach and allows users to maintain song metadata in a single...one iTunes library to another the metadata must be copied separately or reentered in the new library. This demonstrates the utility of storing metadata
NASA Astrophysics Data System (ADS)
Ridgeway, William K.; Millar, David P.; Williamson, James R.
2013-04-01
Fluorescence Correlation Spectroscopy (FCS) is widely used to quantify reaction rates and concentrations of molecules in vitro and in vivo. We recently reported Fluorescence Triple Correlation Spectroscopy (F3CS), which correlates three signals together instead of two. F3CS can analyze the stoichiometries of complex mixtures and detect irreversible processes by identifying time-reversal asymmetries. Here we report the computational developments that were required for the realization of F3CS and present the results as the Triple Correlation Toolbox suite of programs. Triple Correlation Toolbox is a complete data analysis pipeline capable of acquiring, correlating and fitting large data sets. Each segment of the pipeline handles error estimates for accurate error-weighted global fitting. Data acquisition was accelerated with a combination of off-the-shelf counter-timer chips and vectorized operations on 128-bit registers. This allows desktop computers with inexpensive data acquisition cards to acquire hours of multiple-channel data with sub-microsecond time resolution. Off-line correlation integrals were implemented as a two delay time multiple-tau scheme that scales efficiently with multiple processors and provides an unprecedented view of linked dynamics. Global fitting routines are provided to fit FCS and F3CS data to models containing up to ten species. Triple Correlation Toolbox is a complete package that enables F3CS to be performed on existing microscopes. Catalogue identifier: AEOP_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEOP_v1_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 50189 No. of bytes in distributed program, including test data, etc.: 6135283 Distribution format: tar.gz Programming language: C/Assembly. Computer: Any with GCC and library support. Operating system: Linux and OS X (data acq. for Linux only due to library availability), not tested on Windows. RAM: ≥512 MB. Classification: 16.4. External routines: NIDAQmx (National Instruments), Gnu Scientific Library, GTK+, PLplot (optional) Nature of problem: Fluorescence Triple Correlation Spectroscopy required three things: data acquisition at faster speeds than were possible without expensive custom hardware, triple-correlation routines that could process 1/2 TB data sets rapidly, and fitting routines capable of handling several to a hundred fit parameters and 14,000 + data points, each with error estimates. Solution method: A novel data acquisition concept mixed signal processing with off-the-shelf hardware and data-parallel processing using 128-bit registers found in desktop CPUs. Correlation algorithms used fractal data structures and multithreading to reduce data analysis times. Global fitting was implemented with robust minimization routines and provides feedback that allows the user to critically inspect initial guesses and fits. Restrictions: Data acquisition only requires a National Instruments data acquisition card (it was tested on Linux using card PCIe-6251) and a simple home-built circuit. Unusual features: Hand-coded ×86-64 assembly for data acquisition loops (platform-independent C code also provided). Additional comments: A complete collection of tools to perform Fluorescence Triple Correlation Spectroscopy-from data acquisition to two-tau correlation of large data sets, to model fitting. Running time: 1-5 h of data analysis per hour of data collected. Varies depending on data-acquisition length, time resolution, data density and number of cores used for correlation integrals.
PyMICE: APython library for analysis of IntelliCage data.
Dzik, Jakub M; Puścian, Alicja; Mijakowska, Zofia; Radwanska, Kasia; Łęski, Szymon
2018-04-01
IntelliCage is an automated system for recording the behavior of a group of mice housed together. It produces rich, detailed behavioral data calling for new methods and software for their analysis. Here we present PyMICE, a free and open-source library for analysis of IntelliCage data in the Python programming language. We describe the design and demonstrate the use of the library through a series of examples. PyMICE provides easy and intuitive access to IntelliCage data, and thus facilitates the possibility of using numerous other Python scientific libraries to form a complete data analysis workflow.
CADNA_C: A version of CADNA for use with C or C++ programs
NASA Astrophysics Data System (ADS)
Lamotte, Jean-Luc; Chesneaux, Jean-Marie; Jézéquel, Fabienne
2010-11-01
The CADNA library enables one to estimate round-off error propagation using a probabilistic approach. The CADNA_C version enables this estimation in C or C++ programs, while the previous version had been developed for Fortran programs. The CADNA_C version has the same features as the previous one: with CADNA the numerical quality of any simulation program can be controlled. Furthermore by detecting all the instabilities which may occur at run time, a numerical debugging of the user code can be performed. CADNA provides new numerical types on which round-off errors can be estimated. Slight modifications are required to control a code with CADNA, mainly changes in variable declarations, input and output. New version program summaryProgram title: CADNA_C Catalogue identifier: AEGQ_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEGQ_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 60 075 No. of bytes in distributed program, including test data, etc.: 710 781 Distribution format: tar.gz Programming language: C++ Computer: PC running LINUX with an i686 or an ia64 processor, UNIX workstations including SUN, IBM Operating system: LINUX, UNIX Classification: 6.5 Catalogue identifier of previous version: AEAT_v1_0 Journal reference of previous version: Comput. Phys. Comm. 178 (2008) 933 Does the new version supersede the previous version?: No Nature of problem: A simulation program which uses floating-point arithmetic generates round-off errors, due to the rounding performed at each assignment and at each arithmetic operation. Round-off error propagation may invalidate the result of a program. The CADNA library enables one to estimate round-off error propagation in any simulation program and to detect all numerical instabilities that may occur at run time. Solution method: The CADNA library [1-3] implements Discrete Stochastic Arithmetic [4,5] which is based on a probabilistic model of round-off errors. The program is run several times with a random rounding mode generating different results each time. From this set of results, CADNA estimates the number of exact significant digits in the result that would have been computed with standard floating-point arithmetic. Reasons for new version: The previous version (AEAT_v1_0) enables the estimation of round-off error propagation in Fortran programs [2]. The new version has been developed to enable this estimation in C or C++ programs. Summary of revisions: The CADNA_C source code consists of one assembly language file (cadna_rounding.s) and twenty-three C++ language files (including three header files). cadna_rounding.s is a symbolic link to the assembly file corresponding to the processor and the C++ compiler used. This assembly file contains routines which are frequently called in the CADNA_C C++ files to change the rounding mode. The C++ language files contain the definition of the stochastic types on which the control of accuracy can be performed, CADNA_C specific functions (for instance to enable or disable the detection of numerical instabilities), the definition of arithmetic and relational operators which are overloaded for stochastic variables and the definition of mathematical functions which can be used with stochastic arguments. As a remark, on 64-bit processors, the mathematical library associated with the GNU C++ compiler may provide incorrect results or generate severe bugs with rounding towards -∞ and +∞, which the random rounding mode is based on. Therefore, if CADNA_C is used on a 64-bit processor with the GNU C++ compiler, mathematical functions are computed with rounding to the nearest, otherwise they are computed with the random rounding mode. It must be pointed out that the knowledge of the accuracy of the argument of a mathematical function is never lost. Additional comments: In the library archive, users are advised to read the INSTALL file first. The doc directory contains a user guide named ug.cadna.pdf and a reference guide named, ref_cadna.pdf. The user guide shows how to control the numerical accuracy of a program using CADNA, provides installation instructions and describes test runs.The reference guide briefly describes each function of the library. The source code (which consists of C++ and assembly files) is located in the src directory. The examples directory contains seven test runs which illustrate the use of the CADNA library and the benefits of Discrete Stochastic Arithmetic. Running time: The version of a code which uses CADNA runs at least three times slower than its floating-point version. This cost depends on the computer architecture and can be higher if the detection of numerical instabilities is enabled. In this case, the cost may be related to the number of instabilities detected.
Determination of the Conservation Time of Periodicals for Optimal Shelf Maintenance of a Library.
ERIC Educational Resources Information Center
Miyamoto, Sadaaki; Nakayama, Kazuhiko
1981-01-01
Presents a method based on a constrained optimization technique that determines the time of removal of scientific periodicals from the shelf of a library. A geometrical interpretation of the theoretical result is given, and a numerical example illustrates how the technique is applicable to real bibliographic data. (FM)
Poster Sessions in Library Science. Guidelines. IFLA Professional Reports, No. 3.
ERIC Educational Resources Information Center
Schmidmaier, Dieter
This text is based on a comprehensive study of the literature on posters and poster sessions, a conference held by the International Association of Technological University Libraries (IATUL) on the problems of scientific communication, and on the author's experiences in dealing with posters and poster sessions. A conference poster is described as…
ERIC Educational Resources Information Center
Cotter, Gladys A.; And Others
The Defense Department Scientific and Technical Information (STI) network is composed of over 200 technical libraries and information centers tied together by the Defense Technical Information Center (DTIC), an organization which seeks to improve the flow of information throughout the STI network by promoting shared cataloging and integrated…
Zoo and Wildlife Libraries: An International Survey
ERIC Educational Resources Information Center
Coates, Linda L.; Tierney, Kaitlyn Rose
2010-01-01
The conservation and well-being of exotic animals is core to the mission of zoos, aquariums and many small nonprofit wildlife groups. Increasingly, these organizations are committed to scientific research, both basic and applied. To ascertain the current state of the libraries that support their efforts, librarians at the San Diego Zoo conducted…
RAS Corner at the ATRF Library Keeps You Up-to-Date on the Research | Poster
By Robin Meckley, Contributing Writer The new RAS initiative recently undertaken at the Frederick National Laboratory for Cancer Research has prompted the Scientific Library to provide support in a creative way to the laboratories at the Advanced Technology Research Facility (ATRF), where the research is centered.
The persistence of error: a study of retracted articles on the Internet and in personal libraries*
Davis, Philip M.
2012-01-01
Objective: To determine the accessibility of retracted articles residing on non-publisher websites and in personal libraries. Methods: Searches were performed to locate Internet copies of 1,779 retracted articles identified in MEDLINE, published between 1973 and 2010, excluding the publishers' website. Found copies were classified by article version and location. Mendeley (a bibliographic software) was searched for copies residing in personal libraries. Results: Non-publisher websites provided 321 publicly accessible copies for 289 retracted articles: 304 (95%) copies were the publisher' versions, and 13 (4%) were final manuscripts. PubMed Central had 138 (43%) copies; educational websites 94 (29%); commercial websites 24 (7%); advocacy websites 16 (5%); and institutional repositories 10 (3%). Just 15 (5%) full-article views included a retraction statement. Personal Mendeley libraries contained records for 1,340 (75%) retracted articles, shared by 3.4 users, on average. Conclusions: The benefits of decentralized access to scientific articles may come with the cost of promoting incorrect, invalid, or untrustworthy science. Automated methods to deliver status updates to readers may reduce the persistence of error in the scientific literature. PMID:22879807
The persistence of error: a study of retracted articles on the Internet and in personal libraries.
Davis, Philip M
2012-07-01
To determine the accessibility of retracted articles residing on non-publisher websites and in personal libraries. Searches were performed to locate Internet copies of 1,779 retracted articles identified in MEDLINE, published between 1973 and 2010, excluding the publishers' website. Found copies were classified by article version and location. Mendeley (a bibliographic software) was searched for copies residing in personal libraries. Non-publisher websites provided 321 publicly accessible copies for 289 retracted articles: 304 (95%) copies were the publisher' versions, and 13 (4%) were final manuscripts. PubMed Central had 138 (43%) copies; educational websites 94 (29%); commercial websites 24 (7%); advocacy websites 16 (5%); and institutional repositories 10 (3%). Just 16 [corrected] (5%) full-article views included a retraction statement. Personal Mendeley libraries contained records for 1,340 (75%) retracted articles, shared by 3.4 users, on average. The benefits of decentralized access to scientific articles may come with the cost of promoting incorrect, invalid, or untrustworthy science. Automated methods to deliver status updates to readers may reduce the persistence of error in the scientific literature.
NASA Astrophysics Data System (ADS)
Beck, P. G.; Zotti, G.
2012-05-01
Melk Abbey, a marvel of European high baroque architecture, is one of the most frequently visited tourist attractions in Austria, attracting 450 000 visitors each year. The monastery's museum presents selected aspects of Benedictine life in Melk since the monastery's foundation in 1089. After the church, the library is the second-most important room in a Benedictine monastery. Due to the wide scientific interests and contacts of the medieval monks, these libraries also contain manuscripts on mathematics, physics and astronomy. In 2009, the International Year of Astronomy (IYA2009), the annual library exhibition was fully dedicated to astronomical manuscripts and early prints from the past 1000 years. Following earlier research work on astronomical manuscripts in Melk's library, we were invited to organise the exhibition. In addition, we also presented a lecture series and provided more background in an accompanying book. Because of positive feedback from the visitors, the exhibition was extended until March 2011. In the two years of its duration, the exhibition was seen by more than 900 000 visitors. In this article, we describe the background to the scientific project, how the exhibition was organised and lessons learned from this project.
48 CFR 1435.010 - Scientific and technical reports.
Code of Federal Regulations, 2010 CFR
2010-10-01
... SPECIAL CATEGORIES OF CONTRACTING RESEARCH AND DEVELOPMENT CONTRACTING 1435.010 Scientific and technical reports. If a Research and Development (R&D) contract results involve classified or national security... available. Copies of publications and reports are also required to be sent to the DOI Departmental Library...
48 CFR 1435.010 - Scientific and technical reports.
Code of Federal Regulations, 2013 CFR
2013-10-01
... SPECIAL CATEGORIES OF CONTRACTING RESEARCH AND DEVELOPMENT CONTRACTING 1435.010 Scientific and technical reports. If a Research and Development (R&D) contract results involve classified or national security... available. Copies of publications and reports are also required to be sent to the DOI Departmental Library...
48 CFR 1435.010 - Scientific and technical reports.
Code of Federal Regulations, 2011 CFR
2011-10-01
... SPECIAL CATEGORIES OF CONTRACTING RESEARCH AND DEVELOPMENT CONTRACTING 1435.010 Scientific and technical reports. If a Research and Development (R&D) contract results involve classified or national security... available. Copies of publications and reports are also required to be sent to the DOI Departmental Library...
48 CFR 1435.010 - Scientific and technical reports.
Code of Federal Regulations, 2012 CFR
2012-10-01
... SPECIAL CATEGORIES OF CONTRACTING RESEARCH AND DEVELOPMENT CONTRACTING 1435.010 Scientific and technical reports. If a Research and Development (R&D) contract results involve classified or national security... available. Copies of publications and reports are also required to be sent to the DOI Departmental Library...
48 CFR 1435.010 - Scientific and technical reports.
Code of Federal Regulations, 2014 CFR
2014-10-01
... SPECIAL CATEGORIES OF CONTRACTING RESEARCH AND DEVELOPMENT CONTRACTING 1435.010 Scientific and technical reports. If a Research and Development (R&D) contract results involve classified or national security... available. Copies of publications and reports are also required to be sent to the DOI Departmental Library...
xSDK Foundations: Toward an Extreme-scale Scientific Software Development Kit
DOE Office of Scientific and Technical Information (OSTI.GOV)
Heroux, Michael A.; Bartlett, Roscoe; Demeshko, Irina
Here, extreme-scale computational science increasingly demands multiscale and multiphysics formulations. Combining software developed by independent groups is imperative: no single team has resources for all predictive science and decision support capabilities. Scientific libraries provide high-quality, reusable software components for constructing applications with improved robustness and portability. However, without coordination, many libraries cannot be easily composed. Namespace collisions, inconsistent arguments, lack of third-party software versioning, and additional difficulties make composition costly. The Extreme-scale Scientific Software Development Kit (xSDK) defines community policies to improve code quality and compatibility across independently developed packages (hypre, PETSc, SuperLU, Trilinos, and Alquimia) and provides a foundationmore » for addressing broader issues in software interoperability, performance portability, and sustainability. The xSDK provides turnkey installation of member software and seamless combination of aggregate capabilities, and it marks first steps toward extreme-scale scientific software ecosystems from which future applications can be composed rapidly with assured quality and scalability.« less
xSDK Foundations: Toward an Extreme-scale Scientific Software Development Kit
Heroux, Michael A.; Bartlett, Roscoe; Demeshko, Irina; ...
2017-03-01
Here, extreme-scale computational science increasingly demands multiscale and multiphysics formulations. Combining software developed by independent groups is imperative: no single team has resources for all predictive science and decision support capabilities. Scientific libraries provide high-quality, reusable software components for constructing applications with improved robustness and portability. However, without coordination, many libraries cannot be easily composed. Namespace collisions, inconsistent arguments, lack of third-party software versioning, and additional difficulties make composition costly. The Extreme-scale Scientific Software Development Kit (xSDK) defines community policies to improve code quality and compatibility across independently developed packages (hypre, PETSc, SuperLU, Trilinos, and Alquimia) and provides a foundationmore » for addressing broader issues in software interoperability, performance portability, and sustainability. The xSDK provides turnkey installation of member software and seamless combination of aggregate capabilities, and it marks first steps toward extreme-scale scientific software ecosystems from which future applications can be composed rapidly with assured quality and scalability.« less
Barty, Anton; Kirian, Richard A.; Maia, Filipe R. N. C.; Hantke, Max; Yoon, Chun Hong; White, Thomas A.; Chapman, Henry
2014-01-01
The emerging technique of serial X-ray diffraction, in which diffraction data are collected from samples flowing across a pulsed X-ray source at repetition rates of 100 Hz or higher, has necessitated the development of new software in order to handle the large data volumes produced. Sorting of data according to different criteria and rapid filtering of events to retain only diffraction patterns of interest results in significant reductions in data volume, thereby simplifying subsequent data analysis and management tasks. Meanwhile the generation of reduced data in the form of virtual powder patterns, radial stacks, histograms and other meta data creates data set summaries for analysis and overall experiment evaluation. Rapid data reduction early in the analysis pipeline is proving to be an essential first step in serial imaging experiments, prompting the authors to make the tool described in this article available to the general community. Originally developed for experiments at X-ray free-electron lasers, the software is based on a modular facility-independent library to promote portability between different experiments and is available under version 3 or later of the GNU General Public License. PMID:24904246
PsyToolkit: a software package for programming psychological experiments using Linux.
Stoet, Gijsbert
2010-11-01
PsyToolkit is a set of software tools for programming psychological experiments on Linux computers. Given that PsyToolkit is freely available under the Gnu Public License, open source, and designed such that it can easily be modified and extended for individual needs, it is suitable not only for technically oriented Linux users, but also for students, researchers on small budgets, and universities in developing countries. The software includes a high-level scripting language, a library for the programming language C, and a questionnaire presenter. The software easily integrates with other open source tools, such as the statistical software package R. PsyToolkit is designed to work with external hardware (including IoLab and Cedrus response keyboards and two common digital input/output boards) and to support millisecond timing precision. Four in-depth examples explain the basic functionality of PsyToolkit. Example 1 demonstrates a stimulus-response compatibility experiment. Example 2 demonstrates a novel mouse-controlled visual search experiment. Example 3 shows how to control light emitting diodes using PsyToolkit, and Example 4 shows how to build a light-detection sensor. The last two examples explain the electronic hardware setup such that they can even be used with other software packages.
NASA Technical Reports Server (NTRS)
Ramachandran, Ganesh K.; Akopian, David; Heckler, Gregory W.; Winternitz, Luke B.
2011-01-01
Location technologies have many applications in wireless communications, military and space missions, etc. US Global Positioning System (GPS) and other existing and emerging Global Navigation Satellite Systems (GNSS) are expected to provide accurate location information to enable such applications. While GNSS systems perform very well in strong signal conditions, their operation in many urban, indoor, and space applications is not robust or even impossible due to weak signals and strong distortions. The search for less costly, faster and more sensitive receivers is still in progress. As the research community addresses more and more complicated phenomena there exists a demand on flexible multimode reference receivers, associated SDKs, and development platforms which may accelerate and facilitate the research. One of such concepts is the software GPS/GNSS receiver (GPS SDR) which permits a facilitated access to algorithmic libraries and a possibility to integrate more advanced algorithms without hardware and essential software updates. The GNU-SDR and GPS-SDR open source receiver platforms are such popular examples. This paper evaluates the performance of recently proposed block-corelator techniques for acquisition and tracking of GPS signals using open source GPS-SDR platform.
Siamaki, Saba; Geraei, Ehsan; Zare- Farashbandi, Firoozeh
2014-01-01
Background: Scientific collaboration is among the most important subjects in scientometrics, and many studies have investigated this concept to this day. The goal of the current study is investigation of scientific collaboration and co-authorship patterns of researchers in the field of library and information science in Iran between years 2005 and 2009. Materials and Methods: The current study uses scientometrics method. The statistical population consists of 942 documents published in Iranian library and information science journals between years 2005 and 2009. Collaboration coefficient, collaboration index (CI), and degree of collaboration (DC) were used for data analysis. Findings: The findings showed that among 942 investigated documents, 506 documents (53.70%) was created by one individual researcher and 436 documents (46.30%) were the result of collaboration between two or more researchers. Also, the highest rank of different authorship patterns belonged to National Journal of Librarianship and Information Organization (code H). Conclusion: The average collaboration coefficient for the library and information science researchers in the investigated time frame was 0.23. The closer this coefficient is to 1, the higher is the level of collaboration between authors, and a coefficient near zero shows a tendency to prefer individual articles. The highest collaboration index with an average of 1.92 authors per paper was seen in year 1388. The five year collaboration index in library and information science in Iran was 1.58, and the average degree of collaboration between researchers in the investigated papers was 0.46, which shows that library and information science researchers have a tendency for co-authorship. However, the co-authorship had increased in recent years reaching its highest number in year 1388. The researchers’ collaboration coefficient also shows relative increase between years 1384 and 1388. National Journal of Librarianship and Information Organization has the highest rank among all the investigated journals based on collaboration coefficient, collaboration index (CI), and degree of collaboration (DC). PMID:25250365
Siamaki, Saba; Geraei, Ehsan; Zare-Farashbandi, Firoozeh
2014-01-01
Scientific collaboration is among the most important subjects in scientometrics, and many studies have investigated this concept to this day. The goal of the current study is investigation of scientific collaboration and co-authorship patterns of researchers in the field of library and information science in Iran between years 2005 and 2009. The current study uses scientometrics method. The statistical population consists of 942 documents published in Iranian library and information science journals between years 2005 and 2009. Collaboration coefficient, collaboration index (CI), and degree of collaboration (DC) were used for data analysis. The findings showed that among 942 investigated documents, 506 documents (53.70%) was created by one individual researcher and 436 documents (46.30%) were the result of collaboration between two or more researchers. Also, the highest rank of different authorship patterns belonged to National Journal of Librarianship and Information Organization (code H). The average collaboration coefficient for the library and information science researchers in the investigated time frame was 0.23. The closer this coefficient is to 1, the higher is the level of collaboration between authors, and a coefficient near zero shows a tendency to prefer individual articles. The highest collaboration index with an average of 1.92 authors per paper was seen in year 1388. The five year collaboration index in library and information science in Iran was 1.58, and the average degree of collaboration between researchers in the investigated papers was 0.46, which shows that library and information science researchers have a tendency for co-authorship. However, the co-authorship had increased in recent years reaching its highest number in year 1388. The researchers' collaboration coefficient also shows relative increase between years 1384 and 1388. National Journal of Librarianship and Information Organization has the highest rank among all the investigated journals based on collaboration coefficient, collaboration index (CI), and degree of collaboration (DC).
Vascular knowledge in medieval times was the turning point for the humanistic trend.
Ducasse, E; Speziale, F; Baste, J C; Midy, D
2006-06-01
Knowledge of the history of our surgical specialty may broaden our viewpoint for everyday practice. We illustrate the scientific progress made in medieval times relevant to the vascular system and blood circulation, progress made despite prevailing religious and philosophical dogma. We located all articles concerning vascular knowledge and historical reviews in databases such as MEDLINE, EMBASE and the database of abstracts of reviews (DARE). We also explored the database of the register from the French National Library, the French Medical Inter-University (BIUM), the Italian National Library and the French and Italian Libraries in the Vatican. All data were collected and analysed in chronological order. Medieval vascular knowledge was inherited from Greek via Byzantine and Arabic writings, the first controversies against the recognized vascular schema emanating from an Arabian physician in the 13th century. Dissection was forbidden and clerical rules instilled a fear of blood. Major contributions to scientific progress in the vascular field in medieval times came from Ibn-al-Nafis and Harvey. Vascular specialists today may feel proud to recall that once religious dogma declined in early medieval times, vascular anatomic and physiological discoveries led the way to scientific progress.
1992-11-01
field of inquiry. Good solid research advances fault library science and information science for not ask- the state of the art by contributing to the...further research and application. Ennis (1967) problems faced each day by practitioners. Robert Smith commented that library science research is...under ogy, and technical communications. Library science the title "On Information Science." Keren raises interest- and information science have been
Oldfield, Ron A.; Sjaardema, Gregory D.; Lofstead II, Gerald F.; ...
2012-01-01
Trilinos I/O Support (Trios) is a new capability area in Trilinos that serves two important roles: (1) it provides and supports I/O libraries used by in-production scientific codes; (2) it provides a research vehicle for the evaluation and distribution of new techniques to improve I/O on advanced platforms. This paper provides a brief overview of the production-grade I/O libraries in Trios as well as some of the ongoing research efforts that contribute to the experimental libraries in Trios.
ERIC Educational Resources Information Center
Relyea, Harold C.; Halchin, L. Elaine; Hogue, Henry B.; Agnew, Grace; Martin, Mairead; Schottlaender, Brian E. C.; Jackson, Mary E.
2003-01-01
Theses five reports address five special issues: the effects of the September 11 attacks on information management, including homeland security, Web site information removal, scientific and technical information, and privacy concerns; federal policy for electronic government information; digital rights management and libraries; library Web portal…
SEGY to ASCII: Conversion and Plotting Program
Goldman, Mark R.
1999-01-01
This report documents a computer program to convert standard 4 byte, IBM floating point SEGY files to ASCII xyz format. The program then optionally plots the seismic data using the GMT plotting package. The material for this publication is contained in a standard tar file (of99-126.tar) that is uncompressed and 726 K in size. It can be downloaded by any Unix machine. Move the tar file to the directory you wish to use it in, then type 'tar xvf of99-126.tar' The archive files (and diskette) contain a NOTE file, a README file, a version-history file, source code, a makefile for easy compilation, and an ASCII version of the documentation. The archive files (and diskette) also contain example test files, including a typical SEGY file along with the resulting ASCII xyz and postscript files. Requirements for compiling the source code into an executable are a C++ compiler. The program has been successfully compiled using Gnu's g++ version 2.8.1, and use of other compilers may require modifications to the existing source code. The g++ compiler is a free, high quality C++ compiler and may be downloaded from the ftp site: ftp://ftp.gnu.org/gnu Requirements for plotting the seismic data is the existence of the GMT plotting package. The GMT plotting package may be downloaded from the web site: http://www.soest.hawaii.edu/gmt/
MyMolDB: a micromolecular database solution with open source and free components.
Xia, Bing; Tai, Zheng-Fu; Gu, Yu-Cheng; Li, Bang-Jing; Ding, Li-Sheng; Zhou, Yan
2011-10-01
To manage chemical structures in small laboratories is one of the important daily tasks. Few solutions are available on the internet, and most of them are closed source applications. The open-source applications typically have limited capability and basic cheminformatics functionalities. In this article, we describe an open-source solution to manage chemicals in research groups based on open source and free components. It has a user-friendly interface with the functions of chemical handling and intensive searching. MyMolDB is a micromolecular database solution that supports exact, substructure, similarity, and combined searching. This solution is mainly implemented using scripting language Python with a web-based interface for compound management and searching. Almost all the searches are in essence done with pure SQL on the database by using the high performance of the database engine. Thus, impressive searching speed has been archived in large data sets for no external Central Processing Unit (CPU) consuming languages were involved in the key procedure of the searching. MyMolDB is an open-source software and can be modified and/or redistributed under GNU General Public License version 3 published by the Free Software Foundation (Free Software Foundation Inc. The GNU General Public License, Version 3, 2007. Available at: http://www.gnu.org/licenses/gpl.html). The software itself can be found at http://code.google.com/p/mymoldb/. Copyright © 2011 Wiley Periodicals, Inc.
NASA Astrophysics Data System (ADS)
Veneranda, M.; Negro, J. I.; Medina, J.; Rull, F.; Lantz, C.; Poulet, F.; Cousin, A.; Dypvik, H.; Hellevang, H.; Werner, S. C.
2018-04-01
The PTAL website will store multispectral analysis of samples collected from several terrestrial analogue sites and pretend to become a cornerstone tool for the scientific community interested in deepening the knowledge on Mars geological processes.
Integration of Information and Scientific Literacy: Promoting Literacy in Undergraduates
ERIC Educational Resources Information Center
Porter, Jason A.; Wolbach, Kevin C.; Purzycki, Catherine B.; Bowman, Leslie A.; Agbada, Eva; Mostrom, Alison M.
2010-01-01
The Association of College and Research Libraries recommends incorporating information literacy (IL) skills across university and college curricula, for the goal of developing information literate graduates. Congruent with this goal, the Departments of Biological Sciences and Information Science developed an integrated IL and scientific literacy…
A Case Study in E-Journal Developments: The Scandinavian Position.
ERIC Educational Resources Information Center
Joa, Harald
1997-01-01
Provides an overview of peer-reviewed scientific and scholarly electronic journals in Scandinavia from a publisher's point of view. Discusses the electronic journals market in Scandinavia, international electronic publishing, the Institute for Scientific Information's Electronic Library Project, the one-stop shopping concept, and copyright and…
QuTiP 2: A Python framework for the dynamics of open quantum systems
NASA Astrophysics Data System (ADS)
Johansson, J. R.; Nation, P. D.; Nori, Franco
2013-04-01
We present version 2 of QuTiP, the Quantum Toolbox in Python. Compared to the preceding version [J.R. Johansson, P.D. Nation, F. Nori, Comput. Phys. Commun. 183 (2012) 1760.], we have introduced numerous new features, enhanced performance, and made changes in the Application Programming Interface (API) for improved functionality and consistency within the package, as well as increased compatibility with existing conventions used in other scientific software packages for Python. The most significant new features include efficient solvers for arbitrary time-dependent Hamiltonians and collapse operators, support for the Floquet formalism, and new solvers for Bloch-Redfield and Floquet-Markov master equations. Here we introduce these new features, demonstrate their use, and give a summary of the important backward-incompatible API changes introduced in this version. Catalog identifier: AEMB_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEMB_v2_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: GNU General Public License, version 3 No. of lines in distributed program, including test data, etc.: 33625 No. of bytes in distributed program, including test data, etc.: 410064 Distribution format: tar.gz Programming language: Python. Computer: i386, x86-64. Operating system: Linux, Mac OSX. RAM: 2+ Gigabytes Classification: 7. External routines: NumPy, SciPy, Matplotlib, Cython Catalog identifier of previous version: AEMB_v1_0 Journal reference of previous version: Comput. Phys. Comm. 183 (2012) 1760 Does the new version supercede the previous version?: Yes Nature of problem: Dynamics of open quantum systems Solution method: Numerical solutions to Lindblad, Floquet-Markov, and Bloch-Redfield master equations, as well as the Monte Carlo wave function method. Reasons for new version: Compared to the preceding version we have introduced numerous new features, enhanced performance, and made changes in the Application Programming Interface (API) for improved functionality and consistency within the package, as well as increased compatibility with existing conventions used in other scientific software packages for Python. The most significant new features include efficient solvers for arbitrary time-dependent Hamiltonians and collapse operators, support for the Floquet formalism, and new solvers for Bloch-Redfield and Floquet-Markov master equations. Restrictions: Problems must meet the criteria for using the master equation in Lindblad, Floquet-Markov, or Bloch-Redfield form. Running time: A few seconds up to several tens of hours, depending on size of the underlying Hilbert space.
Kostagiolas, Petros A; Aggelopoulou, Vasiliki A; Niakas, Dimitris
2011-12-01
Hospital pharmacists need access to high-quality information in order to constantly update their knowledge and improve their skills. In their modern role, they are expected to address three types of challenges: scientific, organizational and administrative, thus having an increased need for adequate information and library services. This study investigates the information-seeking behaviour of public hospital pharmacists providing evidence from Greece that could be used to encourage the development of effective information hospital services and study the links between the information seeking behaviour of hospital pharmacists and their modern scientific and professional role. An empirical research was conducted between January and February 2010 with the development and distribution of a structured questionnaire. The questionnaire was filled in and returned by 88 public hospital pharmacists from a total of 286 working in all Greek public hospitals, providing a response rate of 31%. The hospital pharmacists in Greece are in search of scientific information and, more particularly, pharmaceutical information (e.g., drug indications, storage, dosage and prices). The Internet and the National Organization of Medicines are their main information sources, while the lack of time and organized information are the main obstacles they have to face when seeking information. The modern professional role of hospital pharmacists as invaluable contributors to efficient and safer healthcare services may be further supported through the development of specialized libraries and information services within Greek public hospitals. © 2011 The authors. Health Information and Libraries Journal © 2011 Health Libraries Group.
[Cardiology writings in New Spain and in the first century of the Independent period].
de Micheli, Alfredo
2015-01-01
The first writings on cardioangiology found in public and private libraries of New Spain from the xvi century to the first century of the Independent period in Mexico are mentioned. These go from the truly incunabular ones, books printed until the year 1500, to the physiology treatises published by European authors in the xvii and xviii centuries, as well as the cardiology texts from French authors of the first half of the xix century. The writings were depicted in the catalogs of the University library, founded in 1762, as well as in the library of a master builder of the Metropolitan Cathedral of the xvii century and that of a physician of the xviii century, Dr. José Ignacio Bartolache. The latter, in turn, edited for a brief period, from October 1772 to February 1773, a scientific-medical journal, «Mercurio Volante», which was the first scientific-hebdomadary publication in the Americas. Likewise, in the libraries of New Spain, several European scientific journals could be found, such as the one edited by the abbot Rozier, in which the initial writings of Lavoisier appeared. The exchange of ideas and knowledge, pointed out herein, attests to the always enthused interest of given individuals from New Spain on the boundless and passionate domains of cardioangiology. Copyright © 2013 Instituto Nacional de Cardiología Ignacio Chávez. Published by Masson Doyma México S.A. All rights reserved.
Mass Spectral Library Quality Assurance by Inter-Library Comparison
NASA Astrophysics Data System (ADS)
Wallace, William E.; Ji, Weihua; Tchekhovskoi, Dmitrii V.; Phinney, Karen W.; Stein, Stephen E.
2017-04-01
A method to discover and correct errors in mass spectral libraries is described. Comparing across a set of highly curated reference libraries compounds that have the same chemical structure quickly identifies entries that are outliers. In cases where three or more entries for the same compound are compared, the outlier as determined by visual inspection was almost always found to contain the error. These errors were either in the spectrum itself or in the chemical descriptors that accompanied it. The method is demonstrated on finding errors in compounds of forensic interest in the NIST/EPA/NIH Mass Spectral Library. The target list of compounds checked was the Scientific Working Group for the Analysis of Seized Drugs (SWGDRUG) mass spectral library. Some examples of errors found are described. A checklist of errors that curators should look for when performing inter-library comparisons is provided.
[Primary care resources available in digital libraries in Spanish Autonomous Regions].
Juan-Quilis, Verónica
2013-03-01
The Statement by the Spanish Society of Family and Community Medicine (SemFYC) on access to scientific information, highlights the need for providing digital libraries with certain resources in Autonomous Regions. The primary goal is to study the evidence-based medicine (EBM) coverage that SemFYC recommends regional virtual libraries. The regional health virtual libraries were identified and the access provided to health professionals, Internet presence, remote access and resources were studied. The results suggest there is ample coverage in 8 Autonomous Regions. At the top of the list was, Health Sciences Virtual Library of Navarre, the Balearic Islands Health Sciences Virtual Library, and Virtual Library of the Andalusian Public Health System. The present study needs to be extended to the other biomedical sciences, in order to obtain more accurate results. Copyright © 2012 Elsevier España, S.L. All rights reserved.
Mass Spectral Library Quality Assurance by Inter-Library Comparison
Wallace, W.E.; Ji, W.; Tchekhovskoi, D.V.; Phinney, K.W.; Stein, S.E.
2017-01-01
A method to discover and correct errors in mass spectral libraries is described. Comparing across a set of highly curated reference libraries compounds that have the same chemical structure quickly identifies entries that are outliers. In cases where three or more entries for the same compound are compared the outlier as determined by visual inspection was almost always found to contain the error. These errors were either in the spectrum itself or in the chemical descriptors that accompanied it. The method is demonstrated on finding errors in compounds of forensic interest in the NIST/EPA/NIH Mass Spectral Library. The target list of compounds checked was the Scientific Working Group for the Analysis of Seized Drugs (SWGDRUG) mass spectral library. Some examples of errors found are described. A checklist of errors that curators should look for when performing inter-library comparisons is provided. PMID:28127680
Comparison and Evaluation of End-User Interfaces for Online Public Access Catalogs.
ERIC Educational Resources Information Center
Zumer, Maja
End-user interfaces for the online public access catalogs (OPACs) of OhioLINK, a system linking major university and research libraries in Ohio, and its 16 member libraries, accessible through the Internet, are compared and evaluated from the user-oriented perspective. A common, systematic framework was used for the scientific observation of the…
Libraries as a Means of Education and Enlightenment.
ERIC Educational Resources Information Center
Mokhov, N. J.
Soviet libraries play a great role in the spiritual life of the country, in the education and enlightenment of broad sections of the population, and in dissemination of the cultural, scientific and technical achievements among the people. Due to this care and broad initiative of the people, the U.S.S.R. has the largest number of networks of…
Developing a Science Cafe Program for Your University Library
ERIC Educational Resources Information Center
Scaramozzino, Jeanine Marie; Trujillo, Catherine
2010-01-01
The Science Cafe is a national movement that attempts to foster community dialog and inquiry on scientific topics in informal venues such as coffee houses, bookstores, restaurants and bars. The California Polytechnic State University, San Luis Obispo, Robert E. Kennedy Library staff have taken the Science Cafe model out of bars and cafes and into…
Archives: New Horizons in Astronomy
NASA Astrophysics Data System (ADS)
Bobis, L.; Laurenceau, A.
2010-10-01
The scientific archives in the Paris Observatory's library date back to the XVIIth century. In addition to the preservation and the valorisation of these historic archives, the library is also responsible for the efficient and timely management of contemporary documents to ensure their optimum conservation and identification once they become historical. Oral, iconographic and electronic documents complement these paper archives.
ERIC Educational Resources Information Center
Stackpole, Laurie
2001-01-01
The Naval Research Laboratory Library has made significant progress providing its distributed user community with a single point-of-access to information needed to support scientific research through TORPEDO "Ultra," a digital archive that in many respects functions as an electronic counterpart of a traditional library. It consists of…
Survey of Scientific-Technical Tape Services.
ERIC Educational Resources Information Center
Carroll, Kenneth D., Ed.
The results of a survey of commercially available tape services which can provide libraries and information centers with data bases of scientific and technical literature are reported. During the past few years there has been an increasing number of tape services entering the information resources market. Each of these services makes available to…
Upcoming Summer Programs for Students and Staff | Poster
By Robin Meckley, Contributing Writer This summer, the Scientific Library is hosting three programs for students and NCI at Frederick staff: the Summer Video Series, Mini Science Film & Discussion Series, and Eighth Annual Student Science Jeopardy Tournament. Complete information on the programs is available on the Scientific Library’s website.
Omonode, Rex A.; Halvorson, Ardell D.; Gagnon, Bernard; Vyn, Tony J.
2017-01-01
Few studies have assessed the common, yet unproven, hypothesis that an increase of plant nitrogen (N) uptake and/or recovery efficiency (NRE) will reduce nitrous oxide (N2O) emission during crop production. Understanding the relationships between N2O emissions and crop N uptake and use efficiency parameters can help inform crop N management recommendations for both efficiency and environmental goals. Analyses were conducted to determine which of several commonly used crop N uptake-derived parameters related most strongly to growing season N2O emissions under varying N management practices in North American maize systems. Nitrogen uptake-derived variables included total aboveground N uptake (TNU), grain N uptake (GNU), N recovery efficiency (NRE), net N balance (NNB) in relation to GNU [NNB(GNU)] and TNU [NNB(TNU)], and surplus N (SN). The relationship between N2O and N application rate was sigmoidal with relatively small emissions for N rates <130 kg ha−1, and a sharp increase for N rates from 130 to 220 kg ha−1; on average, N2O increased linearly by about 5 g N per kg of N applied for rates up to 220 kg ha−1. Fairly strong and significant negative relationships existed between N2O and NRE when management focused on N application rate (r2 = 0.52) or rate and timing combinations (r2 = 0.65). For every percentage point increase, N2O decreased by 13 g N ha−1 in response to N rates, and by 20 g N ha−1 for NRE changes in response to rate-by-timing treatments. However, more consistent positive relationships (R2 = 0.73–0.77) existed between N2O and NNB(TNU), NNB(GNU), and SN, regardless of rate and timing of N application; on average N2O emission increased by about 5, 7, and 8 g N, respectively, per kg increase of NNB(GNU), NNB(TNU), and SN. Neither N source nor placement influenced the relationship between N2O and NRE. Overall, our analysis indicated that a careful selection of appropriate N rate applied at the right time can both increase NRE and reduce N2O. However, N2O reduction benefits of optimum N rate-by-timing practices were achieved most consistently with management systems that reduced NNB through an increase of grain N removal or total plant N uptake relative to the total fertilizer N applied to maize. Future research assessing crop or N management effects on N2O should include N uptake parameter measurements to better understand N2O emission relationships to plant NRE and N uptake. PMID:28690623
mr: A C++ library for the matching and running of the Standard Model parameters
NASA Astrophysics Data System (ADS)
Kniehl, Bernd A.; Pikelner, Andrey F.; Veretin, Oleg L.
2016-09-01
We present the C++ program library mr that allows us to reliably calculate the values of the running parameters in the Standard Model at high energy scales. The initial conditions are obtained by relating the running parameters in the MS bar renormalization scheme to observables at lower energies with full two-loop precision. The evolution is then performed in accordance with the renormalization group equations with full three-loop precision. Pure QCD corrections to the matching and running are included through four loops. We also provide a Mathematica interface for this program library. Catalogue identifier: AFAI_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AFAI_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public License, version 3 No. of lines in distributed program, including test data, etc.: 517613 No. of bytes in distributed program, including test data, etc.: 2358729 Distribution format: tar.gz Programming language: C++. Computer: IBM PC. Operating system: Linux, Mac OS X. RAM: 1 GB Classification: 11.1. External routines: TSIL [1], OdeInt [2], boost [3] Nature of problem: The running parameters of the Standard Model renormalized in the MS bar scheme at some high renormalization scale, which is chosen by the user, are evaluated in perturbation theory as precisely as possible in two steps. First, the initial conditions at the electroweak energy scale are evaluated from the Fermi constant GF and the pole masses of the W, Z, and Higgs bosons and the bottom and top quarks including the full two-loop threshold corrections. Second, the evolution to the high energy scale is performed by numerically solving the renormalization group evolution equations through three loops. Pure QCD corrections to the matching and running are included through four loops. Solution method: Numerical integration of analytic expressions Additional comments: Available for download from URL: http://apik.github.io/mr/. The MathLink interface is tested to work with Mathematica 7-9 and, with an additional flag, also with Mathematica 10 under Linux and with Mathematica 10 under Mac OS X. Running time: less than 1 second References: [1] S. P. Martin and D. G. Robertson, Comput. Phys. Commun. 174 (2006) 133-151 [hep-ph/0501132]. [2] K. Ahnert and M. Mulansky, AIP Conf. Proc. 1389 (2011) 1586-1589 [arxiv:1110.3397 [cs.MS
QuTiP: An open-source Python framework for the dynamics of open quantum systems
NASA Astrophysics Data System (ADS)
Johansson, J. R.; Nation, P. D.; Nori, Franco
2012-08-01
We present an object-oriented open-source framework for solving the dynamics of open quantum systems written in Python. Arbitrary Hamiltonians, including time-dependent systems, may be built up from operators and states defined by a quantum object class, and then passed on to a choice of master equation or Monte Carlo solvers. We give an overview of the basic structure for the framework before detailing the numerical simulation of open system dynamics. Several examples are given to illustrate the build up to a complete calculation. Finally, we measure the performance of our library against that of current implementations. The framework described here is particularly well suited to the fields of quantum optics, superconducting circuit devices, nanomechanics, and trapped ions, while also being ideal for use in classroom instruction. Catalogue identifier: AEMB_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEMB_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public License, version 3 No. of lines in distributed program, including test data, etc.: 16 482 No. of bytes in distributed program, including test data, etc.: 213 438 Distribution format: tar.gz Programming language: Python Computer: i386, x86-64 Operating system: Linux, Mac OSX, Windows RAM: 2+ Gigabytes Classification: 7 External routines: NumPy (http://numpy.scipy.org/), SciPy (http://www.scipy.org/), Matplotlib (http://matplotlib.sourceforge.net/) Nature of problem: Dynamics of open quantum systems. Solution method: Numerical solutions to Lindblad master equation or Monte Carlo wave function method. Restrictions: Problems must meet the criteria for using the master equation in Lindblad form. Running time: A few seconds up to several tens of minutes, depending on size of underlying Hilbert space.
Haidar, Azzam; Jagode, Heike; Vaccaro, Phil; ...
2018-03-22
The emergence of power efficiency as a primary constraint in processor and system design poses new challenges concerning power and energy awareness for numerical libraries and scientific applications. Power consumption also plays a major role in the design of data centers, which may house petascale or exascale-level computing systems. At these extreme scales, understanding and improving the energy efficiency of numerical libraries and their related applications becomes a crucial part of the successful implementation and operation of the computing system. In this paper, we study and investigate the practice of controlling a compute system's power usage, and we explore howmore » different power caps affect the performance of numerical algorithms with different computational intensities. Further, we determine the impact, in terms of performance and energy usage, that these caps have on a system running scientific applications. This analysis will enable us to characterize the types of algorithms that benefit most from these power management schemes. Our experiments are performed using a set of representative kernels and several popular scientific benchmarks. Lastly, we quantify a number of power and performance measurements and draw observations and conclusions that can be viewed as a roadmap to achieving energy efficiency in the design and execution of scientific algorithms.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Haidar, Azzam; Jagode, Heike; Vaccaro, Phil
The emergence of power efficiency as a primary constraint in processor and system design poses new challenges concerning power and energy awareness for numerical libraries and scientific applications. Power consumption also plays a major role in the design of data centers, which may house petascale or exascale-level computing systems. At these extreme scales, understanding and improving the energy efficiency of numerical libraries and their related applications becomes a crucial part of the successful implementation and operation of the computing system. In this paper, we study and investigate the practice of controlling a compute system's power usage, and we explore howmore » different power caps affect the performance of numerical algorithms with different computational intensities. Further, we determine the impact, in terms of performance and energy usage, that these caps have on a system running scientific applications. This analysis will enable us to characterize the types of algorithms that benefit most from these power management schemes. Our experiments are performed using a set of representative kernels and several popular scientific benchmarks. Lastly, we quantify a number of power and performance measurements and draw observations and conclusions that can be viewed as a roadmap to achieving energy efficiency in the design and execution of scientific algorithms.« less
Tigres Workflow Library: Supporting Scientific Pipelines on HPC Systems
Hendrix, Valerie; Fox, James; Ghoshal, Devarshi; ...
2016-07-21
The growth in scientific data volumes has resulted in the need for new tools that enable users to operate on and analyze data on large-scale resources. In the last decade, a number of scientific workflow tools have emerged. These tools often target distributed environments, and often need expert help to compose and execute the workflows. Data-intensive workflows are often ad-hoc, they involve an iterative development process that includes users composing and testing their workflows on desktops, and scaling up to larger systems. In this paper, we present the design and implementation of Tigres, a workflow library that supports the iterativemore » workflow development cycle of data-intensive workflows. Tigres provides an application programming interface to a set of programming templates i.e., sequence, parallel, split, merge, that can be used to compose and execute computational and data pipelines. We discuss the results of our evaluation of scientific and synthetic workflows showing Tigres performs with minimal template overheads (mean of 13 seconds over all experiments). We also discuss various factors (e.g., I/O performance, execution mechanisms) that affect the performance of scientific workflows on HPC systems.« less
Tigres Workflow Library: Supporting Scientific Pipelines on HPC Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hendrix, Valerie; Fox, James; Ghoshal, Devarshi
The growth in scientific data volumes has resulted in the need for new tools that enable users to operate on and analyze data on large-scale resources. In the last decade, a number of scientific workflow tools have emerged. These tools often target distributed environments, and often need expert help to compose and execute the workflows. Data-intensive workflows are often ad-hoc, they involve an iterative development process that includes users composing and testing their workflows on desktops, and scaling up to larger systems. In this paper, we present the design and implementation of Tigres, a workflow library that supports the iterativemore » workflow development cycle of data-intensive workflows. Tigres provides an application programming interface to a set of programming templates i.e., sequence, parallel, split, merge, that can be used to compose and execute computational and data pipelines. We discuss the results of our evaluation of scientific and synthetic workflows showing Tigres performs with minimal template overheads (mean of 13 seconds over all experiments). We also discuss various factors (e.g., I/O performance, execution mechanisms) that affect the performance of scientific workflows on HPC systems.« less
Capturing Petascale Application Characteristics with the Sequoia Toolkit
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vetter, Jeffrey S; Bhatia, Nikhil; Grobelny, Eric M
2005-01-01
Characterization of the computation, communication, memory, and I/O demands of current scientific applications is crucial for identifying which technologies will enable petascale scientific computing. In this paper, we present the Sequoia Toolkit for characterizing HPC applications. The Sequoia Toolkit consists of the Sequoia trace capture library and the Sequoia Event Analysis Library, or SEAL, that facilitates the development of tools for analyzing Sequoia event traces. Using the Sequoia Toolkit, we have characterized the behavior of application runs with up to 2048 application processes. To illustrate the use of the Sequoia Toolkit, we present a preliminary characterization of LAMMPS, a molecularmore » dynamics application of great interest to the computational biology community.« less
More than 3,200 Books and DVDs Donated to Annual Book Swap | Poster
Robin Meckley, Contributing Writer The Scientific Library’s 14th Annual Book and Media Swap, held on April 16 in the lobby of Building 549, proved to be a popular event. When the swap was rescheduled from fall 2013 to spring 2014, the library staff was uncertain if the response would be equal to previous years, said Sue Wilson, principal manager of the Scientific Library. NCI at Frederick employees rose to the challenge, however, with 87 people donating more than 3,200 books and DVDs, according to Pam Noble, serials technician and book swap team leader. By the end of the first day of the swap, almost half of the materials had been claimed.
ERIC Educational Resources Information Center
1970
The Conference was held because of a recognition by the Committee on Scientific and Technical Information (COSATI) Task Group on Library Programs and the Federal Library Committee of a fundamental responsibility to interact in a meaningful way with the non-Federal sector--the state, local and private users of Federal information resources. This…
Building a Multi-Discipline Digital Library Through Extending the Dienst Protocol
NASA Technical Reports Server (NTRS)
Nelson, Michael L.; Maly, Kurt; Shen, Stewart N. T.
1997-01-01
The purpose of this project is to establish multi-discipline capability for a unified, canonical digital library service for scientific and technical information (STI). This is accomplished by extending the Dienst Protocol to be aware of subject classification of a servers holdings. We propose a hierarchical, general, and extendible subject classification that can encapsulate existing classification systems.
NASA Astrophysics Data System (ADS)
Müller, Thomas
2011-06-01
The new version of the Motion4D-library now also includes the integration of a Sachs basis and the Jacobi equation to determine gravitational lensing of pointlike sources for arbitrary spacetimes.New version program summaryProgram title: Motion4D-libraryCatalogue identifier: AEEX_v3_0Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEEX_v3_0.htmlProgram obtainable from: CPC Program Library, Queen's University, Belfast, N. IrelandLicensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.htmlNo. of lines in distributed program, including test data, etc.: 219 441No. of bytes in distributed program, including test data, etc.: 6 968 223Distribution format: tar.gzProgramming language: C++Computer: All platforms with a C++ compilerOperating system: Linux, WindowsRAM: 61 MbytesClassification: 1.5External routines: Gnu Scientic Library (GSL) (http://www.gnu.org/software/gsl/)Catalogue identifier of previous version: AEEX_v2_0Journal reference of previous version: Comput. Phys. Comm. 181 (2010) 703Does the new version supersede the previous version?: YesNature of problem: Solve geodesic equation, parallel and Fermi-Walker transport in four-dimensional Lorentzian spacetimes. Determine gravitational lensing by integration of Jacobi equation and parallel transport of Sachs basis.Solution method: Integration of ordinary differential equations.Reasons for new version: The main novelty of the current version is the extension to integrate the Jacobi equation and the parallel transport of the Sachs basis along null geodesics. In combination, the change of the cross section of a light bundle and thus the gravitational lensing effect of a spacetime can be determined. Furthermore, we have implemented several new metrics.Summary of revisions: The main novelty of the current version is the integration of the Jacobi equation and the parallel transport of the Sachs basis along null geodesics. The corresponding set of equations readd2xμdλ2=-Γρσμdxρdλdxσdλ, ds1,2μdλ=-Γρσμdxρdλs1,2σ, d2Y1,2μdλ2=-2ΓρσμdxρdλdY1,2σdλ-Γρσ,νμdxρdλdxσdλYν, where (1) is the geodesic equation, (2) represents the parallel transport of the two Sachs basis vectors s, and (3) is the Jacobi equation for the two Jacobi fields Y.The initial directions of the Sachs basis vectors s=(0,s)=s1,2μ∂ are defined perpendicular to the initial direction υ→ of the light ray, see also Fig. 1,s=(-, s=(-. Display OmittedA congruence of null geodesics with central null geodesic γ which starts at the observer O with an infinitesimal circular cross section is defined by the above mentioned two Jacobi fields with initial conditions Y1,2μ|=0 and (dY1,2μ/dλ)|=s1,2μ. The cross section of this congruence along γ is described by the Jacobian J(λ)=gYiμsjν|. However, to determine the gravitational lensing of a pointlike source S that is connected to the observer via γ, we need the reverse Jacobian JS. Fortunately, the reverse Jacobian is just the negative transpose of the original Jacobian JO,J:=JS=-(J)T. The Jacobian J transforms the circular shape of the congruence into an ellipse whose shape parameters (M: major/minor axis, ψ: angle of major axis, ɛ: ellipticity) readM=2αsinζcosζ-βsin2ζ+J112+J212, ψ=arctan2(Jcosζ+Jsinζ,Jcosζ+Jsinζ), ɛ=‖M-M‖‖M+M‖ withζ=12arctan2αβ,ζ=ζ+π2, and the parameters α=JJ+JJ, β=J112-J122+J212-J222. The magnification factor is given byμ=λ2MM. These shape parameters can be easily visualized in the new version of the GeodesicViewer, see Ref. [1]. A detailed discussion of gravitational lensing can be found, for example, in Schneider et al. [2].In the following, a list of newly implemented metrics is given:BertottiKasner: see Rindler [3].BesselGravWaveCart: gravitational Bessel wave from Kramer [4].DeSitterUniv, DeSitterUnivConf: de Sitter universe in Cartesian and conformal coordinates.Ernst: Black hole in a magnetic universe by Ernst [5].ExtremeReissnerNordstromDihole: see Chandrasekhar [6].HalilsoyWave: see Ref. [7].JaNeWi: Janis-Newman-Winicour metric, see Ref. [8].MinkowskiConformal: Minkowski metric in conformally rescaled coordinates.PTD_AI, PTD_AII, PTD_AIII, PTD_BI, PTD_BII, PTD_BIII, PTD_C Petrov-Type D - Levi-Civita spacetimes, see Ref. [7].PainleveGullstrand: Schwarzschild metric in Painlevé-Gullstrand coordinates, see Ref. [9].PlaneGravWave: Plane gravitational wave, see Ref. [10].SchwarzschildIsotropic: Schwarzschild metric in isotropic coordinates, see Ref. [11].SchwarzschildTortoise: Schwarzschild metric in tortoise coordinates, see Ref. [11].Sultana-Dyer: A black hole in the Einstein-de Sitter universe by Sultana and Dyer [12].TaubNUT: see Ref. [13]. The Christoffel symbols and the natural local tetrads of these new metrics are given in the Catalogue of Spacetimes, Ref. [14].To study the behavior of geodesics, it is often useful to determine an effective potential like in classical mechanics. For several metrics, we followed the Euler-Lagrangian approach as described by Rindler [10] and implemented an effective potential for a specific situation. As an example, consider the Lagrangian L=-αt˙+α-1r˙+r2φ˙ for timelike geodesics in the ϑ=π/2 hypersurface in the Schwarzschild spacetime with α=1-2m/r. The Euler-Lagrangian equations lead to the energy balance equation r˙+V(r)=k2 with the effective potential V(r)=(r-2m)(r2+h2)/r3 and the constants of motion k=αt˙ and h=r2φ˙. The constants of motion for a timelike geodesic that starts at (r=10m,φ=0) with initial direction ξ=π/4 with respect to the black hole direction and with initial velocity β=0.7 read k≈1.252 and h≈6.931. Then, from the energy balance equation we immediately obtain the radius of closest approach r≈5.927.Beside a standard Runge-Kutta fourth-order integrator and the integrators of the Gnu Scientific Library (GSL), we also implemented a standard Bulirsch-Stoer integrator.Running time: The test runs provided with the distribution require only a few seconds to run.References:T. Müller, New version announcement to the GeodesicViewer, http://cpc.cs.qub.ac.uk/summaries/AEFP_v2_0.html.P. Schneider, J. Ehlers, E. E. Falco, Gravitational Lenses, Springer, 1992.W. Rindler, Phys. Lett. A 245 (1998) 363.D. Kramer, Ann. Phys. 9 (2000) 331.F.J. Ernst, J. Math. Phys. 17 (1976) 54.S. Chandrasekhar, Proc. R. Soc. Lond. A 421 (1989) 227.H. Stephani, D. Kramer, M. MacCallum, C. Hoenselaers, E. Herlt, Exact Solutions of the Einstein Field Equations, Cambridge University Press, 2009.A.I. Janis, E.T. Newman, J. Winicour, Phys. Rev. Lett. 20 (1968) 878.K. Martel, E. Poisson, Am. J. Phys. 69 (2001) 476.W. Rindler, Relativity - Special, General, and Cosmology, Oxford University Press, Oxford, 2007.C.W. Misner, K.S. Thorne, J.A. Wheeler, Gravitation, W.H. Freeman, 1973.J. Sultana, C.C. Dyer, Gen. Relativ. Gravit. 37 (2005) 1349.D. Bini, C. Cherubini, Robert T. Jantzen, Class. Quantum Grav. 19 (2002) 5481.T. Muller, F. Grave, arXiv:0904.4184 [gr-qc].
Graves, J R
2001-02-01
To inform oncology nurses about the electronic knowledge resources offered by the Sigma Theta Tau International Virginia Henderson International Nursing Library. Published articles and research studies. Clinical nursing research dissemination has been seriously affected by publication bias. The Virginia Henderson International Nursing Library has introduced both a new publishing paradigm for research and a new knowledge indexing strategy for improving electronic access to research knowledge (findings). The ability of oncology nursing to evolve, as an evidence-based practice, is largely dependent on access to research findings.
User Interface Technology Transfer to NASA's Virtual Wind Tunnel System
NASA Technical Reports Server (NTRS)
vanDam, Andries
1998-01-01
Funded by NASA grants for four years, the Brown Computer Graphics Group has developed novel 3D user interfaces for desktop and immersive scientific visualization applications. This past grant period supported the design and development of a software library, the 3D Widget Library, which supports the construction and run-time management of 3D widgets. The 3D Widget Library is a mechanism for transferring user interface technology from the Brown Graphics Group to the Virtual Wind Tunnel system at NASA Ames as well as the public domain.
The business of negotiating for hospital librarians.
Orick, Jan T
2004-01-01
Although many hospital librarians may find it difficult, negotiating with vendors has become a basic skill of library acquisitions. This article reports the results of a non-scientific questionnaire administered to hospital librarians and vendors attending a chapter meeting of the Medical Library Association in 2003. The answers revealed that vendors regard libraries as businesses, and while admitting that the role is often uncomfortable for them, librarians acknowledged that negotiating skills have become an important aspect of their jobs. Questions to help guide librarians through the negotiation process are provided in the Appendix.
Core List of Astronomy and Physics Journals
NASA Astrophysics Data System (ADS)
Bryson, Liz; Fortner, Diane; Yorks, Pamela
This is a list of highly-used and highly-cited physics and astronomy journals. "Use" is measured largely on paper-journal counts from selective academic research-level libraries. Citation count titles are drawn from Institute for Scientific Information (ISI) data. Recognition is given to entrepreneurial electronic-only or new-style electronic journals. Selective news, magazine, and general science journals are omitted. The compilers welcome questions, suggestions for additions, or other advice. Comments may be sent c/o Diane Fortner, Physics Library, University of California, Berkeley. Dfortner@library.berkeley.edu
Big Data Smart Socket (BDSS): a system that abstracts data transfer habits from end users.
Watts, Nicholas A; Feltus, Frank A
2017-02-15
The ability to centralize and store data for long periods on an end user's computational resources is increasingly difficult for many scientific disciplines. For example, genomics data is increasingly large and distributed, and the data needs to be moved into workflow execution sites ranging from lab workstations to the cloud. However, the typical user is not always informed on emerging network technology or the most efficient methods to move and share data. Thus, the user defaults to using inefficient methods for transfer across the commercial internet. To accelerate large data transfer, we created a tool called the Big Data Smart Socket (BDSS) that abstracts data transfer methodology from the user. The user provides BDSS with a manifest of datasets stored in a remote storage repository. BDSS then queries a metadata repository for curated data transfer mechanisms and optimal path to move each of the files in the manifest to the site of workflow execution. BDSS functions as a standalone tool or can be directly integrated into a computational workflow such as provided by the Galaxy Project. To demonstrate applicability, we use BDSS within a biological context, although it is applicable to any scientific domain. BDSS is available under version 2 of the GNU General Public License at https://github.com/feltus/BDSS . ffeltus@clemson.edu. © The Author 2016. Published by Oxford University Press.
Big Data Smart Socket (BDSS): a system that abstracts data transfer habits from end users
Watts, Nicholas A.
2017-01-01
Motivation: The ability to centralize and store data for long periods on an end user’s computational resources is increasingly difficult for many scientific disciplines. For example, genomics data is increasingly large and distributed, and the data needs to be moved into workflow execution sites ranging from lab workstations to the cloud. However, the typical user is not always informed on emerging network technology or the most efficient methods to move and share data. Thus, the user defaults to using inefficient methods for transfer across the commercial internet. Results: To accelerate large data transfer, we created a tool called the Big Data Smart Socket (BDSS) that abstracts data transfer methodology from the user. The user provides BDSS with a manifest of datasets stored in a remote storage repository. BDSS then queries a metadata repository for curated data transfer mechanisms and optimal path to move each of the files in the manifest to the site of workflow execution. BDSS functions as a standalone tool or can be directly integrated into a computational workflow such as provided by the Galaxy Project. To demonstrate applicability, we use BDSS within a biological context, although it is applicable to any scientific domain. Availability and Implementation: BDSS is available under version 2 of the GNU General Public License at https://github.com/feltus/BDSS. Contact: ffeltus@clemson.edu PMID:27797780
Data management routines for reproducible research using the G-Node Python Client library
Sobolev, Andrey; Stoewer, Adrian; Pereira, Michael; Kellner, Christian J.; Garbers, Christian; Rautenberg, Philipp L.; Wachtler, Thomas
2014-01-01
Structured, efficient, and secure storage of experimental data and associated meta-information constitutes one of the most pressing technical challenges in modern neuroscience, and does so particularly in electrophysiology. The German INCF Node aims to provide open-source solutions for this domain that support the scientific data management and analysis workflow, and thus facilitate future data access and reproducible research. G-Node provides a data management system, accessible through an application interface, that is based on a combination of standardized data representation and flexible data annotation to account for the variety of experimental paradigms in electrophysiology. The G-Node Python Library exposes these services to the Python environment, enabling researchers to organize and access their experimental data using their familiar tools while gaining the advantages that a centralized storage entails. The library provides powerful query features, including data slicing and selection by metadata, as well as fine-grained permission control for collaboration and data sharing. Here we demonstrate key actions in working with experimental neuroscience data, such as building a metadata structure, organizing recorded data in datasets, annotating data, or selecting data regions of interest, that can be automated to large degree using the library. Compliant with existing de-facto standards, the G-Node Python Library is compatible with many Python tools in the field of neurophysiology and thus enables seamless integration of data organization into the scientific data workflow. PMID:24634654
Data management routines for reproducible research using the G-Node Python Client library.
Sobolev, Andrey; Stoewer, Adrian; Pereira, Michael; Kellner, Christian J; Garbers, Christian; Rautenberg, Philipp L; Wachtler, Thomas
2014-01-01
Structured, efficient, and secure storage of experimental data and associated meta-information constitutes one of the most pressing technical challenges in modern neuroscience, and does so particularly in electrophysiology. The German INCF Node aims to provide open-source solutions for this domain that support the scientific data management and analysis workflow, and thus facilitate future data access and reproducible research. G-Node provides a data management system, accessible through an application interface, that is based on a combination of standardized data representation and flexible data annotation to account for the variety of experimental paradigms in electrophysiology. The G-Node Python Library exposes these services to the Python environment, enabling researchers to organize and access their experimental data using their familiar tools while gaining the advantages that a centralized storage entails. The library provides powerful query features, including data slicing and selection by metadata, as well as fine-grained permission control for collaboration and data sharing. Here we demonstrate key actions in working with experimental neuroscience data, such as building a metadata structure, organizing recorded data in datasets, annotating data, or selecting data regions of interest, that can be automated to large degree using the library. Compliant with existing de-facto standards, the G-Node Python Library is compatible with many Python tools in the field of neurophysiology and thus enables seamless integration of data organization into the scientific data workflow.
Science Information Programs: The Argentine Telex Network for Scientific and Technical Information.
ERIC Educational Resources Information Center
National Academy of Sciences, Washington, DC.
This document reports on two projects jointly sponsored by the National Academy of Science (NAS) (USA) and the Consejo Nacional de Investigaciones Cientificas y Tecnicas (CONICET) (ARGENTINA). The first is the creation of a telex network for scientific libraries and documentation centers in Argentina, designed to improve access to, and delivery…
Longitudinal Study of Scientific Journal Prices in a Research Library.
ERIC Educational Resources Information Center
Marks, Kenneth E; And Others
1991-01-01
Describes a study that evaluated the determinants of price increases of scientific journals over time from a variety of publishers, disciplines, and countries. It was found that inflation and greater journal length explained most price increases, and that journal prices from commercial publishers increased much more rapidly than those from…
ERIC Educational Resources Information Center
Reed, Robyn B.; Butkovich, Nancy J.
2017-01-01
Discussions abound regarding current and future roles of academic science and medical librarians. As changes in scientific approaches, technology, scholarly communication, and funding mechanisms occur, libraries supporting scientific areas must be equipped to handle the various needs of these researchers. The purpose of this study was to examine…
ERIC Educational Resources Information Center
Hartt, Richard W.
This report discusses the characteristics, operations, and automation requirements of technical libraries providing services to organizations involved in aerospace and defense scientific and technical work, and describes the Local Automation Model project. This on-going project is designed to demonstrate the concept of a fully integrated library…
Greenwood, M R
1995-01-01
Scientific life is changing in fundamental ways as the twenty-first century approaches. Advances in technology are changing methods of scientific communications and dissemination of information, while diminishing resources lead to stabilization, politicization, increased public oversight, and the potential for significant downsizing. Libraries can foster the crucial interdisciplinary connections necessary to forge a new vision of scholarship. PMID:7703945
DOE Office of Scientific and Technical Information (OSTI.GOV)
Quinlan, D.; Yi, Q.; Buduc, R.
2005-02-17
ROSE is an object-oriented software infrastructure for source-to-source translation that provides an interface for programmers to write their own specialized translators for optimizing scientific applications. ROSE is a part of current research on telescoping languages, which provides optimizations of the use of libraries in scientific applications. ROSE defines approaches to extend the optimization techniques, common in well defined languages, to the optimization of scientific applications using well defined libraries. ROSE includes a rich set of tools for generating customized transformations to support optimization of applications codes. We currently support full C and C++ (including template instantiation etc.), with Fortran 90more » support under development as part of a collaboration and contract with Rice to use their version of the open source Open64 F90 front-end. ROSE represents an attempt to define an open compiler infrastructure to handle the full complexity of full scale DOE applications codes using the languages common to scientific computing within DOE. We expect that such an infrastructure will also be useful for the development of numerous tools that may then realistically expect to work on DOE full scale applications.« less
. Permission is granted to copy, distribute and/or modify this document under the terms of the GNU Free Documentation License, Version 1.2 or any later version published by the Free Software Foundation.
SDS: A Framework for Scientific Data Services
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dong, Bin; Byna, Surendra; Wu, Kesheng
2013-10-31
Large-scale scientific applications typically write their data to parallel file systems with organizations designed to achieve fast write speeds. Analysis tasks frequently read the data in a pattern that is different from the write pattern, and therefore experience poor I/O performance. In this paper, we introduce a prototype framework for bridging the performance gap between write and read stages of data access from parallel file systems. We call this framework Scientific Data Services, or SDS for short. This initial implementation of SDS focuses on reorganizing previously written files into data layouts that benefit read patterns, and transparently directs read callsmore » to the reorganized data. SDS follows a client-server architecture. The SDS Server manages partial or full replicas of reorganized datasets and serves SDS Clients' requests for data. The current version of the SDS client library supports HDF5 programming interface for reading data. The client library intercepts HDF5 calls and transparently redirects them to the reorganized data. The SDS client library also provides a querying interface for reading part of the data based on user-specified selective criteria. We describe the design and implementation of the SDS client-server architecture, and evaluate the response time of the SDS Server and the performance benefits of SDS.« less
[G. Baglivi and scientific European community between rationalism and enlightenment].
Toscano, A
2000-01-01
The Baglivi Correspondence, kept in the Waller Collection at the University Library of Uppsala, has been published in Italy for the first time in 1999. This Correspondence kept in Sweden provides new information about the scientific Italian culture between the second half of the seventeenth century and the beginning of the eighteenth. Moreover, it provides important knowledge on the diffusion the Baglivi's work in the scientific European context at that time.
NASA Technical Reports Server (NTRS)
Pinelli, Thomas E.; Kennedy, John M.; White, Terry F.
1991-01-01
Phase 3 of a 4 part study was undertaken to study the use of scientific and technical information (STI) in the academic aerospace community. Phase 3 of this project used three questionnaires that were sent to three groups (i.e., faculty, librarians, and students) in the academic aerospace community. Specific attention was paid to the types of STI used and the methods in which academic users acquire STI. The responses of the academic libraries are focussed on herein. Demographic information on academic aerospace libraries is provided. Data regarding NASA interaction with academic aerospace libraries is also included, as is the survey instrument.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Koning, A.J.; Bersillon, O.; Forrest, R. A.
The status of the Joint Evaluated Fission and Fusion file (JEFF) is described. The next version of the library, JEFF-3.1, comprises a significant update of actinide evaluations, evaluations emerging from European nuclear data projects, the activation library JEFF-3/A, the decay data and fission yield library, and fusion-related data files from the EFF project. The revisions were motivated by the availability of new measurements, modelling capabilities, or trends from integral experiments. Various pre-release validation efforts are underway, mainly for criticality and shielding of thermal and fast systems. This JEFF-3.1 library is expected to provide improved performances with respect to previous releasesmore » for a variety of scientific and industrial applications.« less
Data publishing - visions of the future
NASA Astrophysics Data System (ADS)
Schäfer, Leonie; Klump, Jens; Bertelmann, Roland; Klar, Jochen; Enke, Harry; Rathmann, Torsten; Koudela, Daniela; Köhler, Klaus; Müller-Pfefferkorn, Ralph; van Uytvanck, Dieter; Strathmann, Stefan; Engelhardt, Claudia
2013-04-01
This poster describes future scenarios of information infrastructures in science and other fields of research. The scenarios presented are based on practical experience resulting from interaction with research data in a research center and its library, and further enriched by the results of a baseline study of existing data repositories and data infrastructures. The baseline study was conducted as part of the project "Requirements for a multi-disciplinary research data infrastructure (Radieschen)", which is funded by the German Research Foundation (DFG). Current changes in information infrastructures pose new challenges to libraries and scientific journals, which both act as information service providers, facilitating access to digital media, support publications of research data and enable their long-term archiving. Digital media and research data open new aspects in the field of activity of libraries and scientific journals. What will a library of the future look like? Will a library purely serve as interface to data centres? Will libraries and data centres merge into a new service unit? Will a future library be the interface to academic cloud services? Scientific journals already converted from mostly print editions to print and e-journals. What type of journals will emerge in the future? Is there a role for data-centred journals? Will there be journals to publish software code to make this type of research result citable and a part of the record of science? Just as users evolve from being consumers of information into producers, the role of information service providers, such as libraries, changes from a purely supporting to a contributing role. Furthermore, the role of the library changes from a central point of access for the search of publications to an important link in the value-adding chain from author to publication. Journals for software publication might be another vision for the future in data publishing. Software forms the missing link between big data collected by experiments, monitoring or simulation. In order to verify the results presented, a paper should also report on the process of data analysis applied to the data sets stored at data centers. In this case data, software, and interpretation supplement each other as a trustworthy, reproducible presentation of research results. Another approach is suggested by researchers of the EU-funded project "Liquid Publications" (1). Instead of traditional publications the researchers propose liquid journals as evolving collections of links and material, and recommend new methods in reviewing and assessing publications. Another point of interest are workflows in data publication. The commonly used model to depict the data life cycle might look appealing but does not necessarily represent reality. The model proposed by Treloar et. al. (2) offers a better approach to depict transition of research data between different domains of use, e.g. from the group domain to the public domain. However, several questions need to be addressed, such as how to avoid the loss of contextual information during transitions between domains, and the influence of the size of the data on the workflow process. This poster aims to present different scenarios of the future - from the point of view of researchers, libraries and scientific journals and will invite for further discussion. (1) LiquidPub Green Paper, https://dev.liquidpub.org/svn/liquidpub/papers/deliverables/LPGreenPaper.pdf (2) Treloar, A., Harboe-Ree, C. (2008). Data management and the curation continuum: how the Monash experience is informing repository relationships. In VALA2008, Melbourne, Australia. Retrieved from http://www.valaconf.org.au/vala2008/papers2008/111_Treloar_Final.pdf
Visions of the Future - the Changing Role of Actors in Data-Intensive Science
NASA Astrophysics Data System (ADS)
Schäfer, L.; Klump, J. F.
2013-12-01
Around the world scientific disciplines are increasingly facing the challenge of a burgeoning volume of research data. This data avalanche consists of a stream of information generated from sensors and scientific instruments, digital recordings, social-science surveys or drawn from the World Wide Web. All areas of the scientific economy are affected by this rapid growth in data, from the logging of digs in Archaeology, telescope data with observations of distant galaxies in Astrophysics or data from polls and surveys in the Social Sciences. The challenge for science is not only to process the data through analysis, reduction and visualization, but also to set up infrastructures for provisioning and storing the data. The rise of new technologies and developments also poses new challenges for the actors in the area of research data infrastructures. Libraries, as one of the actors, enable access to digital media and support the publication of research data and its long-term archiving. Digital media and research data, however, introduce new aspects into the libraries' range of activities. How are we to imagine the library of the future? The library as an interface to the computer centers? Will library and computer center fuse into a new service unit? What role will scientific publishers play in future? Currently the traditional form of publication still carry greater weight - articles for conferences and journals. But will this still be the case in future? New forms of publication are already making their presence felt. The tasks of the computer centers may also change. Yesterday their remit was provisioning of rapid hardware, whereas now everything revolves around the topic of data and services. Finally, how about the researchers themselves? Not such a long time ago, Geoscience was not necessarily seen as linked to Computer Science. Nowadays, modern Geoscience relies heavily on IT and its techniques. Thus, in how far will the profile of the modern geoscientist change? This gives rise to the question of what tools are required to locate and pursue the correct course in a networked world. One tool from the area of innovation management is the scenario technique. This poster will outline visions of the future as possible developments of the scientific world in 2020 (or later). The scenarios presented will show possible developments - both positive and negative. It is up then to the actors themselves to define their own position in this context, to rethink it and consider steps that can achieve a positive development for the future.
NASA Astrophysics Data System (ADS)
Gasperini, A.; Abrami, L.; Olostro Cirella, E.
2007-10-01
Until 2002, the Italian astronomical observatories were independent research institutes. Their libraries, though different in their origins and history, shared common bibliographical materials, users and aims. This situation prompted a first experience of unofficial cooperation between astronomical observatory libraries, which produced outstanding results, in particular a detailed survey of the nature, cost and use of scientific journals. Starting from 2002, when the 12 observatories merged into a single institution, the National Institute for Astrophysics (INAF), the experience of cooperation between the libraries became official. The INAF headquarters, in fact, has recently established the Library Documentary and Archive Service of the National Institute for Astrophysics (SBDA-INAF) in order to have a centralized astronomical bibliographical service and to promote cooperation among libraries. At the end of 2004, following the INAF rearrangement, 5 Institutes of the National Research Council (CNR) joined the still new organization introducing further complications. In this work we explain all the problems faced by a working group to elaborate an efficient plan of coordinated acquisition of journals: the difficulties in coordinating 17 different sites distributed over the whole national territory, the not so easy negotiation with the publishers, the choice between e-only or print & online and, last but not least, the psychological impact on the scientific community. The cooperation among Italian astronomical libraries was a plan begun many years ago and has continued through various events over the years. This presentation takes into consideration the various stages of our project focusing on some crucial aspects.
NASA Astrophysics Data System (ADS)
Yang, Zengzhang
2017-11-01
The natural lighting design in the reading spaces of university libraries not only influences physical and mental health of readers but also concerns the energy consumption of the libraries. The scientific and rational design of natural lighting is the key to the design of energy saving for physical environment of the reading space. The paper elaborates the present situation and existed problems of natural lighting in reading spaces of university libraries across Jinan region based on characteristics of light climate of Jinan region and concrete utilization of reading spaces in university libraries, and combining field measurement, survey, research and data analysis of reading spaces in Shandong Women’s University’s library. The paper, under the perspective of energy-efficiency, puts forward proposals to improve natural lighting in the reading spaces of university libraries from five aspects, such as adjustment of interior layout, optimization of outer windows design, employment of the reflector panel, design lighting windows on inner walls and utilization of adjustable sun shading facilities.
Give a Book, Take a Book | Poster
Collection has begun for the 15th Annual Book & Media Swap sponsored by the Scientific Library. NCI at Frederick staff can use this opportunity to clear out personal book and DVD shelves of unwanted materials, donate them to the swap, and then receive “new” materials in return. The library staff will collect materials through Tuesday, Oct. 27. Kick-off day for the event is
ERIC Educational Resources Information Center
Lockheed Research Lab., Palo Alto, CA.
The DIALIB Project was a 3-year experiment that investigated the potential of the public library as a "linking agent" between the public and the many machine-readable data bases currently accessible via the telephone using online terminals. The study investigated the following questions: (1) Is online search of use to the patrons of a…
ERIC Educational Resources Information Center
Terner, Janet
The purpose of this project was to specifically identify important works within the National Bureau of Standards library collection of approximately 125,000 items that are generally acknowledged to be pertinent to the development of modern science and technology. Presented is an annotated list including 197 items selected from the pre-1900…
Analysis of Scientific Research Related Anxiety Levels of Undergraduate Students'
ERIC Educational Resources Information Center
Yildirim, Sefa; Hasiloglu, Mehmet Akif
2018-01-01
In this study, it was aimed to identify the scientific research-related anxiety levels of the undergraduate students studying in the department of faculty of science and letters and faculty of education to analyse these anxiety levels in terms of various variables (students' gender, using web based information sources, going to the library,…
Damsel: A Data Model Storage Library for Exascale Science
DOE Office of Scientific and Technical Information (OSTI.GOV)
Koziol, Quincey
The goal of this project is to enable exascale computational science applications to interact conveniently and efficiently with storage through abstractions that match their data models. We will accomplish this through three major activities: (1) identifying major data model motifs in computational science applications and developing representative benchmarks; (2) developing a data model storage library, called Damsel, that supports these motifs, provides efficient storage data layouts, incorporates optimizations to enable exascale operation, and is tolerant to failures; and (3) productizing Damsel and working with computational scientists to encourage adoption of this library by the scientific community.
Hazan, Lynn; Zugaro, Michaël; Buzsáki, György
2006-09-15
Recent technological advances now allow for simultaneous recording of large populations of anatomically distributed neurons in behaving animals. The free software package described here was designed to help neurophysiologists process and view recorded data in an efficient and user-friendly manner. This package consists of several well-integrated applications, including NeuroScope (http://neuroscope.sourceforce.net), an advanced viewer for electrophysiological and behavioral data with limited editing capabilities, Klusters (http://klusters.sourceforge.net), a graphical cluster cutting application for manual and semi-automatic spike sorting, NDManager (GPL,see http://www.gnu.org/licenses/gpl.html), an experimental parameter and data processing manager. All of these programs are distributed under the GNU General Public License (GPL, see ), which gives its users legal permission to copy, distribute and/or modify the software. Also included are extensive user manuals and sample data, as well as source code and documentation.
The SCEC Broadband Platform: Open-Source Software for Strong Ground Motion Simulation and Validation
NASA Astrophysics Data System (ADS)
Silva, F.; Goulet, C. A.; Maechling, P. J.; Callaghan, S.; Jordan, T. H.
2016-12-01
The Southern California Earthquake Center (SCEC) Broadband Platform (BBP) is a carefully integrated collection of open-source scientific software programs that can simulate broadband (0-100 Hz) ground motions for earthquakes at regional scales. The BBP can run earthquake rupture and wave propagation modeling software to simulate ground motions for well-observed historical earthquakes and to quantify how well the simulated broadband seismograms match the observed seismograms. The BBP can also run simulations for hypothetical earthquakes. In this case, users input an earthquake location and magnitude description, a list of station locations, and a 1D velocity model for the region of interest, and the BBP software then calculates ground motions for the specified stations. The BBP scientific software modules implement kinematic rupture generation, low- and high-frequency seismogram synthesis using wave propagation through 1D layered velocity structures, several ground motion intensity measure calculations, and various ground motion goodness-of-fit tools. These modules are integrated into a software system that provides user-defined, repeatable, calculation of ground-motion seismograms, using multiple alternative ground motion simulation methods, and software utilities to generate tables, plots, and maps. The BBP has been developed over the last five years in a collaborative project involving geoscientists, earthquake engineers, graduate students, and SCEC scientific software developers. The SCEC BBP software released in 2016 can be compiled and run on recent Linux and Mac OS X systems with GNU compilers. It includes five simulation methods, seven simulation regions covering California, Japan, and Eastern North America, and the ability to compare simulation results against empirical ground motion models (aka GMPEs). The latest version includes updated ground motion simulation methods, a suite of new validation metrics and a simplified command line user interface.
1997-12-01
Armed Forces Rad I Research Institute Retrospective Reconstruction of Radiation Doses of Chernobyl Liquidators by Electron Paramagnetic Resonance A...of Radiation Doses of Chernobyl Liquidators by Electron Paramagnetic Resonance Authored by Scientific Center of Radiation Medicine Academy of Medical...libraries associated with the U.S. Government’s Depository Library System. Preface On April 26, 1986, Reactor #4 at the Chernobyl Nuclear Power Plant near
ERIC Educational Resources Information Center
BIVONA, WILLIAM A.
THIS VOLUME PRESENTS THE RESULTS OF A NINE-MONTH TEST OF A PROTOTYPE SELECTIVE DISSEMINATION OF INFORMATION (SDI) SYSTEM DEVELOPED FOR THE ARMY TECHNICAL LIBRARIES. DURING THE PILOT TEST ONE THOUSAND DOCUMENTS WERE CATALOGED, INDEXED, AND DISSEMINATED TO TWENTY-FIVE SCIENTIFIC AND TECHNICAL PERSONNEL. MATCHING OF THE INTEREST PROFILES OF THESE…
[Use of cyber library and digital tools are crucial for academic surgeons].
Tomizawa, Yasuko
2010-10-01
In addition to busy clinical work, an academic surgeon has to spend a lot of time and efforts in writing and submitting articles to scientific journals, teaching young surgical trainees to write an article, organizing and updating his/her academic performances in the curriculum vitae, and writing research grant applications. The use of cyber library and commercially available computer software is useful in saving time and effort.
NASA Astrophysics Data System (ADS)
Bytev, Vladimir V.; Kniehl, Bernd A.
2016-09-01
We present a further extension of the HYPERDIRE project, which is devoted to the creation of a set of Mathematica-based program packages for manipulations with Horn-type hypergeometric functions on the basis of differential equations. Specifically, we present the implementation of the differential reduction for the Lauricella function FC of three variables. Catalogue identifier: AEPP_v4_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEPP_v4_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public License, version 3 No. of lines in distributed program, including test data, etc.: 243461 No. of bytes in distributed program, including test data, etc.: 61610782 Distribution format: tar.gz Programming language: Mathematica. Computer: All computers running Mathematica. Operating system: Operating systems running Mathematica. Classification: 4.4. Does the new version supersede the previous version?: No, it significantly extends the previous version. Nature of problem: Reduction of hypergeometric function FC of three variables to a set of basis functions. Solution method: Differential reduction. Reasons for new version: The extension package allows the user to handle the Lauricella function FC of three variables. Summary of revisions: The previous version goes unchanged. Running time: Depends on the complexity of the problem.
OpenStructure: a flexible software framework for computational structural biology.
Biasini, Marco; Mariani, Valerio; Haas, Jürgen; Scheuber, Stefan; Schenk, Andreas D; Schwede, Torsten; Philippsen, Ansgar
2010-10-15
Developers of new methods in computational structural biology are often hampered in their research by incompatible software tools and non-standardized data formats. To address this problem, we have developed OpenStructure as a modular open source platform to provide a powerful, yet flexible general working environment for structural bioinformatics. OpenStructure consists primarily of a set of libraries written in C++ with a cleanly designed application programmer interface. All functionality can be accessed directly in C++ or in a Python layer, meeting both the requirements for high efficiency and ease of use. Powerful selection queries and the notion of entity views to represent these selections greatly facilitate the development and implementation of algorithms on structural data. The modular integration of computational core methods with powerful visualization tools makes OpenStructure an ideal working and development environment. Several applications, such as the latest versions of IPLT and QMean, have been implemented based on OpenStructure-demonstrating its value for the development of next-generation structural biology algorithms. Source code licensed under the GNU lesser general public license and binaries for MacOS X, Linux and Windows are available for download at http://www.openstructure.org. torsten.schwede@unibas.ch Supplementary data are available at Bioinformatics online.
Compression and fast retrieval of SNP data.
Sambo, Francesco; Di Camillo, Barbara; Toffolo, Gianna; Cobelli, Claudio
2014-11-01
The increasing interest in rare genetic variants and epistatic genetic effects on complex phenotypic traits is currently pushing genome-wide association study design towards datasets of increasing size, both in the number of studied subjects and in the number of genotyped single nucleotide polymorphisms (SNPs). This, in turn, is leading to a compelling need for new methods for compression and fast retrieval of SNP data. We present a novel algorithm and file format for compressing and retrieving SNP data, specifically designed for large-scale association studies. Our algorithm is based on two main ideas: (i) compress linkage disequilibrium blocks in terms of differences with a reference SNP and (ii) compress reference SNPs exploiting information on their call rate and minor allele frequency. Tested on two SNP datasets and compared with several state-of-the-art software tools, our compression algorithm is shown to be competitive in terms of compression rate and to outperform all tools in terms of time to load compressed data. Our compression and decompression algorithms are implemented in a C++ library, are released under the GNU General Public License and are freely downloadable from http://www.dei.unipd.it/~sambofra/snpack.html. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Compression and fast retrieval of SNP data
Sambo, Francesco; Di Camillo, Barbara; Toffolo, Gianna; Cobelli, Claudio
2014-01-01
Motivation: The increasing interest in rare genetic variants and epistatic genetic effects on complex phenotypic traits is currently pushing genome-wide association study design towards datasets of increasing size, both in the number of studied subjects and in the number of genotyped single nucleotide polymorphisms (SNPs). This, in turn, is leading to a compelling need for new methods for compression and fast retrieval of SNP data. Results: We present a novel algorithm and file format for compressing and retrieving SNP data, specifically designed for large-scale association studies. Our algorithm is based on two main ideas: (i) compress linkage disequilibrium blocks in terms of differences with a reference SNP and (ii) compress reference SNPs exploiting information on their call rate and minor allele frequency. Tested on two SNP datasets and compared with several state-of-the-art software tools, our compression algorithm is shown to be competitive in terms of compression rate and to outperform all tools in terms of time to load compressed data. Availability and implementation: Our compression and decompression algorithms are implemented in a C++ library, are released under the GNU General Public License and are freely downloadable from http://www.dei.unipd.it/~sambofra/snpack.html. Contact: sambofra@dei.unipd.it or cobelli@dei.unipd.it. PMID:25064564
NASA Astrophysics Data System (ADS)
Wright, D. G.; Feistel, R.; Reissmann, J. H.; Miyagawa, K.; Jackett, D. R.; Wagner, W.; Overhoff, U.; Guder, C.; Feistel, A.; Marion, G. M.
2010-03-01
The SCOR/IAPSO1 Working Group 127 on Thermodynamics and Equation of State of Seawater has prepared recommendations for new methods and algorithms for numerical estimation of the thermophysical properties of seawater. As an outcome of this work, a new International Thermodynamic Equation of Seawater (TEOS-10) was endorsed by IOC/UNESCO2 in June 2009 as the official replacement and extension of the 1980 International Equation of State, EOS-80. As part of this new standard a source code package has been prepared that is now made freely available to users via the World Wide Web. This package includes two libraries referred to as the SIA (Sea-Ice-Air) library and the GSW (Gibbs SeaWater) library. Information on the GSW library may be found on the TEOS-10 web site (http://www.TEOS-10.org). This publication provides an introduction to the SIA library which contains routines to calculate various thermodynamic properties as discussed in the companion paper. The SIA library is very comprehensive, including routines to deal with fluid water, ice, seawater and humid air as well as equilibrium states involving various combinations of these, with equivalent code developed in different languages. The code is hierachically structured in modules that support (i) almost unlimited extension with respect to additional properties or relations, (ii) an extraction of self-contained sub-libraries, (iii) separate updating of the empirical thermodynamic potentials, and (iv) code verification on different platforms and between different languages. Error trapping is implemented to identify when one or more of the primary routines are accessed significantly beyond their established range of validity. The initial version of the SIA library is available in Visual Basic and FORTRAN as a supplement to this publication and updates will be maintained on the TEOS-10 web site. 1 SCOR/IAPSO: Scientific Committee on Oceanic Research/International Association for the Physical Sciences of the Oceans 2 IOC/UNESCO: Intergovernmental Oceanographic Commission/United Nations Educational, Scientific and Cultural Organization
NASA Astrophysics Data System (ADS)
Wright, D. G.; Feistel, R.; Reissmann, J. H.; Miyagawa, K.; Jackett, D. R.; Wagner, W.; Overhoff, U.; Guder, C.; Feistel, A.; Marion, G. M.
2010-07-01
The SCOR/IAPSO1 Working Group 127 on Thermodynamics and Equation of State of Seawater has prepared recommendations for new methods and algorithms for numerical estimation of the the thermophysical properties of seawater. As an outcome of this work, a new International Thermodynamic Equation of Seawater (TEOS-10) was endorsed by IOC/UNESCO2 in June 2009 as the official replacement and extension of the 1980 International Equation of State, EOS-80. As part of this new standard a source code package has been prepared that is now made freely available to users via the World Wide Web. This package includes two libraries referred to as the SIA (Sea-Ice-Air) library and the GSW (Gibbs SeaWater) library. Information on the GSW library may be found on the TEOS-10 web site (http://www.TEOS-10.org). This publication provides an introduction to the SIA library which contains routines to calculate various thermodynamic properties as discussed in the companion paper. The SIA library is very comprehensive, including routines to deal with fluid water, ice, seawater and humid air as well as equilibrium states involving various combinations of these, with equivalent code developed in different languages. The code is hierachically structured in modules that support (i) almost unlimited extension with respect to additional properties or relations, (ii) an extraction of self-contained sub-libraries, (iii) separate updating of the empirical thermodynamic potentials, and (iv) code verification on different platforms and between different languages. Error trapping is implemented to identify when one or more of the primary routines are accessed significantly beyond their established range of validity. The initial version of the SIA library is available in Visual Basic and FORTRAN as a supplement to this publication and updates will be maintained on the TEOS-10 web site. 1SCOR/IAPSO: Scientific Committee on Oceanic Research/International Association for the Physical Sciences of the Oceans 2IOC/UNESCO: Intergovernmental Oceanographic Commission/United Nations Educational, Scientific and Cultural Organization
ERIC Educational Resources Information Center
Mackenzie, A. Graham
This technical report presents recommendations and plans which are the result of a mission undertaken as part of a project to promote a scientific and technological information service and establish a popular science resource center in Korea. The mission's main emphasis was to help Korean authorities and the United Nations Development Programme…
Summer Events at the Scientific Library | Poster
Two exciting events are coming this summer from the Scientific Library—the annual Student Science Jeopardy Tournament and the Summer Video Series. This year, the 10th Annual Student Science Jeopardy Tournament will be held on Wednesday, July 20, beginning at 10 a.m. in the auditorium of Building 549. The event will also be streamed live to the Advanced Technology Research
SciELO, Scientific Electronic Library Online, a Database of Open Access Journals
ERIC Educational Resources Information Center
Meneghini, Rogerio
2013-01-01
This essay discusses SciELO, a scientific journal database operating in 14 countries. It covers over 1000 journals providing open access to full text and table sets of scientometrics data. In Brazil it is responsible for a collection of nearly 300 journals, selected along 15 years as the best Brazilian periodicals in natural and social sciences.…
Methodology for fast detection of false sharing in threaded scientific codes
Chung, I-Hsin; Cong, Guojing; Murata, Hiroki; Negishi, Yasushi; Wen, Hui-Fang
2014-11-25
A profiling tool identifies a code region with a false sharing potential. A static analysis tool classifies variables and arrays in the identified code region. A mapping detection library correlates memory access instructions in the identified code region with variables and arrays in the identified code region while a processor is running the identified code region. The mapping detection library identifies one or more instructions at risk, in the identified code region, which are subject to an analysis by a false sharing detection library. A false sharing detection library performs a run-time analysis of the one or more instructions at risk while the processor is re-running the identified code region. The false sharing detection library determines, based on the performed run-time analysis, whether two different portions of the cache memory line are accessed by the generated binary code.
A Multi-Discipline, Multi-Genre Digital Library for Research and Education
NASA Technical Reports Server (NTRS)
Nelson, Michael L.; Maly, Kurt; Shen, Stewart N. T.
2004-01-01
We describe NCSTRL+, a unified, canonical digital library for educational and scientific and technical information (STI). NCSTRL+ is based on the Networked Computer Science Technical Report Library (NCSTRL), a World Wide Web (WWW) accessible digital library (DL) that provides access to over 100 university departments and laboratories. NCSTRL+ implements two new technologies: cluster functionality and publishing "buckets". We have extended the Dienst protocol, the protocol underlying NCSTRL, to provide the ability to "cluster" independent collections into a logically centralized digital library based upon subject category classification, type of organization, and genres of material. The concept of "buckets" provides a mechanism for publishing and managing logically linked entities with multiple data formats. The NCSTRL+ prototype DL contains the holdings of NCSTRL and the NASA Technical Report Server (NTRS). The prototype demonstrates the feasibility of publishing into a multi-cluster DL, searching across clusters, and storing and presenting buckets of information.
Rux, Erika M.; Flaspohler, John A.
2007-01-01
Contemporary undergraduates in the biological sciences have unprecedented access to scientific information. Although many of these students may be savvy technologists, studies from the field of library and information science consistently show that undergraduates often struggle to locate, evaluate, and use high-quality, reputable sources of information. This study demonstrates the efficacy and pedagogical value of a collaborative teaching approach designed to enhance information literacy competencies among undergraduate biology majors who must write a formal scientific research paper. We rely on the triangulation of assessment data to determine the effectiveness of a substantial research paper project completed by students enrolled in an upper-level biology course. After enhancing library-based instruction, adding an annotated bibliography requirement, and using multiple assessment techniques, we show fundamental improvements in students' library research abilities. Ultimately, these improvements make it possible for students to more independently and effectively complete this challenging science-based writing assignment. We document critical information literacy advances in several key areas: student source-type use, annotated bibliography enhancement, plagiarism reduction, as well as student and faculty/librarian satisfaction. PMID:18056306
Analyzing microtomography data with Python and the scikit-image library.
Gouillart, Emmanuelle; Nunez-Iglesias, Juan; van der Walt, Stéfan
2017-01-01
The exploration and processing of images is a vital aspect of the scientific workflows of many X-ray imaging modalities. Users require tools that combine interactivity, versatility, and performance. scikit-image is an open-source image processing toolkit for the Python language that supports a large variety of file formats and is compatible with 2D and 3D images. The toolkit exposes a simple programming interface, with thematic modules grouping functions according to their purpose, such as image restoration, segmentation, and measurements. scikit-image users benefit from a rich scientific Python ecosystem that contains many powerful libraries for tasks such as visualization or machine learning. scikit-image combines a gentle learning curve, versatile image processing capabilities, and the scalable performance required for the high-throughput analysis of X-ray imaging data.
Earth Sciences data access and preservation with gLibrary
NASA Astrophysics Data System (ADS)
Guidetti, Veronica; Calanducci, Antonio
2010-05-01
ESA-ESRIN, the European Space Agency Centre for Earth Observation (EO), is the largest European EO data provider and operates as the reference European centre for EO payload data exploitation. EO data acquired from space have become powerful scientific tools to enable better understanding and management of the Earth and its resources. Large international initiatives such as GMES and GEO, supported by the European Commission, focus on coordinating international efforts to environmental monitoring, i.e. to provide political and technical solutions to global issues, such as climate change, global environment monitoring, management of natural resources and humanitarian response. Since the time-span of EO data archives extends from a few years to decades, their value as scientific time-series increases considerably, especially for the topic of global change. It will be soon necessary to re-analyse on global scale the information currently locked inside large thematic archives. Future research in the field of Earth Sciences is of invaluable importance: to carry it on researchers worldwide must be enabled to find and access data of interest in a quick and easy way. At present, several thousands of scientists, principal investigators and operators, access EO missions' metadata, data and derived information on a daily basis. Main objectives may be to study the global climate change, to check the status of the instrument on-board and the quality of EO data. There is a huge worldwide scientific community calling for the need to keep EO data accessible without time constrains, easily and quickly. In collaboration with ESA-ESRIN, INFN, the National Institute for Nuclear Physics, is implementing a demonstrative use case where satellite remote sensing data, including in-situ data and other kind of digital assets, are made available to the scientific community via gLibrary (https://glibrary.ct.infn.it), the INFN digital library platform. gLibrary can be used to store, organise, browse, retrieve, annotate and replicate any kind of digital asset on data grids or distributed storage environments. It provides digital assets preservation capabilities, making use of distributed replication of assets, decoupling from the underlying storage technology, and adoption of standard interfaces and metadata descriptions. In its future development gLibrary will investigate and possibly provide integration with grid and HPC processing services, including the ESA G-POD facility (http://eogrid.esrin.esa.int). Currently, gLibrary features encompass fast data access, quick retrieval of digital assets, metadata handling and sharing (including text annotation), high availability and scalability (due to its distributed architecture), (meta)data replication and, last but not least, authentication and authorisation. Much of the experimentation is on-going at EC and international level to provide coordinated and interoperable access to EO data and satellite imagery including any kind of related digital assets (metadata, documents, product guidelines, auxiliary data, mission/sensor specifications, environmental reports). The work with gLibrary comes as a best effort initiative and targets a full interoperability with ESA EO data dissemination, recovering and processing services and intends to demonstrate the benefit the scientific community can gain from this kind of integrated data access. It contributes to respond to the Earth Sciences data users' needs, moving forward the technology development to facilitate a very interactive EO information sharing, analysis and interoperability on the Web.
ERIC Educational Resources Information Center
Summit, Roger K.; Firschein, Oscar
In conjunction with the National Science Foundation (NSD), an on-going experiment is being conducted to test the feasibility of increasing public access to 16 major data bases by providing public libraries with on-line, interactive, retrieval capacity. During the period from January to March 1976, the major activities of the study were: (1) a…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moussa, Jonathan E.
2013-05-13
This piece of software is a new feature implemented inside an existing open-source library. Specifically, it is a new implementation of a density functional (HSE, short for Heyd-Scuseria-Ernzerhof) for a repository of density functionals, the libxc library. It fixes some numerical problems with existing implementations, as outlined in a scientific paper recently submitted for publication. Density functionals are components of electronic structure simulations, which model properties of electrons inside molecules and crystals.
Artificial intelligence support for scientific model-building
NASA Technical Reports Server (NTRS)
Keller, Richard M.
1992-01-01
Scientific model-building can be a time-intensive and painstaking process, often involving the development of large and complex computer programs. Despite the effort involved, scientific models cannot easily be distributed and shared with other scientists. In general, implemented scientific models are complex, idiosyncratic, and difficult for anyone but the original scientific development team to understand. We believe that artificial intelligence techniques can facilitate both the model-building and model-sharing process. In this paper, we overview our effort to build a scientific modeling software tool that aids the scientist in developing and using models. This tool includes an interactive intelligent graphical interface, a high-level domain specific modeling language, a library of physics equations and experimental datasets, and a suite of data display facilities.
NASA Astrophysics Data System (ADS)
Besara, Rachel
2015-03-01
For years the cost of STEM databases have exceeded the rate of inflation. Libraries have reallocated funds for years to continue to provide support to their scientific communities, but they are reaching a point at many institutions where they are no longer able to provide access to many databases considered standard to support research. A possible or partial alleviation to this problem is the federal open access mandate. However, this shift challenges the current model of publishing and data management in the sciences. This talk will discuss these topics from the perspective of research libraries supporting physics and the STEM disciplines.
Author identities an interoperability problem solved by a collaborative solution
NASA Astrophysics Data System (ADS)
Fleischer, D.; Czerniak, A.; Schirnick, C.
2012-12-01
The identity of authors and data providers is crucial for personalized interoperability. The marketplace of available identifiers is packed and the right choice is getting more and more complicated. Even though there are more then 15 different systems available there are still some under development and proposed to come up by the end of 2012 ('PubMed Central Author ID' and ORCID). Data Management on a scale beyond the size of a single research institute but on the scale of a scientific site including a university with student education program needs to tackle this problem and so did the Kiel Data Management an Infrastructure. The main problem with the identities of researchers is the quite high frequency changes in positions during a scientist life. The required system needed to be a system that already contained the potential of preregistered people with their scientific publications from other countries, institutions and organizations. Scanning the author ID marketplace brought up, that there us a high risk of additional workload to the researcher itself or the administration due to the fact that individuals need to register an ID for themselves or the chosen register is not yet big enough to simply find the right entry. On the other hand libraries deal with authors and their publications now for centuries and they have high quality catalogs with person identities already available. Millions of records internationally mapped are available by collaboration with libraries and can be used in exactly the same scope. The international collaboration between libraries (VIAF) provides a mapping between libraries from the US, CA, UK, FR, GER and many more. The international library author identification system made it possible to actually reach at the first matching a success of 60% of all scientists. The additional advantage is that librarians can finalize the Identity system in a kind of background process. The Kiel Data Management Infrastructure initiated a web service at Kiel for mapping from one ID to another. This web service supports the scientific workflows for automation of the data archiving process at world data archive PANGAEA. The long-lasting concept of the library identifier enables the use of these identifiers beyond the employment period, while it has nothing to do with the institutional IDM. The access rights and ownership of data can be assured for very long time since the national library with its national scope hosts the basic system. Making use of this existing system released resourced planed for this task and enabled the chance of interoperability on an international scale for a regional data management infrastructure.
MCdevelop - a universal framework for Stochastic Simulations
NASA Astrophysics Data System (ADS)
Slawinska, M.; Jadach, S.
2011-03-01
We present MCdevelop, a universal computer framework for developing and exploiting the wide class of Stochastic Simulations (SS) software. This powerful universal SS software development tool has been derived from a series of scientific projects for precision calculations in high energy physics (HEP), which feature a wide range of functionality in the SS software needed for advanced precision Quantum Field Theory calculations for the past LEP experiments and for the ongoing LHC experiments at CERN, Geneva. MCdevelop is a "spin-off" product of HEP to be exploited in other areas, while it will still serve to develop new SS software for HEP experiments. Typically SS involve independent generation of large sets of random "events", often requiring considerable CPU power. Since SS jobs usually do not share memory it makes them easy to parallelize. The efficient development, testing and running in parallel SS software requires a convenient framework to develop software source code, deploy and monitor batch jobs, merge and analyse results from multiple parallel jobs, even before the production runs are terminated. Throughout the years of development of stochastic simulations for HEP, a sophisticated framework featuring all the above mentioned functionality has been implemented. MCdevelop represents its latest version, written mostly in C++ (GNU compiler gcc). It uses Autotools to build binaries (optionally managed within the KDevelop 3.5.3 Integrated Development Environment (IDE)). It uses the open-source ROOT package for histogramming, graphics and the mechanism of persistency for the C++ objects. MCdevelop helps to run multiple parallel jobs on any computer cluster with NQS-type batch system. Program summaryProgram title:MCdevelop Catalogue identifier: AEHW_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEHW_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 48 136 No. of bytes in distributed program, including test data, etc.: 355 698 Distribution format: tar.gz Programming language: ANSI C++ Computer: Any computer system or cluster with C++ compiler and UNIX-like operating system. Operating system: Most UNIX systems, Linux. The application programs were thoroughly tested under Ubuntu 7.04, 8.04 and CERN Scientific Linux 5. Has the code been vectorised or parallelised?: Tools (scripts) for optional parallelisation on a PC farm are included. RAM: 500 bytes Classification: 11.3 External routines: ROOT package version 5.0 or higher ( http://root.cern.ch/drupal/). Nature of problem: Developing any type of stochastic simulation program for high energy physics and other areas. Solution method: Object Oriented programming in C++ with added persistency mechanism, batch scripts for running on PC farms and Autotools.
Summer Events at the Scientific Library | Poster
Two exciting events are coming this summer from the Scientific Library—the annual Student Science Jeopardy Tournament and the Summer Video Series. This year, the 10th Annual Student Science Jeopardy Tournament will be held on Wednesday, July 20, beginning at 10 a.m. in the auditorium of Building 549. The event will also be streamed live to the Advanced Technology Research Facility (ATRF), room E1203.
Activities at the Lunar and Planetary Institute
NASA Technical Reports Server (NTRS)
1985-01-01
The activities of the Lunar and Planetary Institute for the period July to December 1984 are discussed. Functions of its departments and projects are summarized. These include: planetary image center; library information center; computer center; production services; scientific staff; visitors program; scientific projects; conferences; workshops; seminars; publications and communications; panels, teams, committees and working groups; NASA-AMES vertical gun range (AVGR); and lunar and planetary science council.
NASA Astrophysics Data System (ADS)
Neal, J. G.
2008-12-01
Research libraries provide a set of core services to the scholarly and educational communities. This includes: information acquisition, synthesis, navigation, discovery, dissemination, interpretation, presentation, understanding and archiving. Researchers across the science disciplines and increasingly in multi disciplinary projects are producing massive amounts of data, and they seek the infrastructure, the strategies and the partnerships that will enable rigorous and sustained tools for extraction, distribution, collaboration, application and permanent availability. This paper will address the role of the research library from three perspectives. First, the view of scientific datasets as information assets that would benefit from traditional library collection development practice will be explored. Second, the agenda on e-science developed by the Association of Research Libraries will be outlined with a focus on the need for policy and standards development, for resources assessment and allocation, for new approaches to the preparation of the library professional, and library leadership in campus planning and innovative collaborations for research cyberinfrastructure. And third, the responses to the call for proposals from the National Science Foundation's DataNet program will be analyzed and the role of the research library in these project plans will be summarized as an indicator of the expanding responsibility of the library for research data stewardship.
[The future of scientific libraries].
De Fiore, Luca
2013-10-01
"Making predictions is always very difficult, especially about the future". Niels Bohr's quote is very appropriate when looking into the future of libraries. If the Web is now the richest library in the world, it is also the most friendly and therefore the most convenient. The evolution of libraries in the coming years - both traditional and online - will probably depend on their ability to meet the information needs of users: improved ease of use and better reliability of the information. These are objectives that require money and - given the general reduction in budgets - it is not obvious that the results will be achieved. However, there are many promising experiences at the international level that show that the world of libraries is populated by projects and creativity. Traditional or digital, libraries will increasingly present themselves more as a sharing tool than as a repository of information: it is the sharing that translates data into knowledge. In the healthcare field, the integration of online libraries with the epidemiological information systems could favor the fulfillment of unconscious information needs of health personnel; libraries will therefore be a key tool for an integrated answer to the challenge of continuing education in medicine. The Internet is no longer a library but an information ecosystem where the data are transformed into knowledge by sharing and discussion.
[SciELO: method for electronic publishing].
Laerte Packer, A; Rocha Biojone, M; Antonio, I; Mayumi Takemaka, R; Pedroso García, A; Costa da Silva, A; Toshiyuki Murasaki, R; Mylek, C; Carvalho Reisl, O; Rocha F Delbucio, H C
2001-01-01
It describes the SciELO Methodology Scientific Electronic Library Online for electronic publishing of scientific periodicals, examining issues such as the transition from traditional printed publication to electronic publishing, the scientific communication process, the principles which founded the methodology development, its application in the building of the SciELO site, its modules and components, the tools use for its construction etc. The article also discusses the potentialities and trends for the area in Brazil and Latin America, pointing out questions and proposals which should be investigated and solved by the methodology. It concludes that the SciELO Methodology is an efficient, flexible and wide solution for the scientific electronic publishing.
FAST: FAST Analysis of Sequences Toolbox
Lawrence, Travis J.; Kauffman, Kyle T.; Amrine, Katherine C. H.; Carper, Dana L.; Lee, Raymond S.; Becich, Peter J.; Canales, Claudia J.; Ardell, David H.
2015-01-01
FAST (FAST Analysis of Sequences Toolbox) provides simple, powerful open source command-line tools to filter, transform, annotate and analyze biological sequence data. Modeled after the GNU (GNU's Not Unix) Textutils such as grep, cut, and tr, FAST tools such as fasgrep, fascut, and fastr make it easy to rapidly prototype expressive bioinformatic workflows in a compact and generic command vocabulary. Compact combinatorial encoding of data workflows with FAST commands can simplify the documentation and reproducibility of bioinformatic protocols, supporting better transparency in biological data science. Interface self-consistency and conformity with conventions of GNU, Matlab, Perl, BioPerl, R, and GenBank help make FAST easy and rewarding to learn. FAST automates numerical, taxonomic, and text-based sorting, selection and transformation of sequence records and alignment sites based on content, index ranges, descriptive tags, annotated features, and in-line calculated analytics, including composition and codon usage. Automated content- and feature-based extraction of sites and support for molecular population genetic statistics make FAST useful for molecular evolutionary analysis. FAST is portable, easy to install and secure thanks to the relative maturity of its Perl and BioPerl foundations, with stable releases posted to CPAN. Development as well as a publicly accessible Cookbook and Wiki are available on the FAST GitHub repository at https://github.com/tlawrence3/FAST. The default data exchange format in FAST is Multi-FastA (specifically, a restriction of BioPerl FastA format). Sanger and Illumina 1.8+ FastQ formatted files are also supported. FAST makes it easier for non-programmer biologists to interactively investigate and control biological data at the speed of thought. PMID:26042145
Cunneen, Monica M.; Liu, Bin; Wang, Lei; Reeves, Peter R.
2013-01-01
We have undertaken an extensive survey of a group of epimerases originally named Gne, that were thought to be responsible for inter-conversion of UDP-N-acetylglucosamine (UDP-GlcNAc) and UDP-N-acetylgalactosamine (UDP-GalNAc). The analysis builds on recent work clarifying the specificity of some of these epimerases. We find three well defined clades responsible for inter-conversion of the gluco- and galacto-configuration at C4 of different N-acetylhexosamines. Their major biological roles are the formation of UDP-GalNAc, UDP-N-acetylgalactosaminuronic acid (UDP-GalNAcA) and undecaprenyl pyrophosphate-N-acetylgalactosamine (UndPP-GalNAc) from the corresponding glucose forms. We propose that the clade of UDP-GlcNAcA epimerase genes be named gnaB and the clade of UndPP-GlcNAc epimerase genes be named gnu, while the UDP-GlcNAc epimerase genes retain the name gne. The Gne epimerases, as now defined after exclusion of those to be named GnaB or Gnu, are in the same clade as the GalE 4-epimerases for inter-conversion of UDP-glucose (UDP-Glc) and UDP-galactose (UDP-Gal). This work brings clarity to an area that had become quite confusing. The identification of distinct enzymes for epimerisation of UDP-GlcNAc, UDP-GlcNAcA and UndPP-GlcNAc will greatly facilitate allocation of gene function in polysaccharide gene clusters, including those found in bacterial genome sequences. A table of the accession numbers for the 295 proteins used in the analysis is provided to enable the major tree to be regenerated with the inclusion of additional proteins of interest. This and other suggestions for annotation of 4-epimerase genes will facilitate annotation. PMID:23799153
Genetics Home Reference: GRACILE syndrome
... University of Utah Eccles Health Sciences Library: Iron Metabolism Patient Support and Advocacy Resources (1 link) Climb: National Information Centre for Metabolic Diseases Scientific Articles on PubMed (1 link) PubMed OMIM (1 link) ...
The librarian as research informationist: a case study.
Federer, Lisa
2013-10-01
How can an embedded research informationist add value to the scientific output of research teams? The University of California-Los Angeles (UCLA) Louise M. Darling Biomedical Library is an academic health sciences library serving the clinical, educational, and research needs of the UCLA community. A grant from the National Library of Medicine funded a librarian to join a UCLA research team as an informationist. The informationist meets regularly with the research team and provides guidance related to data management, preservation, and other information-related issues. Early results suggest that the informationist's involvement has influenced the team's data gathering, storage, and curation methods. The UCLA Library has also changed the librarian's title to research informationist to reflect the new activities that she performs. The research informationist role provides an opportunity for librarians to become effective members of research teams and improve research output.
ERIC Educational Resources Information Center
Melo, Luiza Baptista; Pires, Cesaltina
2012-01-01
This paper investigates the factors that influence the value for the users of the Portuguese electronic scientific information consortium b-on (Biblioteca do Conhecimento Online). We used the contingent valuation method based on a willingness to pay scenario to estimate the value that each user is willing to pay. Data were collected through an…
Phyx: phylogenetic tools for unix.
Brown, Joseph W; Walker, Joseph F; Smith, Stephen A
2017-06-15
The ease with which phylogenomic data can be generated has drastically escalated the computational burden for even routine phylogenetic investigations. To address this, we present phyx : a collection of programs written in C ++ to explore, manipulate, analyze and simulate phylogenetic objects (alignments, trees and MCMC logs). Modelled after Unix/GNU/Linux command line tools, individual programs perform a single task and operate on standard I/O streams that can be piped to quickly and easily form complex analytical pipelines. Because of the stream-centric paradigm, memory requirements are minimized (often only a single tree or sequence in memory at any instance), and hence phyx is capable of efficiently processing very large datasets. phyx runs on POSIX-compliant operating systems. Source code, installation instructions, documentation and example files are freely available under the GNU General Public License at https://github.com/FePhyFoFum/phyx. eebsmith@umich.edu. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press.
Phyx: phylogenetic tools for unix
Brown, Joseph W.; Walker, Joseph F.; Smith, Stephen A.
2017-01-01
Abstract Summary: The ease with which phylogenomic data can be generated has drastically escalated the computational burden for even routine phylogenetic investigations. To address this, we present phyx: a collection of programs written in C ++ to explore, manipulate, analyze and simulate phylogenetic objects (alignments, trees and MCMC logs). Modelled after Unix/GNU/Linux command line tools, individual programs perform a single task and operate on standard I/O streams that can be piped to quickly and easily form complex analytical pipelines. Because of the stream-centric paradigm, memory requirements are minimized (often only a single tree or sequence in memory at any instance), and hence phyx is capable of efficiently processing very large datasets. Availability and Implementation: phyx runs on POSIX-compliant operating systems. Source code, installation instructions, documentation and example files are freely available under the GNU General Public License at https://github.com/FePhyFoFum/phyx Contact: eebsmith@umich.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:28174903
Otegui, Javier; Ariño, Arturo H
2012-08-15
In any data quality workflow, data publishers must become aware of issues in their data so these can be corrected. User feedback mechanisms provide one avenue, while global assessments of datasets provide another. To date, there is no publicly available tool to allow both biodiversity data institutions sharing their data through the Global Biodiversity Information Facility network and its potential users to assess datasets as a whole. Contributing to bridge this gap both for publishers and users, we introduce BIoDiversity DataSets Assessment Tool, an online tool that enables selected diagnostic visualizations on the content of data publishers and/or their individual collections. The online application is accessible at http://www.unav.es/unzyec/mzna/biddsat/ and is supported by all major browsers. The source code is licensed under the GNU GPLv3 license (http://www.gnu.org/licenses/gpl-3.0.txt) and is available at https://github.com/jotegui/BIDDSAT.
Common Graphics Library (CGL). Volume 1: LEZ user's guide
NASA Technical Reports Server (NTRS)
Taylor, Nancy L.; Hammond, Dana P.; Hofler, Alicia S.; Miner, David L.
1988-01-01
Users are introduced to and instructed in the use of the Langley Easy (LEZ) routines of the Common Graphics Library (CGL). The LEZ routines form an application independent graphics package which enables the user community to view data quickly and easily, while providing a means of generating scientific charts conforming to the publication and/or viewgraph process. A distinct advantage for using the LEZ routines is that the underlying graphics package may be replaced or modified without requiring the users to change their application programs. The library is written in ANSI FORTRAN 77, and currently uses a CORE-based underlying graphics package, and is therefore machine independent, providing support for centralized and/or distributed computer systems.
The fast azimuthal integration Python library: pyFAI.
Ashiotis, Giannis; Deschildre, Aurore; Nawaz, Zubair; Wright, Jonathan P; Karkoulis, Dimitrios; Picca, Frédéric Emmanuel; Kieffer, Jérôme
2015-04-01
pyFAI is an open-source software package designed to perform azimuthal integration and, correspondingly, two-dimensional regrouping on area-detector frames for small- and wide-angle X-ray scattering experiments. It is written in Python (with binary submodules for improved performance), a language widely accepted and used by the scientific community today, which enables users to easily incorporate the pyFAI library into their processing pipeline. This article focuses on recent work, especially the ease of calibration, its accuracy and the execution speed for integration.
Give a Book, Take a Book | Poster
Collection has begun for the 15th Annual Book & Media Swap sponsored by the Scientific Library. NCI at Frederick staff can use this opportunity to clear out personal book and DVD shelves of unwanted materials, donate them to the swap, and then receive “new” materials in return. The library staff will collect materials through Tuesday, Oct. 27. Kick-off day for the event is Wednesday, Oct. 28, 10 a.m. to 2 p.m., in the lobby of the Conference Center in Building 549.
Lyceum: A Multi-Protocol Digital Library Gateway
NASA Technical Reports Server (NTRS)
Maa, Ming-Hokng; Nelson, Michael L.; Esler, Sandra L.
1997-01-01
Lyceum is a prototype scalable query gateway that provides a logically central interface to multi-protocol and physically distributed, digital libraries of scientific and technical information. Lyceum processes queries to multiple syntactically distinct search engines used by various distributed information servers from a single logically central interface without modification of the remote search engines. A working prototype (http://www.larc.nasa.gov/lyceum/) demonstrates the capabilities, potentials, and advantages of this type of meta-search engine by providing access to over 50 servers covering over 20 disciplines.
Caribbean Equal Access Program: HIV/AIDS Information Resources from the National Library of Medicine
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nancy Dancy, NLM, and Wilma Templin-Branner, ORISE
2009-01-01
As the treatment and management of HIV/AIDS continues to evolve with new scientific breakthroughs, treatment discoveries, and management challenges, it is difficult for people living with HIV/AIDS and those who care for them to keep up with the latest information on HIV/AIDS prevention, treatment, and research. The National Library of Medicine, of the National Institutes of Health, has a wealth of health information resources freely available on the Internet to address these needs.
ERIC Educational Resources Information Center
Oficina de Educacion Iberoamericana, Madrid (Spain).
The Office of Iberoamerican Education, an intergovernmental body based on educational and cultural cooperation for the purpose of disseminating information, documentation, advice, and assistance in the field of education, co-sponsors (with UNESCO) the work represented in this study of library and information planning and facilities in the Andean…
Charon Message-Passing Toolkit for Scientific Computations
NASA Technical Reports Server (NTRS)
VanderWijngaart, Rob F.; Yan, Jerry (Technical Monitor)
2000-01-01
Charon is a library, callable from C and Fortran, that aids the conversion of structured-grid legacy codes-such as those used in the numerical computation of fluid flows-into parallel, high- performance codes. Key are functions that define distributed arrays, that map between distributed and non-distributed arrays, and that allow easy specification of common communications on structured grids. The library is based on the widely accepted MPI message passing standard. We present an overview of the functionality of Charon, and some representative results.
[Tomaso Rangone (1493-1577): an Italian physician and his library].
Herrmann, Sabine
2012-01-01
The private library of Tomaso Rangone (1473-1577), famous for his patronage of Jacobo Sansovino and Alessandro Vittoria, does not only reflect the personal interests of a medical practitioner in the Italian Renaissance, but also the social, and scientific development of the first half of the Cinquecento: the popularity of astrology, the effect of the European expansion on geography, the growing interest for historiography, the advances in the field of medicine and botany and the remaining influence of medieval scholasticism.
American Academy of Forensic Sciences
... Academy News PDF Library Proceedings Journal of Forensic Sciences Information for Authors Searchable Index Contact Information Forensic Links ... Dale Stewart Award 2018 Annual Scientific Meeting Registration ... in Forensic Science … Now What? Young Forensic Scientists Forum (YFSF) Annual ...
ERIC Educational Resources Information Center
Proceedings of the ASIS Annual Meeting, 1996
1996-01-01
Includes abstracts of special interest group (SIG) sessions. Highlights include digital imagery; text summarization; browsing; digital libraries; icons and the Web; information management; curricula planning; interfaces; information systems; theories; scholarly and scientific communication; global development; archives; document delivery;…
Rapid development of medical imaging tools with open-source libraries.
Caban, Jesus J; Joshi, Alark; Nagy, Paul
2007-11-01
Rapid prototyping is an important element in researching new imaging analysis techniques and developing custom medical applications. In the last ten years, the open source community and the number of open source libraries and freely available frameworks for biomedical research have grown significantly. What they offer are now considered standards in medical image analysis, computer-aided diagnosis, and medical visualization. A cursory review of the peer-reviewed literature in imaging informatics (indeed, in almost any information technology-dependent scientific discipline) indicates the current reliance on open source libraries to accelerate development and validation of processes and techniques. In this survey paper, we review and compare a few of the most successful open source libraries and frameworks for medical application development. Our dual intentions are to provide evidence that these approaches already constitute a vital and essential part of medical image analysis, diagnosis, and visualization and to motivate the reader to use open source libraries and software for rapid prototyping of medical applications and tools.
A Padawan Programmer's Guide to Developing Software Libraries.
Yurkovich, James T; Yurkovich, Benjamin J; Dräger, Andreas; Palsson, Bernhard O; King, Zachary A
2017-11-22
With the rapid adoption of computational tools in the life sciences, scientists are taking on the challenge of developing their own software libraries and releasing them for public use. This trend is being accelerated by popular technologies and platforms, such as GitHub, Jupyter, R/Shiny, that make it easier to develop scientific software and by open-source licenses that make it easier to release software. But how do you build a software library that people will use? And what characteristics do the best libraries have that make them enduringly popular? Here, we provide a reference guide, based on our own experiences, for developing software libraries along with real-world examples to help provide context for scientists who are learning about these concepts for the first time. While we can only scratch the surface of these topics, we hope that this article will act as a guide for scientists who want to write great software that is built to last. Copyright © 2017 Elsevier Inc. All rights reserved.
Dee, C R; Rankin, J A; Burns, C A
1998-07-01
Journal usage studies, which are useful for budget management and for evaluating collection performance relative to library use, have generally described a single library or subject discipline. The Southern Chapter/Medical Library Association (SC/MLA) study has examined journal usage at the aggregate data level with the long-term goal of developing hospital library benchmarks for journal use. Thirty-six SC/MLA hospital libraries, categorized for the study by size as small, medium, or large, reported current journal title use centrally for a one-year period following standardized data collection procedures. Institutional and aggregate data were analyzed for the average annual frequency of use, average costs per use and non-use, and average percent of non-used titles. Permutation F-type tests were used to measure difference among the three hospital groups. Averages were reported for each data set analysis. Statistical tests indicated no significant differences between the hospital groups, suggesting that benchmarks can be derived applying to all types of hospital libraries. The unanticipated lack of commonality among heavily used titles pointed to a need for uniquely tailored collections. Although the small sample size precluded definitive results, the study's findings constituted a baseline of data that can be compared against future studies.
Dee, C R; Rankin, J A; Burns, C A
1998-01-01
BACKGROUND: Journal usage studies, which are useful for budget management and for evaluating collection performance relative to library use, have generally described a single library or subject discipline. The Southern Chapter/Medical Library Association (SC/MLA) study has examined journal usage at the aggregate data level with the long-term goal of developing hospital library benchmarks for journal use. METHODS: Thirty-six SC/MLA hospital libraries, categorized for the study by size as small, medium, or large, reported current journal title use centrally for a one-year period following standardized data collection procedures. Institutional and aggregate data were analyzed for the average annual frequency of use, average costs per use and non-use, and average percent of non-used titles. Permutation F-type tests were used to measure difference among the three hospital groups. RESULTS: Averages were reported for each data set analysis. Statistical tests indicated no significant differences between the hospital groups, suggesting that benchmarks can be derived applying to all types of hospital libraries. The unanticipated lack of commonality among heavily used titles pointed to a need for uniquely tailored collections. CONCLUSION: Although the small sample size precluded definitive results, the study's findings constituted a baseline of data that can be compared against future studies. PMID:9681164
An Open-Source Approach for Catchment's Physiographic Characterization
NASA Astrophysics Data System (ADS)
Di Leo, M.; Di Stefano, M.
2013-12-01
A water catchment's hydrologic response is intimately linked to its morphological shape, which is a signature on the landscape of the particular climate conditions that generated the hydrographic basin over time. Furthermore, geomorphologic structures influence hydrologic regimes and land cover (vegetation). For these reasons, a basin's characterization is a fundamental element in hydrological studies. Physiographic descriptors have been extracted manually for long time, but currently Geographic Information System (GIS) tools ease such task by offering a powerful instrument for hydrologists to save time and improve accuracy of result. Here we present a program combining the flexibility of the Python programming language with the reliability of GRASS GIS, which automatically performing the catchment's physiographic characterization. GRASS (Geographic Resource Analysis Support System) is a Free and Open Source GIS, that today can look back on 30 years of successful development in geospatial data management and analysis, image processing, graphics and maps production, spatial modeling and visualization. The recent development of new hydrologic tools, coupled with the tremendous boost in the existing flow routing algorithms, reduced the computational time and made GRASS a complete toolset for hydrological analysis even for large datasets. The tool presented here is a module called r.basin, based on GRASS' traditional nomenclature, where the "r" stands for "raster", and it is available for GRASS version 6.x and more recently for GRASS 7. As input it uses a Digital Elevation Model and the coordinates of the outlet, and, powered by the recently developed r.stream.* hydrological tools, it performs the flow calculation, delimits the basin's boundaries and extracts the drainage network, returning the flow direction and accumulation, the distance to outlet and the hill slopes length maps. Based on those maps, it calculates hydrologically meaningful shape factors and morphological parameters such as topological diameter, drainage density, Horton's ratios, concentration time, and many more, beside producing statistics on main channel and elevation and geometric features such as centroid's coordinates, rectangle containing the basin, etc. Exploiting Python libraries, such as Numpy and Matplotlib, it produces graphics like the hypsographic and hypsometric curve and the Width Function. The results are exported as a spreadsheet in CSV format and graphics as pngs. The advantages offered by the implementation in Python and GRASS are manifold. Python is a powerful scripting language with huge potential for researchers due to its relative simplicity, high flexibility and thanks to a broad availability of scientific libraries. GRASS, and as a consequence, r.basin, is platform independent, so that it is available for GNU/Linux, MS Windows, Mac, etc. Furthermore, the module is constantly maintained and improved according to users' feedback with the precious help of expert developers. The code is available for review under the official GRASS add-ons repository, allowing hydrologists and researchers to knowingly use, inspect, modify, reuse, and even incorporate it in other projects, such as web services.
An open-source library for the numerical modeling of mass-transfer in solid oxide fuel cells
NASA Astrophysics Data System (ADS)
Novaresio, Valerio; García-Camprubí, María; Izquierdo, Salvador; Asinari, Pietro; Fueyo, Norberto
2012-01-01
The generation of direct current electricity using solid oxide fuel cells (SOFCs) involves several interplaying transport phenomena. Their simulation is crucial for the design and optimization of reliable and competitive equipment, and for the eventual market deployment of this technology. An open-source library for the computational modeling of mass-transport phenomena in SOFCs is presented in this article. It includes several multicomponent mass-transport models ( i.e. Fickian, Stefan-Maxwell and Dusty Gas Model), which can be applied both within porous media and in porosity-free domains, and several diffusivity models for gases. The library has been developed for its use with OpenFOAM ®, a widespread open-source code for fluid and continuum mechanics. The library can be used to model any fluid flow configuration involving multicomponent transport phenomena and it is validated in this paper against the analytical solution of one-dimensional test cases. In addition, it is applied for the simulation of a real SOFC and further validated using experimental data. Program summaryProgram title: multiSpeciesTransportModels Catalogue identifier: AEKB_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEKB_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public License No. of lines in distributed program, including test data, etc.: 18 140 No. of bytes in distributed program, including test data, etc.: 64 285 Distribution format: tar.gz Programming language:: C++ Computer: Any x86 (the instructions reported in the paper consider only the 64 bit case for the sake of simplicity) Operating system: Generic Linux (the instructions reported in the paper consider only the open-source Ubuntu distribution for the sake of simplicity) Classification: 12 External routines: OpenFOAM® (version 1.6-ext) ( http://www.extend-project.de) Nature of problem: This software provides a library of models for the simulation of the steady state mass and momentum transport in a multi-species gas mixture, possibly in a porous medium. The software is particularly designed to be used as the mass-transport library for the modeling of solid oxide fuel cells (SOFC). When supplemented with other sub-models, such as thermal and charge-transport ones, it allows the prediction of the cell polarization curve and hence the cell performance. Solution method: Standard finite volume method (FVM) is used for solving all the conservation equations. The pressure-velocity coupling is solved using the SIMPLE algorithm (possibly adding a porous drag term if required). The mass transport can be calculated using different alternative models, namely Fick, Maxwell-Stefan or dusty gas model. The code adopts a segregated method to solve the resulting linear system of equations. The different regions of the SOFC, namely gas channels, electrodes and electrolyte, are solved independently, and coupled through boundary conditions. Restrictions: When extremely large species fluxes are considered, current implementation of the Neumann and Robin boundary conditions do not avoid negative values of molar and/or mass fractions, which finally end up with numerical instability. However this never happened in the documented runs. Eventually these boundary conditions could be reformulated to become more robust. Running time: From seconds to hours depending on the mesh size and number of species. For example, on a 64 bit machine with Intel Core Duo T8300 and 3 GBytes of RAM, the provided test run requires less than 1 second.
Pcetk: A pDynamo-based Toolkit for Protonation State Calculations in Proteins.
Feliks, Mikolaj; Field, Martin J
2015-10-26
Pcetk (a pDynamo-based continuum electrostatic toolkit) is an open-source, object-oriented toolkit for the calculation of proton binding energetics in proteins. The toolkit is a module of the pDynamo software library, combining the versatility of the Python scripting language and the efficiency of the compiled languages, C and Cython. In the toolkit, we have connected pDynamo to the external Poisson-Boltzmann solver, extended-MEAD. Our goal was to provide a modern and extensible environment for the calculation of protonation states, electrostatic energies, titration curves, and other electrostatic-dependent properties of proteins. Pcetk is freely available under the CeCILL license, which is compatible with the GNU General Public License. The toolkit can be found on the Web at the address http://github.com/mfx9/pcetk. The calculation of protonation states in proteins requires a knowledge of pKa values of protonatable groups in aqueous solution. However, for some groups, such as protonatable ligands bound to protein, the pKa aq values are often difficult to obtain from experiment. As a complement to Pcetk, we revisit an earlier computational method for the estimation of pKa aq values that has an accuracy of ± 0.5 pKa-units or better. Finally, we verify the Pcetk module and the method for estimating pKa aq values with different model cases.
Reproducibility of neuroimaging analyses across operating systems
Glatard, Tristan; Lewis, Lindsay B.; Ferreira da Silva, Rafael; Adalat, Reza; Beck, Natacha; Lepage, Claude; Rioux, Pierre; Rousseau, Marc-Etienne; Sherif, Tarek; Deelman, Ewa; Khalili-Mahani, Najmeh; Evans, Alan C.
2015-01-01
Neuroimaging pipelines are known to generate different results depending on the computing platform where they are compiled and executed. We quantify these differences for brain tissue classification, fMRI analysis, and cortical thickness (CT) extraction, using three of the main neuroimaging packages (FSL, Freesurfer and CIVET) and different versions of GNU/Linux. We also identify some causes of these differences using library and system call interception. We find that these packages use mathematical functions based on single-precision floating-point arithmetic whose implementations in operating systems continue to evolve. While these differences have little or no impact on simple analysis pipelines such as brain extraction and cortical tissue classification, their accumulation creates important differences in longer pipelines such as subcortical tissue classification, fMRI analysis, and cortical thickness extraction. With FSL, most Dice coefficients between subcortical classifications obtained on different operating systems remain above 0.9, but values as low as 0.59 are observed. Independent component analyses (ICA) of fMRI data differ between operating systems in one third of the tested subjects, due to differences in motion correction. With Freesurfer and CIVET, in some brain regions we find an effect of build or operating system on cortical thickness. A first step to correct these reproducibility issues would be to use more precise representations of floating-point numbers in the critical sections of the pipelines. The numerical stability of pipelines should also be reviewed. PMID:25964757
Reproducibility of neuroimaging analyses across operating systems.
Glatard, Tristan; Lewis, Lindsay B; Ferreira da Silva, Rafael; Adalat, Reza; Beck, Natacha; Lepage, Claude; Rioux, Pierre; Rousseau, Marc-Etienne; Sherif, Tarek; Deelman, Ewa; Khalili-Mahani, Najmeh; Evans, Alan C
2015-01-01
Neuroimaging pipelines are known to generate different results depending on the computing platform where they are compiled and executed. We quantify these differences for brain tissue classification, fMRI analysis, and cortical thickness (CT) extraction, using three of the main neuroimaging packages (FSL, Freesurfer and CIVET) and different versions of GNU/Linux. We also identify some causes of these differences using library and system call interception. We find that these packages use mathematical functions based on single-precision floating-point arithmetic whose implementations in operating systems continue to evolve. While these differences have little or no impact on simple analysis pipelines such as brain extraction and cortical tissue classification, their accumulation creates important differences in longer pipelines such as subcortical tissue classification, fMRI analysis, and cortical thickness extraction. With FSL, most Dice coefficients between subcortical classifications obtained on different operating systems remain above 0.9, but values as low as 0.59 are observed. Independent component analyses (ICA) of fMRI data differ between operating systems in one third of the tested subjects, due to differences in motion correction. With Freesurfer and CIVET, in some brain regions we find an effect of build or operating system on cortical thickness. A first step to correct these reproducibility issues would be to use more precise representations of floating-point numbers in the critical sections of the pipelines. The numerical stability of pipelines should also be reviewed.
Lambda: A Mathematica package for operator product expansions in vertex algebras
NASA Astrophysics Data System (ADS)
Ekstrand, Joel
2011-02-01
We give an introduction to the Mathematica package Lambda, designed for calculating λ-brackets in both vertex algebras, and in SUSY vertex algebras. This is equivalent to calculating operator product expansions in two-dimensional conformal field theory. The syntax of λ-brackets is reviewed, and some simple examples are shown, both in component notation, and in N=1 superfield notation. Program summaryProgram title: Lambda Catalogue identifier: AEHF_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEHF_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public License No. of lines in distributed program, including test data, etc.: 18 087 No. of bytes in distributed program, including test data, etc.: 131 812 Distribution format: tar.gz Programming language: Mathematica Computer: See specifications for running Mathematica V7 or above. Operating system: See specifications for running Mathematica V7 or above. RAM: Varies greatly depending on calculation to be performed. Classification: 4.2, 5, 11.1. Nature of problem: Calculate operator product expansions (OPEs) of composite fields in 2d conformal field theory. Solution method: Implementation of the algebraic formulation of OPEs given by vertex algebras, and especially by λ-brackets. Running time: Varies greatly depending on calculation requested. The example notebook provided takes about 3 s to run.
[Visual representation of biological structures in teaching material].
Morato, M A; Struchiner, M; Bordoni, E; Ricciardi, R M
1998-01-01
Parameters must be defined for presenting and handling scientific information presented in the form of teaching materials. Through library research and consultations with specialists in the health sciences and in graphic arts and design, this study undertook a comparative description of the first examples of scientific illustrations of anatomy and the evolution of visual representations of knowledge on the cell. The study includes significant examples of illustrations which served as elements of analysis.
Indonesia: Development of a Scientific Information Network.
ERIC Educational Resources Information Center
Hernandono
1978-01-01
Discusses the development of a library network in Indonesia, including problems encountered due to inadequate manpower, the need for the support of a powerful national advisory committee, and the possibility of utilizing telecommunication facilities in the future. (CWM)
La Ciencia de los Antiguos Mexicanos: Una Bibliografia Selecta
ERIC Educational Resources Information Center
Ortiz-Franco, Luis; Magana, Maria
1973-01-01
Fifty-five citations pertaining to the scientific and mathematic development of ancient Mexicans, particularly the Mayas, are given in this select bibliography. The introduction and descriptions of resource libraries in 8 States are in Spanish. (NQ)
[Chemotherapy-induced peripheral neuropathies: an integrative review of the literature].
Costa, Talita Cassanta; Lopes, Miriam; Anjos, Anna Cláudia Yokoyama Dos; Zago, Marcia Maria Fontão
2015-04-01
To identify scientific studies and to deepen the knowledge of peripheral neuropathies induced by chemotherapy antineoplastic, seeking evidence for assistance to cancer patients. Integrative review of the literature conducted in the databases Latin American and Caribbean Health Sciences (LILACS), Scientific Electronic Library Online (SciELO), Medical Literature Analysis (PubMed/MEDLINE), the Cochrane Library and the Spanish Bibliographic Index Health Sciences (IBECS). The sample consisted of 15 studies published between 2005-2014 that met the inclusion criteria. Studies showed aspects related to advanced age, main symptoms of neuropathy and chemotherapy agents as important adverse effect of neuropathy. We identified a small number of studies that addressed the topic, as well as low production of evidence related to interventions with positive results. It is considered important to develop new studies proposed for the prevention and/or treatment, enabling adjustment of the patient's cancer chemotherapy and consequently better service.
WE-F-211-01: The Evolving Landscape of Scientific Publishing.
Armato, S; Hendee, W; Marshall, C; Curran, B
2012-06-01
The dissemination of scientific advances has changed little since the first peer-reviewed journal was published in 1665 - that is, until this past decade. The print journal, delivered by mail and stored on office shelves and in library reading rooms around the world, has been transformed by immediate, on-demand access to scientific discovery in electronic form. At the same time, the producers and consumers of that scientific content have greatly increased in number, and the balance between supply and demand has required innovations in the world of scientific publishing. In light of technological advances and societal expectations, the dissemination of scientific knowledge has assumed a new form, one that is dynamic and rapidly changing. The academic medical physicist must understand this evolution to ensure that appropriate decisions are made with regard to journal submission strategies and that relevant information on new findings is obtained in a timely manner. Medical Physics is adapting to these changes in substantive ways. This new scientific publishing landscape has implications for subscription models, targeted access through semantic enrichment, user interactivity with content, customized content delivery, and advertising opportunities. Many organizations, including the AAPM, depend on scientific publishing as a significant source of revenue, but web-based delivery raises the expectation that access should be free and threatens this model. The purpose of this symposium is to explore the factors that have contributed to the current state of scientific publishing, to anticipate future directions in this arena, and to convey how medical physicists may benefit from the expanded opportunities, both as authors and as readers. 1. To appreciate the importance of scientific and clinical practice communication for the advancement of the medical physics field 2. To understand the roles of the Editorial Board and the Journal Business Management Committee in the promotion and advancement of Medical Physics 3. To explore technology-driven content delivery mechanisms and their role in facilitating content access and driving content usage 4. To understand the potential benefits and pitfalls of various economic and editorial models of scientific publications and the recent shifts away from the traditional role of libraries. © 2012 American Association of Physicists in Medicine.
Clinical simulation with dramatization: gains perceived by students and health professionals.
Negri, Elaine Cristina; Mazzo, Alessandra; Martins, José Carlos Amado; Pereira, Gerson Alves; Almeida, Rodrigo Guimarães Dos Santos; Pedersoli, César Eduardo
2017-08-03
to identify in the literature the gains health students and professionals perceive when using clinical simulation with dramatization resources. integrative literature review, using the method proposed by the Joanna Briggs Institute (JBI). A search was undertaken in the following databases: Latin American and Caribbean Health Sciences Literature, Web of Science, National Library of Medicine, Cumulative Index to Nursing and Allied Health Literature, The Cochrane Library, Scopus, Scientific Electronic Library Online. 53 studies were analyzed, which complied with the established inclusion criteria. Among the different gains obtained, satisfaction, self-confidence, knowledge, empathy, realism, reduced level of anxiety, comfort, communication, motivation, capacity for reflection and critical thinking and teamwork stand out. the evidence demonstrates the great possibilities to use dramatization in the context of clinical simulation, with gains in the different health areas, as well as interprofessional gains. identificar na literatura quais os ganhos percebidos pelos estudantes e profissionais da área de saúde, utilizando-se da simulação clínica realizada com recursos da dramatização. revisão integrativa da literatura, com a metodologia proposta pelo Instituto Joanna Briggs (JBI), com busca nas bases de dados: Literatura Latino-Americana e do Caribe em Ciências da Saúde, Web of Science, National Library of Medicine, Cumulative Index to Nursing and Allied Health Literature, The Cochrane Library, Scopus, Scientific Electronic Library Online. foram analisados 53 estudos, que atenderam os critérios de inclusão estabelecidos. Entre os diversos ganhos obtidos, destaca-se a satisfação, autoconfiança, conhecimento, empatia, realismo, diminuição do nível de ansiedade, conforto, comunicação, motivação, capacidade de reflexão e de pensamento crítico e trabalho em equipe. as evidências demonstram a ampla possibilidade de uso da dramatização no contexto de simulação clínica com ganhos nas diversas áreas de saúde e, também, interprofissionais. identificar en la literatura cuales los beneficios por los estudiantes y profesionales del área de salud, usándose la simulación clínica con recursos de dramatización. revisión integradora de la literatura, aplicándose la metodología propuesta por el Instituto Joanna Briggs (JBI), con búsqueda en las bases de datos: Literatura Latinoamericana y del Caribe en Ciencias de la Salud, Web of Science, National Library of Medicine, Cumulative Index to Nursing and Allied Health Literature, The Cochrane Library, Scopus, Scientific Electronic Library Online. fueron analizados 53 estudios, que cumplieron con los criterios de inclusión establecidos. Entre los diversos beneficios obtenidos, se destacan la satisfacción, autoconfianza, conocimiento, empatía, realismo, disminución del nivel de ansiedad, conforto, comunicación, motivación, capacidad de reflexión y de pensamiento crítico y trabajo en equipo. las evidencias demuestran la amplia posibilidad de uso de la dramatización en el contexto de simulación clínica con beneficios en las diversas áreas de salud, y también interprofesionales.
[Reproductive health: a contribution to the evaluation of a virtual library].
Alvarez, Maria do Carmo Avamilano; Cuenca, Angela Maria Belloni; Noronha, Daisy Pires; Schor, Néia
2007-10-01
Virtual libraries have been implemented in an attempt to organize scientific information found in the Internet, including the Biblioteca Virtual de Saúde Reprodutiva (BVSR), or Virtual Library on Reproductive Health. The aim is to provide quality information to researchers in the reproductive health field. The current study evaluates the use of the BVSR, emphasizing the users' expectations, difficulties, and suggestions. The study adopted a qualitative methodology. The focus group technique was applied to Internet chat groups through which reproductive health researchers communicated. Users expressed their expectations regarding information, highlighting the lack of time and the need to quickly obtain precise data. Use of virtual libraries for research increases where there is more trust in the institutions responsible for maintaining them. Researchers suggested the following: greater dissemination of the BVSR, publication of an electronic newsletter, and creation of a communications channel between the BVSR and users in order to foster intelligent collective communication.
NASA Technical Reports Server (NTRS)
Nelson, Michael L.; Maly, Kurt; Shen, Stewart N. T.; Zubair, Mohammad
1998-01-01
We describe NCSTRL+, a unified, canonical digital library for scientific and technical information (STI). NCSTRL+ is based on the Networked Computer Science Technical Report Library (NCSTRL), a World Wide Web (WWW) accessible digital library (DL) that provides access to over 100 university departments and laboratories. NCSTRL+ implements two new technologies: cluster functionality and publishing buckets. We have extended Dienst, the protocol underlying NCSTRL, to provide the ability to cluster independent collections into a logically centralized digital library based upon subject category classification, type of organization, and genres of material. The bucket construct provides a mechanism for publishing and managing logically linked entities with multiple data forms as a single object. The NCSTRL+ prototype DL contains the holdings of NCSTRL and the NASA Technical Report Server (NTRS). The prototype demonstrates the feasibility of publishing into a multi-cluster DL, searching across clusters, and storing and presenting buckets of information.
NASA Technical Reports Server (NTRS)
Pinelli, Thomas E.; Kennedy, John M.; White, Terry F.
1991-01-01
Phase 2 of the four phase NASA/DoD Aerospace Knowledge Diffusion Research Project was undertaken to study the transfer of scientific and technical information (STI) from government to the aerospace industry and the role of librarians and technical information specialists in the transfer process. Data was collected through a self-administered mailback questionnaire. Libraries identified as holding substantial aerospace or aeronautical technical report collections were selected to receive the questionnaires. Within each library, the person responsible for the technical report was requested to answer the questionnaire. Questionnaires were returned from approx. 68 pct. of the libraries. The respondents indicated that scientists and engineer are not aware of the services available from libraries/technical information centers and that scientists and engineers also under-utilized their services. The respondents also indicated they should be more involved in the process.
CRITIC2: A program for real-space analysis of quantum chemical interactions in solids
NASA Astrophysics Data System (ADS)
Otero-de-la-Roza, A.; Johnson, Erin R.; Luaña, Víctor
2014-03-01
We present CRITIC2, a program for the analysis of quantum-mechanical atomic and molecular interactions in periodic solids. This code, a greatly improved version of the previous CRITIC program (Otero-de-la Roza et al., 2009), can: (i) find critical points of the electron density and related scalar fields such as the electron localization function (ELF), Laplacian, … (ii) integrate atomic properties in the framework of Bader’s Atoms-in-Molecules theory (QTAIM), (iii) visualize non-covalent interactions in crystals using the non-covalent interactions (NCI) index, (iv) generate relevant graphical representations including lines, planes, gradient paths, contour plots, atomic basins, … and (v) perform transformations between file formats describing scalar fields and crystal structures. CRITIC2 can interface with the output produced by a variety of electronic structure programs including WIEN2k, elk, PI, abinit, Quantum ESPRESSO, VASP, Gaussian, and, in general, any other code capable of writing the scalar field under study to a three-dimensional grid. CRITIC2 is parallelized, completely documented (including illustrative test cases) and publicly available under the GNU General Public License. Catalogue identifier: AECB_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AECB_v2_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: yes No. of lines in distributed program, including test data, etc.: 11686949 No. of bytes in distributed program, including test data, etc.: 337020731 Distribution format: tar.gz Programming language: Fortran 77 and 90. Computer: Workstations. Operating system: Unix, GNU/Linux. Has the code been vectorized or parallelized?: Shared-memory parallelization can be used for most tasks. Classification: 7.3. Catalogue identifier of previous version: AECB_v1_0 Journal reference of previous version: Comput. Phys. Comm. 180 (2009) 157 Nature of problem: Analysis of quantum-chemical interactions in periodic solids by means of atoms-in-molecules and related formalisms. Solution method: Critical point search using Newton’s algorithm, atomic basin integration using bisection, qtree and grid-based algorithms, diverse graphical representations and computation of the non-covalent interactions index on a three-dimensional grid. Additional comments: !!!!! The distribution file for this program is over 330 Mbytes and therefore is not delivered directly when download or Email is requested. Instead a html file giving details of how the program can be obtained is sent. !!!!! Running time: Variable, depending on the crystal and the source of the underlying scalar field.
Runwien: a text-based interface for the WIEN package
NASA Astrophysics Data System (ADS)
Otero de la Roza, A.; Luaña, Víctor
2009-05-01
A new text-based interface for WIEN2k, the full-potential linearized augmented plane-waves (FPLAPW) program, is presented. This code provides an easy to use, yet powerful way of generating arbitrarily large sets of calculations. Thus, properties over a potential energy surface and WIEN2k parameter exploration can be calculated using a simple input text file. This interface also provides new capabilities to the WIEN2k package, such as the calculation of elastic constants on hexagonal systems or the automatic gathering of relevant information. Additionally, runwien is modular, flexible and intuitive. Program summaryProgram title: runwien Catalogue identifier: AECM_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AECM_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GPL version 3 No. of lines in distributed program, including test data, etc.: 62 567 No. of bytes in distributed program, including test data, etc.: 610 973 Distribution format: tar.gz Programming language: gawk (with locale POSIX or similar) Computer: All running Unix, Linux Operating system: Unix, GNU/Linux Classification: 7.3 External routines: WIEN2k ( http://www.wien2k.at/), GAWK ( http://www.gnu.org/software/gawk/), rename by L. Wall, a Perl script which renames files, modified by R. Barker to check for the existence of target files, gnuplot ( http://www.gnuplot.info/) Subprograms used:Cat Id: ADSY_v1_0/AECB_v1_0, Title: GIBBS/CRITIC, Reference: CPC 158 (2004) 57/CPC 999 (2009) 999 Nature of problem: Creation of a text-based, batch-oriented interface for the WIEN2k package. Solution method: WIEN2k solves the Kohn-Sham equations of a solid using the FPLAPW formalism. Runwien interprets an input file containing the description of the geometry and structure of the solid and drives the execution of the WIEN2k programs. The input is simplified thanks to the default values of the WIEN2k parameters known to runwien. Additional comments: Designed for WIEN2k versions 06.4, 07.2, 08.2, and 08.3. Running time: For the test case (TiC), a single geometry takes 5 to 10 minutes on a typical desktop PC (Intel Pentium 4, 3.4 GHz, 1 GB RAM). The full example including the calculation of the elastic constants and the equation of state, takes 9 hours and 32 minutes.
Milnthorpe, Andrew T; Soloviev, Mikhail
2011-04-15
The Cancer Genome Anatomy Project (CGAP) xProfiler and cDNA Digital Gene Expression Displayer (DGED) have been made available to the scientific community over a decade ago and since then were used widely to find genes which are differentially expressed between cancer and normal tissues. The tissue types are usually chosen according to the ontology hierarchy developed by NCBI. The xProfiler uses an internally available flat file database to determine the presence or absence of genes in the chosen libraries, while cDNA DGED uses the publicly available UniGene Expression and Gene relational databases to count the sequences found for each gene in the presented libraries. We discovered that the CGAP approach often includes libraries from dependent or irrelevant tissues (one third of libraries were incorrect on average, with some tissue searches no correct libraries being selected at all). We also discovered that the CGAP approach reported genes from outside the selected libraries and may omit genes found within the libraries. Other errors include the incorrect estimation of the significance values and inaccurate settings for the library size cut-off values. We advocated a revised approach to finding libraries associated with tissues. In doing so, libraries from dependent or irrelevant tissues do not get included in the final library pool. We also revised the method for determining the presence or absence of a gene by searching the UniGene relational database, revised calculation of statistical significance and sorted the library cut-off filter. Our results justify re-evaluation of all previously reported results where NCBI CGAP expression data and tools were used.
2011-01-01
Background The Cancer Genome Anatomy Project (CGAP) xProfiler and cDNA Digital Gene Expression Displayer (DGED) have been made available to the scientific community over a decade ago and since then were used widely to find genes which are differentially expressed between cancer and normal tissues. The tissue types are usually chosen according to the ontology hierarchy developed by NCBI. The xProfiler uses an internally available flat file database to determine the presence or absence of genes in the chosen libraries, while cDNA DGED uses the publicly available UniGene Expression and Gene relational databases to count the sequences found for each gene in the presented libraries. Results We discovered that the CGAP approach often includes libraries from dependent or irrelevant tissues (one third of libraries were incorrect on average, with some tissue searches no correct libraries being selected at all). We also discovered that the CGAP approach reported genes from outside the selected libraries and may omit genes found within the libraries. Other errors include the incorrect estimation of the significance values and inaccurate settings for the library size cut-off values. We advocated a revised approach to finding libraries associated with tissues. In doing so, libraries from dependent or irrelevant tissues do not get included in the final library pool. We also revised the method for determining the presence or absence of a gene by searching the UniGene relational database, revised calculation of statistical significance and sorted the library cut-off filter. Conclusion Our results justify re-evaluation of all previously reported results where NCBI CGAP expression data and tools were used. PMID:21496233
Interfaces for Distributed Systems of Information Servers.
ERIC Educational Resources Information Center
Kahle, Brewster M.; And Others
1993-01-01
Describes five interfaces to remote, full-text databases accessed through distributed systems of servers. These are WAIStation for the Macintosh, XWAIS for X-Windows, GWAIS for Gnu-Emacs; SWAIS for dumb terminals, and Rosebud for the Macintosh. Sixteen illustrations provide examples of display screens. Problems and needed improvements are…
Global manipulation of digital images can lead to variation in cytological diagnosis
Prasad, H; Wanjari, Sangeeta; Parwani, Rajkumar
2011-01-01
Background: With the adoption of a completely electronic workflow by several journals and the advent of telepathology, digital imaging has become an integral part of every scientific research. However, manipulating digital images is very easy, and it can lead to misinterpretations. Aim: To analyse the impact of manipulating digital images on their diagnosis. Design: Digital images were obtained from Papanicolaou-stained smears of dysplastic and normal oral epithelium. They were manipulated using GNU Image Manipulation Program (GIMP) to alter their brightness and contrast and color levels. A Power Point presentation composed of slides of these manipulated images along with the unaltered originals arranged randomly was created. The presentation was shown to five observers individually who rated the images as normal, mild, moderate or severe dysplasia. Weighted κ statistics was used to measure and assess the levels of agreement between observers. Results: Levels of agreement between manipulated images and original images varied greatly among observers. Variation in diagnosis was in the form of overdiagnosis or under-diagnosis, usually by one grade. Conclusion: Global manipulations of digital images of cytological slides can significantly affect their interpretation. Such manipulations should therefore be kept to a minimum, and avoided wherever possible. PMID:21572507
Global manipulation of digital images can lead to variation in cytological diagnosis.
Prasad, H; Wanjari, Sangeeta; Parwani, Rajkumar
2011-03-31
With the adoption of a completely electronic workflow by several journals and the advent of telepathology, digital imaging has become an integral part of every scientific research. However, manipulating digital images is very easy, and it can lead to misinterpretations. To analyse the impact of manipulating digital images on their diagnosis. Digital images were obtained from Papanicolaou-stained smears of dysplastic and normal oral epithelium. They were manipulated using GNU Image Manipulation Program (GIMP) to alter their brightness and contrast and color levels. A Power Point presentation composed of slides of these manipulated images along with the unaltered originals arranged randomly was created. The presentation was shown to five observers individually who rated the images as normal, mild, moderate or severe dysplasia. Weighted κ statistics was used to measure and assess the levels of agreement between observers. Levels of agreement between manipulated images and original images varied greatly among observers. Variation in diagnosis was in the form of overdiagnosis or under-diagnosis, usually by one grade. Global manipulations of digital images of cytological slides can significantly affect their interpretation. Such manipulations should therefore be kept to a minimum, and avoided wherever possible.
Stochastic hyperfine interactions modeling library
NASA Astrophysics Data System (ADS)
Zacate, Matthew O.; Evenson, William E.
2011-04-01
The stochastic hyperfine interactions modeling library (SHIML) provides a set of routines to assist in the development and application of stochastic models of hyperfine interactions. The library provides routines written in the C programming language that (1) read a text description of a model for fluctuating hyperfine fields, (2) set up the Blume matrix, upon which the evolution operator of the system depends, and (3) find the eigenvalues and eigenvectors of the Blume matrix so that theoretical spectra of experimental techniques that measure hyperfine interactions can be calculated. The optimized vector and matrix operations of the BLAS and LAPACK libraries are utilized; however, there was a need to develop supplementary code to find an orthonormal set of (left and right) eigenvectors of complex, non-Hermitian matrices. In addition, example code is provided to illustrate the use of SHIML to generate perturbed angular correlation spectra for the special case of polycrystalline samples when anisotropy terms of higher order than A can be neglected. Program summaryProgram title: SHIML Catalogue identifier: AEIF_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEIF_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU GPL 3 No. of lines in distributed program, including test data, etc.: 8224 No. of bytes in distributed program, including test data, etc.: 312 348 Distribution format: tar.gz Programming language: C Computer: Any Operating system: LINUX, OS X RAM: Varies Classification: 7.4 External routines: TAPP [1], BLAS [2], a C-interface to BLAS [3], and LAPACK [4] Nature of problem: In condensed matter systems, hyperfine methods such as nuclear magnetic resonance (NMR), Mössbauer effect (ME), muon spin rotation (μSR), and perturbed angular correlation spectroscopy (PAC) measure electronic and magnetic structure within Angstroms of nuclear probes through the hyperfine interaction. When interactions fluctuate at rates comparable to the time scale of a hyperfine method, there is a loss in signal coherence, and spectra are damped. The degree of damping can be used to determine fluctuation rates, provided that theoretical expressions for spectra can be derived for relevant physical models of the fluctuations. SHIML provides routines to help researchers quickly develop code to incorporate stochastic models of fluctuating hyperfine interactions in calculations of hyperfine spectra. Solution method: Calculations are based on the method for modeling stochastic hyperfine interactions for PAC by Winkler and Gerdau [5]. The method is extended to include other hyperfine methods following the work of Dattagupta [6]. The code provides routines for reading model information from text files, allowing researchers to develop new models quickly without the need to modify computer code for each new model to be considered. Restrictions: In the present version of the code, only methods that measure the hyperfine interaction on one probe spin state, such as PAC, μSR, and NMR, are supported. Running time: Varies
The High Level Data Reduction Library
NASA Astrophysics Data System (ADS)
Ballester, P.; Gabasch, A.; Jung, Y.; Modigliani, A.; Taylor, J.; Coccato, L.; Freudling, W.; Neeser, M.; Marchetti, E.
2015-09-01
The European Southern Observatory (ESO) provides pipelines to reduce data for most of the instruments at its Very Large telescope (VLT). These pipelines are written as part of the development of VLT instruments, and are used both in the ESO's operational environment and by science users who receive VLT data. All the pipelines are highly specific geared toward instruments. However, experience showed that the independently developed pipelines include significant overlap, duplication and slight variations of similar algorithms. In order to reduce the cost of development, verification and maintenance of ESO pipelines, and at the same time improve the scientific quality of pipelines data products, ESO decided to develop a limited set of versatile high-level scientific functions that are to be used in all future pipelines. The routines are provided by the High-level Data Reduction Library (HDRL). To reach this goal, we first compare several candidate algorithms and verify them during a prototype phase using data sets from several instruments. Once the best algorithm and error model have been chosen, we start a design and implementation phase. The coding of HDRL is done in plain C and using the Common Pipeline Library (CPL) functionality. HDRL adopts consistent function naming conventions and a well defined API to minimise future maintenance costs, implements error propagation, uses pixel quality information, employs OpenMP to take advantage of multi-core processors, and is verified with extensive unit and regression tests. This poster describes the status of the project and the lesson learned during the development of reusable code implementing algorithms of high scientific quality.
Common Graphics Library (CGL). Volume 2: Low-level user's guide
NASA Technical Reports Server (NTRS)
Taylor, Nancy L.; Hammond, Dana P.; Theophilos, Pauline M.
1989-01-01
The intent is to instruct the users of the Low-Level routines of the Common Graphics Library (CGL). The Low-Level routines form an application-independent graphics package enabling the user community to construct and design scientific charts conforming to the publication and/or viewgraph process. The Low-Level routines allow the user to design unique or unusual report-quality charts from a set of graphics utilities. The features of these routines can be used stand-alone or in conjunction with other packages to enhance or augment their capabilities. This library is written in ANSI FORTRAN 77, and currently uses a CORE-based underlying graphics package, and is therefore machine-independent, providing support for centralized and/or distributed computer systems.
The effect of U.S. policies on the economics of libraries.
Cummings, M M
1985-01-01
The decline in federal support of educational programs has made it difficult for libraries to apply new technologies to improve practices and services. While federal support has declined in constant dollars, there has been a modest increase in grants from private foundations. Current U.S. policies require federal agencies to recover full costs of rendering services (Circular A-25) and require the transfer of many federal service-oriented activities to the commercial sector (Circular A-76). Additionally, the Paperwork Reduction Act of 1980 is inhibiting the production and dissemination of federal publications. Government pursuit of these policies adds a heavy economic burden to libraries and threatens to reduce access to the scholarly and scientific record. PMID:3978292
NOAA Launches Deepwater Horizon Library | NOAA Gulf Spill Restoration
Restoration Area NOAA has unveiled a web archive of the maps, wildlife reports, scientific reports and other : Press releases related to the spill More than 100 wildlife reports, including reports related to sea
Toward a Calculus of Collection Development.
ERIC Educational Resources Information Center
Hamaker, Charles
1993-01-01
Discusses problems in scholarly communication, particularly pricing strategies for scientific and technical journals and their impact on academic libraries' collection development. Highlights include the growth rate in scholarly titles; serials expenditures; research and development expenditures; expenditures at Louisiana State University; shared…
The virtual library: Coming of age
NASA Technical Reports Server (NTRS)
Hunter, Judy F.; Cotter, Gladys A.
1994-01-01
With the high speed networking capabilities, multiple media options, and massive amounts of information that exist in electronic format today, the concept of a 'virtual' library or 'library without walls' is becoming viable. In virtual library environment, the information processed goes beyond the traditional definition of documents to include the results of scientific and technical research and development (reports, software, data) recorded in any format or media: electronic, audio, video, or scanned images. Network access to information must include tools to help locate information sources and navigate the networks to connect to the sources, as well as methods to extract the relevant information. Graphical User Interfaces (GUI's) that are intuitive and navigational tools such as Intelligent Gateway Processors (IGP) will provide users with seamless and transparent use of high speed networks to access, organize, and manage information. Traditional libraries will become points of electronic access to information on multiple medias. The emphasis will be towards unique collections of information at each library rather than entire collections at every library. It is no longer a question of whether there is enough information available; it is more a question of how to manage the vast volumes of information. The future equation will involve being able to organize knowledge, manage information, and provide access at the point of origin.
Opisthorchis felineus infection prevalence in Western Siberia: A review of Russian literature.
Fedorova, Olga S; Fedotova, Marina M; Sokolova, Tatiana S; Golovach, Ekaterina A; Kovshirina, Yulia V; Ageeva, Tatiana S; Kovshirina, Anna E; Kobyakova, Olga S; Ogorodova, Ludmila M; Odermatt, Peter
2018-02-01
In this study we reviewed Russian scientific literature (scientific publications, book chapters, monographs) published between 1 January 1979 and 31 August 2015 from two sources: Main database of the Russian Scientific Electronic Library (eLIBRARY, http://elibrary.ru/), and the Scientific Medical Library of Siberian State Medical University (http://medlib.tomsk.ru/). Specifically, the review details the infection prevalence of Opisthorchis felineus (O. felineus) in Western Siberia, Russian Federation. From the primary key words screening, 1591 records were identified from which 32 Russian-language publications were relevant. The lowest O. felineus infection rate of 0.4% was reported in Tatarstan Republic, and the highest reached 83.9% in the Khanty-Mansiysk Autonomous Okrug. The infection prevalence was lower in children than in adults and increased with age. O. felineus infection was detected more often in indigenous population than in migrants. Infection intensity in western regions (Permskaya, Bryanskaya Oblast) was low and varied from 15 to 336 eggs per gram stool (epg), while in endemic regions it reached more than 2000 epg. In some settlements the mean intensity infection was 5234 epg. The high rates of intensity were registered in regions with a high prevalence of infection. Based on obtained data, a map of O. felineus infection prevalence in Western Siberia was developed. After mapping the results, the highest prevalence was detected in Tyumenskaya Oblast with over 60%, while the Tomskaya Oblast had the lowest prevalence at fewer than 19.0%. Khanty-Mansiysk Autonomus Okrug, Altaiskii Krai, Novosibirskaya Oblast and Omskaya Oblast had an average level of O. felineus infection of 20-39%. According to the results of the review, Western Siberia must be considered as highly endemic region for opisthorchiasis in the Russian Federation. The development of a control program specific for the Russian community is warranted. Copyright © 2017 Elsevier B.V. All rights reserved.
Tobacco industry sponsorship of a book and conflict of interest.
Hong, Mi-Kyung; Bero, Lisa A
2006-08-01
The tobacco industry has hidden its involvement in the design, conduct and publication of scientific research articles and has used the articles to argue against tobacco regulation. The objective of this study is to examine tobacco industry involvement in the development of scientific books. Qualitative analysis of previously secret internal tobacco industry documents retrieved from the Legacy Tobacco Documents Library (http://legacy.library.ucsf.edu). Information from the documents was supplemented with material from Internet searches, the National Center for Biotechnology Information Pubmed database and interviews with individuals involved in book publication. Between 1997 and 1999 the tobacco industry sponsored a monograph, entitled 'Analytical Determination of Nicotine and Related Compounds and their Metabolites', that examined the measurement and metabolism of nicotine. The tobacco industry recruited Elsevier Science to publish the monograph. Tobacco industry executives, lawyers and scientists reviewed the chapters. One use of the monograph was to stimulate collaborative efforts between academic and tobacco industry scientists. Another was to provide the book to a government regulatory agency reviewing the teratogenic effects of nicotine. Our findings show the breadth of tobacco industry engagement in scientific knowledge production and dissemination, and its motives for sponsoring scientific literature. The industry's effort to gain credibility through collaboration with academic scientists raises questions regarding the ethics of accepting tobacco industry funding for publication. Scientists who collaborate on publications sponsored by the tobacco industry must consider the full implications of these joint efforts.
Teaching information literacy skills to sophomore-level biology majors.
Thompson, Leigh; Blankinship, Lisa Ann
2015-05-01
Many undergraduate students lack a sound understanding of information literacy. The skills that comprise information literacy are particularly important when combined with scientific writing for biology majors as they are the foundation skills necessary to complete upper-division biology course assignments, better train students for research projects, and prepare students for graduate and professional education. To help undergraduate biology students develop and practice information literacy and scientific writing skills, a series of three one-hour hands-on library sessions, discussions, and homework assignments were developed for Biological Literature, a one-credit, one-hour-per-week, required sophomore-level course. The embedded course librarian developed a learning exercise that reviewed how to conduct database and web searches, the difference between primary and secondary sources, source credibility, and how to access articles through the university's databases. Students used the skills gained in the library training sessions for later writing assignments including a formal lab report and annotated bibliography. By focusing on improving information literacy skills as well as providing practice in scientific writing, Biological Literature students are better able to meet the rigors of upper-division biology courses and communicate research findings in a more professional manner.
Teaching Information Literacy Skills to Sophomore-Level Biology Majors
Thompson, Leigh; Blankinship, Lisa Ann
2015-01-01
Many undergraduate students lack a sound understanding of information literacy. The skills that comprise information literacy are particularly important when combined with scientific writing for biology majors as they are the foundation skills necessary to complete upper-division biology course assignments, better train students for research projects, and prepare students for graduate and professional education. To help undergraduate biology students develop and practice information literacy and scientific writing skills, a series of three one-hour hands-on library sessions, discussions, and homework assignments were developed for Biological Literature, a one-credit, one-hour-per-week, required sophomore-level course. The embedded course librarian developed a learning exercise that reviewed how to conduct database and web searches, the difference between primary and secondary sources, source credibility, and how to access articles through the university’s databases. Students used the skills gained in the library training sessions for later writing assignments including a formal lab report and annotated bibliography. By focusing on improving information literacy skills as well as providing practice in scientific writing, Biological Literature students are better able to meet the rigors of upper-division biology courses and communicate research findings in a more professional manner. PMID:25949754
Lawyers' delights and geneticists' nightmares: at forty, the double helix shows some wrinkles.
Sgaramella, V
1993-12-15
The National Institutes of Health (NIH) request to patent the base sequences of incomplete and uncharacterized fragments of DNA copied on messenger RNAs (cDNAs) extracted from human tissues, the refusal by the patent office, and the appeal placed by NIH, have incited a violent controversy, fueled by rational, as well as emotional elements. In a compromising mode between liberalism and protectionism, I propose that legal protection be considered only for those RNA/DNA sequences, either natural or artificial, which can generate practical applications per se, and not through their expression products. Another controversy is developing around a popular tool for genomic research: the fidelity of yeast artificial chromosome (YAC) libraries being distributed worldwide for physical mapping is being questioned. Some of these libraries have been shown to be affected by surprisingly high levels of co-cloning, in addition to more common gene reshuffling instances. Also in this case, scientific as well as non-scientific components have to be considered. Possible remedies for the underlying problems may be found in the proper use of kinetic, enzymatic and microbiological variables in the production of YACs. Here too, a sharper distinction between the secular and scientific gratifications of research could help.
LibHalfSpace: A C++ object-oriented library to study deformation and stress in elastic half-spaces
NASA Astrophysics Data System (ADS)
Ferrari, Claudio; Bonafede, Maurizio; Belardinelli, Maria Elina
2016-11-01
The study of deformation processes in elastic half-spaces is widely employed for many purposes (e.g. didactic, scientific investigation of real processes, inversion of geodetic data, etc.). We present a coherent programming interface containing a set of tools designed to make easier and faster the study of processes in an elastic half-space. LibHalfSpace is presented in the form of an object-oriented library. A set of well known and frequently used source models (Mogi source, penny shaped horizontal crack, inflating spheroid, Okada rectangular dislocation, etc.) are implemented to describe the potential usage and the versatility of the library. The common interface given to library tools enables us to switch easily among the effects produced by different deformation sources that can be monitored at the free surface. Furthermore, the library also offers an interface which simplifies the creation of new source models exploiting the features of object-oriented programming (OOP). These source models can be built as distributions of rectangular boundary elements. In order to better explain how new models can be deployed some examples are included in the library.
[Physiology in the mirror of systematic catalogue of Russian Academy of Sciences Library].
Orlov, I V; Lazurkina, V B
2011-07-01
Representation of general human and animal physiology publications in the systematic catalogue of the Library of the Russian Academy of Sciences is considered. The organization of the catalogue as applied to the problems of physiology, built on the basis of library-bibliographic classification used in the Russian universal scientific libraries is described. The card files of the systematic catalogue of the Library contain about 8 million cards. Topics that reflect the problems of general physiology contain 39 headings. For the full range of sciences including physiology the tables of general types of divisions were developed. They have been marked by indexes using lower-case letters of the Russian alphabet. For further detalizations of these indexes decimal symbols are used. The indexes are attached directly to the field of knowledge index. With the current relatively easy availability of network resources value and relevance of any catalogue are reduced. However it concerns much more journal articles, rather than reference books, proceedings of various conferences, bibliographies, personalities, and especially the monographs contained in the systematic catalogue. The card systematic catalogue of the Library remains an important source of information on general physiology issues, as well as its magistral narrower sections.
The role of medical libraries in undergraduate education: a case study in genetics*
Tennant, Michele R.; Miyamoto, Michael M.
2002-01-01
Between 1996 and 2001, the Health Science Center Libraries and Department of Zoology at the University of Florida partnered to provide a cohesive and comprehensive learning experience to undergraduate students in PCB3063, “Genetics.” During one semester each year, a librarian worked with up to 120 undergraduates, providing bibliographic and database instruction in the tools that practicing geneticists use (MEDLINE, GenBank, BLAST, etc.). Students learned to evaluate and synthesize the information that they retrieved, coupling it with information provided in classroom lectures, thus resulting in well-researched short papers on an assigned genetics topic. Exit surveys of students indicated that the majority found the library sessions and librarian's instruction to be useful. Responses also indicated that the project facilitated increased understanding of genetics concepts and appreciation for the scientific research process and the relevance of genetics to the real world. The library benefited from this partnership on a variety of fronts, including the development of skilled library users, pretrained future clientele, and increased visibility among campus research laboratories. The course and associated information instruction and assigned projects can be considered models for course-integrated instruction and the role of medical libraries in undergraduate education. PMID:11999176
Havery Mudd 2014-2015 Computer Science Conduit Clinic Final Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aspesi, G; Bai, J; Deese, R
2015-05-12
Conduit, a new open-source library developed at Lawrence Livermore National Laboratories, provides a C++ application programming interface (API) to describe and access scientific data. Conduit’s primary use is for inmemory data exchange in high performance computing (HPC) applications. Our team tested and improved Conduit to make it more appealing to potential adopters in the HPC community. We extended Conduit’s capabilities by prototyping four libraries: one for parallel communication using MPI, one for I/O functionality, one for aggregating performance data, and one for data visualization.
Argonne Research Library | Argonne National Laboratory
Publications Researchers Postdocs Exascale Computing Institute for Molecular Engineering at Argonne Work with Scientific Publications Researchers Postdocs Exascale Computing Institute for Molecular Engineering at IMEInstitute for Molecular Engineering JCESRJoint Center for Energy Storage Research MCSGMidwest Center for
Librarianship, Education and Service.
ERIC Educational Resources Information Center
Carpenter, Ray L.
1988-01-01
Compares differences in the use of technology, attitudes toward it, and educational experiences of librarians between libraries in Europe and the United States, and between countries within Europe. The issues discussed include decentralization, bureaucratic versus professional staff, the use of technology for scientific versus humanities…
NASA Astrophysics Data System (ADS)
Dusenbery, P.; LaConte, K.; Holland, A.; Harold, J. B.; Johnson, A.; Randall, C.; Fitzhugh, G.
2017-12-01
NASA research programs are helping humanity understand the origin and evolution of galaxies, stars, and planets, how our Sun varies and impacts the heliosphere, and defining the conditions necessary to support life beyond Earth. As places that offer their services for free, public libraries have become the "public square" by providing a place where members of a community can gather for information, educational programming, and policy discussions. Libraries are also developing new ways to engage their patrons in STEM learning. The Space Science Institute's (SSI) National Center for Interactive Learning (NCIL) was funded by NASA`s Science Mission Directorate (SMD) to develop and implement a project called NASA@ My Library: A National Earth and Space Science Initiative That Connects NASA, Public Libraries and Their Communities. NCIL's STAR Library Network (STAR_Net) is providing important leverage to expand its community of practice that serves both librarians and STEM professionals. Seventy-five libraries were selected through a competitive application process to receive NASA STEM Facilitation Kits, NASA STEM Backpacks for circulation, financial resources, training, and partnership opportunities. Initial survey data from the 75 NASA@ My Library partners showed that, while they are actively providing programming, few STEM programs connected with NASA science and engineering. With the launch of the initiative - including training, resources, and STEM-related event opportunities - all 75 libraries are engaged in offering NASA-focused programs, including with NASA subject matter experts. This talk will highlight the impacts the initiative is having on both public library partners and many others across the country.
NASA Technical Reports Server (NTRS)
Agrawal, Gagan; Sussman, Alan; Saltz, Joel
1993-01-01
Scientific and engineering applications often involve structured meshes. These meshes may be nested (for multigrid codes) and/or irregularly coupled (called multiblock or irregularly coupled regular mesh problems). A combined runtime and compile-time approach for parallelizing these applications on distributed memory parallel machines in an efficient and machine-independent fashion was described. A runtime library which can be used to port these applications on distributed memory machines was designed and implemented. The library is currently implemented on several different systems. To further ease the task of application programmers, methods were developed for integrating this runtime library with compilers for HPK-like parallel programming languages. How this runtime library was integrated with the Fortran 90D compiler being developed at Syracuse University is discussed. Experimental results to demonstrate the efficacy of our approach are presented. A multiblock Navier-Stokes solver template and a multigrid code were experimented with. Our experimental results show that our primitives have low runtime communication overheads. Further, the compiler parallelized codes perform within 20 percent of the code parallelized by manually inserting calls to the runtime library.
Enabling a Scientific Cloud Marketplace: VGL (Invited)
NASA Astrophysics Data System (ADS)
Fraser, R.; Woodcock, R.; Wyborn, L. A.; Vote, J.; Rankine, T.; Cox, S. J.
2013-12-01
The Virtual Geophysics Laboratory (VGL) provides a flexible, web based environment where researchers can browse data and use a variety of scientific software packaged into tool kits that run in the Cloud. Both data and tool kits are published by multiple researchers and registered with the VGL infrastructure forming a data and application marketplace. The VGL provides the basic work flow of Discovery and Access to the disparate data sources and a Library for tool kits and scripting to drive the scientific codes. Computation is then performed on the Research or Commercial Clouds. Provenance information is collected throughout the work flow and can be published alongside the results allowing for experiment comparison and sharing with other researchers. VGL's "mix and match" approach to data, computational resources and scientific codes, enables a dynamic approach to scientific collaboration. VGL allows scientists to publish their specific contribution, be it data, code, compute or work flow, knowing the VGL framework will provide other components needed for a complete application. Other scientists can choose the pieces that suit them best to assemble an experiment. The coarse grain workflow of the VGL framework combined with the flexibility of the scripting library and computational toolkits allows for significant customisation and sharing amongst the community. The VGL utilises the cloud computational and storage resources from the Australian academic research cloud provided by the NeCTAR initiative and a large variety of data accessible from national and state agencies via the Spatial Information Services Stack (SISS - http://siss.auscope.org). VGL v1.2 screenshot - http://vgl.auscope.org
A GPL Relativistic Hydrodynamical Code
NASA Astrophysics Data System (ADS)
Olvera, D.; Mendoza, S.
We are currently building a free (in the sense of a GNU GPL license) 2DRHD code in order to be used for different astrophysical situations. Our final target will be to include strong gravitational fields and magnetic fields. We intend to form a large group of developers as it is usually done for GPL codes.
SCAMP: Automatic Astrometric and Photometric Calibration
NASA Astrophysics Data System (ADS)
Bertin, Emmanuel
2010-10-01
Astrometric and photometric calibrations have remained the most tiresome step in the reduction of large imaging surveys. SCAMP has been written to address this problem. The program efficiently computes accurate astrometric and photometric solutions for any arbitrary sequence of FITS images in a completely automatic way. SCAMP is released under the GNU General Public License.
It's Time to Consider Open Source Software
ERIC Educational Resources Information Center
Pfaffman, Jay
2007-01-01
In 1985 Richard Stallman, a computer programmer, released "The GNU Manifesto" in which he proclaimed a golden rule: One must share computer programs. Software vendors required him to agree to license agreements that forbade sharing programs with others, but he refused to "break solidarity" with other computer users whom he assumed also wanted to…
Design, Development, and Testing of a Network Frequency Selection Service (NFSS)
1994-02-14
mercial simulation software (Sim++), word processor ( FrameMaker ), editor (Gnu Emacs), software ver- sion control (Revision Control System (RCS)), system...of FrameMaker ".mif" files. When viewed using FrameMaker or a PostScript reader, each page of results appears as two columns by four rows of graphics
Modular Open-Source Software for Item Factor Analysis
ERIC Educational Resources Information Center
Pritikin, Joshua N.; Hunter, Micheal D.; Boker, Steven M.
2015-01-01
This article introduces an item factor analysis (IFA) module for "OpenMx," a free, open-source, and modular statistical modeling package that runs within the R programming environment on GNU/Linux, Mac OS X, and Microsoft Windows. The IFA module offers a novel model specification language that is well suited to programmatic generation…
Mighty Math[TM] Zoo Zillions[TM]. [CD-ROM].
ERIC Educational Resources Information Center
1996
Zoo Zillions contains five activities for grades K-2: Annie's Jungle Trail, 3D Gallery, Number Line Express, Gnu Ewe Boutique, and Fish Stories. These activities enable children to review and practice basic mathematics skills; identify three-dimensional shapes, watch them in motion, and create their own three-dimensional designs; locate numbers…
Videos for Science Communication and Nature Interpretation: The TIB|AV-Portal as Resource.
NASA Astrophysics Data System (ADS)
Marín Arraiza, Paloma; Plank, Margret; Löwe, Peter
2016-04-01
Scientific audiovisual media such as videos of research, interactive displays or computer animations has become an important part of scientific communication and education. Dynamic phenomena can be described better by audiovisual media than by words and pictures. For this reason, scientific videos help us to understand and discuss environmental phenomena more efficiently. Moreover, the creation of scientific videos is easier than ever, thanks to mobile devices and open source editing software. Video-clips, webinars or even the interactive part of a PICO are formats of scientific audiovisual media used in the Geosciences. This type of media translates the location-referenced Science Communication such as environmental interpretation into computed-based Science Communication. A new way of Science Communication is video abstracting. A video abstract is a three- to five-minute video statement that provides background information about a research paper. It also gives authors the opportunity to present their research activities to a wider audience. Since this kind of media have become an important part of scientific communication there is a need for reliable infrastructures which are capable of managing the digital assets researchers generate. Using the reference of the usecase of video abstracts this paper gives an overview over the activities by the German National Library of Science and Technology (TIB) regarding publishing and linking audiovisual media in a scientifically sound way. The German National Library of Science and Technology (TIB) in cooperation with the Hasso Plattner Institute (HPI) developed a web-based portal (av.tib.eu) that optimises access to scientific videos in the fields of science and technology. Videos from the realms of science and technology can easily be uploaded onto the TIB|AV Portal. Within a short period of time the videos are assigned a digital object identifier (DOI). This enables them to be referenced, cited, and linked (e.g. to the relevant article or further supplement materials). By using media fragment identifiers not only the whole video can be cited, but also individual parts of it. Doing so, users are also likely to find high-quality related content (for instance, a video abstract and the corresponding article or an expedition documentary and its field notebook). Based on automatic analysis of speech, images and texts within the videos a large amount of metadata associated with the segments of the video is automatically generated. These metadata enhance the searchability of the video and make it easier to retrieve and interlink meaningful parts of the video. This new and reliable library-driven infrastructure allow all different types of data be discoverable, accessible, citable, freely reusable, and interlinked. Therefore, it simplifies Science Communication
Gist: A scientific graphics package for Python
DOE Office of Scientific and Technical Information (OSTI.GOV)
Busby, L.E.
1996-05-08
{open_quotes}Gist{close_quotes} is a scientific graphics library written by David H. Munro of Lawrence Livermore National Laboratory (LLNL). It features support for three common graphics output devices: X Windows, (Color) PostScript, and ANSI/ISO Standard Computer Graphics Metafiles (CGM). The library is small (written directly to Xlib), portable, efficient, and full-featured. It produces X versus Y plots with {open_quotes}good{close_quotes} tick marks and tick labels, 2-dimensional quadrilateral mesh plots with contours, vector fields, or pseudo color maps on such meshes, with 3-dimensional plots on the way. The Python Gist module utilizes the new {open_quotes}Numeric{close_quotes} module due to J. Hugunin and others. It ismore » therefore fast and able to handle large datasets. The Gist module includes an X Windows event dispatcher which can be dynamically added (e.g., via importing a dynamically loaded module) to the Python interpreter after a simple two-line modification to the Python core. This makes fast mouse-controlled zoom, pan, and other graphic operations available to the researcher while maintaining the usual Python command-line interface. Munro`s Gist library is already freely available. The Python Gist module is currently under review and is also expected to qualify for unlimited release.« less
Climate tools in mainstream Linux distributions
NASA Astrophysics Data System (ADS)
McKinstry, Alastair
2015-04-01
Debian/meterology is a project to integrate climate tools and analysis software into the mainstream Debian/Ubuntu Linux distributions. This work describes lessons learnt, and recommends practices for scientific software to be adopted and maintained in OS distributions. In addition to standard analysis tools (cdo,, grads, ferret, metview, ncl, etc.), software used by the Earth System Grid Federation was chosen for integraion, to enable ESGF portals to be built on this base; however exposing scientific codes via web APIs enables security weaknesses, normally ignorable, to be exposed. How tools are hardened, and what changes are required to handle security upgrades, are described. Secondly, to enable libraries and components (e.g. Python modules) to be integrated requires planning by writers: it is not sufficient to assume users can upgrade their code when you make incompatible changes. Here, practices are recommended to enable upgrades and co-installability of C, C++, Fortran and Python codes. Finally, software packages such as NetCDF and HDF5 can be built in multiple configurations. Tools may then expect incompatible versions of these libraries (e.g. serial and parallel) to be simultaneously available; how this was solved in Debian using "pkg-config" and shared library interfaces is described, and best practices for software writers to enable this are summarised.
BioMake: a GNU make-compatible utility for declarative workflow management.
Holmes, Ian H; Mungall, Christopher J
2017-11-01
The Unix 'make' program is widely used in bioinformatics pipelines, but suffers from problems that limit its application to large analysis datasets. These include reliance on file modification times to determine whether a target is stale, lack of support for parallel execution on clusters, and restricted flexibility to extend the underlying logic program. We present BioMake, a make-like utility that is compatible with most features of GNU Make and adds support for popular cluster-based job-queue engines, MD5 signatures as an alternative to timestamps, and logic programming extensions in Prolog. BioMake is available for MacOSX and Linux systems from https://github.com/evoldoers/biomake under the BSD3 license. The only dependency is SWI-Prolog (version 7), available from http://www.swi-prolog.org/. ihholmes + biomake@gmail.com or cmungall + biomake@gmail.com. Feature table comparing BioMake to similar tools. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com
MCPB.py: A Python Based Metal Center Parameter Builder.
Li, Pengfei; Merz, Kenneth M
2016-04-25
MCPB.py, a python based metal center parameter builder, has been developed to build force fields for the simulation of metal complexes employing the bonded model approach. It has an optimized code structure, with far fewer required steps than the previous developed MCPB program. It supports various AMBER force fields and more than 80 metal ions. A series of parametrization schemes to derive force constants and charge parameters are available within the program. We give two examples (one metalloprotein example and one organometallic compound example), indicating the program's ability to build reliable force fields for different metal ion containing complexes. The original version was released with AmberTools15. It is provided via the GNU General Public License v3.0 (GNU_GPL_v3) agreement and is free to download and distribute. MCPB.py provides a bridge between quantum mechanical calculations and molecular dynamics simulation software packages thereby enabling the modeling of metal ion centers. It offers an entry into simulating metal ions in a number of situations by providing an efficient way for researchers to handle the vagaries and difficulties associated with metal ion modeling.
Carnivore fecal chemicals suppress feeding by Alpine goats (Capra hircus).
Weldon, P J; Graham, D P; Mears, L P
1993-12-01
The efficacy of carnivore and ungulate fecal chemicals in suppressing the feeding behavior of Alpine goats (Capra hircus) was examined. In the first four experiments, goats were offered food covered with paper strips treated with fecal extracts of the Bengal tiger, Siberian tiger, African lion, and brown bear, respectively; food covered with solvent-treated and untreated (plain) papers served as controls in each experiment. Goats made fewer head entries into, and ate less food from, buckets containing fecal extracts. In the fifth experiment, goats were offered food covered with paper strips treated with fecal extracts of the puma, Dorcas gazelle, white-bearded gnu, and conspecifics; food covered with solvent-treated and plain papers again served as controls. The amounts of food consumed from buckets containing puma, gazelle, gnu, and solvent treatments were statistically indistinguishable, but less food was consumed from them than from buckets containing the goat-scented or plain papers. No significant differences among treatments were detected with respect to head entries. Field experiments are needed on the use of predator-derived chemicals to reduce damage by goats to vegetation.
Scientific misconduct and theft: case report from 17th century.
Fatović-Ferencić, Stella
2008-02-01
Gjuro Armen Baglivi was one of the most famous medical authorities of the 17th century. Apart from his numerous books and publications, several extensive collections of his correspondence have been preserved and are available in libraries around the world. They provide new information about the 17th century scientific culture and place of Baglivi's work in the scientific European context. Also, they shed light on his personality more than other writings intended for the public eye. In this paper I will present the case of a theft of intellectual property, which Baglivi described in one of his letters to Jean Jacques Manget.
Scientific Misconduct and Theft: Case Report from 17th Century
Fatović-Ferenčić, Stella
2008-01-01
Gjuro Armen Baglivi was one of the most famous medical authorities of the 17th century. Apart from his numerous books and publications, several extensive collections of his correspondence have been preserved and are available in libraries around the world. They provide new information about the 17th century scientific culture and place of Baglivi’s work in the scientific European context. Also, they shed light on his personality more than other writings intended for the public eye. In this paper I will present the case of a theft of intellectual property, which Baglivi described in one of his letters to Jean Jacques Manget. PMID:18293461
Integration of Information and Scientific Literacy: Promoting Literacy in Undergraduates
Wolbach, Kevin C.; Purzycki, Catherine B.; Bowman, Leslie A.; Agbada, Eva; Mostrom, Alison M.
2010-01-01
The Association of College and Research Libraries recommends incorporating information literacy (IL) skills across university and college curricula, for the goal of developing information literate graduates. Congruent with this goal, the Departments of Biological Sciences and Information Science developed an integrated IL and scientific literacy (SL) exercise for use in a first-year biology course. Students were provided the opportunity to access, retrieve, analyze, and evaluate primary scientific literature. By the completion of this project, student responses improved concerning knowledge and relevance of IL and SL skills. This project exposes students to IL and SL early in their undergraduate experience, preparing them for future academic advancement. PMID:21123700
DOE Office of Scientific and Technical Information (OSTI.GOV)
Helmus, Jonathan J.; Collis, Scott M.
The Python ARM Radar Toolkit is a package for reading, visualizing, correcting and analysing data from weather radars. Development began to meet the needs of the Atmospheric Radiation Measurement Climate Research Facility and has since expanded to provide a general-purpose framework for working with data from weather radars in the Python programming language. The toolkit is built on top of libraries in the Scientific Python ecosystem including NumPy, SciPy, and matplotlib, and makes use of Cython for interfacing with existing radar libraries written in C and to speed up computationally demanding algorithms. As a result, the source code for themore » toolkit is available on GitHub and is distributed under a BSD license.« less
Helmus, Jonathan J.; Collis, Scott M.
2016-07-18
The Python ARM Radar Toolkit is a package for reading, visualizing, correcting and analysing data from weather radars. Development began to meet the needs of the Atmospheric Radiation Measurement Climate Research Facility and has since expanded to provide a general-purpose framework for working with data from weather radars in the Python programming language. The toolkit is built on top of libraries in the Scientific Python ecosystem including NumPy, SciPy, and matplotlib, and makes use of Cython for interfacing with existing radar libraries written in C and to speed up computationally demanding algorithms. As a result, the source code for themore » toolkit is available on GitHub and is distributed under a BSD license.« less
Learning To Live with Complexity.
ERIC Educational Resources Information Center
Dosa, Marta
Neither the design of information systems and networks nor the delivery of library services can claim true user centricity without an understanding of the multifaceted psychological environment of users and potential users. The complexity of the political process, social problems, challenges to scientific inquiry, entrepreneurship, and…
Access to the scientific literature
NASA Astrophysics Data System (ADS)
Albarède, Francis
The Public Library of Science Open Letter (http://www.publiclibraryofscience.org) is a very generous initiative, but, as most similar initiatives since the advent of electronic publishing, it misses the critical aspects of electronic publishing.Ten years ago, a Publisher would be in charge of running a system called a “scientific journal.” In such a system, the presence of an Editor and peer Reviewers secures the strength of the science and the rigor of writing; the Publisher guarantees the professional quality of printing, efficient dissemination, and long-term archiving. Publishing used to be in everyone's best interest, or nearly everyone. The Publisher, because he/she is financially motivated, ensures widespread dissemination of the journal amongst libraries and individual subscribers. The interest of the Author is that the system guarantees a broad potential readership. The interest of the Reader is that a line is drawn between professionally edited literature, presumably of better quality, and gray literature or home publishing, so that he/she does not waste time going through ‘low yield’ ungraded information. The Publisher could either be a private company, an academic institution, or a scholarly society. My experience is that, when page charges and subscription rates are compounded, journals published by scholarly societies are not necessarily cheaper. The difference between these cases is not the cost of running an office with rents, wages, printing, postage, advertisement, and archiving, but that a private Publisher pays shareholders. Shareholders have the bad habit of minding their own business and, therefore, they may interfere negatively with scientific publishing. Nevertheless, while the stranglehold imposed by private Publishers on our libraries over the last 10 years by increasing subscription rates may in part be due to shareholders' greed, this is true only in part. The increases are also a consequence of the booming number of pages being printed.
NASA Astrophysics Data System (ADS)
Perez, Tracie Renea Conn
Over the past 15 years, there has been a growing interest in femtosatellites, a class of tiny satellites having mass less than 100 grams. Research groups from Peru, Spain, England, Canada, and the United States have proposed femtosat designs and novel mission concepts for them. In fact, Peru made history in 2013 by releasing the first - and still only - femtosat tracked from LEO. However, femtosatellite applications in interplanetary missions have yet to be explored in detail. An interesting operations concept would be for a space probe to release numerous femtosatellites into orbit around a planetary object of interest, thereby augmenting the overall data collection capability of the mission. A planetary probe releasing hundreds of femtosats could complete an in-situ, simultaneous 3D mapping of a physical property of interest, achieving scientific investigations not possible for one probe operating alone. To study the technical challenges associated with such a mission, a conceptual mission design is proposed where femtosats are deployed from a host satellite orbiting Titan. The conceptual mission objective is presented: to study Titan's dynamic atmosphere. Then, the design challenges are addressed in turn. First, any science payload measurements that the femtosats provide are only useful if their corresponding locations can be determined. Specifically, what's required is a method of position determination for femtosatellites operating beyond Medium Earth Orbit and therefore beyond the help of GPS. A technique is presented which applies Kalman filter techniques to Doppler shift measurements, allowing for orbit determination of the femtosats. Several case studies are presented demonstrating the usefulness of this approach. Second, due to the inherit power and computational limitations in a femtosatellite design, establishing a radio link between each chipsat and the mothersat will be difficult. To provide a mathematical gain, a particular form of forward error correction (FEC) method called low-density parity-check (LDPC) codes is recommended. A specific low-complexity encoder, and accompanying decoder, have been implemented in the open-source software radio library, GNU Radio. Simulation results demonstrating bit error rate (BER) improvement are presented. Hardware for implementing the LDPC methods in a benchtop test are described and future work on this topic is suggested. Third, the power and spatial constraints of femtosatellite designs likely restrict the payload to one or two sensors. Therefore, it is desired to extract as much useful scientific data as possible from secondary sources, such as radiometric data. Estimating the atmospheric density model from different measurement sources is simulated; results are presented. The overall goal for this effort is to advance the field of miniature spacecraft-based technology and to highlight the advantages of using femtosatellites in future planetary exploration missions. By addressing several subsystem design challenges in this context, such a femtosat mission concept is one step closer to being feasible.
R.E.DD.B.: A database for RESP and ESP atomic charges, and force field libraries
Dupradeau, François-Yves; Cézard, Christine; Lelong, Rodolphe; Stanislawiak, Élodie; Pêcher, Julien; Delepine, Jean Charles; Cieplak, Piotr
2008-01-01
The web-based RESP ESP charge DataBase (R.E.DD.B., http://q4md-forcefieldtools.org/REDDB) is a free and new source of RESP and ESP atomic charge values and force field libraries for model systems and/or small molecules. R.E.DD.B. stores highly effective and reproducible charge values and molecular structures in the Tripos mol2 file format, information about the charge derivation procedure, scripts to integrate the charges and molecular topology in the most common molecular dynamics packages. Moreover, R.E.DD.B. allows users to freely store and distribute RESP or ESP charges and force field libraries to the scientific community, via a web interface. The first version of R.E.DD.B., released in January 2006, contains force field libraries for molecules as well as molecular fragments for standard residues and their analogs (amino acids, monosaccharides, nucleotides and ligands), hence covering a vast area of relevant biological applications. PMID:17962302
NASA Technical Reports Server (NTRS)
Nelson, Michael L.; Maly, Kurt; Shen, Stewart N. T.
1997-01-01
In this paper we describe NCSTRL+, a unified, canonical digital library for scientific and technical information (STI). NCSTRL+ is based on the Networked Computer Science Technical Report Library (NCSTRL), a World Wide Web (WWW) accessible digital library (DL) that provides access to over 80 university departments and laboratories. NCSTRL+ implements two new technologies: cluster functionality and publishing "buckets." We have extended the Dienst protocol, the protocol underlying NCSTRL, to provide the ability to "cluster" independent collections into a logically centralized digital library based upon subject category classification, type of organization, and genres of material. The concept of "buckets" provides a mechanism for publishing and managing logically linked entities with multiple data formats. The NCSTRL+ prototype DL contains the holdings of NCSTRL and the NASA Technical Report Server (NTRS). The prototype demonstrates the feasibility of publishing into a multi-cluster DL, searching across clusters, and storing and presenting buckets of information. We show that the overhead for these additional capabilities is minimal to both the author and the user when compared to the equivalent process within NCSTRL.
NSUF Irradiated Materials Library
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cole, James Irvin
The Nuclear Science User Facilities has been in the process of establishing an innovative Irradiated Materials Library concept for maximizing the value of previous and on-going materials and nuclear fuels irradiation test campaigns, including utilization of real-world components retrieved from current and decommissioned reactors. When the ATR national scientific user facility was established in 2007 one of the goals of the program was to establish a library of irradiated samples for users to access and conduct research through competitively reviewed proposal process. As part of the initial effort, staff at the user facility identified legacy materials from previous programs thatmore » are still being stored in laboratories and hot-cell facilities at the INL. In addition other materials of interest were identified that are being stored outside the INL that the current owners have volunteered to enter into the library. Finally, over the course of the last several years, the ATR NSUF has irradiated more than 3500 specimens as part of NSUF competitively awarded research projects. The Logistics of managing this large inventory of highly radioactive poses unique challenges. This document will describe materials in the library, outline the policy for accessing these materials and put forth a strategy for making new additions to the library as well as establishing guidelines for minimum pedigree needed to be included in the library to limit the amount of material stored indefinitely without identified value.« less
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-19
... Medicines Agency (EMA) European Community Herbal Monographs, and World Health Organization (WHO) Monographs... that several authoritative labeling standards monographs for herbal products specify traditional use... the major scientific reference databases, such as the National Library of Medicine's literature...
OAI and NASA's Scientific and Technical Information.
ERIC Educational Resources Information Center
Nelson, Michael L.; Rocker, JoAnne; Harrison, Terry L.
2003-01-01
Details NASA's (National Aeronautics & Space Administration (USA)) involvement in defining and testing the Open Archives Initiative (OAI) Protocol for Metadata Harvesting (OAI-PMH) and experience with adapting existing NASA distributed searching DLs (digital libraries) to use the OAI-PMH and metadata harvesting. Discusses some new digital…
Historiography and History of Information Science (SIG HFIS)
ERIC Educational Resources Information Center
Breitenstein, Mikel
2000-01-01
Presents abstracts of papers for a planned session dealing with the historiography and history of information science. Highlights include probability distributions underlying the use of library materials, particularly scientific journals; the temporal and historical orientation of the rhetoric of information science; and concepts of information…
NASA Scientific and Technical Information System (STI) and New Directory of Numerical Data Bases
NASA Technical Reports Server (NTRS)
Wilson, J.
1984-01-01
The heart of NASA's STI system is a collection of scientific and technical information gathered from worldwide sources. Currently containing over 2.2 million items, the data base is growing at the rate of 140,000 items per year. In addition to announcement journals, information is disseminated through the NASA RECON on-line bibliographic search system. One part of RECON is NALNET which lists journals and books held by the NASA Centers. Another service now accessible by recon is a directory of numerical data bases (DND) which can be shared by NASA staff and contractors. The DND describes each data base and gives the name and phone number of a contact person. A NASA-wide integrated library system is being developed for the Center libraries which will include on-line catalog and subsystems for acquisition, circulation control, information retrieveal, management information, and an authority file. These subsystems can interact with on-line bibliographic, patron, and vendor files.
Public humanization policies: integrative literature review.
Moreira, Márcia Adriana Dias Meirelles; Lustosa, Abdon Moreira; Dutra, Fernando; Barros, Eveline de Oliveira; Batista, Jaqueline Brito Vidal; Duarte, Marcella Costa Souto
2015-10-01
The study aimed to investigate the scientific literature on Public Humanization Policies, available in online periodicals, from 2009 to 2012, in the health field. This is an integrative literature review conducted in the Virtual Health Library databases: Latin-America and Caribbean Health Sciences (Lilacs) and the Scientific Electronic Library Online (SciELO) and Portal Capes. Data were collected in July 2013. To this end, the following Health Sciences Descriptors (DeCS) were used: "Humanization of Care," "Public Policies," "National Humanization Policy". The sample consisted of 27 articles about the investigated theme. From the publications selected for the research, three categories emerged according to their respective approaches: National Human-ization Policy: history and processes involved in its implementation; National Humanization Policy: health professionals contribution; Humanization and in the care process. The study showed that the National Humanization Policy is an important benchmark in the development of health practices. For this reason, there is a pressing multiplication of related reflections on ways to promote human-ization in health services.
Tera-Op Reliable Intelligently Adaptive Processing System (TRIPS)
2004-04-01
flop creates a loadable FIFO queue, fifo pload. A prototype of the HML simulator is implemented using a functional language OCaml . The language type...Flow . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16 7.1.2 Hardware Meta Language ...operates on the TRIPS Intermediate Language (TIL) produced by the Scale compiler. We also adapted the gnu binary utilities to implement an assembler and
autokonf - A Configuration Script Generator Implemented in Perl
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reus, J F
This paper discusses configuration scripts in general and the scripting language issues involved. A brief description of GNU autoconf is provided along with a contrasting overview of autokonf, a configuration script generator implemented in Perl, whose macros are implemented in Perl, generating a configuration script in Perl. It is very portable, easily extensible, and readily mastered.
2015-02-03
requiring the least hardware investment is to localize by received signal strength [1, 4, 5]. Because our intended scenario of low-complexity...MHz were taken with a spectrum analyzer program on the USRP and a range finder was used to measure the distance between the emitter and sensor
ERIC Educational Resources Information Center
Spante, Maria; Karlsen, Asgjerd Vea; Nortvig, Anne-Mette; Christiansen, Rene B.
2014-01-01
Gränsöverskridande Nordisk Undervisning/Utdanelse (GNU, meaning Cross-Border Nordic Education), the larger Nordic project, under which this case study was carried out, aims at developing innovative, cross-border teaching models in different subject domains in elementary school, including mathematics, language, science, social studies and history.…
Rondon, Michelle R.; Raffel, Sandra J.; Goodman, Robert M.; Handelsman, Jo
1999-01-01
As the study of microbes moves into the era of functional genomics, there is an increasing need for molecular tools for analysis of a wide diversity of microorganisms. Currently, biological study of many prokaryotes of agricultural, medical, and fundamental scientific interest is limited by the lack of adequate genetic tools. We report the application of the bacterial artificial chromosome (BAC) vector to prokaryotic biology as a powerful approach to address this need. We constructed a BAC library in Escherichia coli from genomic DNA of the Gram-positive bacterium Bacillus cereus. This library provides 5.75-fold coverage of the B. cereus genome, with an average insert size of 98 kb. To determine the extent of heterologous expression of B. cereus genes in the library, we screened it for expression of several B. cereus activities in the E. coli host. Clones expressing 6 of 10 activities tested were identified in the library, namely, ampicillin resistance, zwittermicin A resistance, esculin hydrolysis, hemolysis, orange pigment production, and lecithinase activity. We analyzed selected BAC clones genetically to identify rapidly specific B. cereus loci. These results suggest that BAC libraries will provide a powerful approach for studying gene expression from diverse prokaryotes. PMID:10339608
The Acquisition of Electronic Books in the Area of Astronomy in the UNAM
NASA Astrophysics Data System (ADS)
Juarez, B.
2015-04-01
The current high cost of electronic books coupled with the low budget for the astronomy libraries of the National Autonomous University of Mexico (UNAM) has led to the three libraries in the area of Astronomy to become part of the Group of Libraries in the area of sciences, which happened in 2011. This group was formed to support the purchasing of e-books: because of their high costs, it was impossible to acquire materials of interest for each library. As a result, a working group was formed to prepare lists of e-books to purchase in order to avoid duplication, acquire all needed titles, and combine resources. The goal was to purchase e-books from large publishing companies such as Springer and Elsevier, and also Cambridge UP, Oxford UP, World Scientific, Astronomical Society of the Pacific, and others. Through these joint purchases, the three campus Astronomy libraries — University City, Ensenada, and Morelia — have benefited from the acquisition of e-books from 2010 to 2013. This paper will present the ways the working group functioned, the policies that needed to be followed with regards to the selection and acquisition of e-books, and the benefits at both the Library and Group level.
Re-inventing Data Libraries: Ensuring Continuing Access To Curated (Value-added) Data
NASA Astrophysics Data System (ADS)
Burnhill, P.; Medyckyj-Scott, D.
2008-12-01
How many years of inexperience do we need in using, and in particular sharing, digital data generated by others? That history pre-dates, but must also gain leverage from, the emergence of the digital library. Much of this sharing was done within research groups but recent attention to spatial data infrastructure highlights the importance of achieving several 'right mixes': * between Internet-standards, geo-specific referencing, and domain-specific vocabulary (cf ontology); * between attention to user-focus'd services and machine-to-machine interoperability; * between the demands of current high-quality services, the practice of data curation, and the need for long term preservation. This presentation will draw upon ideas and experience data library services in research universities, a national (UK) academic data centre, and developments in digital curation. It will be argued that the 1980s term 'data library' has some polemic value in that we have yet to learn what it means to 'do library' for data: more than "a bit like inter-galactic library loan", perhaps. Illustration will be drawn from multi-faceted database of digitized boundaries (UKBORDERS), through the first Internet map delivery of national mapping agency data (Digimap), to strategic positioning to help geo-enable academic and scientific data and so enhance research (in the UK, in Europe, and beyond).
a UV Spectral Library of Metal-Poor Massive Stars
NASA Astrophysics Data System (ADS)
Robert, Carmelle
1994-01-01
We propose to use the FOS to build a snapshot library of UV spectra of a sample of about 50 metal-poor massive stars located in the Magellanic Clouds. The majority of libraries already existing contains spectra of hot stars with chemical abundances close to solar. The high spectral resolution achieves with the FOS will be a major factor for the uniqueness of this new library. UV spectral libraries represent fundamental tools for the study of the massive star populations of young star-forming regions. Massive stars, which are impossible to identify directly in the optical-IR part of a composite spectrum, display on the other hand key signatures in the UV region. These signatures are mainly broad, metallicity dependent spectral features formed in the hot star winds. They require a high spectral resolution (of the order of 200-300 km/s) for an adequate study. A spectral library of metal-poor massive stars represents also a unique source of data for a stellar atmosphere analysis. Within less then 10 min we will obtain a high signal-to-noise ratio of at least 30. Finally, since short exposure times are possible, this proposal makes extremely good use of the capabilities of HST. We designed an observing strategy which yields a maximum scientific return at a minimum cost of spacecraft time.
NASA Technical Reports Server (NTRS)
Nelson, Michael L.
1997-01-01
Our objective was to study the feasibility of extending the Dienst protocol to enable a multi-discipline, multi-format digital library. We implemented two new technologies: cluster functionality and publishing buckets. We have designed a possible implementation of clusters and buckets, and have prototyped some aspects of the resultant digital library. Currently, digital libraries are segregated by the disciplines they serve (computer science, aeronautics, etc.), and by the format of their holdings (reports, software, datasets, etc.). NCSTRL+ is a multi-discipline, multi-format digital library (DL) prototype created to explore the feasibility of the design and implementation issues involved with created a unified, canonical scientific and technical information (STI) DL. NCSTRL+ is based on the Networked Computer Science Technical Report Library (NCSTRL), a World Wide Web (WWW) accessible DL that provides access to over 80 university departments and laboratories. We have extended the Dienst protocol (version 4.1.8), the protocol underlying NCSTRL, to provide the ability to cluster independent collections into a logically centralized DL based upon subject category classification, type of organization, and genre of material. The concept of buckets provides a mechanism for publishing and managing logically linked entities with multiple data formats.
Rübel, Oliver; Dougherty, Max; Prabhat; Denes, Peter; Conant, David; Chang, Edward F.; Bouchard, Kristofer
2016-01-01
Neuroscience continues to experience a tremendous growth in data; in terms of the volume and variety of data, the velocity at which data is acquired, and in turn the veracity of data. These challenges are a serious impediment to sharing of data, analyses, and tools within and across labs. Here, we introduce BRAINformat, a novel data standardization framework for the design and management of scientific data formats. The BRAINformat library defines application-independent design concepts and modules that together create a general framework for standardization of scientific data. We describe the formal specification of scientific data standards, which facilitates sharing and verification of data and formats. We introduce the concept of Managed Objects, enabling semantic components of data formats to be specified as self-contained units, supporting modular and reusable design of data format components and file storage. We also introduce the novel concept of Relationship Attributes for modeling and use of semantic relationships between data objects. Based on these concepts we demonstrate the application of our framework to design and implement a standard format for electrophysiology data and show how data standardization and relationship-modeling facilitate data analysis and sharing. The format uses HDF5, enabling portable, scalable, and self-describing data storage and integration with modern high-performance computing for data-driven discovery. The BRAINformat library is open source, easy-to-use, and provides detailed user and developer documentation and is freely available at: https://bitbucket.org/oruebel/brainformat. PMID:27867355
National Technology Center and photonics
NASA Astrophysics Data System (ADS)
Vlannes, Nickolas P.
1992-05-01
A National Technology Center is proposed in order to meet the international challenges to the economy and security of the United States. This center would be tasked with the acquisition, analysis, assessment, and dissemination of worldwide scientific and technical information and data; technology transfer to the United States; and research and development in information and library sciences and technology. The National Technology Center would form a national network linking centers of excellence and expertise, and maintain a national technology library. With these functions, the National Technology Center has inherent requirements for technologies based on photonics, and will further motivate developments in this field.
Things Change, People Change, Libraries Go on: E-books or Not E-books?
NASA Astrophysics Data System (ADS)
Martines, F.
2015-04-01
The aim of this paper is to describe how e-books work and how they can be managed in a scientific or research library; specifically, to discuss the viability of e-lending. The results were a little bit surprising and even slightly confusing. Unquestionably, e-books have enormous potential, but much of this potential is untapped. Although there is widespread awareness of the advantages of e-books among users and librarians, problems and challenges are not as well known. After a discussion of the potential advantages, I will concentrate on some of the real drawbacks of e-books.
Numerical ‘health check’ for scientific codes: the CADNA approach
NASA Astrophysics Data System (ADS)
Scott, N. S.; Jézéquel, F.; Denis, C.; Chesneaux, J.-M.
2007-04-01
Scientific computation has unavoidable approximations built into its very fabric. One important source of error that is difficult to detect and control is round-off error propagation which originates from the use of finite precision arithmetic. We propose that there is a need to perform regular numerical 'health checks' on scientific codes in order to detect the cancerous effect of round-off error propagation. This is particularly important in scientific codes that are built on legacy software. We advocate the use of the CADNA library as a suitable numerical screening tool. We present a case study to illustrate the practical use of CADNA in scientific codes that are of interest to the Computer Physics Communications readership. In doing so we hope to stimulate a greater awareness of round-off error propagation and present a practical means by which it can be analyzed and managed.
Management of scientific information with Google Drive.
Kubaszewski, Łukasz; Kaczmarczyk, Jacek; Nowakowski, Andrzej
2013-09-20
The amount and diversity of scientific publications requires a modern management system. By "management" we mean the process of gathering interesting information for the purpose of reading and archiving for quick access in future clinical practice and research activity. In the past, such system required physical existence of a library, either institutional or private. Nowadays in an era dominated by electronic information, it is natural to migrate entire systems to a digital form. In the following paper we describe the structure and functions of an individual electronic library system (IELiS) for the management of scientific publications based on the Google Drive service. Architecture of the system. Architecture system consists of a central element and peripheral devices. Central element of the system is virtual Google Drive provided by Google Inc. Physical elements of the system include: tablet with Android operating system and a personal computer, both with internet access. Required software includes a program to view and edit files in PDF format for mobile devices and another to synchronize the files. Functioning of the system. The first step in creating a system is collection of scientific papers in PDF format and their analysis. This step is performed most frequently on a tablet. At this stage, after being read, the papers are cataloged in a system of folders and subfolders, according to individual demands. During this stage, but not exclusively, the PDF files are annotated by the reader. This allows the user to quickly track down interesting information in review or research process. Modification of the document title is performed at this stage, as well. Second element of the system is creation of a mirror database in the Google Drive virtual memory. Modified and cataloged papers are synchronized with Google Drive. At this stage, a fully functional scientific information electronic library becomes available online. The third element of the system is a periodic two-way synchronization of data between Google Drive and tablet, as occasional modification of the files with annotation or recataloging may be performed at both locations. The system architecture is designed to gather, catalog and analyze scientific publications. All steps are electronic, eliminating paper forms. Indexed files are available for re-reading and modification. The system allows for fast access to full-text search with additional features making research easier. Team collaboration is also possible with full control of user privileges. Particularly important is the safety of collected data. In our opinion, the system exceeds many commercially available applications in terms of functionality and versatility.
Lambert W function for applications in physics
NASA Astrophysics Data System (ADS)
Veberič, Darko
2012-12-01
The Lambert W(x) function and its possible applications in physics are presented. The actual numerical implementation in C++ consists of Halley's and Fritsch's iterations with initial approximations based on branch-point expansion, asymptotic series, rational fits, and continued-logarithm recursion. Program summaryProgram title: LambertW Catalogue identifier: AENC_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AENC_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public License version 3 No. of lines in distributed program, including test data, etc.: 1335 No. of bytes in distributed program, including test data, etc.: 25 283 Distribution format: tar.gz Programming language: C++ (with suitable wrappers it can be called from C, Fortran etc.), the supplied command-line utility is suitable for other scripting languages like sh, csh, awk, perl etc. Computer: All systems with a C++ compiler. Operating system: All Unix flavors, Windows. It might work with others. RAM: Small memory footprint, less than 1 MB Classification: 1.1, 4.7, 11.3, 11.9. Nature of problem: Find fast and accurate numerical implementation for the Lambert W function. Solution method: Halley's and Fritsch's iterations with initial approximations based on branch-point expansion, asymptotic series, rational fits, and continued logarithm recursion. Additional comments: Distribution file contains the command-line utility lambert-w. Doxygen comments, included in the source files. Makefile. Running time: The tests provided take only a few seconds to run.
EEGVIS: A MATLAB Toolbox for Browsing, Exploring, and Viewing Large Datasets.
Robbins, Kay A
2012-01-01
Recent advances in data monitoring and sensor technology have accelerated the acquisition of very large data sets. Streaming data sets from instrumentation such as multi-channel EEG recording usually must undergo substantial pre-processing and artifact removal. Even when using automated procedures, most scientists engage in laborious manual examination and processing to assure high quality data and to indentify interesting or problematic data segments. Researchers also do not have a convenient method of method of visually assessing the effects of applying any stage in a processing pipeline. EEGVIS is a MATLAB toolbox that allows users to quickly explore multi-channel EEG and other large array-based data sets using multi-scale drill-down techniques. Customizable summary views reveal potentially interesting sections of data, which users can explore further by clicking to examine using detailed viewing components. The viewer and a companion browser are built on our MoBBED framework, which has a library of modular viewing components that can be mixed and matched to best reveal structure. Users can easily create new viewers for their specific data without any programming during the exploration process. These viewers automatically support pan, zoom, resizing of individual components, and cursor exploration. The toolbox can be used directly in MATLAB at any stage in a processing pipeline, as a plug-in for EEGLAB, or as a standalone precompiled application without MATLAB running. EEGVIS and its supporting packages are freely available under the GNU general public license at http://visual.cs.utsa.edu/eegvis.
- XSUMMER- Transcendental functions and symbolic summation in FORM
NASA Astrophysics Data System (ADS)
Moch, S.; Uwer, P.
2006-05-01
Harmonic sums and their generalizations are extremely useful in the evaluation of higher-order perturbative corrections in quantum field theory. Of particular interest have been the so-called nested sums, where the harmonic sums and their generalizations appear as building blocks, originating for example, from the expansion of generalized hypergeometric functions around integer values of the parameters. In this paper we discuss the implementation of several algorithms to solve these sums by algebraic means, using the computer algebra system FORM. Program summaryTitle of program:XSUMMER Catalogue identifier:ADXQ_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADXQ_v1_0 Program obtainable from:CPC Program Library, Queen's University of Belfast, N. Ireland License:GNU Public License and FORM License Computers:all Operating system:all Program language:FORM Memory required to execute:Depending on the complexity of the problem, recommended at least 64 MB RAM No. of lines in distributed program, including test data, etc.:9854 No. of bytes in distributed program, including test data, etc.:126 551 Distribution format:tar.gz Other programs called:none External files needed:none Nature of the physical problem:Systematic expansion of higher transcendental functions in a small parameter. The expansions arise in the calculation of loop integrals in perturbative quantum field theory. Method of solution:Algebraic manipulations of nested sums. Restrictions on complexity of the problem:Usually limited only by the available disk space. Typical running time:Dependent on the complexity of the problem.
NASA Astrophysics Data System (ADS)
Kroll, Peter
The real heritage of Sonneberg Observatory consists of several buildings with seven domes, a number of telescopes for photographic and photoelectric measurements, a plate archive - which is the second-largest in the world - and a scientific library. While the instruments are today mainly used for public observing tours and to a limited degree for continuing sky patrol, the plate archive is systematically scanned in order to make the whole information stored in the emulsion of the plates accessible to the astronomical community and to allow the scientific study of all stars ever recorded. First pilot studies give a taste of what output can be expected from the digitized plate archive.
Front End Software for Online Database Searching. Part 2: The Marketplace.
ERIC Educational Resources Information Center
Levy, Louise R.; Hawkins, Donald T.
1986-01-01
This article analyzes the front end software marketplace and discusses some of the complex forces influencing it. Discussion covers intermediary market; end users (library customers, scientific and technical professionals, corporate business specialists, consumers); marketing strategies; a British front end development firm; competitive pressures;…
Online Bioinformatics Tutorials | Office of Cancer Genomics
Bioinformatics is a scientific discipline that applies computer science and information technology to help understand biological processes. The NIH provides a list of free online bioinformatics tutorials, either generated by the NIH Library or other institutes, which includes introductory lectures and "how to" videos on using various tools.
Defending Champions Reign Supreme at 2017 Student Science Jeopardy Tournament | Poster
Anuk Dayaprema and Evan Yamaguchi, champions of the 2016 Student Science Jeopardy Tournament, have done it again. After a grueling competition, they emerged victorious for the second year in a row at the 2017 Student Science Jeopardy Tournament, sponsored by the Scientific Library.
Information Competencies for Chemistry Undergraduates and Related Collaborative Endeavors
ERIC Educational Resources Information Center
Peters, Marion C.
2014-01-01
"Information Competencies for Chemistry Undergraduates: The Elements of Information Literacy", (2012-) now in its second edition and available as a Wikibook since 2012, resulted from collaboration by chemistry librarians participating in several professional organizations. Sections covering a) the library and scientific literature and b)…
48 CFR 53.235 - Research and Development Contracting (SF 298).
Code of Federal Regulations, 2013 CFR
2013-10-01
... 48 Federal Acquisition Regulations System 2 2013-10-01 2013-10-01 false Research and Development... REGULATION (CONTINUED) CLAUSES AND FORMS FORMS Prescription of Forms 53.235 Research and Development... scientific and technical reports to contracting officers and to technical information libraries, as specified...