ERIC Educational Resources Information Center
McDonald, Joseph
1986-01-01
Focusing on management decisions in academic libraries, this article compares management information systems (MIS) with decision support systems (DSS) and discusses the decision-making process, information needs of library managers, sources of data, reasons for choosing microcomputer, preprogrammed application software, prototyping a system, and…
Library-Specific Microcomputer Software.
ERIC Educational Resources Information Center
Levert, Virginia M.
1985-01-01
Discusses number and type of microcomputer software programs useful to libraries and types of hardware on which they run, as identified by Nolan Information Management Services. Highlights include general application programs, applications designed to support library technical processes, producers of library software, and choosing among options.…
Microcomputer Software for Libraries: A Survey.
ERIC Educational Resources Information Center
Nolan, Jeanne M.
1983-01-01
Reports on findings of research done by Nolan Information Management Services concerning availability of microcomputer software for libraries. Highlights include software categories (specific, generic-database management programs, original); number of programs available in 1982 for 12 applications; projections for 1983; and future software…
Teaching Reprint File Management: Basic Principles and Software Programs.
ERIC Educational Resources Information Center
Wood, Elizabeth H.
1989-01-01
Describes a workshop for teaching library users how to manage reprint files which was developed at the University of Southern California Norris Medical Library. Software programs designed for this purpose are suggested, and a sidebar lists software features to consider. (eight references) (MES)
Microcomputer Software Packages--Choose with Caution.
ERIC Educational Resources Information Center
Naumer, Janet Noll
1983-01-01
Briefly discusses types of software available for library and media center operations and library instruction, suggests three sources of software reviews, and describes almost 50 specific application programs available for bibliographic management, cataloging, circulation, inventory and purchasing, readability, and teaching library skills in…
Selecting Software for Libraries.
ERIC Educational Resources Information Center
Beiser, Karl
1993-01-01
Discusses resources and strategies that libraries can use to evaluate competing database management software for purchase. Needs assessments, types of software available, features of good software, evaluation aids, shareware, and marketing and product trends are covered. (KRN)
Managing Automation: A Process, Not a Project.
ERIC Educational Resources Information Center
Hoffmann, Ellen
1988-01-01
Discussion of issues in management of library automation includes: (1) hardware, including systems growth and contracts; (2) software changes, vendor relations, local systems, and microcomputer software; (3) item and authority databases; (4) automation and library staff, organizational structure, and managing change; and (5) environmental issues,…
The Vendors' Corner: Biblio-Techniques' Library and Information System (BLIS).
ERIC Educational Resources Information Center
Library Software Review, 1984
1984-01-01
Describes online catalog and integrated library computer system designed to enhance Washington Library Network's software. Highlights include system components; implementation options; system features (integrated library functions, database design, system management facilities); support services (installation and training, software maintenance and…
ERIC Educational Resources Information Center
Benson, Allen C.
This handbook is designed to help readers identify and eliminate security risks, with sound recommendations and library-tested security software. Chapter 1 "Managing Your Facilities and Assessing Your Risks" addresses fundamental management responsibilities including planning for a secure system, organizing computer-related information, assessing…
GeoTess: A generalized Earth model software utility
Ballard, Sanford; Hipp, James; Kraus, Brian; ...
2016-03-23
GeoTess is a model parameterization and software support library that manages the construction, population, storage, and interrogation of data stored in 2D and 3D Earth models. Here, the software is available in Java and C++, with a C interface to the C++ library.
SAGA: A project to automate the management of software production systems
NASA Technical Reports Server (NTRS)
Campbell, Roy H.; Laliberte, D.; Render, H.; Sum, R.; Smith, W.; Terwilliger, R.
1987-01-01
The Software Automation, Generation and Administration (SAGA) project is investigating the design and construction of practical software engineering environments for developing and maintaining aerospace systems and applications software. The research includes the practical organization of the software lifecycle, configuration management, software requirements specifications, executable specifications, design methodologies, programming, verification, validation and testing, version control, maintenance, the reuse of software, software libraries, documentation, and automated management.
The Use of BookshelF in Teaching Students of Information and Library Management.
ERIC Educational Resources Information Center
Rowley, Jenny E.; Fisher, Shelagh
1992-01-01
Describes how BookshelF software, a library management system, has been integrated into the library and information management curriculum at Manchester Polytechnic (England), and discusses the learning objectives that have been achieved. The use of BookshelF modules for ordering, circulation control, cataloging, OPAC (online public access…
Libraries for Software Use on Peregrine | High-Performance Computing | NREL
-specific libraries. Libraries List Name Description BLAS Basic Linear Algebra Subroutines, libraries only managing hierarchically structured data. LAPACK Standard Netlib offering for computational linear algebra
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ballard, Sanford; Hipp, James; Kraus, Brian
GeoTess is a model parameterization and software support library that manages the construction, population, storage, and interrogation of data stored in 2D and 3D Earth models. Here, the software is available in Java and C++, with a C interface to the C++ library.
Mining Software Usage with the Automatic Library Tracking Database (ALTD)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hadri, Bilel; Fahey, Mark R
2013-01-01
Tracking software usage is important for HPC centers, computer vendors, code developers and funding agencies to provide more efficient and targeted software support, and to forecast needs and guide HPC software effort towards the Exascale era. However, accurately tracking software usage on HPC systems has been a challenging task. In this paper, we present a tool called Automatic Library Tracking Database (ALTD) that has been developed and put in production on several Cray systems. The ALTD infrastructure prototype automatically and transparently stores information about libraries linked into an application at compilation time and also the executables launched in a batchmore » job. We will illustrate the usage of libraries, compilers and third party software applications on a system managed by the National Institute for Computational Sciences.« less
Computers and Library Management.
ERIC Educational Resources Information Center
Cooke, Deborah M.; And Others
1985-01-01
This five-article section discusses changes in the management of the school library resulting from use of the computer. Topics covered include data management programs (record keeping, word processing, and bibliographies); practical applications of a database; evaluation of "Circulation Plus" software; ergonomics and computers; and…
ERIC Educational Resources Information Center
Matoria, Ram Kumar; Upadhyay, P. K.; Moni, Madaswamy
2007-01-01
Purpose: To describe the development of the library management system, e-Granthalaya, for public libraries in India. This is an initiative of the Indian government's National Informatics Centre (NIC). The paper outlines the challenges and the potential of a full-scale deployment of this software at a national level. Design/methodology/approach:…
Migration of Data from One Library Management System to Another: A Case Study in India
ERIC Educational Resources Information Center
Matoria, Ram Kumar; Upadhyay, P. K.
2005-01-01
Purpose: To share the experiences gained during the migration of library data from one library management system (LibSys[TM]) to another (e-Granthalaya[TM]). Design/methodology/approach: The paper describes the step-by-step approach taken to migrate the existing library data to the new software. The paper also discusses the peculiarities of the…
Selecting Data-Base Management Software for Microcomputers in Libraries and Information Units.
ERIC Educational Resources Information Center
Pieska, K. A. O.
1986-01-01
Presents a model for the evaluation of database management systems software from the viewpoint of librarians and information specialists. The properties of data management systems, database management systems, and text retrieval systems are outlined and compared. (10 references) (CLB)
1994-02-28
improvements. Pare 10 ka•- V •DkI U Release Manager The Release Manager provides franchisees with media copies of existing libraries, as needed. Security...implementors, and potential library franchisees . Security Team The Security Team assists the Security Officer with security analysis. Team members are...and Franchisees . A Potential User is an individual who requests a Library Account. A User Recruit has been sent a CARDS Library Account Registration
ERIC Educational Resources Information Center
Ertel, Monica M.
1984-01-01
This discussion of current microcomputer technologies available to libraries focuses on software applications in four major classifications: communications (online database searching); word processing; administration; and database management systems. Specific examples of library applications are given and six references are cited. (EJS)
Real Time with the Librarian: Using Web Conferencing Software to Connect to Distance Students
ERIC Educational Resources Information Center
Riedel, Tom; Betty, Paul
2013-01-01
A pilot program to provide real-time library webcasts to Regis University distance students using Adobe Connect software was initiated in fall of 2011. Previously, most interaction between librarians and online students had been accomplished by asynchronous discussion threads in the Learning Management System. Library webcasts were offered in…
ERIC Educational Resources Information Center
See, Andrew; Teetor, Travis Stephen
2014-01-01
In the summer of 2012, the University of Arizona Libraries implemented an online training program to effectively train Access Services staff and student employees at a large academic research library. This article discusses the program, which was built using a course management system (D2L) and various e-Learning software applications (Articulate…
Repository-Based Software Engineering Program: Working Program Management Plan
NASA Technical Reports Server (NTRS)
1993-01-01
Repository-Based Software Engineering Program (RBSE) is a National Aeronautics and Space Administration (NASA) sponsored program dedicated to introducing and supporting common, effective approaches to software engineering practices. The process of conceiving, designing, building, and maintaining software systems by using existing software assets that are stored in a specialized operational reuse library or repository, accessible to system designers, is the foundation of the program. In addition to operating a software repository, RBSE promotes (1) software engineering technology transfer, (2) academic and instructional support of reuse programs, (3) the use of common software engineering standards and practices, (4) software reuse technology research, and (5) interoperability between reuse libraries. This Program Management Plan (PMP) is intended to communicate program goals and objectives, describe major work areas, and define a management report and control process. This process will assist the Program Manager, University of Houston at Clear Lake (UHCL) in tracking work progress and describing major program activities to NASA management. The goal of this PMP is to make managing the RBSE program a relatively easy process that improves the work of all team members. The PMP describes work areas addressed and work efforts being accomplished by the program; however, it is not intended as a complete description of the program. Its focus is on providing management tools and management processes for monitoring, evaluating, and administering the program; and it includes schedules for charting milestones and deliveries of program products. The PMP was developed by soliciting and obtaining guidance from appropriate program participants, analyzing program management guidance, and reviewing related program management documents.
NA-42 TI Shared Software Component Library FY2011 Final Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Knudson, Christa K.; Rutz, Frederick C.; Dorow, Kevin E.
The NA-42 TI program initiated an effort in FY2010 to standardize its software development efforts with the long term goal of migrating toward a software management approach that will allow for the sharing and reuse of code developed within the TI program, improve integration, ensure a level of software documentation, and reduce development costs. The Pacific Northwest National Laboratory (PNNL) has been tasked with two activities that support this mission. PNNL has been tasked with the identification, selection, and implementation of a Shared Software Component Library. The intent of the library is to provide a common repository that is accessiblemore » by all authorized NA-42 software development teams. The repository facilitates software reuse through a searchable and easy to use web based interface. As software is submitted to the repository, the component registration process captures meta-data and provides version control for compiled libraries, documentation, and source code. This meta-data is then available for retrieval and review as part of library search results. In FY2010, PNNL and staff from the Remote Sensing Laboratory (RSL) teamed up to develop a software application with the goal of replacing the aging Aerial Measuring System (AMS). The application under development includes an Advanced Visualization and Integration of Data (AVID) framework and associated AMS modules. Throughout development, PNNL and RSL have utilized a common AMS code repository for collaborative code development. The AMS repository is hosted by PNNL, is restricted to the project development team, is accessed via two different geographic locations and continues to be used. The knowledge gained from the collaboration and hosting of this repository in conjunction with PNNL software development and systems engineering capabilities were used in the selection of a package to be used in the implementation of the software component library on behalf of NA-42 TI. The second task managed by PNNL is the development and continued maintenance of the NA-42 TI Software Development Questionnaire. This questionnaire is intended to help software development teams working under NA-42 TI in documenting their development activities. When sufficiently completed, the questionnaire illustrates that the software development activities recorded incorporate significant aspects of the software engineering lifecycle. The questionnaire template is updated as comments are received from NA-42 and/or its development teams and revised versions distributed to those using the questionnaire. PNNL also maintains a list of questionnaire recipients. The blank questionnaire template, the AVID and AMS software being developed, and the completed AVID AMS specific questionnaire are being used as the initial content to be established in the TI Component Library. This report summarizes the approach taken to identify requirements, search for and evaluate technologies, and the approach taken for installation of the software needed to host the component library. Additionally, it defines the process by which users request access for the contribution and retrieval of library content.« less
Component Technology for High-Performance Scientific Simulation Software
DOE Office of Scientific and Technical Information (OSTI.GOV)
Epperly, T; Kohn, S; Kumfert, G
2000-11-09
We are developing scientific software component technology to manage the complexity of modem, parallel simulation software and increase the interoperability and re-use of scientific software packages. In this paper, we describe a language interoperability tool named Babel that enables the creation and distribution of language-independent software libraries using interface definition language (IDL) techniques. We have created a scientific IDL that focuses on the unique interface description needs of scientific codes, such as complex numbers, dense multidimensional arrays, complicated data types, and parallelism. Preliminary results indicate that in addition to language interoperability, this approach provides useful tools for thinking about themore » design of modem object-oriented scientific software libraries. Finally, we also describe a web-based component repository called Alexandria that facilitates the distribution, documentation, and re-use of scientific components and libraries.« less
Airland Battlefield Environment (ALBE) Tactical Decision Aid (TDA) Demonstration Program,
1987-11-12
Management System (DBMS) software, GKS graphics libraries, and user interface software. These components of the ATB system software architecture will be... knowlede base ano auqent the decision mak:n• process by providing infocr-mation useful in the formulation and execution of battlefield strategies...Topographic Laboratories as an Engineer. Ms. Capps is managing the software development of the AirLand Battlefield Environment (ALBE) geographic
ERIC Educational Resources Information Center
Alloro, Giovanna; Ugolini, Donatella
1992-01-01
Describes the implementation of an online catalog in the library of the National Institute for Cancer Research and the Clinical and Experimental Oncology Institute of the University of Genoa. Topics addressed include automation of various library functions, software features, database management, training, and user response. (10 references) (MES)
International Inventory of Software Packages in the Information Field.
ERIC Educational Resources Information Center
Keren, Carl, Ed.; Sered, Irina, Ed.
Designed to provide guidance in selecting appropriate software for library automation, information storage and retrieval, or management of bibliographic databases, this inventory describes 188 computer software packages. The information was obtained through a questionnaire survey of 600 software suppliers and developers who were asked to describe…
Integration of the NRL Digital Library.
ERIC Educational Resources Information Center
King, James
2001-01-01
The Naval Research Laboratory (NRL) Library has identified six primary areas that need improvement: infrastructure, InfoWeb, TORPEDO Ultra, journal data management, classified data, and linking software. It is rebuilding InfoWeb and TORPEDO Ultra as database-driven Web applications, upgrading the STILAS library catalog, and creating other support…
Modular Filter and Source-Management Upgrade of RADAC
NASA Technical Reports Server (NTRS)
Lanzi, R. James; Smith, Donna C.
2007-01-01
In an upgrade of the Range Data Acquisition Computer (RADAC) software, a modular software object library was developed to implement required functionality for filtering of flight-vehicle-tracking data and management of tracking-data sources. (The RADAC software is used to process flight-vehicle metric data for realtime display in the Wallops Flight Facility Range Control Center and Mobile Control Center.)
From LAMP to Koha: Case Study of the Pakistan Legislative Assembly Libraries
ERIC Educational Resources Information Center
Shafi-Ullah, Farasat; Qutab, Saima
2012-01-01
Purpose: This paper aims to elaborate the library data migration process from LAMP (Library Automation Management Program) to the open source software Koha's (2.2.8 Windows based) Pakistani flavour PakLAG-Koha in six legislative assembly libraries of Pakistan. Design/methodology/approach: The paper explains different steps of the data migration…
CONTENTdm Digital Collection Management Software and End-User Efficacy
ERIC Educational Resources Information Center
Dickson, Maggie
2008-01-01
Digital libraries and collections are a growing facet of today's traditional library. Digital library technologies have become increasingly more sophisticated in the effort to provide more and better access to the collections they contain. The evaluation of the usability of these technologies has not kept pace with technological developments,…
Bibliographic Management Software Seminars: Funding and Implementation.
ERIC Educational Resources Information Center
Henry, Marcia
This paper contains the grant proposal and final report for a project conducted by the California State University at Northridge library to demonstrate online database searching and introduce the use of bibliographic management software to faculty and graduate students. Day-long, discipline-oriented seminars were planned to increase the…
A conceptual model for megaprogramming
NASA Technical Reports Server (NTRS)
Tracz, Will
1990-01-01
Megaprogramming is component-based software engineering and life-cycle management. Magaprogramming and its relationship to other research initiatives (common prototyping system/common prototyping language, domain specific software architectures, and software understanding) are analyzed. The desirable attributes of megaprogramming software components are identified and a software development model and resulting prototype megaprogramming system (library interconnection language extended by annotated Ada) are described.
ERIC Educational Resources Information Center
Barker, Philip
1986-01-01
Discussion of developments in information storage technology likely to have significant impact upon library utilization focuses on hardware (videodisc technology) and software developments (knowledge databases; computer networks; database management systems; interactive video, computer, and multimedia user interfaces). Three generic computer-based…
EST Express: PHP/MySQL based automated annotation of ESTs from expression libraries
Smith, Robin P; Buchser, William J; Lemmon, Marcus B; Pardinas, Jose R; Bixby, John L; Lemmon, Vance P
2008-01-01
Background Several biological techniques result in the acquisition of functional sets of cDNAs that must be sequenced and analyzed. The emergence of redundant databases such as UniGene and centralized annotation engines such as Entrez Gene has allowed the development of software that can analyze a great number of sequences in a matter of seconds. Results We have developed "EST Express", a suite of analytical tools that identify and annotate ESTs originating from specific mRNA populations. The software consists of a user-friendly GUI powered by PHP and MySQL that allows for online collaboration between researchers and continuity with UniGene, Entrez Gene and RefSeq. Two key features of the software include a novel, simplified Entrez Gene parser and tools to manage cDNA library sequencing projects. We have tested the software on a large data set (2,016 samples) produced by subtractive hybridization. Conclusion EST Express is an open-source, cross-platform web server application that imports sequences from cDNA libraries, such as those generated through subtractive hybridization or yeast two-hybrid screens. It then provides several layers of annotation based on Entrez Gene and RefSeq to allow the user to highlight useful genes and manage cDNA library projects. PMID:18402700
EST Express: PHP/MySQL based automated annotation of ESTs from expression libraries.
Smith, Robin P; Buchser, William J; Lemmon, Marcus B; Pardinas, Jose R; Bixby, John L; Lemmon, Vance P
2008-04-10
Several biological techniques result in the acquisition of functional sets of cDNAs that must be sequenced and analyzed. The emergence of redundant databases such as UniGene and centralized annotation engines such as Entrez Gene has allowed the development of software that can analyze a great number of sequences in a matter of seconds. We have developed "EST Express", a suite of analytical tools that identify and annotate ESTs originating from specific mRNA populations. The software consists of a user-friendly GUI powered by PHP and MySQL that allows for online collaboration between researchers and continuity with UniGene, Entrez Gene and RefSeq. Two key features of the software include a novel, simplified Entrez Gene parser and tools to manage cDNA library sequencing projects. We have tested the software on a large data set (2,016 samples) produced by subtractive hybridization. EST Express is an open-source, cross-platform web server application that imports sequences from cDNA libraries, such as those generated through subtractive hybridization or yeast two-hybrid screens. It then provides several layers of annotation based on Entrez Gene and RefSeq to allow the user to highlight useful genes and manage cDNA library projects.
The U. S. Geological Survey, Digital Spectral Library: Version 1 (0.2 to 3.0um)
Clark, Roger N.; Swayze, Gregg A.; Gallagher, Andrea J.; King, Trude V.V.; Calvin, Wendy M.
1993-01-01
We have developed a digital reflectance spectral library, with management and spectral analysis software. The library includes 498 spectra of 444 samples (some samples include a series of grain sizes) measured from approximately 0.2 to 3.0 um . The spectral resolution (Full Width Half Maximum) of the reflectance data is <= 4 nm in the visible (0.2-0.8 um) and <= 10 nm in the NIR (0.8-2.35 um). All spectra were corrected to absolute reflectance using an NIST Halon standard. Library management software lets users search on parameters (e.g. chemical formulae, chemical analyses, purity of samples, mineral groups, etc.) as well as spectral features. Minerals from borate, carbonate, chloride, element, halide, hydroxide, nitrate, oxide, phosphate, sulfate, sulfide, sulfosalt, and the silicate (cyclosilicate, inosilicate, nesosilicate, phyllosilicate, sorosilicate, and tectosilicate) classes are represented. X-Ray and chemical analyses are tabulated for many of the entries, and all samples have been evaluated for spectral purity. The library also contains end and intermediate members for the olivine, garnet, scapolite, montmorillonite, muscovite, jarosite, and alunite solid-solution series. We have included representative spectra of H2O ice, kerogen, ammonium-bearing minerals, rare-earth oxides, desert varnish coatings, kaolinite crystallinity series, kaolinite-smectite series, zeolite series, and an extensive evaporite series. Because of the importance of vegetation to climate-change studies we have include 17 spectra of tree leaves, bushes, and grasses. The library and software are available as a series of U.S.G.S. Open File reports. PC user software is available to convert the binary data to ascii files (a separate U.S.G.S. open file report). Additionally, a binary data files are on line at the U.S.G.S. in Denver for anonymous ftp to users on the Internet. The library search software enables a user to search on documentation parameters as well as spectral features. The analysis system includes general spectral analysis routines, plotting packages, radiative transfer software for computing intimate mixtures, routines to derive optical constants from reflectance spectra, tools to analyze spectral features, and the capability to access imaging spectrometer data cubes for spectral analysis. Users may build customized libraries (at specific wavelengths and spectral resolution) for their own instruments using the library software. We are currently extending spectral coverage to 150 um. The libraries (original and convolved) will be made available in the future on a CD-ROM.
Library Information System Time-Sharing (LISTS) Project. Final Report.
ERIC Educational Resources Information Center
Black, Donald V.
The Library Information System Time-Sharing (LISTS) experiment was based on three innovations in data processing technology: (1) the advent of computer time-sharing on third-generation machines, (2) the development of general-purpose file-management software and (3) the introduction of large, library-oriented data bases. The main body of the…
Microcomputers in Libraries: The Quiet Revolution.
ERIC Educational Resources Information Center
Boss, Richard
1985-01-01
This article defines three separate categories of microcomputers--personal, desk-top, multi-user devices--and relates storage capabilities (expandability, floppy disks) to library applications. Highlghts include de facto standards, operating systems, database management systems, applications software, circulation control systems, dumb and…
Two Years on: Koha 3.0 in Use at the CAMLIS Library, Royal London Homoeopathic Hospital
ERIC Educational Resources Information Center
Bissels, Gerhard; Chandler, Andrea
2010-01-01
Purpose: The purpose of this paper is to describe the further development of the Koha 3.0 library management system (LMS) and the involvement of external software consultants at the Complementary and Alternative Medicine Library and Information Service (CAMLIS), Royal London Homoeopathic Hospital. Design/methodology/approach: The paper takes the…
Upgrading a ColdFusion-Based Academic Medical Library Staff Intranet
ERIC Educational Resources Information Center
Vander Hart, Robert; Ingrassia, Barbara; Mayotte, Kerry; Palmer, Lisa A.; Powell, Julia
2010-01-01
This article details the process of upgrading and expanding an existing academic medical library intranet to include a wiki, blog, discussion forum, and photo collection manager. The first version of the library's intranet from early 2002 was powered by ColdFusion software and existed primarily to allow staff members to author and store minutes of…
ERIC Educational Resources Information Center
Jackson, Mary E.
2002-01-01
Explains portals as tools that gather a variety of electronic information resources, including local library resources, into a single Web page. Highlights include cross-database searching; integration with university portals and course management software; the ARL (Association of Research Libraries) Scholars Portal Initiative; and selected vendors…
The Virginia Tech Library System (VTLS).
ERIC Educational Resources Information Center
McGrath, Deborah Hall; Lee, Carl R.
1989-01-01
Discusses topics relating to the Virginia Tech Library System: the company (VTLS, Inc.); the software; data structure; cataloging, status, and authority control; circulation; serials control and acquisitions; the online catalog; management reporting; networking; and the operating environment. Sidebars discuss the Vanilla Network; LINNEA--a network…
Improving Software Sustainability: Lessons Learned from Profiles in Science.
Gallagher, Marie E
2013-01-01
The Profiles in Science® digital library features digitized surrogates of historical items selected from the archival collections of the U.S. National Library of Medicine as well as collaborating institutions. In addition, it contains a database of descriptive, technical and administrative metadata. It also contains various software components that allow creation of the metadata, management of the digital items, and access to the items and metadata through the Profiles in Science Web site [1]. The choices made building the digital library were designed to maximize the sustainability and long-term survival of all of the components of the digital library [2]. For example, selecting standard and open digital file formats rather than proprietary formats increases the sustainability of the digital files [3]. Correspondingly, using non-proprietary software may improve the sustainability of the software--either through in-house expertise or through the open source community. Limiting our digital library software exclusively to open source software or to software developed in-house has not been feasible. For example, we have used proprietary operating systems, scanning software, a search engine, and office productivity software. We did this when either lack of essential capabilities or the cost-benefit trade-off favored using proprietary software. We also did so knowing that in the future we would need to replace or upgrade some of our proprietary software, analogous to migrating from an obsolete digital file format to a new format as the technological landscape changes. Since our digital library's start in 1998, all of its software has been upgraded or replaced, but the digitized items have not yet required migration to other formats. Technological changes that compelled us to replace proprietary software included the cost of product licensing, product support, incompatibility with other software, prohibited use due to evolving security policies, and product abandonment. Sometimes these changes happen on short notice, so we continually monitor our library's software for signs of endangerment. We have attempted to replace proprietary software with suitable in-house or open source software. When the replacement involves a standalone piece of software with a nearly equivalent version, such as replacing a commercial HTTP server with an open source HTTP server, the replacement is straightforward. Recently we replaced software that functioned not only as our search engine but also as the backbone of the architecture of our Web site. In this paper, we describe the lessons learned and the pros and cons of replacing this software with open source software.
An Overview of Public Access Computer Software Management Tools for Libraries
ERIC Educational Resources Information Center
Wayne, Richard
2004-01-01
An IT decision maker gives an overview of public access PC software that's useful in controlling session length and scheduling, Internet access, print output, security, and the latest headaches: spyware and adware. In this article, the author describes a representative sample of software tools in several important categories such as setup…
Computers in Libraries, 2000: Proceedings (15th, Washington, D.C., March 15-17, 2000).
ERIC Educational Resources Information Center
Nixon, Carol, Comp.; Burmood, Jennifer, Comp.
Topics of the Proceedings of the 15th Annual Computers in Libraries Conference (March 15-17, 2000) include: Linux and open source software in an academic library; a Master Trainer Program; what educators need to know about multimedia and copyright; how super searchers find business information online; managing print costs; new technologies in wide…
Software Library: A Reusable Software Issue.
1984-06-01
On reverse aide it neceeary aid Identify by block number) Software Library; Program Library; Reusability; Generator 20 ABSTRACT (Cmlnue on revere... Software Library. A particular example of the Software Library, the Program Library, is described as a prototype of a reusable library. A hierarchical... programming libraries are described. Finally, non code products in the Software Library are discussed. Accesson Fo NTIS R~jS DrrC TA Availability Codes 0
The M68HC11 gripper controller software. Thesis
NASA Technical Reports Server (NTRS)
Tsai, Jodi Wei-Duk
1991-01-01
This thesis discusses the development of firmware for the 68HC11 gripper controller. A general description of the software and hardware interfaces is given. The C library interface for the gripper is then described and followed by a detailed discussion of the software architecture of the firmware. A procedure to assemble and download 68HC11 programs is presented in the form of a tutorial. The tools used to implement this environment are then described. Finally, the implementation of the configuration management scheme used to manage all CIRSSE software is presented.
The Houston Academy of Medicine--Texas Medical Center Library management information system.
Camille, D; Chadha, S; Lyders, R A
1993-01-01
A management information system (MIS) provides a means for collecting, reporting, and analyzing data from all segments of an organization. Such systems are common in business but rare in libraries. The Houston Academy of Medicine-Texas Medical Center Library developed an MIS that operates on a system of networked IBM PCs and Paradox, a commercial database software package. The data collected in the system include monthly reports, client profile information, and data collected at the time of service requests. The MIS assists with enforcement of library policies, ensures that correct information is recorded, and provides reports for library managers. It also can be used to help answer a variety of ad hoc questions. Future plans call for the development of an MIS that could be adapted to other libraries' needs, and a decision-support interface that would facilitate access to the data contained in the MIS databases. PMID:8251972
Towards a high performance geometry library for particle-detector simulations
Apostolakis, J.; Bandieramonte, M.; Bitzes, G.; ...
2015-05-22
Thread-parallelization and single-instruction multiple data (SIMD) ”vectorisation” of software components in HEP computing has become a necessity to fully benefit from current and future computing hardware. In this context, the Geant-Vector/GPU simulation project aims to re-engineer current software for the simulation of the passage of particles through detectors in order to increase the overall event throughput. As one of the core modules in this area, the geometry library plays a central role and vectorising its algorithms will be one of the cornerstones towards achieving good CPU performance. Here, we report on the progress made in vectorising the shape primitives, asmore » well as in applying new C++ template based optimizations of existing code available in the Geant4, ROOT or USolids geometry libraries. We will focus on a presentation of our software development approach that aims to provide optimized code for all use cases of the library (e.g., single particle and many-particle APIs) and to support different architectures (CPU and GPU) while keeping the code base small, manageable and maintainable. We report on a generic and templated C++ geometry library as a continuation of the AIDA USolids project. As a result, the experience gained with these developments will be beneficial to other parts of the simulation software, such as for the optimization of the physics library, and possibly to other parts of the experiment software stack, such as reconstruction and analysis.« less
Towards a high performance geometry library for particle-detector simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Apostolakis, J.; Bandieramonte, M.; Bitzes, G.
Thread-parallelization and single-instruction multiple data (SIMD) ”vectorisation” of software components in HEP computing has become a necessity to fully benefit from current and future computing hardware. In this context, the Geant-Vector/GPU simulation project aims to re-engineer current software for the simulation of the passage of particles through detectors in order to increase the overall event throughput. As one of the core modules in this area, the geometry library plays a central role and vectorising its algorithms will be one of the cornerstones towards achieving good CPU performance. Here, we report on the progress made in vectorising the shape primitives, asmore » well as in applying new C++ template based optimizations of existing code available in the Geant4, ROOT or USolids geometry libraries. We will focus on a presentation of our software development approach that aims to provide optimized code for all use cases of the library (e.g., single particle and many-particle APIs) and to support different architectures (CPU and GPU) while keeping the code base small, manageable and maintainable. We report on a generic and templated C++ geometry library as a continuation of the AIDA USolids project. As a result, the experience gained with these developments will be beneficial to other parts of the simulation software, such as for the optimization of the physics library, and possibly to other parts of the experiment software stack, such as reconstruction and analysis.« less
ERIC Educational Resources Information Center
Cummins, Caroline
2006-01-01
In this article, the author discusses the benefits offered by integrated library systems (ILS) for making more informed decisions. Library software vendors, realizing ILS products can reveal business intelligence, have begun to offer tools like Director's Station to help library managers get more out of their data, and librarians are taking…
Administrative Issues in Planning a Library End User Searching Program. ERIC Digest.
ERIC Educational Resources Information Center
Machovec, George S.
This digest presents a reprint of an article which examines management principles that should be considered when implementing library end user searching programs. A brief discussion of specific implementation issues includes needs assessment, hardware, software, training, budgeting, what systems to offer, publicity and marketing, policies and…
NASA Technical Reports Server (NTRS)
Bickham, Grandin; Saile, Lynn; Havelka, Jacque; Fitts, Mary
2011-01-01
Introduction: Johnson Space Center (JSC) offers two extensive libraries that contain journals, research literature and electronic resources. Searching capabilities are available to those individuals residing onsite or through a librarian s search. Many individuals have rich collections of references, but no mechanisms to share reference libraries across researchers, projects, or directorates exist. Likewise, information regarding which references are provided to which individuals is not available, resulting in duplicate requests, redundant labor costs and associated copying fees. In addition, this tends to limit collaboration between colleagues and promotes the establishment of individual, unshared silos of information The Integrated Medical Model (IMM) team has utilized a centralized reference management tool during the development, test, and operational phases of this project. The Enterprise Reference Library project expands the capabilities developed for IMM to address the above issues and enhance collaboration across JSC. Method: After significant market analysis for a multi-user reference management tool, no available commercial tool was found to meet this need, so a software program was built around a commercial tool, Reference Manager 12 by The Thomson Corporation. A use case approach guided the requirements development phase. The premise of the design is that individuals use their own reference management software and export to SharePoint when their library is incorporated into the Enterprise Reference Library. This results in a searchable user-specific library application. An accompanying share folder will warehouse the electronic full-text articles, which allows the global user community to access full -text articles. Discussion: An enterprise reference library solution can provide a multidisciplinary collection of full text articles. This approach improves efficiency in obtaining and storing reference material while greatly reducing labor, purchasing and duplication costs. Most importantly, increasing collaboration across research groups provides unprecedented access to information relevant to NASA s mission. Conclusion: This project is an expansion and cost-effective leveraging of the existing JSC centralized library. Adding key word and author search capabilities and an alert function for notifications about new articles, based on users profiles, represent examples of future enhancements.
MicroUse: The Database on Microcomputer Applications in Libraries and Information Centers.
ERIC Educational Resources Information Center
Chen, Ching-chih; Wang, Xiaochu
1984-01-01
Describes MicroUse, a microcomputer-based database on microcomputer applications in libraries and information centers which was developed using relational database manager dBASE II. The description includes its system configuration, software utilized, the in-house-developed dBASE programs, multifile structure, basic functions, MicroUse records,…
What makes computational open source software libraries successful?
NASA Astrophysics Data System (ADS)
Bangerth, Wolfgang; Heister, Timo
2013-01-01
Software is the backbone of scientific computing. Yet, while we regularly publish detailed accounts about the results of scientific software, and while there is a general sense of which numerical methods work well, our community is largely unaware of best practices in writing the large-scale, open source scientific software upon which our discipline rests. This is particularly apparent in the commonly held view that writing successful software packages is largely the result of simply ‘being a good programmer’ when in fact there are many other factors involved, for example the social skill of community building. In this paper, we consider what we have found to be the necessary ingredients for successful scientific software projects and, in particular, for software libraries upon which the vast majority of scientific codes are built today. In particular, we discuss the roles of code, documentation, communities, project management and licenses. We also briefly comment on the impact on academic careers of engaging in software projects.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lumsdaine, Andrew
2013-03-08
The main purpose of the Coordinated Infrastructure for Fault Tolerance in Systems initiative has been to conduct research with a goal of providing end-to-end fault tolerance on a systemwide basis for applications and other system software. While fault tolerance has been an integral part of most high-performance computing (HPC) system software developed over the past decade, it has been treated mostly as a collection of isolated stovepipes. Visibility and response to faults has typically been limited to the particular hardware and software subsystems in which they are initially observed. Little fault information is shared across subsystems, allowing little flexibility ormore » control on a system-wide basis, making it practically impossible to provide cohesive end-to-end fault tolerance in support of scientific applications. As an example, consider faults such as communication link failures that can be seen by a network library but are not directly visible to the job scheduler, or consider faults related to node failures that can be detected by system monitoring software but are not inherently visible to the resource manager. If information about such faults could be shared by the network libraries or monitoring software, then other system software, such as a resource manager or job scheduler, could ensure that failed nodes or failed network links were excluded from further job allocations and that further diagnosis could be performed. As a founding member and one of the lead developers of the Open MPI project, our efforts over the course of this project have been focused on making Open MPI more robust to failures by supporting various fault tolerance techniques, and using fault information exchange and coordination between MPI and the HPC system software stack from the application, numeric libraries, and programming language runtime to other common system components such as jobs schedulers, resource managers, and monitoring tools.« less
A verification library for multibody simulation software
NASA Technical Reports Server (NTRS)
Kim, Sung-Soo; Haug, Edward J.; Frisch, Harold P.
1989-01-01
A multibody dynamics verification library, that maintains and manages test and validation data is proposed, based on RRC Robot arm and CASE backhoe validation and a comparitive study of DADS, DISCOS, and CONTOPS that are existing public domain and commercial multibody dynamic simulation programs. Using simple representative problems, simulation results from each program are cross checked, and the validation results are presented. Functionalities of the verification library are defined, in order to automate validation procedure.
The Weapons Laboratory Technical Library: Automating with ’Stilas’
1990-03-01
version of the system to LC in October 1988. -X- A small business specializing in library automation, SIRSI was founded in 1979 by library and...computer specialists, and has a strong reputation based upon the success of their UNIX-based Unicorn Collection Management System. SIRSI offers a complete...system based on the Unicorn and BRS/ Search systems. The contracted STILAS package includes UNISYS hardware, software written in the C language
Creation of a Book Order Management System Using a Microcomputer and a DBMS.
ERIC Educational Resources Information Center
Neill, Charlotte; And Others
1985-01-01
Describes management decisions and resultant technology-based system that allowed a medical library to meet increasing workloads without accompanying increases in resources available. Discussion covers system analysis; capabilities of book-order management system, "BOOKDIRT;" software and training; hardware; data files; data entry;…
Software for Studying and Enhancing Educational Uses of Geospatial Semantics and Data
ERIC Educational Resources Information Center
Nodenot, Thierry; Sallaberry, Christian; Gaio, Mauro
2010-01-01
Geographically related queries form nearly one-fifth of all queries submitted to the Excite search engine and the most frequently occurring terms are names of places. This paper focuses on digital libraries and extends the basic services of existing library management systems to include new ones that are dedicated to geographic information…
Performance testing of 3D point cloud software
NASA Astrophysics Data System (ADS)
Varela-González, M.; González-Jorge, H.; Riveiro, B.; Arias, P.
2013-10-01
LiDAR systems are being used widely in recent years for many applications in the engineering field: civil engineering, cultural heritage, mining, industry and environmental engineering. One of the most important limitations of this technology is the large computational requirements involved in data processing, especially for large mobile LiDAR datasets. Several software solutions for data managing are available in the market, including open source suites, however, users often unknown methodologies to verify their performance properly. In this work a methodology for LiDAR software performance testing is presented and four different suites are studied: QT Modeler, VR Mesh, AutoCAD 3D Civil and the Point Cloud Library running in software developed at the University of Vigo (SITEGI). The software based on the Point Cloud Library shows better results in the loading time of the point clouds and CPU usage. However, it is not as strong as commercial suites in working set and commit size tests.
Serials Evaluation: An Innovative Approach.
ERIC Educational Resources Information Center
Berger, Marilyn; Devine, Jane
1990-01-01
Describes a method of analyzing serials collections in special libraries that combines evaluative criteria with database management technology. Choice of computer software is discussed, qualitative information used to evaluate subject coverage is examined, and quantitative and descriptive data that can be used for collection management are…
Streamlining the Process of Acquiring Secure Open Architecture Software Systems
2013-10-08
Microsoft.NET, Enterprise Java Beans, GNU Lesser General Public License (LGPL) libraries, and data communication protocols like the Hypertext Transfer...NetBeans development environments), customer relationship management (SugarCRM), database management systems (PostgreSQL, MySQL ), operating
The Software Jungle: To Guide or Not To Guide.
ERIC Educational Resources Information Center
Stigleman, Sue E.
This report describes a project which evaluated five IBM and Macintosh bibliographic formatting software programs--ProCite, Sci-Mate, Reference Manager, Notebook II, and Bibliography--and which was conducted by the Health Sciences Library at the University of North Carolina at Chapel Hill at the request of the campus microcomputer support center.…
A UNIMARC Bibliographic Format Database for ABCD
ERIC Educational Resources Information Center
Megnigbeto, Eustache
2012-01-01
Purpose: ABCD is a web-based open and free software suite for library management derived from the UNESCO CDS/ISIS software technology. The first version was launched officially in December 2009 with a MARC 21 bibliographic format database. This paper aims to detail the building of the UNIMARC bibliographic format database for ABCD.…
Managing Digital Archives Using Open Source Software Tools
NASA Astrophysics Data System (ADS)
Barve, S.; Dongare, S.
2007-10-01
This paper describes the use of open source software tools such as MySQL and PHP for creating database-backed websites. Such websites offer many advantages over ones built from static HTML pages. This paper will discuss how OSS tools are used and their benefits, and after the successful implementation of these tools how the library took the initiative in implementing an institutional repository using DSpace open source software.
The Internet Knowledge Manager, Dynamic Digital Libraries, and Agents You Can Understand.
ERIC Educational Resources Information Center
Walker, Adrian
1998-01-01
Discusses the Internet Knowledge Manager (IKM) which provides an understandable way of representing knowledge, as readable software agents. Gives an example of writing and running an IKM agent for transfer pricing in corporations. Describes how the technology works. Concludes that the IKM could trigger new ways of performing knowledge management,…
Specification of Fenix MPI Fault Tolerance library version 1.0.1
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gamble, Marc; Van Der Wijngaart, Rob; Teranishi, Keita
This document provides a specification of Fenix, a software library compatible with the Message Passing Interface (MPI) to support fault recovery without application shutdown. The library consists of two modules. The first, termed process recovery , restores an application to a consistent state after it has suffered a loss of one or more MPI processes (ranks). The second specifies functions the user can invoke to store application data in Fenix managed redundant storage, and to retrieve it from that storage after process recovery.
ERIC Educational Resources Information Center
Bernstein, Allison; Baule, Steve; Farmer, Lesley S. J.; Anderson, Cyndee; Barringer, Crystal; Buchanan, Peggy; Fullner, Sheryl Kindle; Knop, Kathi; Larson, Chris
2002-01-01
Includes eight articles that address issues related to budgeting and finances in secondary school libraries. Topics include limited budgets; budget proposals; administering grants; writing grants; using Microsoft's Excel software for budgeting; using credit cards for library purchases; offering prizes for donating books; and offering coffee and…
ERIC Educational Resources Information Center
Miley, David W.
Many reference librarians still rely on manual searches to access vertical files, ready reference files, and other information stored in card files, drawers, and notebooks scattered around the reference department. Automated access to these materials via microcomputers using database management software may speed up the process. This study focuses…
Information Technology and the Evolution of the Library
2009-03-01
Resource Commons/ Repository/ Federated Search ILS (GLADIS/Pathfinder - Millenium)/ Catalog/ Circulation/ Acquisitions/ Digital Object Content...content management services to help centralize and distribute digi- tal content from across the institution, software to allow for seamless federated ... search - ing across multiple databases, and imaging software to allow for daily reimaging of ter- minals to reduce security concerns that otherwise
NASA software documentation standard software engineering program
NASA Technical Reports Server (NTRS)
1991-01-01
The NASA Software Documentation Standard (hereinafter referred to as Standard) can be applied to the documentation of all NASA software. This Standard is limited to documentation format and content requirements. It does not mandate specific management, engineering, or assurance standards or techniques. This Standard defines the format and content of documentation for software acquisition, development, and sustaining engineering. Format requirements address where information shall be recorded and content requirements address what information shall be recorded. This Standard provides a framework to allow consistency of documentation across NASA and visibility into the completeness of project documentation. This basic framework consists of four major sections (or volumes). The Management Plan contains all planning and business aspects of a software project, including engineering and assurance planning. The Product Specification contains all technical engineering information, including software requirements and design. The Assurance and Test Procedures contains all technical assurance information, including Test, Quality Assurance (QA), and Verification and Validation (V&V). The Management, Engineering, and Assurance Reports is the library and/or listing of all project reports.
Debugging and Logging Services for Defence Service Oriented Architectures
2012-02-01
Service A software component and callable end point that provides a logically related set of operations, each of which perform a logical step in a...important to note that in some cases when the fault is identified to lie in uneditable code such as program libraries, or outsourced software services ...debugging is limited to characterisation of the fault, reporting it to the software or service provider and development of work-arounds and management
Integrating an Information Literacy Quiz into the Learning Management System
ERIC Educational Resources Information Center
Lowe, M. Sara; Booth, Char; Tagge, Natalie; Stone, Sean
2014-01-01
The Claremont Colleges Library Instruction Services Department developed a quiz that could be integrated into the consortial learning management software to accompany a local online, open-source information literacy tutorial. The quiz is integrated into individual course pages, allowing students to receive a grade for completion and improving…
Characterizing and Modeling the Cost of Rework in a Library of Reusable Software Components
NASA Technical Reports Server (NTRS)
Basili, Victor R.; Condon, Steven E.; ElEmam, Khaled; Hendrick, Robert B.; Melo, Walcelio
1997-01-01
In this paper we characterize and model the cost of rework in a Component Factory (CF) organization. A CF is responsible for developing and packaging reusable software components. Data was collected on corrective maintenance activities for the Generalized Support Software reuse asset library located at the Flight Dynamics Division of NASA's GSFC. We then constructed a predictive model of the cost of rework using the C4.5 system for generating a logical classification model. The predictor variables for the model are measures of internal software product attributes. The model demonstrates good prediction accuracy, and can be used by managers to allocate resources for corrective maintenance activities. Furthermore, we used the model to generate proscriptive coding guidelines to improve programming, practices so that the cost of rework can be reduced in the future. The general approach we have used is applicable to other environments.
Perez-Riverol, Yasset; Wang, Rui; Hermjakob, Henning; Müller, Markus; Vesada, Vladimir; Vizcaíno, Juan Antonio
2014-01-01
Data processing, management and visualization are central and critical components of a state of the art high-throughput mass spectrometry (MS)-based proteomics experiment, and are often some of the most time-consuming steps, especially for labs without much bioinformatics support. The growing interest in the field of proteomics has triggered an increase in the development of new software libraries, including freely available and open-source software. From database search analysis to post-processing of the identification results, even though the objectives of these libraries and packages can vary significantly, they usually share a number of features. Common use cases include the handling of protein and peptide sequences, the parsing of results from various proteomics search engines output files, and the visualization of MS-related information (including mass spectra and chromatograms). In this review, we provide an overview of the existing software libraries, open-source frameworks and also, we give information on some of the freely available applications which make use of them. This article is part of a Special Issue entitled: Computational Proteomics in the Post-Identification Era. Guest Editors: Martin Eisenacher and Christian Stephan. PMID:23467006
Automation and hypermedia technology applications
NASA Technical Reports Server (NTRS)
Jupin, Joseph H.; Ng, Edward W.; James, Mark L.
1993-01-01
This paper represents a progress report on HyLite (Hypermedia Library technology): a research and development activity to produce a versatile system as part of NASA's technology thrusts in automation, information sciences, and communications. HyLite can be used as a system or tool to facilitate the creation and maintenance of large distributed electronic libraries. The contents of such a library may be software components, hardware parts or designs, scientific data sets or databases, configuration management information, etc. Proliferation of computer use has made the diversity and quantity of information too large for any single user to sort, process, and utilize effectively. In response to this information deluge, we have created HyLite to enable the user to process relevant information into a more efficient organization for presentation, retrieval, and readability. To accomplish this end, we have incorporated various AI techniques into the HyLite hypermedia engine to facilitate parameters and properties of the system. The proposed techniques include intelligent searching tools for the libraries, intelligent retrievals, and navigational assistance based on user histories. HyLite itself is based on an earlier project, the Encyclopedia of Software Components (ESC) which used hypermedia to facilitate and encourage software reuse.
Perez-Riverol, Yasset; Wang, Rui; Hermjakob, Henning; Müller, Markus; Vesada, Vladimir; Vizcaíno, Juan Antonio
2014-01-01
Data processing, management and visualization are central and critical components of a state of the art high-throughput mass spectrometry (MS)-based proteomics experiment, and are often some of the most time-consuming steps, especially for labs without much bioinformatics support. The growing interest in the field of proteomics has triggered an increase in the development of new software libraries, including freely available and open-source software. From database search analysis to post-processing of the identification results, even though the objectives of these libraries and packages can vary significantly, they usually share a number of features. Common use cases include the handling of protein and peptide sequences, the parsing of results from various proteomics search engines output files, and the visualization of MS-related information (including mass spectra and chromatograms). In this review, we provide an overview of the existing software libraries, open-source frameworks and also, we give information on some of the freely available applications which make use of them. This article is part of a Special Issue entitled: Computational Proteomics in the Post-Identification Era. Guest Editors: Martin Eisenacher and Christian Stephan. Copyright © 2013 Elsevier B.V. All rights reserved.
Library and Classroom Use of Copyrighted Videotapes and Computer Software.
ERIC Educational Resources Information Center
Reed, Mary Hutchings; Stanek, Debra
1986-01-01
This pullout guide addresses issues regarding library use of copyrighted videotapes and computer software. Guidelines for videotapes cover in-classroom use, in-library use in public library, loan, and duplication. Guidelines for computer software cover purchase conditions, avoiding license restrictions, loaning, archival copies, and in-library and…
The US Geological Survey, digital spectral reflectance library: version 1: 0.2 to 3.0 microns
NASA Technical Reports Server (NTRS)
Clark, Roger N.; Swayze, Gregg A.; King, Trude V. V.; Gallagher, Andrea J.; Calvin, Wendy M.
1993-01-01
We have developed a digital reflectance spectral library, with management and spectral analysis software. The library includes 500 spectra of 447 samples (some samples include a series of grain sizes) measured from approximately 0.2 to 3.0 microns. The spectral resolution (Full Width Half Maximum) of the reflectance data is less than or equal to 4 nm in the visible (0.2-0.8 microns) and less than or equal 10 nm in the NIR (0.8-2.35 microns). All spectra were corrected to absolute reflectance using an NBS Halon standard. Library management software lets users search on parameters (e.g. chemical formulae, chemical analyses, purity of samples, mineral groups, etc.) as well as spectral features. Minerals from sulfide, oxide, hydroxide, halide, carbonate, nitrate, borate, phosphate, and silicate groups are represented. X-ray and chemical analyses are tabulated for many of the entries, and all samples have been evaluated for spectral purity. The library also contains end and intermediate members for the olivine, garnet, scapolite, montmorillonite, muscovite, jarosite, and alunite solid-solution series. We have included representative spectra of H2O ice, kerogen, ammonium-bearing minerals, rare-earth oxides, desert varnish coatings, kaolinite crystallinity series, kaolinite-smectite series, zeolite series, and an extensive evaporite series. Because of the importance of vegetation to climate-change studies we have include 17 spectra of tree leaves, bushes, and grasses.
Automating the Presentation of Computer Networks
2006-12-01
software to overlay operational state information. Other network management tools like Computer Associates Unicenter [6,7] generate internal network...and required manual placement assistance. A number of software libraries [20] offer a wealth of automatic layout algorithms and presentation...FX010857971033.aspx [2] Microsoft (2005) Visio 2003 Product Demo, http://www.microsoft.com/office/visio/prodinfo/demo.mspx [3] Smartdraw (2005) Network
Crawford, S; Halbrook, B
1990-01-01
In an era of great technological and socioeconomic changes, the Washington University School of Medicine conceptualized and built its first Library and Biomedical Communications Center in seventy-eight years. The planning process, evolution of the electronic library, and translation of functions into operating spaces are discussed. Since 1983, when the project was approved, a whole range of information technologies and services have emerged. The authors consider the kind of library that would operate in a setting where people can do their own searches, order data and materials through an electronic network, analyze and manage information, and use software to create their own publications. Images PMID:2393757
User Interface Technology Transfer to NASA's Virtual Wind Tunnel System
NASA Technical Reports Server (NTRS)
vanDam, Andries
1998-01-01
Funded by NASA grants for four years, the Brown Computer Graphics Group has developed novel 3D user interfaces for desktop and immersive scientific visualization applications. This past grant period supported the design and development of a software library, the 3D Widget Library, which supports the construction and run-time management of 3D widgets. The 3D Widget Library is a mechanism for transferring user interface technology from the Brown Graphics Group to the Virtual Wind Tunnel system at NASA Ames as well as the public domain.
Chapelle, D; Fragu, M; Mallet, V; Moireau, P
2013-11-01
We present the fundamental principles of data assimilation underlying the Verdandi library, and how they are articulated with the modular architecture of the library. This translates--in particular--into the definition of standardized interfaces through which the data assimilation library interoperates with the model simulation software and the so-called observation manager. We also survey various examples of data assimilation applied to the personalization of biophysical models, in particular, for cardiac modeling applications within the euHeart European project. This illustrates the power of data assimilation concepts in such novel applications, with tremendous potential in clinical diagnosis assistance.
A global snapshot of the state of digital collections in the health sciences, 2013*
Pickett, Keith M.; Knapp, Maureen M.
2014-01-01
Two hundred twenty-nine health sciences libraries (HSLs) worldwide were surveyed regarding the availability of digital collections, evidence of the type of digital collections, level of access, software used, and HSL type. Of the surveyed libraries, 69% (n = 157) had digital collections, with an average of 1,531 items in each collection; 49% (n = 112) also had institutional repositories. In most cases (n = 147), these collections were publicly available. The predominant platforms for disseminating these digital collections were CONTENTdm and library web pages. Only 50% (n = 77) of these collections were managed by the health sciences library itself. PMID:24860271
A global snapshot of the state of digital collections in the health sciences, 2013.
Pickett, Keith M; Knapp, Maureen M
2014-04-01
Two hundred twenty-nine health sciences libraries (HSLs) worldwide were surveyed regarding the availability of digital collections, evidence of the type of digital collections, level of access, software used, and HSL type. Of the surveyed libraries, 69% (n = 157) had digital collections, with an average of 1,531 items in each collection; 49% (n = 112) also had institutional repositories. In most cases (n = 147), these collections were publicly available. The predominant platforms for disseminating these digital collections were CONTENTdm and library web pages. Only 50% (n = 77) of these collections were managed by the health sciences library itself.
37 CFR 201.24 - Warning of copyright for software lending by nonprofit libraries.
Code of Federal Regulations, 2012 CFR
2012-07-01
... software lending by nonprofit libraries. 201.24 Section 201.24 Patents, Trademarks, and Copyrights... copyright for software lending by nonprofit libraries. (a) Definition. A Warning of Copyright for Software... States Code, as amended by the Computer Software Rental Amendments Act of 1990, Public Law 101-650. As...
37 CFR 201.24 - Warning of copyright for software lending by nonprofit libraries.
Code of Federal Regulations, 2013 CFR
2013-07-01
... software lending by nonprofit libraries. 201.24 Section 201.24 Patents, Trademarks, and Copyrights... copyright for software lending by nonprofit libraries. (a) Definition. A Warning of Copyright for Software... States Code, as amended by the Computer Software Rental Amendments Act of 1990, Public Law 101-650. As...
37 CFR 201.24 - Warning of copyright for software lending by nonprofit libraries.
Code of Federal Regulations, 2011 CFR
2011-07-01
... software lending by nonprofit libraries. 201.24 Section 201.24 Patents, Trademarks, and Copyrights... copyright for software lending by nonprofit libraries. (a) Definition. A Warning of Copyright for Software... States Code, as amended by the Computer Software Rental Amendments Act of 1990, Public Law 101-650. As...
37 CFR 201.24 - Warning of copyright for software lending by nonprofit libraries.
Code of Federal Regulations, 2010 CFR
2010-07-01
... software lending by nonprofit libraries. 201.24 Section 201.24 Patents, Trademarks, and Copyrights... copyright for software lending by nonprofit libraries. (a) Definition. A Warning of Copyright for Software... States Code, as amended by the Computer Software Rental Amendments Act of 1990, Public Law 101-650. As...
37 CFR 201.24 - Warning of copyright for software lending by nonprofit libraries.
Code of Federal Regulations, 2014 CFR
2014-07-01
... software lending by nonprofit libraries. 201.24 Section 201.24 Patents, Trademarks, and Copyrights U.S... copyright for software lending by nonprofit libraries. (a) Definition. A Warning of Copyright for Software... States Code, as amended by the Computer Software Rental Amendments Act of 1990, Public Law 101-650. As...
Bhargava, Puneet; Patel, Vatsal B; Iyer, Ramesh S; Moshiri, Mariam; Robinson, Tracy J; Lall, Chandana; Heller, Matthew T
2015-02-01
The academic portfolio has become an integral part of the promotions process. Creating and maintaining an academic portfolio in paper-based or web-based formats can be a cumbersome and time-consuming task. In this article, we describe an alternative way to efficiently organize an academic portfolio using a reference manager software, and discuss some of the afforded advantages. The reference manager software Papers (Mekentosj, Amsterdam, The Netherlands) was used to create an academic portfolio. The article outlines the key steps in creating and maintaining a digital academic portfolio. Using reference manager software (Papers), we created an academic portfolio that allows the user to digitally organize clinical, teaching, and research accomplishments in an indexed library enabling efficient updating, rapid retrieval, and easy sharing. To our knowledge, this is the first digital portfolio of its kind.
The virtual library: Coming of age
NASA Technical Reports Server (NTRS)
Hunter, Judy F.; Cotter, Gladys A.
1994-01-01
With the high speed networking capabilities, multiple media options, and massive amounts of information that exist in electronic format today, the concept of a 'virtual' library or 'library without walls' is becoming viable. In virtual library environment, the information processed goes beyond the traditional definition of documents to include the results of scientific and technical research and development (reports, software, data) recorded in any format or media: electronic, audio, video, or scanned images. Network access to information must include tools to help locate information sources and navigate the networks to connect to the sources, as well as methods to extract the relevant information. Graphical User Interfaces (GUI's) that are intuitive and navigational tools such as Intelligent Gateway Processors (IGP) will provide users with seamless and transparent use of high speed networks to access, organize, and manage information. Traditional libraries will become points of electronic access to information on multiple medias. The emphasis will be towards unique collections of information at each library rather than entire collections at every library. It is no longer a question of whether there is enough information available; it is more a question of how to manage the vast volumes of information. The future equation will involve being able to organize knowledge, manage information, and provide access at the point of origin.
NASA Technical Reports Server (NTRS)
Merwarth, P., D.
1983-01-01
The Common Software Module Repository (CSMR) is computerized library system with high product and service visibility to potential users. Online capabilities of system allow both librarian and user to interact with library. Librarian is responsible for maintaining information in CSMR library. User searches library to locate software modules that meet his or her current needs.
Mission and data operations IBM 360 user's guide
NASA Technical Reports Server (NTRS)
Balakirsky, J.
1973-01-01
The M and DO computer systems are introduced and supplemented. The hardware and software status is discussed, along with standard processors and user libraries. Data management techniques are presented, as well as machine independence, debugging facilities, and overlay considerations.
Generic, Type-Safe and Object Oriented Computer Algebra Software
NASA Astrophysics Data System (ADS)
Kredel, Heinz; Jolly, Raphael
Advances in computer science, in particular object oriented programming, and software engineering have had little practical impact on computer algebra systems in the last 30 years. The software design of existing systems is still dominated by ad-hoc memory management, weakly typed algorithm libraries and proprietary domain specific interactive expression interpreters. We discuss a modular approach to computer algebra software: usage of state-of-the-art memory management and run-time systems (e.g. JVM) usage of strongly typed, generic, object oriented programming languages (e.g. Java) and usage of general purpose, dynamic interactive expression interpreters (e.g. Python) To illustrate the workability of this approach, we have implemented and studied computer algebra systems in Java and Scala. In this paper we report on the current state of this work by presenting new examples.
ERIC Educational Resources Information Center
Salem, Jamie; Fehrmann, Paul
2013-01-01
With the growing population of undergraduate students on our campus and an increased focus on their success, librarians at a large midwestern university were interested in the citation management styles of this university cohort. Our university library spends considerable resources each year to maintain and promote access to the robust…
Application Reuse Library for Software, Requirements, and Guidelines
NASA Technical Reports Server (NTRS)
Malin, Jane T.; Thronesbery, Carroll
1994-01-01
Better designs are needed for expert systems and other operations automation software, for more reliable, usable and effective human support. A prototype computer-aided Application Reuse Library shows feasibility of supporting concurrent development and improvement of advanced software by users, analysts, software developers, and human-computer interaction experts. Such a library expedites development of quality software, by providing working, documented examples, which support understanding, modification and reuse of requirements as well as code. It explicitly documents and implicitly embodies design guidelines, standards and conventions. The Application Reuse Library provides application modules with Demo-and-Tester elements. Developers and users can evaluate applicability of a library module and test modifications, by running it interactively. Sub-modules provide application code and displays and controls. The library supports software modification and reuse, by providing alternative versions of application and display functionality. Information about human support and display requirements is provided, so that modifications will conform to guidelines. The library supports entry of new application modules from developers throughout an organization. Example library modules include a timer, some buttons and special fonts, and a real-time data interface program. The library prototype is implemented in the object-oriented G2 environment for developing real-time expert systems.
Computer Software: Copyright and Licensing Considerations for Schools and Libraries. ERIC Digest.
ERIC Educational Resources Information Center
Reed, Mary Hutchings
This digest notes that the terms and conditions of computer software package license agreements control the use of software in schools and libraries, and examines the implications of computer software license agreements for classroom use and for library lending policies. Guidelines are provided for interpreting the Copyright Act, and insuring the…
Megasite Management Tool (mmt): a Decision Support System Built Using Mapwindow Activex Control
NASA Astrophysics Data System (ADS)
Pulsani, B. R.
2017-11-01
Megasite Management Tool (MMT) is planning and evaluation software for contaminated sites. Using different statistical modules, MMT produces maps which help decision makers in rehabilitating contaminated sites. The input data used by MMT is of geographic nature and exists as shapefile and raster format. As MMT is built using simple windows forms application, the objective of the study was to find a way to visualize geographic data and to allow the user to edit its attribute information. Therefore, the application requirement was to find GIS libraries which offer capabilities such as (1) map viewer with navigation tools (2) library to read/write geographic data and (3) software which allows free distribution of the developed components. A research on these requirements led to the discovery of MapWindow ActiveX components which not only offered these capabilities but also provided free and open source licensing options for redistribution. Although considerable amount of reports and publications exist on MMT, the major contribution provided by MapWindow libraries have been under played. The current study emphasises upon the contribution and advantages MapWindow ActiveX provides for incorporating GIS functionality to an already existing application. Similar components for other languages have also been reviewed.
Coordinated Fault-Tolerance for High-Performance Computing Final Project Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Panda, Dhabaleswar Kumar; Beckman, Pete
2011-07-28
With the Coordinated Infrastructure for Fault Tolerance Systems (CIFTS, as the original project came to be called) project, our aim has been to understand and tackle the following broad research questions, the answers to which will help the HEC community analyze and shape the direction of research in the field of fault tolerance and resiliency on future high-end leadership systems. Will availability of global fault information, obtained by fault information exchange between the different HEC software on a system, allow individual system software to better detect, diagnose, and adaptively respond to faults? If fault-awareness is raised throughout the system throughmore » fault information exchange, is it possible to get all system software working together to provide a more comprehensive end-to-end fault management on the system? What are the missing fault-tolerance features that widely used HEC system software lacks today that would inhibit such software from taking advantage of systemwide global fault information? What are the practical limitations of a systemwide approach for end-to-end fault management based on fault awareness and coordination? What mechanisms, tools, and technologies are needed to bring about fault awareness and coordination of responses on a leadership-class system? What standards, outreach, and community interaction are needed for adoption of the concept of fault awareness and coordination for fault management on future systems? Keeping our overall objectives in mind, the CIFTS team has taken a parallel fourfold approach. Our central goal was to design and implement a light-weight, scalable infrastructure with a simple, standardized interface to allow communication of fault-related information through the system and facilitate coordinated responses. This work led to the development of the Fault Tolerance Backplane (FTB) publish-subscribe API specification, together with a reference implementation and several experimental implementations on top of existing publish-subscribe tools. We enhanced the intrinsic fault tolerance capabilities representative implementations of a variety of key HPC software subsystems and integrated them with the FTB. Targeting software subsystems included: MPI communication libraries, checkpoint/restart libraries, resource managers and job schedulers, and system monitoring tools. Leveraging the aforementioned infrastructure, as well as developing and utilizing additional tools, we have examined issues associated with expanded, end-to-end fault response from both system and application viewpoints. From the standpoint of system operations, we have investigated log and root cause analysis, anomaly detection and fault prediction, and generalized notification mechanisms. Our applications work has included libraries for fault-tolerance linear algebra, application frameworks for coupled multiphysics applications, and external frameworks to support the monitoring and response for general applications. Our final goal was to engage the high-end computing community to increase awareness of tools and issues around coordinated end-to-end fault management.« less
2010-12-01
PSP and TSP books by Watts Humphrey or in the TSP-MT (multi-team) process extension. A few additional items should be created, e.g., see OPD-2...Institute, Carnegie Mellon University, 2000. www.sei.cmu.edu/library/abstracts/reports/00tr023.cfm [ Humphrey 2005] Humphrey , Watts S . PSP : A Self... Humphrey 2006] Humphrey , Watts S . TSP: Coaching Development Teams. Addison Wesley, 2006 (ISBN 978- 0201731132). www.sei.cmu.edu/library/abstracts/
Library reuse in a rapid development environment
NASA Technical Reports Server (NTRS)
Uhde, JO; Weed, Daniel; Gottlieb, Robert; Neal, Douglas
1995-01-01
The Aeroscience and Flight Mechanics Division (AFMD) established a Rapid Development Laboratory (RDL) to investigate and improve new 'rapid development' software production processes and refine the use of commercial, off-the-shelf (COTS) tools. These tools and processes take an avionics design project from initial inception through high fidelity, real-time, hardware-in-the-loop (HIL) testing. One central theme of a rapid development process is the use and integration of a variety of COTS tools: This paper discusses the RDL MATRIX(sub x)(R) libraries, as well as the techniques for managing and documenting these libraries. This paper also shows the methods used for building simulations with the Advanced Simulation Development System (ASDS) libraries, and provides metrics to illustrate the amount of reuse for five complete simulations. Combining ASDS libraries with MATRIX(sub x)(R) libraries is discussed.
Coordinated Fault Tolerance for High-Performance Computing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dongarra, Jack; Bosilca, George; et al.
2013-04-08
Our work to meet our goal of end-to-end fault tolerance has focused on two areas: (1) improving fault tolerance in various software currently available and widely used throughout the HEC domain and (2) using fault information exchange and coordination to achieve holistic, systemwide fault tolerance and understanding how to design and implement interfaces for integrating fault tolerance features for multiple layers of the software stack—from the application, math libraries, and programming language runtime to other common system software such as jobs schedulers, resource managers, and monitoring tools.
Testing, Testing...Managing Electronic Access in Disparate Times.
ERIC Educational Resources Information Center
Carrington, Bessie M.
1996-01-01
Duke University's Perkins Library (North Carolina) tests electronic resources and services for remote accessibility by examining capabilities on various platforms, operating systems, communications software, and World Wide Web browsers. Problems occur in establishing connections, screen display, navigation or retrieval, keyboard variations, and in…
The Resource Manager the ATLAS Trigger and Data Acquisition System
NASA Astrophysics Data System (ADS)
Aleksandrov, I.; Avolio, G.; Lehmann Miotto, G.; Soloviev, I.
2017-10-01
The Resource Manager is one of the core components of the Data Acquisition system of the ATLAS experiment at the LHC. The Resource Manager marshals the right for applications to access resources which may exist in multiple but limited copies, in order to avoid conflicts due to program faults or operator errors. The access to resources is managed in a manner similar to what a lock manager would do in other software systems. All the available resources and their association to software processes are described in the Data Acquisition configuration database. The Resource Manager is queried about the availability of resources every time an application needs to be started. The Resource Manager’s design is based on a client-server model, hence it consists of two components: the Resource Manager “server” application and the “client” shared library. The Resource Manager server implements all the needed functionalities, while the Resource Manager client library provides remote access to the “server” (i.e., to allocate and free resources, to query about the status of resources). During the LHC’s Long Shutdown period, the Resource Manager’s requirements have been reviewed at the light of the experience gained during the LHC’s Run 1. As a consequence, the Resource Manager has undergone a full re-design and re-implementation cycle with the result of a reduction of the code base by 40% with respect to the previous implementation. This contribution will focus on the way the design and the implementation of the Resource Manager could leverage the new features available in the C++11 standard, and how the introduction of external libraries (like Boost multi-container) led to a more maintainable system. Additionally, particular attention will be given to the technical solutions adopted to ensure the Resource Manager could effort the typical requests rates of the Data Acquisition system, which is about 30000 requests in a time window of few seconds coming from more than 1000 clients.
PAVE: program for assembling and viewing ESTs.
Soderlund, Carol; Johnson, Eric; Bomhoff, Matthew; Descour, Anne
2009-08-26
New sequencing technologies are rapidly emerging. Many laboratories are simultaneously working with the traditional Sanger ESTs and experimenting with ESTs generated by the 454 Life Science sequencers. Though Sanger ESTs have been used to generate contigs for many years, no program takes full advantage of the 5' and 3' mate-pair information, hence, many tentative transcripts are assembled into two separate contigs. The new 454 technology has the benefit of high-throughput expression profiling, but introduces time and space problems for assembling large contigs. The PAVE (Program for Assembling and Viewing ESTs) assembler takes advantage of the 5' and 3' mate-pair information by requiring that the mate-pairs be assembled into the same contig and joined by n's if the two sub-contigs do not overlap. It handles the depth of 454 data sets by "burying" similar ESTs during assembly, which retains the expression level information while circumventing time and space problems. PAVE uses MegaBLAST for the clustering step and CAP3 for assembly, however it assembles incrementally to enforce the mate-pair constraint, bury ESTs, and reduce incorrect joins and splits. The PAVE data management system uses a MySQL database to store multiple libraries of ESTs along with their metadata; the management system allows multiple assemblies with variations on libraries and parameters. Analysis routines provide standard annotation for the contigs including a measure of differentially expressed genes across the libraries. A Java viewer program is provided for display and analysis of the results. Our results clearly show the benefit of using the PAVE assembler to explicitly use mate-pair information and bury ESTs for large contigs. The PAVE assembler provides a software package for assembling Sanger and/or 454 ESTs. The assembly software, data management software, Java viewer and user's guide are freely available.
NASA Software Documentation Standard
NASA Technical Reports Server (NTRS)
1991-01-01
The NASA Software Documentation Standard (hereinafter referred to as "Standard") is designed to support the documentation of all software developed for NASA; its goal is to provide a framework and model for recording the essential information needed throughout the development life cycle and maintenance of a software system. The NASA Software Documentation Standard can be applied to the documentation of all NASA software. The Standard is limited to documentation format and content requirements. It does not mandate specific management, engineering, or assurance standards or techniques. This Standard defines the format and content of documentation for software acquisition, development, and sustaining engineering. Format requirements address where information shall be recorded and content requirements address what information shall be recorded. This Standard provides a framework to allow consistency of documentation across NASA and visibility into the completeness of project documentation. The basic framework consists of four major sections (or volumes). The Management Plan contains all planning and business aspects of a software project, including engineering and assurance planning. The Product Specification contains all technical engineering information, including software requirements and design. The Assurance and Test Procedures contains all technical assurance information, including Test, Quality Assurance (QA), and Verification and Validation (V&V). The Management, Engineering, and Assurance Reports is the library and/or listing of all project reports.
CGNS Mid-Level Software Library and Users Guide
NASA Technical Reports Server (NTRS)
Poirier, Diane; Smith, Charles A. (Technical Monitor)
1998-01-01
The "CFD General Notation System" (CGNS) consists of a collection of conventions, and conforming software, for the storage and retrieval of Computational Fluid Dynamics (CFD) data. It facilitates the exchange of data between sites and applications, and helps stabilize the archiving of aerodynamic data. This effort was initiated in order to streamline the procedures in exchanging data and software between NASA and its customers, but the goal is to develop CGNS into a National Standard for the exchange of aerodynamic data. The CGNS development team is comprised of members from Boeing Commercial Airplane Group, NASA-Ames, NASA-Langley, NASA-Lewis, McDonnell-Douglas Corporation (now Boeing-St. Louis), Air Force-Wright Lab., and ICEM-CFD Engineering. The elements of CGNS address all activities associated with the storage of data on external media and its movement to and from application programs. These elements include: - The Advanced Data Format (ADF) Database manager, consisting of both a file format specification and its I/O software, which handles the actual reading and writing of data from and to external storage media; - The Standard Interface Data Structures (SIDS), which specify the intellectual content of CFD data and the conventions governing naming and terminology; - The SIDS-to-ADF File Mapping conventions, which specify the exact location where the CFD data defined by the SIDS is to be stored within the ADF file(s); and - The CGNS Mid-level Library, which provides CFD-knowledgeable routines suitable for direct installation into application codes. The CGNS Mid-level Library was designed to ease the implementation of CGNS by providing developers with a collection of handy I/O functions. Since knowledge of the ADF core is not required to use this library, it will greatly facilitate the task of interfacing with CGNS. There are currently 48 user callable functions that comprise the Mid-level library and are described in the Users Guide. The library is written in C, but each function has a FORTRAN counterpart.
Advanced Transport Operating System (ATOPS) utility library software description
NASA Technical Reports Server (NTRS)
Clinedinst, Winston C.; Slominski, Christopher J.; Dickson, Richard W.; Wolverton, David A.
1993-01-01
The individual software processes used in the flight computers on-board the Advanced Transport Operating System (ATOPS) aircraft have many common functional elements. A library of commonly used software modules was created for general uses among the processes. The library includes modules for mathematical computations, data formatting, system database interfacing, and condition handling. The modules available in the library and their associated calling requirements are described.
Proposal for a CLIPS software library
NASA Technical Reports Server (NTRS)
Porter, Ken
1991-01-01
This paper is a proposal to create a software library for the C Language Integrated Production System (CLIPS) expert system shell developed by NASA. Many innovative ideas for extending CLIPS were presented at the First CLIPS Users Conference, including useful user and database interfaces. CLIPS developers would benefit from a software library of reusable code. The CLIPS Users Group should establish a software library-- a course of action to make that happen is proposed. Open discussion to revise this library concept is essential, since only a group effort is likely to succeed. A response form intended to solicit opinions and support from the CLIPS community is included.
PD5: a general purpose library for primer design software.
Riley, Michael C; Aubrey, Wayne; Young, Michael; Clare, Amanda
2013-01-01
Complex PCR applications for large genome-scale projects require fast, reliable and often highly sophisticated primer design software applications. Presently, such applications use pipelining methods to utilise many third party applications and this involves file parsing, interfacing and data conversion, which is slow and prone to error. A fully integrated suite of software tools for primer design would considerably improve the development time, the processing speed, and the reliability of bespoke primer design software applications. The PD5 software library is an open-source collection of classes and utilities, providing a complete collection of software building blocks for primer design and analysis. It is written in object-oriented C(++) with an emphasis on classes suitable for efficient and rapid development of bespoke primer design programs. The modular design of the software library simplifies the development of specific applications and also integration with existing third party software where necessary. We demonstrate several applications created using this software library that have already proved to be effective, but we view the project as a dynamic environment for building primer design software and it is open for future development by the bioinformatics community. Therefore, the PD5 software library is published under the terms of the GNU General Public License, which guarantee access to source-code and allow redistribution and modification. The PD5 software library is downloadable from Google Code and the accompanying Wiki includes instructions and examples: http://code.google.com/p/primer-design.
Preparation of cherry-picked combinatorial libraries by string synthesis.
Furka, Arpád; Dibó, Gábor; Gombosuren, Naran
2005-03-01
String synthesis [1-3] is an efficient and cheap manual method for preparation of combinatorial libraries by using macroscopic solid support units. Sorting the units between two synthetic steps is an important operation of the procedure. The software developed to guide sorting can be used only when complete combinatorial libraries are prepared. Since very often only selected components of the full libraries are needed, new software was constructed that guides sorting in preparation of non-complete combinatorial libraries. Application of the software is described in details.
libdrdc: software standards library
NASA Astrophysics Data System (ADS)
Erickson, David; Peng, Tie
2008-04-01
This paper presents the libdrdc software standards library including internal nomenclature, definitions, units of measure, coordinate reference frames, and representations for use in autonomous systems research. This library is a configurable, portable C-function wrapped C++ / Object Oriented C library developed to be independent of software middleware, system architecture, processor, or operating system. It is designed to use the automatically-tuned linear algebra suite (ATLAS) and Basic Linear Algebra Suite (BLAS) and port to firmware and software. The library goal is to unify data collection and representation for various microcontrollers and Central Processing Unit (CPU) cores and to provide a common Application Binary Interface (ABI) for research projects at all scales. The library supports multi-platform development and currently works on Windows, Unix, GNU/Linux, and Real-Time Executive for Multiprocessor Systems (RTEMS). This library is made available under LGPL version 2.1 license.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moody, A. T.
2014-04-20
This code adds an implementation of PMIX_Ring to the existing PM12 Library in the SLURM open source software package (Simple Linux Utility for Resource Management). PMIX_Ring executes a particular communication pattern that is used to bootstrap connections between MPI processes in a parallel job.
ERIC Educational Resources Information Center
Jackson, Mary E.
1998-01-01
Assesses the changes in interlibrary loan (ILL) practices, and points the way to an ideal future. Discusses patron-initiated document request systems; library-mediated ordering systems; document delivery suppliers; accessing electronic resources; ILL management software; paying ILL invoices; new electronic delivery options; and results of a…
NASA Technical Reports Server (NTRS)
Jones, Kennie H.; Randall, Donald P.; Stallcup, Scott S.; Rowell, Lawrence F.
1988-01-01
The Environment for Application Software Integration and Execution, EASIE, provides a methodology and a set of software utility programs to ease the task of coordinating engineering design and analysis codes. EASIE was designed to meet the needs of conceptual design engineers that face the task of integrating many stand-alone engineering analysis programs. Using EASIE, programs are integrated through a relational data base management system. In volume 2, the use of a SYSTEM LIBRARY PROCESSOR is used to construct a DATA DICTIONARY describing all relations defined in the data base, and a TEMPLATE LIBRARY. A TEMPLATE is a description of all subsets of relations (including conditional selection criteria and sorting specifications) to be accessed as input or output for a given application. Together, these form the SYSTEM LIBRARY which is used to automatically produce the data base schema, FORTRAN subroutines to retrieve/store data from/to the data base, and instructions to a generic REVIEWER program providing review/modification of data for a given template. Automation of these functions eliminates much of the tedious, error prone work required by the usual approach to data base integration.
A Padawan Programmer's Guide to Developing Software Libraries.
Yurkovich, James T; Yurkovich, Benjamin J; Dräger, Andreas; Palsson, Bernhard O; King, Zachary A
2017-11-22
With the rapid adoption of computational tools in the life sciences, scientists are taking on the challenge of developing their own software libraries and releasing them for public use. This trend is being accelerated by popular technologies and platforms, such as GitHub, Jupyter, R/Shiny, that make it easier to develop scientific software and by open-source licenses that make it easier to release software. But how do you build a software library that people will use? And what characteristics do the best libraries have that make them enduringly popular? Here, we provide a reference guide, based on our own experiences, for developing software libraries along with real-world examples to help provide context for scientists who are learning about these concepts for the first time. While we can only scratch the surface of these topics, we hope that this article will act as a guide for scientists who want to write great software that is built to last. Copyright © 2017 Elsevier Inc. All rights reserved.
A portable structural analysis library for reaction networks.
Bedaso, Yosef; Bergmann, Frank T; Choi, Kiri; Medley, Kyle; Sauro, Herbert M
2018-07-01
The topology of a reaction network can have a significant influence on the network's dynamical properties. Such influences can include constraints on network flows and concentration changes or more insidiously result in the emergence of feedback loops. These effects are due entirely to mass constraints imposed by the network configuration and are important considerations before any dynamical analysis is made. Most established simulation software tools usually carry out some kind of structural analysis of a network before any attempt is made at dynamic simulation. In this paper, we describe a portable software library, libStructural, that can carry out a variety of popular structural analyses that includes conservation analysis, flux dependency analysis and enumerating elementary modes. The library employs robust algorithms that allow it to be used on large networks with more than a two thousand nodes. The library accepts either a raw or fully labeled stoichiometry matrix or models written in SBML format. The software is written in standard C/C++ and comes with extensive on-line documentation and a test suite. The software is available for Windows, Mac OS X, and can be compiled easily on any Linux operating system. A language binding for Python is also available through the pip package manager making it simple to install on any standard Python distribution. The bulk of the source code is licensed under the open source BSD license with other parts using as either the MIT license or more simply public domain. All source is available on GitHub (https://github.com/sys-bio/Libstructural). Copyright © 2018 Elsevier B.V. All rights reserved.
ERIC Educational Resources Information Center
Oliver, Astrid; Dahlquist, Janet; Tankersley, Jan; Emrich, Beth
2010-01-01
This article discusses the processes that occurred when the Library, Controller's Office, and Information Technology Department agreed to create an interface between the Library's Innovative Interfaces patron database and campus administrative software, Banner, using file transfer protocol, in an effort to streamline the Library's accounts…
Building a Snow Data Management System using Open Source Software (and IDL)
NASA Astrophysics Data System (ADS)
Goodale, C. E.; Mattmann, C. A.; Ramirez, P.; Hart, A. F.; Painter, T.; Zimdars, P. A.; Bryant, A.; Brodzik, M.; Skiles, M.; Seidel, F. C.; Rittger, K. E.
2012-12-01
At NASA's Jet Propulsion Laboratory free and open source software is used everyday to support a wide range of projects, from planetary to climate to research and development. In this abstract I will discuss the key role that open source software has played in building a robust science data processing pipeline for snow hydrology research, and how the system is also able to leverage programs written in IDL, making JPL's Snow Data System a hybrid of open source and proprietary software. Main Points: - The Design of the Snow Data System (illustrate how the collection of sub-systems are combined to create a complete data processing pipeline) - Discuss the Challenges of moving from a single algorithm on a laptop, to running 100's of parallel algorithms on a cluster of servers (lesson's learned) - Code changes - Software license related challenges - Storage Requirements - System Evolution (from data archiving, to data processing, to data on a map, to near-real-time products and maps) - Road map for the next 6 months (including how easily we re-used the snowDS code base to support the Airborne Snow Observatory Mission) Software in Use and their Software Licenses: IDL - Used for pre and post processing of data. Licensed under a proprietary software license held by Excelis. Apache OODT - Used for data management and workflow processing. Licensed under the Apache License Version 2. GDAL - Geospatial Data processing library used for data re-projection currently. Licensed under the X/MIT license. GeoServer - WMS Server. Licensed under the General Public License Version 2.0 Leaflet.js - Javascript web mapping library. Licensed under the Berkeley Software Distribution License. Python - Glue code and miscellaneous data processing support. Licensed under the Python Software Foundation License. Perl - Script wrapper for running the SCAG algorithm. Licensed under the General Public License Version 3. PHP - Front-end web application programming. Licensed under the PHP License Version 3.01
System and Mass Storage Study for Defense Mapping Agency Topographic Center (DMATC/HC)
1977-04-01
34•»-—•—■»■—- view. The assessment should be based on carefully designed control condi- tions—data volume, resolution, function, etc...egories: hardware control and library management support. This software is designed to interface with IBM 360/370 OS and OS/VS. No interface with a...laser re- cording unit includes a programmable recorder control subsystem which can be designed to provide a hardware and software interface compatible
Mining collections of compounds with Screening Assistant 2
2012-01-01
Background High-throughput screening assays have become the starting point of many drug discovery programs for large pharmaceutical companies as well as academic organisations. Despite the increasing throughput of screening technologies, the almost infinite chemical space remains out of reach, calling for tools dedicated to the analysis and selection of the compound collections intended to be screened. Results We present Screening Assistant 2 (SA2), an open-source JAVA software dedicated to the storage and analysis of small to very large chemical libraries. SA2 stores unique molecules in a MySQL database, and encapsulates several chemoinformatics methods, among which: providers management, interactive visualisation, scaffold analysis, diverse subset creation, descriptors calculation, sub-structure / SMART search, similarity search and filtering. We illustrate the use of SA2 by analysing the composition of a database of 15 million compounds collected from 73 providers, in terms of scaffolds, frameworks, and undesired properties as defined by recently proposed HTS SMARTS filters. We also show how the software can be used to create diverse libraries based on existing ones. Conclusions Screening Assistant 2 is a user-friendly, open-source software that can be used to manage collections of compounds and perform simple to advanced chemoinformatics analyses. Its modular design and growing documentation facilitate the addition of new functionalities, calling for contributions from the community. The software can be downloaded at http://sa2.sourceforge.net/. PMID:23327565
Mining collections of compounds with Screening Assistant 2.
Guilloux, Vincent Le; Arrault, Alban; Colliandre, Lionel; Bourg, Stéphane; Vayer, Philippe; Morin-Allory, Luc
2012-08-31
High-throughput screening assays have become the starting point of many drug discovery programs for large pharmaceutical companies as well as academic organisations. Despite the increasing throughput of screening technologies, the almost infinite chemical space remains out of reach, calling for tools dedicated to the analysis and selection of the compound collections intended to be screened. We present Screening Assistant 2 (SA2), an open-source JAVA software dedicated to the storage and analysis of small to very large chemical libraries. SA2 stores unique molecules in a MySQL database, and encapsulates several chemoinformatics methods, among which: providers management, interactive visualisation, scaffold analysis, diverse subset creation, descriptors calculation, sub-structure / SMART search, similarity search and filtering. We illustrate the use of SA2 by analysing the composition of a database of 15 million compounds collected from 73 providers, in terms of scaffolds, frameworks, and undesired properties as defined by recently proposed HTS SMARTS filters. We also show how the software can be used to create diverse libraries based on existing ones. Screening Assistant 2 is a user-friendly, open-source software that can be used to manage collections of compounds and perform simple to advanced chemoinformatics analyses. Its modular design and growing documentation facilitate the addition of new functionalities, calling for contributions from the community. The software can be downloaded at http://sa2.sourceforge.net/.
NASA Technical Reports Server (NTRS)
McComas, David
2013-01-01
The flight software (FSW) math library is a collection of reusable math components that provides typical math utilities required by spacecraft flight software. These utilities are intended to increase flight software quality reusability and maintainability by providing a set of consistent, well-documented, and tested math utilities. This library only has dependencies on ANSI C, so it is easily ported. Prior to this library, each mission typically created its own math utilities using ideas/code from previous missions. Part of the reason for this is that math libraries can be written with different strategies in areas like error handling, parameters orders, naming conventions, etc. Changing the utilities for each mission introduces risks and costs. The obvious risks and costs are that the utilities must be coded and revalidated. The hidden risks and costs arise in miscommunication between engineers. These utilities must be understood by both the flight software engineers and other subsystem engineers (primarily guidance navigation and control). The FSW math library is part of a larger goal to produce a library of reusable Guidance Navigation and Control (GN&C) FSW components. A GN&C FSW library cannot be created unless a standardized math basis is created. This library solves the standardization problem by defining a common feature set and establishing policies for the library s design. This allows the libraries to be maintained with the same strategy used in its initial development, which supports a library of reusable GN&C FSW components. The FSW math library is written for an embedded software environment in C. This places restrictions on the language features that can be used by the library. Another advantage of the FSW math library is that it can be used in the FSW as well as other environments like the GN&C analyst s simulators. This helps communication between the teams because they can use the same utilities with the same feature set and syntax.
Database Software for the 1990s.
ERIC Educational Resources Information Center
Beiser, Karl
1990-01-01
Examines trends in the design of database management systems for microcomputers and predicts developments that may occur in the next decade. Possible developments are discussed in the areas of user interfaces, database programing, library systems, the use of MARC data, CD-ROM applications, artificial intelligence features, HyperCard, and…
Contrasting LMS Marketing Approaches
ERIC Educational Resources Information Center
Carriere, Brain; Challborn, Carl; Moore, James; Nibourg, Theodorus
2005-01-01
The first section of this report examines the CourseCompass learning management system (LMS), made available to educators by the Pearson publishing group as a vehicle for the company's extensive content library. The product's features are discussed, and the implications of Pearson's software/textbook "bundling" policy for the integrity of course…
Open source pipeline for ESPaDOnS reduction and analysis
NASA Astrophysics Data System (ADS)
Martioli, Eder; Teeple, Doug; Manset, Nadine; Devost, Daniel; Withington, Kanoa; Venne, Andre; Tannock, Megan
2012-09-01
OPERA is a Canada-France-Hawaii Telescope (CFHT) open source collaborative software project currently under development for an ESPaDOnS echelle spectro-polarimetric image reduction pipeline. OPERA is designed to be fully automated, performing calibrations and reduction, producing one-dimensional intensity and polarimetric spectra. The calibrations are performed on two-dimensional images. Spectra are extracted using an optimal extraction algorithm. While primarily designed for CFHT ESPaDOnS data, the pipeline is being written to be extensible to other echelle spectrographs. A primary design goal is to make use of fast, modern object-oriented technologies. Processing is controlled by a harness, which manages a set of processing modules, that make use of a collection of native OPERA software libraries and standard external software libraries. The harness and modules are completely parametrized by site configuration and instrument parameters. The software is open- ended, permitting users of OPERA to extend the pipeline capabilities. All these features have been designed to provide a portable infrastructure that facilitates collaborative development, code re-usability and extensibility. OPERA is free software with support for both GNU/Linux and MacOSX platforms. The pipeline is hosted on SourceForge under the name "opera-pipeline".
GABBs: Cyberinfrastructure for Self-Service Geospatial Data Exploration, Computation, and Sharing
NASA Astrophysics Data System (ADS)
Song, C. X.; Zhao, L.; Biehl, L. L.; Merwade, V.; Villoria, N.
2016-12-01
Geospatial data are present everywhere today with the proliferation of location-aware computing devices. This is especially true in the scientific community where large amounts of data are driving research and education activities in many domains. Collaboration over geospatial data, for example, in modeling, data analysis and visualization, must still overcome the barriers of specialized software and expertise among other challenges. In addressing these needs, the Geospatial data Analysis Building Blocks (GABBs) project aims at building geospatial modeling, data analysis and visualization capabilities in an open source web platform, HUBzero. Funded by NSF's Data Infrastructure Building Blocks initiative, GABBs is creating a geospatial data architecture that integrates spatial data management, mapping and visualization, and interfaces in the HUBzero platform for scientific collaborations. The geo-rendering enabled Rappture toolkit, a generic Python mapping library, geospatial data exploration and publication tools, and an integrated online geospatial data management solution are among the software building blocks from the project. The GABBS software will be available through Amazon's AWS Marketplace VM images and open source. Hosting services are also available to the user community. The outcome of the project will enable researchers and educators to self-manage their scientific data, rapidly create GIS-enable tools, share geospatial data and tools on the web, and build dynamic workflows connecting data and tools, all without requiring significant software development skills, GIS expertise or IT administrative privileges. This presentation will describe the GABBs architecture, toolkits and libraries, and showcase the scientific use cases that utilize GABBs capabilities, as well as the challenges and solutions for GABBs to interoperate with other cyberinfrastructure platforms.
Standardized Sky Partitioning for the Next Generation Astronomy and Space Science Archives
NASA Technical Reports Server (NTRS)
Lal, Nand (Technical Monitor); McLean, Brian
2004-01-01
The Johns Hopkins University and Space Telescope Science Institute are working together on this project to develop a library of standard software for data archives that will benefit the wider astronomical community. The ultimate goal was to develop and distribute a software library aimed at providing a common system for partitioning and indexing the sky in manageable sized regions and provide complex queries on the objects stored in this system. Whilst ongoing maintenance work will continue the primary goal has been completed. Most of the next generation sky surveys in the different wavelengths like 2MASS, GALEX, SDSS, GSC-II, DPOSS and FIRST have agreed on this common set of utilities. In this final report, we summarize work on the work elements assigned to the STScI project team.
ERIC Educational Resources Information Center
Cibbarelli, Pamela
1996-01-01
Profiles the top-selling IOLS (integrated online library systems) software for libraries based on sales figures reported in the 1996 "Library Journal" annual survey of the library automation marketplace. Highlights include microcomputer-based systems and minicomputer-based systems, system components, MARC formats, and market sectors.…
Klein, M S; Ross, F
1997-01-01
Using the results of the 1993 Medical Library Association (MLA) Hospital Libraries Section survey of hospital-based end-user search services, this article describes how end-user search services can become an impetus for an expanded information management and technology role for the hospital librarian. An end-user services implementation plan is presented that focuses on software, hardware, finances, policies, staff allocations and responsibilities, educational program design, and program evaluation. Possibilities for extending end-user search services into information technology and informatics, specialized end-user search systems, and Internet access are described. Future opportunities are identified for expanding the hospital librarian's role in the face of changing health care management, advances in information technology, and increasing end-user expectations. PMID:9285126
Automating Document Delivery: A Conference Report.
ERIC Educational Resources Information Center
Ensor, Pat
1992-01-01
Describes presentations made at a forum on automation, interlibrary loan (ILL), and document delivery sponsored by the Houston Area Library Consortium. Highlights include access versus ownership; software for ILL; fee-based services; automated management systems for ILL; and electronic mail and online systems for end-user-generated ILL requests.…
A Computerized Cataloging System for an Outdoor Program Library or Resource Center.
ERIC Educational Resources Information Center
Watters, Ron
The Outdoor Resource Library Cataloging System is a computer software program designed primarily for outdoor programs with small to medium-sized resource centers. The software is free to nonprofit organizations and is available from the Idaho State University Outdoor Program. The software is used to construct a database of library materials, which…
ERIC Educational Resources Information Center
Tolulope, Akano
2017-01-01
Libraries before the 21st century carried out daily routine library task such as cataloguing and classification, acquisition, reference services etc using manual procedures only but the advent of Information Technology as transformed these routine task that libraries can now automate their activities by deploying the use of library software in…
Duffy, Fergal J; Verniere, Mélanie; Devocelle, Marc; Bernard, Elise; Shields, Denis C; Chubb, Anthony J
2011-04-25
We introduce CycloPs, software for the generation of virtual libraries of constrained peptides including natural and nonnatural commercially available amino acids. The software is written in the cross-platform Python programming language, and features include generating virtual libraries in one-dimensional SMILES and three-dimensional SDF formats, suitable for virtual screening. The stand-alone software is capable of filtering the virtual libraries using empirical measurements, including peptide synthesizability by standard peptide synthesis techniques, stability, and the druglike properties of the peptide. The software and accompanying Web interface is designed to enable the rapid generation of large, structurally diverse, synthesizable virtual libraries of constrained peptides quickly and conveniently, for use in virtual screening experiments. The stand-alone software, and the Web interface for evaluating these empirical properties of a single peptide, are available at http://bioware.ucd.ie .
Computer Software Management and Information Center
NASA Technical Reports Server (NTRS)
1983-01-01
Computer programs for passive anti-roll tank, earth resources laboratory applications, the NIMBUS-7 coastal zone color scanner derived products, transportable applications executive, plastic and failure analysis of composites, velocity gradient method for calculating velocities in an axisymmetric annular duct, an integrated procurement management system, data I/O PRON for the Motorola exorcisor, aerodynamic shock-layer shape, kinematic modeling, hardware library for a graphics computer, and a file archival system are documented.
ERIC Educational Resources Information Center
England, Lenore; Fu, Li
2011-01-01
A critical part of electronic resources management, the electronic resources evaluation process is multi-faceted and includes a seemingly endless range of resources and tools involving numerous library staff. A solution is to build a Web site to bring all of the components together that can be implemented quickly and result in an organizational…
Automated reuseable components system study results
NASA Technical Reports Server (NTRS)
Gilroy, Kathy
1989-01-01
The Automated Reusable Components System (ARCS) was developed under a Phase 1 Small Business Innovative Research (SBIR) contract for the U.S. Army CECOM. The objectives of the ARCS program were: (1) to investigate issues associated with automated reuse of software components, identify alternative approaches, and select promising technologies, and (2) to develop tools that support component classification and retrieval. The approach followed was to research emerging techniques and experimental applications associated with reusable software libraries, to investigate the more mature information retrieval technologies for applicability, and to investigate the applicability of specialized technologies to improve the effectiveness of a reusable component library. Various classification schemes and retrieval techniques were identified and evaluated for potential application in an automated library system for reusable components. Strategies for library organization and management, component submittal and storage, and component search and retrieval were developed. A prototype ARCS was built to demonstrate the feasibility of automating the reuse process. The prototype was created using a subset of the classification and retrieval techniques that were investigated. The demonstration system was exercised and evaluated using reusable Ada components selected from the public domain. A requirements specification for a production-quality ARCS was also developed.
Advanced Data Format (ADF) Software Library and Users Guide
NASA Technical Reports Server (NTRS)
Smith, Matthew; Smith, Charles A. (Technical Monitor)
1998-01-01
The "CFD General Notation System" (CGNS) consists of a collection of conventions, and conforming software, for the storage and retrieval of Computational Fluid Dynamics (CFD) data. It facilitates the exchange of data between sites and applications, and helps stabilize the archiving of aerodynamic data. This effort was initiated in order to streamline the procedures in exchanging data and software between NASA and its customers, but the goal is to develop CGNS into a National Standard for the exchange of aerodynamic data. The CGNS development team is comprised of members from Boeing Commercial. Airplane Group, NASA-Ames, NASA-Langley, NASA-Lewis, McDonnell-Douglas Corporation (now Boeing-St. Louis), Air Force-Wright Lab., and ICEM-CFD Engineering. The elements of CGNS address all activities associated with the storage of data on external media and its movement to and from application programs. These elements include: 1) The Advanced Data Format (ADF) Database manager, consisting of both a file format specification and its 1/0 software, which handles the actual reading and writing of data from and to external storage media; 2) The Standard Interface Data Structures (SIDS), which specify the intellectual content of CFD data and the conventions governing naming and terminology; 3) The SIDS-to-ADF File Mapping conventions, which specify the exact location where the CFD data defined by the SIDS is to be stored within the ADF file(s); and 4) The CGNS Mid-level Library, which provides CFD-knowledgeable routines suitable for direct installation into application codes. The ADF is a generic database manager with minimal intrinsic capability. It was written for the purpose of storing large numerical datasets in an efficient, platform independent manner. To be effective, it must be used in conjunction with external agreements on how the data will be organized within the ADF database such defined by the SIDS. There are currently 34 user callable functions that comprise the ADF Core library and are described in the Users Guide. The library is written in C, but each function has a FORTRAN counterpart.
Nuclear Data Online Services at Peking University
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fan, T.S.; Guo, Z.Y.; Ye, W.G.
2005-05-24
The Institute of Heavy Ion Physics at Peking University has developed a new nuclear data online services software package. Through the web site (http://ndos.nst.pku.edu.cn), it offers online access to main relational nuclear databases: five evaluated neutron libraries (BROND, CENDL, ENDF, JEF, JENDL), the ENSDF library, the EXFOR library, the IAEA photonuclear library and the charged particle data of the FENDL library. This software allows the comparison and graphic representations of the different data sets. The computer programs of this package are based on the Linux implementation of PHP and the MySQL software.
Nuclear Data Online Services at Peking University
NASA Astrophysics Data System (ADS)
Fan, T. S.; Guo, Z. Y.; Ye, W. G.; Liu, W. L.; Liu, T. J.; Liu, C. X.; Chen, J. X.; Tang, G. Y.; Shi, Z. M.; Huang, X. L.; Chen, J. E.
2005-05-01
The Institute of Heavy Ion Physics at Peking University has developed a new nuclear data online services software package. Through the web site (http://ndos.nst.pku.edu.cn), it offers online access to main relational nuclear databases: five evaluated neutron libraries (BROND, CENDL, ENDF, JEF, JENDL), the ENSDF library, the EXFOR library, the IAEA photonuclear library and the charged particle data of the FENDL library. This software allows the comparison and graphic representations of the different data sets. The computer programs of this package are based on the Linux implementation of PHP and the MySQL software.
2014-05-18
intention of offering improved software libraries for GNSS signal acquisition. It has been the team mission to implement new and improved techniques...with the intention of offering improved software libraries for GNSS signal acquisition. It has been the team mission to implement new and improved...intention of offering improved software libraries for GNSS signal acquisition. It has been the team mission to implement new and improved techniques to
ERIC Educational Resources Information Center
Cibbarelli, Pamela
1996-01-01
Examines library automation product introductions and conversions to new operating systems. Compares user satisfaction ratings of the following library software packages: DOS/Windows, UNIX, Macintosh, and DEC VAX/VMS. Software is rated according to documentation, service/support, training, product reliability, product capabilities, ease of use,…
An OpenEarth Framework (OEF) for Integrating and Visualizing Earth Science Data
NASA Astrophysics Data System (ADS)
Moreland, J. L.; Nadeau, D. R.; Baru, C.; Crosby, C. J.
2009-12-01
The integration of data is essential to make transformative progress in understanding the complex processes operating at the Earth’s surface and within its interior. While our current ability to collect massive amounts of data, develop structural models, and generate high-resolution dynamics models is well developed, our ability to quantitatively integrate these data and models into holistic interpretations of Earth systems is poorly developed. We lack the basic tools to realize a first-order goal in Earth science of developing integrated 4D models of Earth structure and processes using a complete range of available constraints, at a time when the research agenda of major efforts such as EarthScope demand such a capability. Among the challenges to 3D data integration are data that may be in different coordinate spaces, units, value ranges, file formats, and data structures. While several file format standards exist, they are infrequently or incorrectly used. Metadata is often missing, misleading, or relegated to README text files along side the data. This leaves much of the work to integrate data bogged down by simple data management tasks. The OpenEarth Framework (OEF) being developed by GEON addresses these data management difficulties. The software incorporates file format parsers, data interpretation heuristics, user interfaces to prompt for missing information, and visualization techniques to merge data into a common visual model. The OEF’s data access libraries parse formal and de facto standard file formats and map their data into a common data model. The software handles file format quirks, storage details, caching, local and remote file access, and web service protocol handling. Heuristics are used to determine coordinate spaces, units, and other key data features. Where multiple data structure, naming, and file organization conventions exist, those heuristics check for each convention’s use to find a high confidence interpretation of the data. When no convention or embedded data yields a suitable answer, the user is prompted to fill in the blanks. The OEF’s interaction libraries assist in the construction of user interfaces for data management. These libraries support data import, data prompting, data introspection, the management of the contents of a common data model, and the creation of derived data to support visualization. Finally, visualization libraries provide interactive visualization using an extended version of NASA WorldWind. The OEF viewer supports visualization of terrains, point clouds, 3D volumes, imagery, cutting planes, isosurfaces, and more. Data may be color coded, shaded, and displayed above, or below the terrain, and always registered into a common coordinate space. The OEF architecture is open and cross-platform software libraries are available separately for use with other software projects, while modules from other projects may be integrated into the OEF to extend its features. The OEF is currently being used to visualize data from EarthScope-related research in the Western US.
Software Quality Assurance and Verification for the MPACT Library Generation Process
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Yuxuan; Williams, Mark L.; Wiarda, Dorothea
This report fulfills the requirements for the Consortium for the Advanced Simulation of Light-Water Reactors (CASL) milestone L2:RTM.P14.02, “SQA and Verification for MPACT Library Generation,” by documenting the current status of the software quality, verification, and acceptance testing of nuclear data libraries for MPACT. It provides a brief overview of the library generation process, from general-purpose evaluated nuclear data files (ENDF/B) to a problem-dependent cross section library for modeling of light-water reactors (LWRs). The software quality assurance (SQA) programs associated with each of the software used to generate the nuclear data libraries are discussed; specific tests within the SCALE/AMPX andmore » VERA/XSTools repositories are described. The methods and associated tests to verify the quality of the library during the generation process are described in detail. The library generation process has been automated to a degree to (1) ensure that it can be run without user intervention and (2) to ensure that the library can be reproduced. Finally, the acceptance testing process that will be performed by representatives from the Radiation Transport Methods (RTM) Focus Area prior to the production library’s release is described in detail.« less
NASA Technical Reports Server (NTRS)
Pearson, Don; Hamm, Dustin; Kubena, Brian; Weaver, Jonathan K.
2010-01-01
An updated version of the Platform Independent Software Components for the Exploration of Space (PISCES) software library is available. A previous version was reported in Library for Developing Spacecraft-Mission-Planning Software (MSC-22983), NASA Tech Briefs, Vol. 25, No. 7 (July 2001), page 52. To recapitulate: This software provides for Web-based, collaborative development of computer programs for planning trajectories and trajectory- related aspects of spacecraft-mission design. The library was built using state-of-the-art object-oriented concepts and software-development methodologies. The components of PISCES include Java-language application programs arranged in a hierarchy of classes that facilitates the reuse of the components. As its full name suggests, the PISCES library affords platform-independence: The Java language makes it possible to use the classes and application programs with a Java virtual machine, which is available in most Web-browser programs. Another advantage is expandability: Object orientation facilitates expansion of the library through creation of a new class. Improvements in the library since the previous version include development of orbital-maneuver- planning and rendezvous-launch-window application programs, enhancement of capabilities for propagation of orbits, and development of a desktop user interface.
Genetic Constructor: An Online DNA Design Platform.
Bates, Maxwell; Lachoff, Joe; Meech, Duncan; Zulkower, Valentin; Moisy, Anaïs; Luo, Yisha; Tekotte, Hille; Franziska Scheitz, Cornelia Johanna; Khilari, Rupal; Mazzoldi, Florencio; Chandran, Deepak; Groban, Eli
2017-12-15
Genetic Constructor is a cloud Computer Aided Design (CAD) application developed to support synthetic biologists from design intent through DNA fabrication and experiment iteration. The platform allows users to design, manage, and navigate complex DNA constructs and libraries, using a new visual language that focuses on functional parts abstracted from sequence. Features like combinatorial libraries and automated primer design allow the user to separate design from construction by focusing on functional intent, and design constraints aid iterative refinement of designs. A plugin architecture enables contributions from scientists and coders to leverage existing powerful software and connect to DNA foundries. The software is easily accessible and platform agnostic, free for academics, and available in an open-source community edition. Genetic Constructor seeks to democratize DNA design, manufacture, and access to tools and services from the synthetic biology community.
A microkernel design for component-based parallel numerical software systems.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Balay, S.
1999-01-13
What is the minimal software infrastructure and what type of conventions are needed to simplify development of sophisticated parallel numerical application codes using a variety of software components that are not necessarily available as source code? We propose an opaque object-based model where the objects are dynamically loadable from the file system or network. The microkernel required to manage such a system needs to include, at most: (1) a few basic services, namely--a mechanism for loading objects at run time via dynamic link libraries, and consistent schemes for error handling and memory management; and (2) selected methods that all objectsmore » share, to deal with object life (destruction, reference counting, relationships), and object observation (viewing, profiling, tracing). We are experimenting with these ideas in the context of extensible numerical software within the ALICE (Advanced Large-scale Integrated Computational Environment) project, where we are building the microkernel to manage the interoperability among various tools for large-scale scientific simulations. This paper presents some preliminary observations and conclusions from our work with microkernel design.« less
New Support for the Research Process: Desktop Delivery of Microform Content
ERIC Educational Resources Information Center
Weare, William H., Jr.
2011-01-01
While trying to access microform content, patrons at the Christopher Center for Library and Information Resources at Valparaiso University were often hampered by unfamiliar equipment, temperamental software, and a puzzling file management system. In an effort to address these problems, the Access Services Department launched a pilot program for…
Programming biological models in Python using PySB.
Lopez, Carlos F; Muhlich, Jeremy L; Bachman, John A; Sorger, Peter K
2013-01-01
Mathematical equations are fundamental to modeling biological networks, but as networks get large and revisions frequent, it becomes difficult to manage equations directly or to combine previously developed models. Multiple simultaneous efforts to create graphical standards, rule-based languages, and integrated software workbenches aim to simplify biological modeling but none fully meets the need for transparent, extensible, and reusable models. In this paper we describe PySB, an approach in which models are not only created using programs, they are programs. PySB draws on programmatic modeling concepts from little b and ProMot, the rule-based languages BioNetGen and Kappa and the growing library of Python numerical tools. Central to PySB is a library of macros encoding familiar biochemical actions such as binding, catalysis, and polymerization, making it possible to use a high-level, action-oriented vocabulary to construct detailed models. As Python programs, PySB models leverage tools and practices from the open-source software community, substantially advancing our ability to distribute and manage the work of testing biochemical hypotheses. We illustrate these ideas using new and previously published models of apoptosis.
Programming biological models in Python using PySB
Lopez, Carlos F; Muhlich, Jeremy L; Bachman, John A; Sorger, Peter K
2013-01-01
Mathematical equations are fundamental to modeling biological networks, but as networks get large and revisions frequent, it becomes difficult to manage equations directly or to combine previously developed models. Multiple simultaneous efforts to create graphical standards, rule-based languages, and integrated software workbenches aim to simplify biological modeling but none fully meets the need for transparent, extensible, and reusable models. In this paper we describe PySB, an approach in which models are not only created using programs, they are programs. PySB draws on programmatic modeling concepts from little b and ProMot, the rule-based languages BioNetGen and Kappa and the growing library of Python numerical tools. Central to PySB is a library of macros encoding familiar biochemical actions such as binding, catalysis, and polymerization, making it possible to use a high-level, action-oriented vocabulary to construct detailed models. As Python programs, PySB models leverage tools and practices from the open-source software community, substantially advancing our ability to distribute and manage the work of testing biochemical hypotheses. We illustrate these ideas using new and previously published models of apoptosis. PMID:23423320
The Profiles in Science Digital Library: Behind the Scenes.
Gallagher, Marie E; Moffatt, Christie
2012-01-01
This demonstration shows the Profiles in Science ® digital library. Profiles in Science contains digitized selections from the personal manuscript collections of prominent biomedical researchers, medical practitioners, and those fostering science and health. The Profiles in Science Web site is the delivery mechanism for content derived from the digital library system. The system is designed according to our basic principles for digital library development [1]. The digital library includes the rules and software used for digitizing items, creating and editing database records and performing quality control as well as serving the digital content to the public. Among the types of data managed by the digital library are detailed item-level, collection-level and cross-collection metadata, digitized photographs, papers, audio clips, movies, born-digital electronic files, optical character recognized (OCR) text, and annotations (see Figure 1). The digital library also tracks the status of each item, including digitization quality, sensitivity of content, and copyright. Only items satisfying all required criteria are released to the public through the World Wide Web. External factors have influenced all aspects of the digital library's infrastructure.
Flexible server-side processing of climate archives
NASA Astrophysics Data System (ADS)
Juckes, Martin; Stephens, Ag; Damasio da Costa, Eduardo
2014-05-01
The flexibility and interoperability of OGC Web Processing Services are combined with an extensive range of data processing operations supported by the Climate Data Operators (CDO) library to facilitate processing of the CMIP5 climate data archive. The challenges posed by this peta-scale archive allow us to test and develop systems which will help us to deal with approaching exa-scale challenges. The CEDA WPS package allows users to manipulate data in the archive and export the results without first downloading the data -- in some cases this can drastically reduce the data volumes which need to be transferred and greatly reduce the time needed for the scientists to get their results. Reductions in data transfer are achieved at the expense of an additional computational load imposed on the archive (or near-archive) infrastructure. This is managed with a load balancing system. Short jobs may be run in near real-time, longer jobs will be queued. When jobs are queued the user is provided with a web dashboard displaying job status. A clean split between the data manipulation software and the request management software is achieved by exploiting the extensive CDO library. This library has a long history of development to support the needs of the climate science community. Use of the library ensures that operations run on data by the system can be reproduced by users using the same operators installed on their own computers. Examples using the system deployed for the CMIP5 archive will be shown and issues which need to be addressed as archive volumes expand into the exa-scale will be discussed.
Flexible server-side processing of climate archives
NASA Astrophysics Data System (ADS)
Juckes, M. N.; Stephens, A.; da Costa, E. D.
2013-12-01
The flexibility and interoperability of OGC Web Processing Services are combined with an extensive range of data processing operations supported by the Climate Data Operators (CDO) library to facilitate processing of the CMIP5 climate data archive. The challenges posed by this peta-scale archive allow us to test and develop systems which will help us to deal with approaching exa-scale challenges. The CEDA WPS package allows users to manipulate data in the archive and export the results without first downloading the data -- in some cases this can drastically reduce the data volumes which need to be transferred and greatly reduce the time needed for the scientists to get their results. Reductions in data transfer are achieved at the expense of an additional computational load imposed on the archive (or near-archive) infrastructure. This is managed with a load balancing system. Short jobs may be run in near real-time, longer jobs will be queued. When jobs are queued the user is provided with a web dashboard displaying job status. A clean split between the data manipulation software and the request management software is achieved by exploiting the extensive CDO library. This library has a long history of development to support the needs of the climate science community. Use of the library ensures that operations run on data by the system can be reproduced by users using the same operators installed on their own computers. Examples using the system deployed for the CMIP5 archive will be shown and issues which need to be addressed as archive volumes expand into the exa-scale will be discussed.
The MGDO software library for data analysis in Ge neutrinoless double-beta decay experiments
NASA Astrophysics Data System (ADS)
Agostini, M.; Detwiler, J. A.; Finnerty, P.; Kröninger, K.; Lenz, D.; Liu, J.; Marino, M. G.; Martin, R.; Nguyen, K. D.; Pandola, L.; Schubert, A. G.; Volynets, O.; Zavarise, P.
2012-07-01
The Gerda and Majorana experiments will search for neutrinoless double-beta decay of 76Ge using isotopically enriched high-purity germanium detectors. Although the experiments differ in conceptual design, they share many aspects in common, and in particular will employ similar data analysis techniques. The collaborations are jointly developing a C++ software library, MGDO, which contains a set of data objects and interfaces to encapsulate, store and manage physical quantities of interest, such as waveforms and high-purity germanium detector geometries. These data objects define a common format for persistent data, whether it is generated by Monte Carlo simulations or an experimental apparatus, to reduce code duplication and to ease the exchange of information between detector systems. MGDO also includes general-purpose analysis tools that can be used for the processing of measured or simulated digital signals. The MGDO design is based on the Object-Oriented programming paradigm and is very flexible, allowing for easy extension and customization of the components. The tools provided by the MGDO libraries are used by both Gerda and Majorana.
Information Metacatalog for a Grid
NASA Technical Reports Server (NTRS)
Kolano, Paul
2007-01-01
SWIM is a Software Information Metacatalog that gathers detailed information about the software components and packages installed on a grid resource. Information is currently gathered for Executable and Linking Format (ELF) executables and shared libraries, Java classes, shell scripts, and Perl and Python modules. SWIM is built on top of the POUR framework, which is described in the preceding article. SWIM consists of a set of Perl modules for extracting software information from a system, an XML schema defining the format of data that can be added by users, and a POUR XML configuration file that describes how these elements are used to generate periodic, on-demand, and user-specified information. Periodic software information is derived mainly from the package managers used on each system. SWIM collects information from native package managers in FreeBSD, Solaris, and IRX as well as the RPM, Perl, and Python package managers on multiple platforms. Because not all software is available, or installed in package form, SWIM also crawls the set of relevant paths from the File System Hierarchy Standard that defines the standard file system structure used by all major UNIX distributions. Using these two techniques, the vast majority of software installed on a system can be located. SWIM computes the same information gathered by the periodic routines for specific files on specific hosts, and locates software on a system given only its name and type.
Graphics Software For VT Terminals
NASA Technical Reports Server (NTRS)
Wang, Caroline
1991-01-01
VTGRAPH graphics software tool for DEC/VT computer terminal or terminals compatible with it, widely used by government and industry. Callable in FORTRAN or C language, library program enabling user to cope with many computer environments in which VT terminals used for window management and graphic systems. Provides PLOT10-like package plus color or shade capability for VT240, VT241, and VT300 terminals. User can easily design more-friendly user-interface programs and design PLOT10 programs on VT terminals with different computer systems. Requires ReGis graphics set terminal and FORTRAN compiler.
Ground Operations Autonomous Control and Integrated Health Management
NASA Technical Reports Server (NTRS)
Daniels, James
2014-01-01
The Ground Operations Autonomous Control and Integrated Health Management plays a key role for future ground operations at NASA. The software that is integrated into this system is called G2 2011 Gensym. The purpose of this report is to describe the Ground Operations Autonomous Control and Integrated Health Management with the use of the G2 Gensym software and the G2 NASA toolkit for Integrated System Health Management (ISHM) which is a Computer Software Configuration Item (CSCI). The decision rationale for the use of the G2 platform is to develop a modular capability for ISHM and AC. Toolkit modules include knowledge bases that are generic and can be applied in any application domain module. That way, there's a maximization of reusability, maintainability, and systematic evolution, portability, and scalability. Engine modules are generic, while application modules represent the domain model of a specific application. Furthermore, the NASA toolkit, developed since 2006 (a set of modules), makes it possible to create application domain models quickly, using pre-defined objects that include sensors and components libraries for typical fluid, electrical, and mechanical systems.
A Robust and Scalable Software Library for Parallel Adaptive Refinement on Unstructured Meshes
NASA Technical Reports Server (NTRS)
Lou, John Z.; Norton, Charles D.; Cwik, Thomas A.
1999-01-01
The design and implementation of Pyramid, a software library for performing parallel adaptive mesh refinement (PAMR) on unstructured meshes, is described. This software library can be easily used in a variety of unstructured parallel computational applications, including parallel finite element, parallel finite volume, and parallel visualization applications using triangular or tetrahedral meshes. The library contains a suite of well-designed and efficiently implemented modules that perform operations in a typical PAMR process. Among these are mesh quality control during successive parallel adaptive refinement (typically guided by a local-error estimator), parallel load-balancing, and parallel mesh partitioning using the ParMeTiS partitioner. The Pyramid library is implemented in Fortran 90 with an interface to the Message-Passing Interface (MPI) library, supporting code efficiency, modularity, and portability. An EM waveguide filter application, adaptively refined using the Pyramid library, is illustrated.
Efficient methods and readily customizable libraries for managing complexity of large networks.
Dogrusoz, Ugur; Karacelik, Alper; Safarli, Ilkin; Balci, Hasan; Dervishi, Leonard; Siper, Metin Can
2018-01-01
One common problem in visualizing real-life networks, including biological pathways, is the large size of these networks. Often times, users find themselves facing slow, non-scaling operations due to network size, if not a "hairball" network, hindering effective analysis. One extremely useful method for reducing complexity of large networks is the use of hierarchical clustering and nesting, and applying expand-collapse operations on demand during analysis. Another such method is hiding currently unnecessary details, to later gradually reveal on demand. Major challenges when applying complexity reduction operations on large networks include efficiency and maintaining the user's mental map of the drawing. We developed specialized incremental layout methods for preserving a user's mental map while managing complexity of large networks through expand-collapse and hide-show operations. We also developed open-source JavaScript libraries as plug-ins to the web based graph visualization library named Cytsocape.js to implement these methods as complexity management operations. Through efficient specialized algorithms provided by these extensions, one can collapse or hide desired parts of a network, yielding potentially much smaller networks, making them more suitable for interactive visual analysis. This work fills an important gap by making efficient implementations of some already known complexity management techniques freely available to tool developers through a couple of open source, customizable software libraries, and by introducing some heuristics which can be applied upon such complexity management techniques to ensure preserving mental map of users.
Digital Library Storage using iRODS Data Grids
NASA Astrophysics Data System (ADS)
Hedges, Mark; Blanke, Tobias; Hasan, Adil
Digital repository software provides a powerful and flexible infrastructure for managing and delivering complex digital resources and metadata. However, issues can arise in managing the very large, distributed data files that may constitute these resources. This paper describes an implementation approach that combines the Fedora digital repository software with a storage layer implemented as a data grid, using the iRODS middleware developed by DICE (Data Intensive Cyber Environments) as the successor to SRB. This approach allows us to use Fedoras flexible architecture to manage the structure of resources and to provide application- layer services to users. The grid-based storage layer provides efficient support for managing and processing the underlying distributed data objects, which may be very large (e.g. audio-visual material). The Rule Engine built into iRODS is used to integrate complex workflows at the data level that need not be visible to users, e.g. digital preservation functionality.
2010-01-01
Background Suppression subtractive hybridization is a popular technique for gene discovery from non-model organisms without an annotated genome sequence, such as cowpea (Vigna unguiculata (L.) Walp). We aimed to use this method to enrich for genes expressed during drought stress in a drought tolerant cowpea line. However, current methods were inefficient in screening libraries and management of the sequence data, and thus there was a need to develop software tools to facilitate the process. Results Forward and reverse cDNA libraries enriched for cowpea drought response genes were screened on microarrays, and the R software package SSHscreen 2.0.1 was developed (i) to normalize the data effectively using spike-in control spot normalization, and (ii) to select clones for sequencing based on the calculation of enrichment ratios with associated statistics. Enrichment ratio 3 values for each clone showed that 62% of the forward library and 34% of the reverse library clones were significantly differentially expressed by drought stress (adjusted p value < 0.05). Enrichment ratio 2 calculations showed that > 88% of the clones in both libraries were derived from rare transcripts in the original tester samples, thus supporting the notion that suppression subtractive hybridization enriches for rare transcripts. A set of 118 clones were chosen for sequencing, and drought-induced cowpea genes were identified, the most interesting encoding a late embryogenesis abundant Lea5 protein, a glutathione S-transferase, a thaumatin, a universal stress protein, and a wound induced protein. A lipid transfer protein and several components of photosynthesis were down-regulated by the drought stress. Reverse transcriptase quantitative PCR confirmed the enrichment ratio values for the selected cowpea genes. SSHdb, a web-accessible database, was developed to manage the clone sequences and combine the SSHscreen data with sequence annotations derived from BLAST and Blast2GO. The self-BLAST function within SSHdb grouped redundant clones together and illustrated that the SSHscreen plots are a useful tool for choosing anonymous clones for sequencing, since redundant clones cluster together on the enrichment ratio plots. Conclusions We developed the SSHscreen-SSHdb software pipeline, which greatly facilitates gene discovery using suppression subtractive hybridization by improving the selection of clones for sequencing after screening the library on a small number of microarrays. Annotation of the sequence information and collaboration was further enhanced through a web-based SSHdb database, and we illustrated this through identification of drought responsive genes from cowpea, which can now be investigated in gene function studies. SSH is a popular and powerful gene discovery tool, and therefore this pipeline will have application for gene discovery in any biological system, particularly non-model organisms. SSHscreen 2.0.1 and a link to SSHdb are available from http://microarray.up.ac.za/SSHscreen. PMID:20359330
Copyright: Know Your Electronic Rights!
ERIC Educational Resources Information Center
Valauskas, Edward J.
1992-01-01
Defines copyright and examines the interests of computer software publishers. Issues related to the rights of libraries in the circulation of software are discussed, including the fair use principle, software vendors' licensing agreements, and cooperation between libraries and vendors. An inset describes procedures for internal auditing of…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-07
...., Austin Connectivity Software Group, Austin, TX: July 19, 2008. TA-W-72,913: McNulty Hicken Smith, Inc... Corporation, Greensboro, NC. TA-W-72,926: Freescale Semiconductor, Inc., Libraries Group, Austin, TX. TA-W-73... Resources, Fort Worth, TX: January 12, 2009. TA-W-73,315: Apria Healthcare, Southeast Private Pay Management...
Apples in the Apple Library--How One Library Took a Byte.
ERIC Educational Resources Information Center
Ertel, Monica
1983-01-01
Summarizes automation of a specialized library at Apple Computer, Inc., describing software packages chosen for the following functions: word processing/text editing; cataloging and circulation; reference; and in-house databases. Examples of each function and additional sources of information on software and equipment mentioned in the article are…
A Multi-User Microcomputer System for Small Libraries.
ERIC Educational Resources Information Center
Leggate, Peter
1988-01-01
Describes the development of Bookshelf, a multi-user microcomputer system for small libraries that uses an integrated software package. The discussion covers the design parameters of the package, which were based on a survey of seven small libraries, and some characteristics of the software. (three notes with references) (CLB)
Chen, Yi-Bu; Chattopadhyay, Ansuman; Bergen, Phillip; Gadd, Cynthia; Tannery, Nancy
2007-01-01
To bridge the gap between the rising information needs of biological and medical researchers and the rapidly growing number of online bioinformatics resources, we have created the Online Bioinformatics Resources Collection (OBRC) at the Health Sciences Library System (HSLS) at the University of Pittsburgh. The OBRC, containing 1542 major online bioinformatics databases and software tools, was constructed using the HSLS content management system built on the Zope Web application server. To enhance the output of search results, we further implemented the Vivísimo Clustering Engine, which automatically organizes the search results into categories created dynamically based on the textual information of the retrieved records. As the largest online collection of its kind and the only one with advanced search results clustering, OBRC is aimed at becoming a one-stop guided information gateway to the major bioinformatics databases and software tools on the Web. OBRC is available at the University of Pittsburgh's HSLS Web site (http://www.hsls.pitt.edu/guides/genetics/obrc).
FROMS3D: New Software for 3-D Visualization of Fracture Network System in Fractured Rock Masses
NASA Astrophysics Data System (ADS)
Noh, Y. H.; Um, J. G.; Choi, Y.
2014-12-01
A new software (FROMS3D) is presented to visualize fracture network system in 3-D. The software consists of several modules that play roles in management of borehole and field fracture data, fracture network modelling, visualization of fracture geometry in 3-D and calculation and visualization of intersections and equivalent pipes between fractures. Intel Parallel Studio XE 2013, Visual Studio.NET 2010 and the open source VTK library were utilized as development tools to efficiently implement the modules and the graphical user interface of the software. The results have suggested that the developed software is effective in visualizing 3-D fracture network system, and can provide useful information to tackle the engineering geological problems related to strength, deformability and hydraulic behaviors of the fractured rock masses.
One library's experience with review and selection of chat software for reference.
Behm, Leslie M
2003-01-01
When Michigan State University (MSU) Libraries decided to make the foray into virtual reference, the first thing that needed to be done was to decide on the software to use. This article discusses the process used including the items considered essential (deal-breakers) for software to make the first cut, what other features needed to be included, and what features would be useful but were not critical. A literature review of some useful current articles on virtual reference is included. The vendor and software ultimately selected was not one of the original vendors; how MSU Libraries was able to evaluate and select Docutek is presented. A matrix for software comparison is included in the appendix.
ERIC Educational Resources Information Center
Kahle, Brewster; Prelinger, Rick; Jackson, Mary E.; Boyack, Kevin W.; Wylie, Brian N.; Davidson, George S.; Witten, Ian H.; Bainbridge, David; Boddie, Stefan J.; Garrison, William A.; Cunningham, Sally Jo; Borgman, Christine L.; Hessel, Heather
2001-01-01
These six articles discuss various issues relating to digital libraries. Highlights include public access to digital materials; intellectual property concerns; the need for collaboration across disciplines; Greenstone software for construction and presentation of digital information collections; the Colorado Digitization Project; and conferences…
Digital Preservation in Open-Source Digital Library Software
ERIC Educational Resources Information Center
Madalli, Devika P.; Barve, Sunita; Amin, Saiful
2012-01-01
Digital archives and digital library projects are being initiated all over the world for materials of different formats and domains. To organize, store, and retrieve digital content, many libraries as well as archiving centers are using either proprietary or open-source software. While it is accepted that print media can survive for centuries with…
The Elusive Cost of Library Software
ERIC Educational Resources Information Center
Breeding, Marshall
2009-01-01
Software pricing is not a straightforward issue, since each procurement involves a special business arrangement between a library and its chosen vendor. The author thinks that it is reasonable to scale the cost of a product to such factors as the size of the library, the complexity of the installation, the number of simultaneous users, or the…
Point Cloud Management Through the Realization of the Intelligent Cloud Viewer Software
NASA Astrophysics Data System (ADS)
Costantino, D.; Angelini, M. G.; Settembrini, F.
2017-05-01
The paper presents a software dedicated to the elaboration of point clouds, called Intelligent Cloud Viewer (ICV), made in-house by AESEI software (Spin-Off of Politecnico di Bari), allowing to view point cloud of several tens of millions of points, also on of "no" very high performance systems. The elaborations are carried out on the whole point cloud and managed by means of the display only part of it in order to speed up rendering. It is designed for 64-bit Windows and is fully written in C ++ and integrates different specialized modules for computer graphics (Open Inventor by SGI, Silicon Graphics Inc), maths (BLAS, EIGEN), computational geometry (CGAL, Computational Geometry Algorithms Library), registration and advanced algorithms for point clouds (PCL, Point Cloud Library), advanced data structures (BOOST, Basic Object Oriented Supporting Tools), etc. ICV incorporates a number of features such as, for example, cropping, transformation and georeferencing, matching, registration, decimation, sections, distances calculation between clouds, etc. It has been tested on photographic and TLS (Terrestrial Laser Scanner) data, obtaining satisfactory results. The potentialities of the software have been tested by carrying out the photogrammetric survey of the Castel del Monte which was already available in previous laser scanner survey made from the ground by the same authors. For the aerophotogrammetric survey has been adopted a flight height of approximately 1000ft AGL (Above Ground Level) and, overall, have been acquired over 800 photos in just over 15 minutes, with a covering not less than 80%, the planned speed of about 90 knots.
AdaNET prototype library administration manual
NASA Technical Reports Server (NTRS)
Hanley, Lionel
1989-01-01
The functions of the AdaNET Prototype Library of Reusable Software Parts is described. Adopted from the Navy Research Laboratory's Reusability Guidebook (V.5.0), this is a working document, customized for use the the AdaNET Project. Within this document, the term part is used to denote the smallest unit controlled by a library and retrievable from it. A part may have several constituents, which may not be individually tracked. Presented are the types of parts which may be stored in the library and the relationships among those parts; a concept of trust indicators which provide measures of confidence that a user of a previously developed part may reasonably apply to a part for a new application; search and retrieval, configuration management, and communications among those who interact with the AdaNET Prototype Library; and the AdaNET Prototype, described from the perspective of its three major users: the part reuser and retriever, the part submitter, and the librarian and/or administrator.
New software for 3D fracture network analysis and visualization
NASA Astrophysics Data System (ADS)
Song, J.; Noh, Y.; Choi, Y.; Um, J.; Hwang, S.
2013-12-01
This study presents new software to perform analysis and visualization of the fracture network system in 3D. The developed software modules for the analysis and visualization, such as BOUNDARY, DISK3D, FNTWK3D, CSECT and BDM, have been developed using Microsoft Visual Basic.NET and Visualization TookKit (VTK) open-source library. Two case studies revealed that each module plays a role in construction of analysis domain, visualization of fracture geometry in 3D, calculation of equivalent pipes, production of cross-section map and management of borehole data, respectively. The developed software for analysis and visualization of the 3D fractured rock mass can be used to tackle the geomechanical problems related to strength, deformability and hydraulic behaviors of the fractured rock masses.
Spatial Data Management System (SDMS)
NASA Technical Reports Server (NTRS)
Hutchison, Mark W.
1994-01-01
The Spatial Data Management System (SDMS) is a testbed for retrieval and display of spatially related material. SDMS permits the linkage of large graphical display objects with detail displays and explanations of its smaller components. SDMS combines UNIX workstations, MIT's X Window system, TCP/IP and WAIS information retrieval technology to prototype a means of associating aggregate data linked via spatial orientation. SDMS capitalizes upon and extends previous accomplishments of the Software Technology Branch in the area of Virtual Reality and Automated Library Systems.
The Computational Infrastructure for Geodynamics as a Community of Practice
NASA Astrophysics Data System (ADS)
Hwang, L.; Kellogg, L. H.
2016-12-01
Computational Infrastructure for Geodynamics (CIG), geodynamics.org, originated in 2005 out of community recognition that the efforts of individual or small groups of researchers to develop scientifically-sound software is impossible to sustain, duplicates effort, and makes it difficult for scientists to adopt state-of-the art computational methods that promote new discovery. As a community of practice, participants in CIG share an interest in computational modeling in geodynamics and work together on open source software to build the capacity to support complex, extensible, scalable, interoperable, reliable, and reusable software in an effort to increase the return on investment in scientific software development and increase the quality of the resulting software. The group interacts regularly to learn from each other and better their practices formally through webinar series, workshops, and tutorials and informally through listservs and hackathons. Over the past decade, we have learned that successful scientific software development requires at a minimum: collaboration between domain-expert researchers, software developers and computational scientists; clearly identified and committed lead developer(s); well-defined scientific and computational goals that are regularly evaluated and updated; well-defined benchmarks and testing throughout development; attention throughout development to usability and extensibility; understanding and evaluation of the complexity of dependent libraries; and managed user expectations through education, training, and support. CIG's code donation standards provide the basis for recently formalized best practices in software development (geodynamics.org/cig/dev/best-practices/). Best practices include use of version control; widely used, open source software libraries; extensive test suites; portable configuration and build systems; extensive documentation internal and external to the code; and structured, human readable input formats.
A basic analysis toolkit for biological sequences
Giancarlo, Raffaele; Siragusa, Alessandro; Siragusa, Enrico; Utro, Filippo
2007-01-01
This paper presents a software library, nicknamed BATS, for some basic sequence analysis tasks. Namely, local alignments, via approximate string matching, and global alignments, via longest common subsequence and alignments with affine and concave gap cost functions. Moreover, it also supports filtering operations to select strings from a set and establish their statistical significance, via z-score computation. None of the algorithms is new, but although they are generally regarded as fundamental for sequence analysis, they have not been implemented in a single and consistent software package, as we do here. Therefore, our main contribution is to fill this gap between algorithmic theory and practice by providing an extensible and easy to use software library that includes algorithms for the mentioned string matching and alignment problems. The library consists of C/C++ library functions as well as Perl library functions. It can be interfaced with Bioperl and can also be used as a stand-alone system with a GUI. The software is available at under the GNU GPL. PMID:17877802
DOCLIB: a software library for document processing
NASA Astrophysics Data System (ADS)
Jaeger, Stefan; Zhu, Guangyu; Doermann, David; Chen, Kevin; Sampat, Summit
2006-01-01
Most researchers would agree that research in the field of document processing can benefit tremendously from a common software library through which institutions are able to develop and share research-related software and applications across academic, business, and government domains. However, despite several attempts in the past, the research community still lacks a widely-accepted standard software library for document processing. This paper describes a new library called DOCLIB, which tries to overcome the drawbacks of earlier approaches. Many of DOCLIB's features are unique either in themselves or in their combination with others, e.g. the factory concept for support of different image types, the juxtaposition of image data and metadata, or the add-on mechanism. We cherish the hope that DOCLIB serves the needs of researchers better than previous approaches and will readily be accepted by a larger group of scientists.
NASA Astrophysics Data System (ADS)
Oliveira, Micael
The CECAM Electronic Structure Library (ESL) is a community-driven effort to segregate shared pieces of software as libraries that could be contributed and used by the community. Besides allowing to share the burden of developing and maintaining complex pieces of software, these can also become a target for re-coding by software engineers as hardware evolves, ensuring that electronic structure codes remain at the forefront of HPC trends. In a series of workshops hosted at the CECAM HQ in Lausanne, the tools and infrastructure for the project were prepared, and the first contributions were included and made available online (http://esl.cecam.org). In this talk I will present the different aspects and aims of the ESL and how these can be useful for the electronic structure community.
INTERIM -- Starlink Software Environment
NASA Astrophysics Data System (ADS)
Pearce, Dave; Pavelin, Cliff; Lawden, M. D.
Early versions of this paper were based on a number of other papers produced at a very early stage of the Starlink project. They contained a description of a specific implementation of a subroutine library, speculations on the desirable attributes of a software environment, and future development plans. They reflected the experimental nature of the Starlink software environment at that time. Since then, the situation has changed. The implemented subroutine library, INTERIM_DIR:INTERIM.OLB, is now a well established and widely used piece of software. A completely new Starlink software environment (ADAM) has been developed and distributed. Thus the library released in 1980 as `STARLINK' and now called `INTERIM' has reached the end of its development cycle and is now frozen in its current state, apart from bug corrections. This paper has, therefore, been completely rewritten and restructured to reflect the new situation. Its aim is to describe the facilities of the INTERIM subroutine library as clearly and concisely as possible. It avoids speculation, discussion of design decisions, and announcements of future plans.
Experimental research control software system
NASA Astrophysics Data System (ADS)
Cohn, I. A.; Kovalenko, A. G.; Vystavkin, A. N.
2014-05-01
A software system, intended for automation of a small scale research, has been developed. The software allows one to control equipment, acquire and process data by means of simple scripts. The main purpose of that development is to increase experiment automation easiness, thus significantly reducing experimental setup automation efforts. In particular, minimal programming skills are required and supervisors have no reviewing troubles. Interactions between scripts and equipment are managed automatically, thus allowing to run multiple scripts simultaneously. Unlike well-known data acquisition commercial software systems, the control is performed by an imperative scripting language. This approach eases complex control and data acquisition algorithms implementation. A modular interface library performs interaction with external interfaces. While most widely used interfaces are already implemented, a simple framework is developed for fast implementations of new software and hardware interfaces. While the software is in continuous development with new features being implemented, it is already used in our laboratory for automation of a helium-3 cryostat control and data acquisition. The software is open source and distributed under Gnu Public License.
Software Library for Bruker TopSpin NMR Data Files
DOE Office of Scientific and Technical Information (OSTI.GOV)
A software library for parsing and manipulating frequency-domain data files that have been processed using the Bruker TopSpin NMR software package. In the context of NMR, the term "processed" indicates that the end-user of the Bruker TopSpin NMR software package has (a) Fourier transformed the raw, time-domain data (the Free Induction Decay) into the frequency-domain and (b) has extracted the list of NMR peaks.
Software for Real-Time Analysis of Subsonic Test Shot Accuracy
2014-03-01
used the C++ programming language, the Open Source Computer Vision ( OpenCV ®) software library, and Microsoft Windows® Application Programming...video for comparison through OpenCV image analysis tools. Based on the comparison, the software then computed the coordinates of each shot relative to...DWB researchers wanted to use the Open Source Computer Vision ( OpenCV ) software library for capturing and analyzing frames of video. OpenCV contains
Software ``Best'' Practices: Agile Deconstructed
NASA Astrophysics Data System (ADS)
Fraser, Steven
Software “best” practices depend entirely on context - in terms of the problem domain, the system constructed, the software designers, and the “customers” ultimately deriving value from the system. Agile practices no longer have the luxury of “choosing” small non-mission critical projects with co-located teams. Project stakeholders are selecting and adapting practices based on a combina tion of interest, need and staffing. For example, growing product portfolios through a merger or the acquisition of a company exposes legacy systems to new staff, new software integration challenges, and new ideas. Innovation in communications (tools and processes) to span the growth and contraction of both information and organizations, while managing the adoption of changing software practices, is imperative for success. Traditional web-based tools such as web pages, document libraries, and forums are not suf ficient. A blend of tweeting, blogs, wikis, instant messaging, web-based confer encing, and telepresence creates a new dimension of communication “best” practices.
Student project of optical system analysis API-library development
NASA Astrophysics Data System (ADS)
Ivanova, Tatiana; Zhukova, Tatiana; Dantcaranov, Ruslan; Romanova, Maria; Zhadin, Alexander; Ivanov, Vyacheslav; Kalinkina, Olga
2017-08-01
In the paper API-library software developed by students of Applied and Computer Optics Department (ITMO University) for optical system design is presented. The library performs paraxial and real ray tracing, calculates 3d order (Seidel) aberration and real ray aberration of axis and non-axis beams (wave, lateral, longitudinal, coma, distortion etc.) and finally, approximate wave aberration by Zernike polynomials. Real aperture can be calculated by considering of real rays tracing failure on each surface. So far we assume optical system is centered, with spherical or 2d order aspherical surfaces. Optical glasses can be set directly by refraction index or by dispersion coefficients. The library can be used for education or research purposes in optical system design area. It provides ready to use software functions for optical system simulation and analysis that developer can simply plug into their software development for different purposes, for example for some specific synthesis tasks or investigation of new optimization modes. In the paper we present an example of using the library for development of cemented doublet synthesis software based on Slusarev's methodology. The library is used in optical system optimization recipes course for deep studying of optimization model and its application for optical system design. Development of such software is an excellent experience for students and help to understanding optical image modeling and quality analysis. This development is organized as student group joint project. We try to organize it as a group in real research and development project, so each student has his own role in the project and then use whole library functionality in his own master or bachelor thesis. Working in such group gives students useful experience and opportunity to work as research and development engineer of scientific software in the future.
Delivering customized apps, multimedia and NASA data to libraries and their patrons
NASA Astrophysics Data System (ADS)
Harold, J. B.; Dusenbery, P.; Holland, A.; LaConte, K.; Johnson, A.; Randall, C.; Fitzhugh, G.
2017-12-01
With funding through NASA's Science Mission Directorate, the NASA @ My Library project has been working with the public library community to enhance the STEM literacy of library patrons throughout the nation. One element of the project is to disseminate a variety of materials to 75 partner libraries in order to support their implementation of hands-on, NASA related STEM activities for their patrons. These materials range from very low tech (UV beads) to high tech (a 4.5" Orion Dobsonian telescope), and include an 8" tablet. This tablet provides us with a wide range of possibilities for delivering NASA content. Besides NASA multimedia and real-time spacecraft data, the tablets can be used for interactive activities, including public apps as well as apps specifically designed for this program, such as a green screen app that incorporates NASA imagery as part of a larger storytelling activity. The tablets also include a full sensor suite (magnetometer, light sensor, accelerometer, etc.), allowing us to develop library activities that use the tablet as a measuring device - detecting magnetism in a "Meteorite or Wrong" activity, or using the light sensor as a transit device. The tablet is centrally managed and includes a "kiosk mode", allowing the libraries to use it in either a locked down or conventional mode. The management system also allows us to create a curated collection of apps and multimedia, push out updated software, and collect analytics data on how the tablet is being used. In this presentation we will discuss the library pre-survey that guided our tablet development process, as well as our lessons learned to date, including the practicality and effectiveness of deploying tablets at this scale, their ability to support NASA specific STEM efforts, and what we have learned about library usage.
Investigating interoperability of the LSST data management software stack with Astropy
NASA Astrophysics Data System (ADS)
Jenness, Tim; Bosch, James; Owen, Russell; Parejko, John; Sick, Jonathan; Swinbank, John; de Val-Borro, Miguel; Dubois-Felsmann, Gregory; Lim, K.-T.; Lupton, Robert H.; Schellart, Pim; Krughoff, K. S.; Tollerud, Erik J.
2016-07-01
The Large Synoptic Survey Telescope (LSST) will be an 8.4m optical survey telescope sited in Chile and capable of imaging the entire sky twice a week. The data rate of approximately 15TB per night and the requirements to both issue alerts on transient sources within 60 seconds of observing and create annual data releases means that automated data management systems and data processing pipelines are a key deliverable of the LSST construction project. The LSST data management software has been in development since 2004 and is based on a C++ core with a Python control layer. The software consists of nearly a quarter of a million lines of code covering the system from fundamental WCS and table libraries to pipeline environments and distributed process execution. The Astropy project began in 2011 as an attempt to bring together disparate open source Python projects and build a core standard infrastructure that can be used and built upon by the astronomy community. This project has been phenomenally successful in the years since it has begun and has grown to be the de facto standard for Python software in astronomy. Astropy brings with it considerable expectations from the community on how astronomy Python software should be developed and it is clear that by the time LSST is fully operational in the 2020s many of the prospective users of the LSST software stack will expect it to be fully interoperable with Astropy. In this paper we describe the overlap between the LSST science pipeline software and Astropy software and investigate areas where the LSST software provides new functionality. We also discuss the possibilities of re-engineering the LSST science pipeline software to build upon Astropy, including the option of contributing affliated packages.
Using PATIMDB to Create Bacterial Transposon Insertion Mutant Libraries
Urbach, Jonathan M.; Wei, Tao; Liberati, Nicole; Grenfell-Lee, Daniel; Villanueva, Jacinto; Wu, Gang; Ausubel, Frederick M.
2015-01-01
PATIMDB is a software package for facilitating the generation of transposon mutant insertion libraries. The software has two main functions: process tracking and automated sequence analysis. The process tracking function specifically includes recording the status and fates of multiwell plates and samples in various stages of library construction. Automated sequence analysis refers specifically to the pipeline of sequence analysis starting with ABI files from a sequencing facility and ending with insertion location identifications. The protocols in this unit describe installation and use of PATIMDB software. PMID:19343706
Open source electronic health record and patient data management system for intensive care.
Massaut, Jacques; Reper, Pascal
2008-01-01
In Intensive Care Units, the amount of data to be processed for patients care, the turn over of the patients, the necessity for reliability and for review processes indicate the use of Patient Data Management Systems (PDMS) and electronic health records (EHR). To respond to the needs of an Intensive Care Unit and not to be locked with proprietary software, we developed a PDMS and EHR based on open source software and components. The software was designed as a client-server architecture running on the Linux operating system and powered by the PostgreSQL data base system. The client software was developed in C using GTK interface library. The application offers to the users the following functions: medical notes captures, observations and treatments, nursing charts with administration of medications, scoring systems for classification, and possibilities to encode medical activities for billing processes. Since his deployment in February 2004, the PDMS was used to care more than three thousands patients with the expected software reliability and facilitated data management and review processes. Communications with other medical software were not developed from the start, and are realized by the use of the Mirth HL7 communication engine. Further upgrade of the system will include multi-platform support, use of typed language with static analysis, and configurable interface. The developed system based on open source software components was able to respond to the medical needs of the local ICU environment. The use of OSS for development allowed us to customize the software to the preexisting organization and contributed to the acceptability of the whole system.
A user's guide to Sandia's latin hypercube sampling software : LHS UNIX library/standalone version.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Swiler, Laura Painton; Wyss, Gregory Dane
2004-07-01
This document is a reference guide for the UNIX Library/Standalone version of the Latin Hypercube Sampling Software. This software has been developed to generate Latin hypercube multivariate samples. This version runs on Linux or UNIX platforms. This manual covers the use of the LHS code in a UNIX environment, run either as a standalone program or as a callable library. The underlying code in the UNIX Library/Standalone version of LHS is almost identical to the updated Windows version of LHS released in 1998 (SAND98-0210). However, some modifications were made to customize it for a UNIX environment and as a librarymore » that is called from the DAKOTA environment. This manual covers the use of the LHS code as a library and in the standalone mode under UNIX.« less
NASA Technical Reports Server (NTRS)
Gawadiak, Yuri; Wong, Alan; Maluf, David; Bell, David; Gurram, Mohana; Tran, Khai Peter; Hsu, Jennifer; Yagi, Kenji; Patel, Hemil
2007-01-01
The Program Management Tool (PMT) is a comprehensive, Web-enabled business intelligence software tool for assisting program and project managers within NASA enterprises in gathering, comprehending, and disseminating information on the progress of their programs and projects. The PMT provides planning and management support for implementing NASA programmatic and project management processes and requirements. It provides an online environment for program and line management to develop, communicate, and manage their programs, projects, and tasks in a comprehensive tool suite. The information managed by use of the PMT can include monthly reports as well as data on goals, deliverables, milestones, business processes, personnel, task plans, monthly reports, and budgetary allocations. The PMT provides an intuitive and enhanced Web interface to automate the tedious process of gathering and sharing monthly progress reports, task plans, financial data, and other information on project resources based on technical, schedule, budget, and management criteria and merits. The PMT is consistent with the latest Web standards and software practices, including the use of Extensible Markup Language (XML) for exchanging data and the WebDAV (Web Distributed Authoring and Versioning) protocol for collaborative management of documents. The PMT provides graphical displays of resource allocations in the form of bar and pie charts using Microsoft Excel Visual Basic for Application (VBA) libraries. The PMT has an extensible architecture that enables integration of PMT with other strategic-information software systems, including, for example, the Erasmus reporting system, now part of the NASA Integrated Enterprise Management Program (IEMP) tool suite, at NASA Marshall Space Flight Center (MSFC). The PMT data architecture provides automated and extensive software interfaces and reports to various strategic information systems to eliminate duplicative human entries and minimize data integrity issues among various NASA systems that impact schedules and planning.
A Language for Specifying Compiler Optimizations for Generic Software
DOE Office of Scientific and Technical Information (OSTI.GOV)
Willcock, Jeremiah J.
2007-01-01
Compiler optimization is important to software performance, and modern processor architectures make optimization even more critical. However, many modern software applications use libraries providing high levels of abstraction. Such libraries often hinder effective optimization — the libraries are difficult to analyze using current compiler technology. For example, high-level libraries often use dynamic memory allocation and indirectly expressed control structures, such as iteratorbased loops. Programs using these libraries often cannot achieve an optimal level of performance. On the other hand, software libraries have also been recognized as potentially aiding in program optimization. One proposed implementation of library-based optimization is to allowmore » the library author, or a library user, to define custom analyses and optimizations. Only limited systems have been created to take advantage of this potential, however. One problem in creating a framework for defining new optimizations and analyses is how users are to specify them: implementing them by hand inside a compiler is difficult and prone to errors. Thus, a domain-specific language for librarybased compiler optimizations would be beneficial. Many optimization specification languages have appeared in the literature, but they tend to be either limited in power or unnecessarily difficult to use. Therefore, I have designed, implemented, and evaluated the Pavilion language for specifying program analyses and optimizations, designed for library authors and users. These analyses and optimizations can be based on the implementation of a particular library, its use in a specific program, or on the properties of a broad range of types, expressed through concepts. The new system is intended to provide a high level of expressiveness, even though the intended users are unlikely to be compiler experts.« less
Building a Library Web Server on a Budget.
ERIC Educational Resources Information Center
Orr, Giles
1998-01-01
Presents a method for libraries with limited budgets to create reliable Web servers with existing hardware and free software available via the Internet. Discusses staff, hardware and software requirements, and security; outlines the assembly process. (PEN)
ERIC Educational Resources Information Center
Lofstrom, Mats
Because experience with large information retrieval (IR) and database management (DBM) systems has shown that they are not adequate for the handling of textual material, two Swedish companies--Paralog and AU-System Network--have joined in a venture to develop a software package which combines features from IR and DMB systems to form a Text Data…
Ray, N J; Hannigan, A
1999-05-01
As dental practice management becomes more computer-based, the efficient functioning of the dentist will become dependent on adequate computer literacy. A survey has been carried out into the computer literacy of a cohort of 140 undergraduate dental students at a University Dental School in Ireland (years 1-5), in the academic year 1997-98. Aspects investigated by anonymous questionnaire were: (1) keyboard skills; (2) computer skills; (3) access to computer facilities; (4) software competencies and (5) use of medical library computer facilities. The students are relatively unfamiliar with basic computer hardware and software: 51.1% considered their expertise with computers as "poor"; 34.3% had taken a formal typewriting or computer keyboarding course; 7.9% had taken a formal computer course at university level and 67.2% were without access to computer facilities at their term-time residences. A majority of students had never used either word-processing, spreadsheet, or graphics programs. Programs relating to "informatics" were more popular, such as literature searching, accessing the Internet and the use of e-mail which represent the major use of the computers in the medical library. The lack of experience with computers may be addressed by including suitable computing courses at the secondary level (age 13-18 years) and/or tertiary level (FE/HE) education programmes. Such training may promote greater use of generic softwares, particularly in the library, with a more electronic-based approach to data handling.
Lean and Efficient Software: Whole-Program Optimization of Executables
2015-09-30
libraries. Many levels of library interfaces—where some libraries are dynamically linked and some are provided in binary form only—significantly limit...software at build time. The opportunity: Our objective in this project is to substantially improve the performance, size, and robustness of binary ...executables by using static and dynamic binary program analysis techniques to perform whole-program optimization directly on compiled programs
Integrating MPI and deduplication engines: a software architecture roadmap.
Baksi, Dibyendu
2009-03-01
The objective of this paper is to clarify the major concepts related to architecture and design of patient identity management software systems so that an implementor looking to solve a specific integration problem in the context of a Master Patient Index (MPI) and a deduplication engine can address the relevant issues. The ideas presented are illustrated in the context of a reference use case from Integrating the Health Enterprise Patient Identifier Cross-referencing (IHE PIX) profile. Sound software engineering principles using the latest design paradigm of model driven architecture (MDA) are applied to define different views of the architecture. The main contribution of the paper is a clear software architecture roadmap for implementors of patient identity management systems. Conceptual design in terms of static and dynamic views of the interfaces is provided as an example of platform independent model. This makes the roadmap applicable to any specific solutions of MPI, deduplication library or software platform. Stakeholders in need of integration of MPIs and deduplication engines can evaluate vendor specific solutions and software platform technologies in terms of fundamental concepts and can make informed decisions that preserve investment. This also allows freedom from vendor lock-in and the ability to kick-start integration efforts based on a solid architecture.
National Software Reference Library (NSRL)
National Institute of Standards and Technology Data Gateway
National Software Reference Library (NSRL) (PC database for purchase) A collaboration of the National Institute of Standards and Technology (NIST), the National Institute of Justice (NIJ), the Federal Bureau of Investigation (FBI), the Defense Computer Forensics Laboratory (DCFL),the U.S. Customs Service, software vendors, and state and local law enforement organizations, the NSRL is a tool to assist in fighting crime involving computers.
Managing Scientific Software Complexity with Bocca and CCA
Allan, Benjamin A.; Norris, Boyana; Elwasif, Wael R.; ...
2008-01-01
In high-performance scientific software development, the emphasis is often on short time to first solution. Even when the development of new components mostly reuses existing components or libraries and only small amounts of new code must be created, dealing with the component glue code and software build processes to obtain complete applications is still tedious and error-prone. Component-based software meant to reduce complexity at the application level increases complexity to the extent that the user must learn and remember the interfaces and conventions of the component model itself. To address these needs, we introduce Bocca, the first tool to enablemore » application developers to perform rapid component prototyping while maintaining robust software-engineering practices suitable to HPC environments. Bocca provides project management and a comprehensive build environment for creating and managing applications composed of Common Component Architecture components. Of critical importance for high-performance computing (HPC) applications, Bocca is designed to operate in a language-agnostic way, simultaneously handling components written in any of the languages commonly used in scientific applications: C, C++, Fortran, Python and Java. Bocca automates the tasks related to the component glue code, freeing the user to focus on the scientific aspects of the application. Bocca embraces the philosophy pioneered by Ruby on Rails for web applications: start with something that works, and evolve it to the user's purpose.« less
A PICKSC Science Gateway for enabling the common plasma physicist to run kinetic software
NASA Astrophysics Data System (ADS)
Hu, Q.; Winjum, B. J.; Zonca, A.; Youn, C.; Tsung, F. S.; Mori, W. B.
2017-10-01
Computer simulations offer tremendous opportunities for studying plasmas, ranging from simulations for students that illuminate fundamental educational concepts to research-level simulations that advance scientific knowledge. Nevertheless, there is a significant hurdle to using simulation tools. Users must navigate codes and software libraries, determine how to wrangle output into meaningful plots, and oftentimes confront a significant cyberinfrastructure with powerful computational resources. Science gateways offer a Web-based environment to run simulations without needing to learn or manage the underlying software and computing cyberinfrastructure. We discuss our progress on creating a Science Gateway for the Particle-in-Cell and Kinetic Simulation Software Center that enables users to easily run and analyze kinetic simulations with our software. We envision that this technology could benefit a wide range of plasma physicists, both in the use of our simulation tools as well as in its adaptation for running other plasma simulation software. Supported by NSF under Grant ACI-1339893 and by the UCLA Institute for Digital Research and Education.
A knowledge discovery object model API for Java
Zuyderduyn, Scott D; Jones, Steven JM
2003-01-01
Background Biological data resources have become heterogeneous and derive from multiple sources. This introduces challenges in the management and utilization of this data in software development. Although efforts are underway to create a standard format for the transmission and storage of biological data, this objective has yet to be fully realized. Results This work describes an application programming interface (API) that provides a framework for developing an effective biological knowledge ontology for Java-based software projects. The API provides a robust framework for the data acquisition and management needs of an ontology implementation. In addition, the API contains classes to assist in creating GUIs to represent this data visually. Conclusions The Knowledge Discovery Object Model (KDOM) API is particularly useful for medium to large applications, or for a number of smaller software projects with common characteristics or objectives. KDOM can be coupled effectively with other biologically relevant APIs and classes. Source code, libraries, documentation and examples are available at . PMID:14583100
COSMIC monthly progress report
NASA Technical Reports Server (NTRS)
1994-01-01
Activities of the Computer Software Management and Information Center (COSMIC) are summarized for the month of May 1994. Tables showing the current inventory of programs available from COSMIC are presented and program processing and evaluation activities are summarized. Nine articles were prepared for publication in the NASA Tech Brief Journal. These articles (included in this report) describe the following software items: (1) WFI - Windowing System for Test and Simulation; (2) HZETRN - A Free Space Radiation Transport and Shielding Program; (3) COMGEN-BEM - Composite Model Generation-Boundary Element Method; (4) IDDS - Interactive Data Display System; (5) CET93/PC - Chemical Equilibrium with Transport Properties, 1993; (6) SDVIC - Sub-pixel Digital Video Image Correlation; (7) TRASYS - Thermal Radiation Analyzer System (HP9000 Series 700/800 Version without NASADIG); (8) NASADIG - NASA Device Independent Graphics Library, Version 6.0 (VAX VMS Version); and (9) NASADIG - NASA Device Independent Graphics Library, Version 6.0 (UNIX Version). Activities in the areas of marketing, customer service, benefits identification, maintenance and support, and dissemination are also described along with a budget summary.
ERIC Educational Resources Information Center
McGlamery, Susan; Coffman, Steve
2000-01-01
Explores the possibility of using Web contact center software to offer reference assistance to remote users. Discusses a project by the Metropolitan Cooperative Library System/Santiago Library System consortium to test contact center software and to develop a virtual reference network. (Author/LRW)
GIS Application System Design Applied to Information Monitoring
NASA Astrophysics Data System (ADS)
Qun, Zhou; Yujin, Yuan; Yuena, Kang
Natural environment information management system involves on-line instrument monitoring, data communications, database establishment, information management software development and so on. Its core lies in collecting effective and reliable environmental information, increasing utilization rate and sharing degree of environment information by advanced information technology, and maximizingly providing timely and scientific foundation for environmental monitoring and management. This thesis adopts C# plug-in application development and uses a set of complete embedded GIS component libraries and tools libraries provided by GIS Engine to finish the core of plug-in GIS application framework, namely, the design and implementation of framework host program and each functional plug-in, as well as the design and implementation of plug-in GIS application framework platform. This thesis adopts the advantages of development technique of dynamic plug-in loading configuration, quickly establishes GIS application by visualized component collaborative modeling and realizes GIS application integration. The developed platform is applicable to any application integration related to GIS application (ESRI platform) and can be as basis development platform of GIS application development.
Giancarlo, R; Scaturro, D; Utro, F
2015-02-01
The prediction of the number of clusters in a dataset, in particular microarrays, is a fundamental task in biological data analysis, usually performed via validation measures. Unfortunately, it has received very little attention and in fact there is a growing need for software tools/libraries dedicated to it. Here we present ValWorkBench, a software library consisting of eleven well known validation measures, together with novel heuristic approximations for some of them. The main objective of this paper is to provide the interested researcher with the full software documentation of an open source cluster validation platform having the main features of being easily extendible in a homogeneous way and of offering software components that can be readily re-used. Consequently, the focus of the presentation is on the architecture of the library, since it provides an essential map that can be used to access the full software documentation, which is available at the supplementary material website [1]. The mentioned main features of ValWorkBench are also discussed and exemplified, with emphasis on software abstraction design and re-usability. A comparison with existing cluster validation software libraries, mainly in terms of the mentioned features, is also offered. It suggests that ValWorkBench is a much needed contribution to the microarray software development/algorithm engineering community. For completeness, it is important to mention that previous accurate algorithmic experimental analysis of the relative merits of each of the implemented measures [19,23,25], carried out specifically on microarray data, gives useful insights on the effectiveness of ValWorkBench for cluster validation to researchers in the microarray community interested in its use for the mentioned task. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Electronic Library: A TERI Experiment.
ERIC Educational Resources Information Center
Kar, Debal C.; Deb, Subrata; Kumar, Satish
2003-01-01
Discusses the development of Electronic Library at TERI (The Energy and Resources Institute, New Delhi). Highlights include: hardware and software used; the digital library/Virtual Electronic Library; directory of Internet journals; virtual reference resources; electronic collection/Physical Electronic Library; downloaded online full-length…
Research in image management and access
NASA Technical Reports Server (NTRS)
Vondran, Raymond F.; Barron, Billy J.
1993-01-01
Presently, the problem of over-all library system design has been compounded by the accretion of both function and structure to a basic framework of requirements. While more device power has led to increased functionality, opportunities for reducing system complexity at the user interface level have not always been pursued with equal zeal. The purpose of this book is therefore to set forth and examine these opportunities, within the general framework of human factors research in man-machine interfaces. Human factors may be viewed as a series of trade-off decisions among four polarized objectives: machine resources and user specifications; functionality and user requirements. In the past, a limiting factor was the availability of systems. However, in the last two years, over one hundred libraries supported by many different software configurations have been added to the Internet. This document includes a statistical analysis of human responses to five Internet library systems by key features, development of the ideal online catalog system, and ideal online catalog systems for libraries and information centers.
ERIC Educational Resources Information Center
Goodgion, Laurel; And Others
1986-01-01
Eight articles in special supplement to "Library Journal" and "School Library Journal" cover a computer program called "Byte into Books"; microcomputers and the small library; creating databases with students; online searching with a microcomputer; quality automation software; Meckler Publishing Company's…
ERIC Educational Resources Information Center
Library Computing, 1985
1985-01-01
Special supplement to "Library Journal" and "School Library Journal" covers topics of interest to school, public, academic, and special libraries planning for automation: microcomputer use, readings in automation, online searching, databases of microcomputer software, public access to microcomputers, circulation, creating a…
Cross-Platform Mobile Application Development: A Pattern-Based Approach
2012-03-01
Additionally, developers should be aware of different hardware capabilities such as external SD cards and forward facing cameras. Finally, each...applications are written. Additionally, developers should be aware of different hardware capabilities such as external SD cards and forward facing cameras... iTunes library, allowing the user to update software and manage content on each device. However, in iOS5, the PC Free feature removes this constraint
ENCOMPASS: A SAGA based environment for the compositon of programs and specifications, appendix A
NASA Technical Reports Server (NTRS)
Terwilliger, Robert B.; Campbell, Roy H.
1985-01-01
ENCOMPASS is an example integrated software engineering environment being constructed by the SAGA project. ENCOMPASS supports the specification, design, construction and maintenance of efficient, validated, and verified programs in a modular programming language. The life cycle paradigm, schema of software configurations, and hierarchical library structure used by ENCOMPASS is presented. In ENCOMPASS, the software life cycle is viewed as a sequence of developments, each of which reuses components from the previous ones. Each development proceeds through the phases planning, requirements definition, validation, design, implementation, and system integration. The components in a software system are modeled as entities which have relationships between them. An entity may have different versions and different views of the same project are allowed. The simple entities supported by ENCOMPASS may be combined into modules which may be collected into projects. ENCOMPASS supports multiple programmers and projects using a hierarchical library system containing a workspace for each programmer; a project library for each project, and a global library common to all projects.
Using graph approach for managing connectivity in integrative landscape modelling
NASA Astrophysics Data System (ADS)
Rabotin, Michael; Fabre, Jean-Christophe; Libres, Aline; Lagacherie, Philippe; Crevoisier, David; Moussa, Roger
2013-04-01
In cultivated landscapes, a lot of landscape elements such as field boundaries, ditches or banks strongly impact water flows, mass and energy fluxes. At the watershed scale, these impacts are strongly conditionned by the connectivity of these landscape elements. An accurate representation of these elements and of their complex spatial arrangements is therefore of great importance for modelling and predicting these impacts.We developped in the framework of the OpenFLUID platform (Software Environment for Modelling Fluxes in Landscapes) a digital landscape representation that takes into account the spatial variabilities and connectivities of diverse landscape elements through the application of the graph theory concepts. The proposed landscape representation consider spatial units connected together to represent the flux exchanges or any other information exchanges. Each spatial unit of the landscape is represented as a node of a graph and relations between units as graph connections. The connections are of two types - parent-child connection and up/downstream connection - which allows OpenFLUID to handle hierarchical graphs. Connections can also carry informations and graph evolution during simulation is possible (connections or elements modifications). This graph approach allows a better genericity on landscape representation, a management of complex connections and facilitate development of new landscape representation algorithms. Graph management is fully operational in OpenFLUID for developers or modelers ; and several graph tools are available such as graph traversal algorithms or graph displays. Graph representation can be managed i) manually by the user (for example in simple catchments) through XML-based files in easily editable and readable format or ii) by using methods of the OpenFLUID-landr library which is an OpenFLUID library relying on common open-source spatial libraries (ogr vector, geos topologic vector and gdal raster libraries). OpenFLUID-landr library has been developed in order i) to be used with no GIS expert skills needed (common gis formats can be read and simplified spatial management is provided), ii) to easily develop adapted rules of landscape discretization and graph creation to follow spatialized model requirements and iii) to allow model developers to manage dynamic and complex spatial topology. Graph management in OpenFLUID are shown with i) examples of hydrological modelizations on complex farmed landscapes and ii) the new implementation of Geo-MHYDAS tool based on the OpenFLUID-landr library, which allows to discretize a landscape and create graph structure for the MHYDAS model requirements.
Arbitrating Control of Control and Display Units
NASA Technical Reports Server (NTRS)
Sugden, Paul C.
2007-01-01
The ARINC 739 Switch is a computer program that arbitrates control of two multi-function control and display units (MCDUs) between (1) a commercial flight-management computer (FMC) and (2) NASA software used in research on transport aircraft. (MCDUs are the primary interfaces between pilots and FMCs on many commercial aircraft.) This program was recently redesigned into a software library that can be embedded in research application programs. As part of the redesign, this software was combined with software for creating custom pages of information to be displayed on a CDU. This software commands independent switching of the left (pilot s) and right (copilot s) MCDUs. For example, a custom CDU page can control the left CDU while the FMC controls the right CDU. The software uses menu keys to switch control of the CDU between the FMC or a custom CDU page. The software provides an interface that enables custom CDU pages to insert keystrokes into the FMC s CDU input interface. This feature allows the custom CDU pages to manipulate the FMC as if it were a pilot.
Development of a new software for analyzing 3-D fracture network
NASA Astrophysics Data System (ADS)
Um, Jeong-Gi; Noh, Young-Hwan; Choi, Yosoon
2014-05-01
A new software is presented to analyze fracture network in 3-D. Recently, we completed the software package based on information given in EGU2013. The software consists of several modules that play roles in management of borehole data, stochastic modelling of fracture network, construction of analysis domain, visualization of fracture geometry in 3-D, calculation of equivalent pipes and production of cross-section diagrams. Intel Parallel Studio XE 2013, Visual Studio.NET 2010 and the open source VTK library were utilized as development tools to efficiently implement the modules and the graphical user interface of the software. A case study was performed to analyze 3-D fracture network system at the Upper Devonian Grosmont Formation in Alberta, Canada. The results have suggested that the developed software is effective in modelling and visualizing 3-D fracture network system, and can provide useful information to tackle the geomechanical problems related to strength, deformability and hydraulic behaviours of the fractured rock masses. This presentation describes the concept and details of the development and implementation of the software.
PTools: an opensource molecular docking library
Saladin, Adrien; Fiorucci, Sébastien; Poulain, Pierre; Prévost, Chantal; Zacharias, Martin
2009-01-01
Background Macromolecular docking is a challenging field of bioinformatics. Developing new algorithms is a slow process generally involving routine tasks that should be found in a robust library and not programmed from scratch for every new software application. Results We present an object-oriented Python/C++ library to help the development of new docking methods. This library contains low-level routines like PDB-format manipulation functions as well as high-level tools for docking and analyzing results. We also illustrate the ease of use of this library with the detailed implementation of a 3-body docking procedure. Conclusion The PTools library can handle molecules at coarse-grained or atomic resolution and allows users to rapidly develop new software. The library is already in use for protein-protein and protein-DNA docking with the ATTRACT program and for simulation analysis. This library is freely available under the GNU GPL license, together with detailed documentation. PMID:19409097
PTools: an opensource molecular docking library.
Saladin, Adrien; Fiorucci, Sébastien; Poulain, Pierre; Prévost, Chantal; Zacharias, Martin
2009-05-01
Macromolecular docking is a challenging field of bioinformatics. Developing new algorithms is a slow process generally involving routine tasks that should be found in a robust library and not programmed from scratch for every new software application. We present an object-oriented Python/C++ library to help the development of new docking methods. This library contains low-level routines like PDB-format manipulation functions as well as high-level tools for docking and analyzing results. We also illustrate the ease of use of this library with the detailed implementation of a 3-body docking procedure. The PTools library can handle molecules at coarse-grained or atomic resolution and allows users to rapidly develop new software. The library is already in use for protein-protein and protein-DNA docking with the ATTRACT program and for simulation analysis. This library is freely available under the GNU GPL license, together with detailed documentation.
The Impact of Electronic Media on Research and Education. Role of Libraries in Information Service
NASA Astrophysics Data System (ADS)
Evstigneeva, Galina; Wegner, Bernd
The paper gives a short survey how electronic media have changed the working conditions at research institutions, universities and higher schools, which new possibilities in research and education emerge from this, and which problems will have to be solved with respect to this in the future. We shall concentrate our attention on the role of the libraries as information brokers in such an environment. In this context archiving of electronic documents, software and access systems will be addressed as one of the challenging future tasks of libraries. Each of these themes may serve as a subject for a seminar on its own. Hence the paper only can highlight some of these features referring to more detailed work elsewhere. At the beginning the main classes of electronic offers providing infrastructure for research and education are introduced. The role of editors, publishers, software producers and web managers is shortly discussed. Information gateways and information brokers are important for the distribution of these offers. The impact of electronic media on research and education is described by representative examples of different types. Some final conclusions deal with the problems to be solved in the future when electronic media will occupy the central place in the daily work of professionals, researchers and teachers.
Challenges in Managing Trustworthy Large-scale Digital Science
NASA Astrophysics Data System (ADS)
Evans, B. J. K.
2017-12-01
The increased use of large-scale international digital science has opened a number of challenges for managing, handling, using and preserving scientific information. The large volumes of information are driven by three main categories - model outputs including coupled models and ensembles, data products that have been processing to a level of usability, and increasingly heuristically driven data analysis. These data products are increasingly the ones that are usable by the broad communities, and far in excess of the raw instruments data outputs. The data, software and workflows are then shared and replicated to allow broad use at an international scale, which places further demands of infrastructure to support how the information is managed reliably across distributed resources. Users necessarily rely on these underlying "black boxes" so that they are productive to produce new scientific outcomes. The software for these systems depend on computational infrastructure, software interconnected systems, and information capture systems. This ranges from the fundamentals of the reliability of the compute hardware, system software stacks and libraries, and the model software. Due to these complexities and capacity of the infrastructure, there is an increased emphasis of transparency of the approach and robustness of the methods over the full reproducibility. Furthermore, with large volume data management, it is increasingly difficult to store the historical versions of all model and derived data. Instead, the emphasis is on the ability to access the updated products and the reliability by which both previous outcomes are still relevant and can be updated for the new information. We will discuss these challenges and some of the approaches underway that are being used to address these issues.
Climbing the Mountain: The Americans with Disabilities Act and Libraries.
ERIC Educational Resources Information Center
Lenn, Katy
1993-01-01
Provides suggestions for academic libraries to comply with the Americans with Disabilities Act. Topics addressed are planning, including patron surveys; physical access to buildings; signage; library security systems; furniture; library services; staff development; telephone access; library acquisitions; and equipment and software. A sidebar lists…
Global Land Cover Facility About GLCF Research Publications Data & Products Gallery Library Services Contact Site Map Go Library Documents Proposal Reports Publications FAQ Display Materials Release News Archive Library * Display Materials * Documents * News Archive * Software e-link 4321
DOE Office of Scientific and Technical Information (OSTI.GOV)
Peles, Slaven
2016-11-06
GridKit is a software development kit for interfacing power systems and power grid application software with high performance computing (HPC) libraries developed at National Labs and academia. It is also intended as interoperability layer between different numerical libraries. GridKit is not a standalone application, but comes with a suite of test examples illustrating possible usage.
Software, Copyright, and Site-License Agreements: Publishers' Perspective of Library Practice.
ERIC Educational Resources Information Center
Happer, Stephanie K.
Thirty-one academic publishers of stand-alone software and book/disk packages were surveyed to determine whether publishers have addressed the copyright issues inherent in circulating these packages within the library environment. Twenty-two questionnaires were returned, providing a 71% return rate. There were 18 usable questionnaires. Publishers…
Copyright Handbook for Videotapes and Microcomputer Software.
ERIC Educational Resources Information Center
Ensign, David; And Others
Developed to help library media personnel, administrators, and educators understand and work with the copyright law as it applies to videotape and microcomputer software, this handbook provides: (1) an overview of copyright, including rights granted to copyright holders and libraries and court interpretation of the copyright law; (2) suggestions…
User’s Manual for the Modular Analysis-Package Libraries ANAPAC and TRANL
1977-09-01
number) Computer software Fourier transforms Computer software library Interpolation software Digitized data...disregarded to give the user a simplified plot. (b) The last digit of ISPACE determines the type of line to be drawn, provided KODE is not...negative. If the last digit of ISPACE is 0 a solid line is drawn 1 a dashed line is drawn - - - 2 a dotted line is drawn .... 3 a dash-dot line is
Libraries Online!: Microsoft Partnering with American Library Association (ALA).
ERIC Educational Resources Information Center
Machovec, George S., Ed.
1995-01-01
Describes Libraries Online, a pilot project created by Microsoft and the American Library Association to develop ways to provide access to information technologies to underserved populations. Presents the nine public libraries that will receive cash grants, staff training, computer hardware and software, and technical support to help support local…
Wyoming: Open Range for Library Technology.
ERIC Educational Resources Information Center
Maul, Helen Meadors
1996-01-01
Describes the development of library technology and the need for telecommunications in a state with a lack of population density. Topics include the state library's role; shared library resources and library networks; government information; the Wyoming State Home Page on the World Wide Web; Ariel software; network coordinating; and central…
Introduction to Minicomputers in Federal Libraries.
ERIC Educational Resources Information Center
Young, Micki Jo; And Others
This book for library administrators and Federal library staff covers the application of minicomputers in Federal libraries and offers a review of minicomputer technology. A brief overview of automation explains computer technology, hardware, and software. The role of computers in libraries is examined in terms of the history of computers and…
Sex, Kids, and the Public Library.
ERIC Educational Resources Information Center
Mason, Marilyn Gell
1997-01-01
Considers issues relating to children's access to information on the Internet in public libraries. Topics include censorship versus selection; First Amendment rights; the American Library Association's challenge to the Communications Decency Act; filtering software; and the role of the public library. (LRW)
Electronic Health Record for Intensive Care based on Usual Windows Based Software.
Reper, Arnaud; Reper, Pascal
2015-08-01
In Intensive Care Units, the amount of data to be processed for patients care, the turn over of the patients, the necessity for reliability and for review processes indicate the use of Patient Data Management Systems (PDMS) and electronic health records (EHR). To respond to the needs of an Intensive Care Unit and not to be locked with proprietary software, we developed an EHR based on usual software and components. The software was designed as a client-server architecture running on the Windows operating system and powered by the access data base system. The client software was developed using Visual Basic interface library. The application offers to the users the following functions: medical notes captures, observations and treatments, nursing charts with administration of medications, scoring systems for classification, and possibilities to encode medical activities for billing processes. Since his deployment in September 2004, the EHR was used to care more than five thousands patients with the expected software reliability and facilitated data management and review processes. Communications with other medical software were not developed from the start, and are realized by the use of basic functionalities communication engine. Further upgrade of the system will include multi-platform support, use of typed language with static analysis, and configurable interface. The developed system based on usual software components was able to respond to the medical needs of the local ICU environment. The use of Windows for development allowed us to customize the software to the preexisting organization and contributed to the acceptability of the whole system.
Three-Dimensional Audio Client Library
NASA Technical Reports Server (NTRS)
Rizzi, Stephen A.
2005-01-01
The Three-Dimensional Audio Client Library (3DAudio library) is a group of software routines written to facilitate development of both stand-alone (audio only) and immersive virtual-reality application programs that utilize three-dimensional audio displays. The library is intended to enable the development of three-dimensional audio client application programs by use of a code base common to multiple audio server computers. The 3DAudio library calls vendor-specific audio client libraries and currently supports the AuSIM Gold-Server and Lake Huron audio servers. 3DAudio library routines contain common functions for (1) initiation and termination of a client/audio server session, (2) configuration-file input, (3) positioning functions, (4) coordinate transformations, (5) audio transport functions, (6) rendering functions, (7) debugging functions, and (8) event-list-sequencing functions. The 3DAudio software is written in the C++ programming language and currently operates under the Linux, IRIX, and Windows operating systems.
NASA Astrophysics Data System (ADS)
Knox, S.; Meier, P.; Mohammed, K.; Korteling, B.; Matrosov, E. S.; Hurford, A.; Huskova, I.; Harou, J. J.; Rosenberg, D. E.; Thilmant, A.; Medellin-Azuara, J.; Wicks, J.
2015-12-01
Capacity expansion on resource networks is essential to adapting to economic and population growth and pressures such as climate change. Engineered infrastructure systems such as water, energy, or transport networks require sophisticated and bespoke models to refine management and investment strategies. Successful modeling of such complex systems relies on good data management and advanced methods to visualize and share data.Engineered infrastructure systems are often represented as networks of nodes and links with operating rules describing their interactions. Infrastructure system management and planning can be abstracted to simulating or optimizing new operations and extensions of the network. By separating the data storage of abstract networks from manipulation and modeling we have created a system where infrastructure modeling across various domains is facilitated.We introduce Hydra Platform, a Free Open Source Software designed for analysts and modelers to store, manage and share network topology and data. Hydra Platform is a Python library with a web service layer for remote applications, called Apps, to connect. Apps serve various functions including network or results visualization, data export (e.g. into a proprietary format) or model execution. This Client-Server architecture allows users to manipulate and share centrally stored data. XML templates allow a standardised description of the data structure required for storing network data such that it is compatible with specific models.Hydra Platform represents networks in an abstract way and is therefore not bound to a single modeling domain. It is the Apps that create domain-specific functionality. Using Apps researchers from different domains can incorporate different models within the same network enabling cross-disciplinary modeling while minimizing errors and streamlining data sharing. Separating the Python library from the web layer allows developers to natively expand the software or build web-based apps in other languages for remote functionality. Partner CH2M is developing a commercial user-interface for Hydra Platform however custom interfaces and visualization tools can be built. Hydra Platform is available on GitHub while Apps will be shared on a central repository.
Documentation Library Application (DLA) Version 2.0.0.1, User Guide
2013-05-08
document DIScard chotnoes and undo <ht:dc-oot. View AN Library ~IR8librMY ~ooc~~.tiOn Llbt~ry View W AA Library View ~MI Libr -ary...Windows XP and access to the DLA SQL Server database. To install the DLA, navigate to N:\\Dept 161\\3 - PRODUCTS\\ Software Installation...Health Research Center. You should see the DLA menu item listed under the NHRC programs there. Contact the DLA software POC if you encounter any
Development of an Ada package library
NASA Technical Reports Server (NTRS)
Burton, Bruce; Broido, Michael
1986-01-01
A usable prototype Ada package library was developed and is currently being evaluated for use in large software development efforts. The library system is comprised of an Ada-oriented design language used to facilitate the collection of reuse information, a relational data base to store reuse information, a set of reusable Ada components and tools, and a set of guidelines governing the system's use. The prototyping exercise is discussed and the lessons learned from it have led to the definition of a comprehensive tool set to facilitate software reuse.
HiCAT Software Infrastructure: Safe hardware control with object oriented Python
NASA Astrophysics Data System (ADS)
Moriarty, Christopher; Brooks, Keira; Soummer, Remi
2018-01-01
High contrast imaging for Complex Aperture Telescopes (HiCAT) is a testbed designed to demonstrate coronagraphy and wavefront control for segmented on-axis space telescopes such as envisioned for LUVOIR. To limit the air movements in the testbed room, software interfaces for several different hardware components were developed to completely automate operations. When developing software interfaces for many different pieces of hardware, unhandled errors are commonplace and can prevent the software from properly closing a hardware resource. Some fragile components (e.g. deformable mirrors) can be permanently damaged because of this. We present an object oriented Python-based infrastructure to safely automate hardware control and optical experiments. Specifically, conducting high-contrast imaging experiments while monitoring humidity and power status along with graceful shutdown processes even for unexpected errors. Python contains a construct called a “context manager” that allows you define code to run when a resource is opened or closed. Context managers ensure that a resource is properly closed, even when unhandled errors occur. Harnessing the context manager design, we also use Python’s multiprocessing library to monitor humidity and power status without interrupting the experiment. Upon detecting a safety problem, the master process sends an event to the child process that triggers the context managers to gracefully close any open resources. This infrastructure allows us to queue up several experiments and safely operate the testbed without a human in the loop.
The Social Ties That Bind. Online Treasures
ERIC Educational Resources Information Center
Balas, Janet L.
2006-01-01
This article discusses how, in the new world of the information age, libraries must seek to bring relevant services to their technically savvy patrons. This includes designing a library that can serve not only the geographic community, but also the virtual online community. Software used to create online communities is known as social software and…
Using Web Metric Software to Drive: Mobile Website Development
ERIC Educational Resources Information Center
Tidal, Junior
2011-01-01
Many libraries have developed mobile versions of their websites. In order to understand their users, web developers have conducted both usability tests and focus groups, yet analytical software and web server logs can also be used to better understand users. Using data collected from these tools, the Ursula C. Schwerin Library has made informed…
Linux Adventures on a Laptop. Computers in Small Libraries
ERIC Educational Resources Information Center
Roberts, Gary
2005-01-01
This article discusses the pros and cons of open source software, such as Linux. It asserts that despite the technical difficulties of installing and maintaining this type of software, ultimately it is helpful in terms of knowledge acquisition and as a beneficial investment librarians can make in themselves, their libraries, and their patrons.…
A Survey on the Use of Microcomputers in Special Libraries.
ERIC Educational Resources Information Center
Krieger, Tillie
1986-01-01
Describes a survey on the use of microcomputers in special libraries. The discussion of the findings includes types of hardware and software in use; applications in public services, technical processes, and administrative tasks; data back-up techniques; training received; evaluation of software; and future plans for microcomputer applications. (1…
After Losing Users in Catalogs, Libraries Find Better Search Software
ERIC Educational Resources Information Center
Parry, Marc
2010-01-01
Traditional online library catalogs do not tend to order search results by ranked relevance, and they can befuddle users with clunky interfaces. However, that's changing because of two technology trends. First, a growing number of universities are shelling out serious money for sophisticated software that makes exploring their collections more…
ALSC 2011 Notable Videos, Recordings & Interactive Software
ERIC Educational Resources Information Center
School Library Journal, 2011
2011-01-01
This article presents the Notable Children's Videos, Recordings, and Interactive Software for Kids lists which are compiled annually by committees of the Association for Library Service to children (ALSC), a division of the American Library Association (ALA). These lists were released in January 2011 at the ALA Midwinter meeting in San Diego,…
NASA Technical Reports Server (NTRS)
Katz, Daniel
2004-01-01
PVM Wrapper is a software library that makes it possible for code that utilizes the Parallel Virtual Machine (PVM) software library to run using the message-passing interface (MPI) software library, without needing to rewrite the entire code. PVM and MPI are the two most common software libraries used for applications that involve passing of messages among parallel computers. Since about 1996, MPI has been the de facto standard. Codes written when PVM was popular often feature patterns of {"initsend," "pack," "send"} and {"receive," "unpack"} calls. In many cases, these calls are not contiguous and one set of calls may even exist over multiple subroutines. These characteristics make it difficult to obtain equivalent functionality via a single MPI "send" call. Because PVM Wrapper is written to run with MPI- 1.2, some PVM functions are not permitted and must be replaced - a task that requires some programming expertise. The "pvm_spawn" and "pvm_parent" function calls are not replaced, but a programmer can use "mpirun" and knowledge of the ranks of parent and child tasks with supplied macroinstructions to enable execution of codes that use "pvm_spawn" and "pvm_parent."
ProteoWizard: open source software for rapid proteomics tools development.
Kessner, Darren; Chambers, Matt; Burke, Robert; Agus, David; Mallick, Parag
2008-11-01
The ProteoWizard software project provides a modular and extensible set of open-source, cross-platform tools and libraries. The tools perform proteomics data analyses; the libraries enable rapid tool creation by providing a robust, pluggable development framework that simplifies and unifies data file access, and performs standard proteomics and LCMS dataset computations. The library contains readers and writers of the mzML data format, which has been written using modern C++ techniques and design principles and supports a variety of platforms with native compilers. The software has been specifically released under the Apache v2 license to ensure it can be used in both academic and commercial projects. In addition to the library, we also introduce a rapidly growing set of companion tools whose implementation helps to illustrate the simplicity of developing applications on top of the ProteoWizard library. Cross-platform software that compiles using native compilers (i.e. GCC on Linux, MSVC on Windows and XCode on OSX) is available for download free of charge, at http://proteowizard.sourceforge.net. This website also provides code examples, and documentation. It is our hope the ProteoWizard project will become a standard platform for proteomics development; consequently, code use, contribution and further development are strongly encouraged.
An interactive environment for the analysis of large Earth observation and model data sets
NASA Technical Reports Server (NTRS)
Bowman, Kenneth P.; Walsh, John E.; Wilhelmson, Robert B.
1993-01-01
We propose to develop an interactive environment for the analysis of large Earth science observation and model data sets. We will use a standard scientific data storage format and a large capacity (greater than 20 GB) optical disk system for data management; develop libraries for coordinate transformation and regridding of data sets; modify the NCSA X Image and X DataSlice software for typical Earth observation data sets by including map transformations and missing data handling; develop analysis tools for common mathematical and statistical operations; integrate the components described above into a system for the analysis and comparison of observations and model results; and distribute software and documentation to the scientific community.
An interactive environment for the analysis of large Earth observation and model data sets
NASA Technical Reports Server (NTRS)
Bowman, Kenneth P.; Walsh, John E.; Wilhelmson, Robert B.
1992-01-01
We propose to develop an interactive environment for the analysis of large Earth science observation and model data sets. We will use a standard scientific data storage format and a large capacity (greater than 20 GB) optical disk system for data management; develop libraries for coordinate transformation and regridding of data sets; modify the NCSA X Image and X Data Slice software for typical Earth observation data sets by including map transformations and missing data handling; develop analysis tools for common mathematical and statistical operations; integrate the components described above into a system for the analysis and comparison of observations and model results; and distribute software and documentation to the scientific community.
How Much Security Does Your Library Need?
ERIC Educational Resources Information Center
Banerjee, Kyle
2003-01-01
Explains how to keep library systems healthy and functioning by taking sensible security measures. Examines why hackers would target library systems and how library systems are compromised. Describes tools that can help, including: firewalls; antivirus software; alarms; network analysis tools; and encryption. Identifies several strategies for…
NASA Technical Reports Server (NTRS)
Nelson, Michael L.
1997-01-01
Our objective was to study the feasibility of extending the Dienst protocol to enable a multi-discipline, multi-format digital library. We implemented two new technologies: cluster functionality and publishing buckets. We have designed a possible implementation of clusters and buckets, and have prototyped some aspects of the resultant digital library. Currently, digital libraries are segregated by the disciplines they serve (computer science, aeronautics, etc.), and by the format of their holdings (reports, software, datasets, etc.). NCSTRL+ is a multi-discipline, multi-format digital library (DL) prototype created to explore the feasibility of the design and implementation issues involved with created a unified, canonical scientific and technical information (STI) DL. NCSTRL+ is based on the Networked Computer Science Technical Report Library (NCSTRL), a World Wide Web (WWW) accessible DL that provides access to over 80 university departments and laboratories. We have extended the Dienst protocol (version 4.1.8), the protocol underlying NCSTRL, to provide the ability to cluster independent collections into a logically centralized DL based upon subject category classification, type of organization, and genre of material. The concept of buckets provides a mechanism for publishing and managing logically linked entities with multiple data formats.
Automated Rocket Propulsion Test Management
NASA Technical Reports Server (NTRS)
Walters, Ian; Nelson, Cheryl; Jones, Helene
2007-01-01
The Rocket Propulsion Test-Automated Management System provides a central location for managing activities associated with Rocket Propulsion Test Management Board, National Rocket Propulsion Test Alliance, and the Senior Steering Group business management activities. A set of authorized users, both on-site and off-site with regard to Stennis Space Center (SSC), can access the system through a Web interface. Web-based forms are used for user input with generation and electronic distribution of reports easily accessible. Major functions managed by this software include meeting agenda management, meeting minutes, action requests, action items, directives, and recommendations. Additional functions include electronic review, approval, and signatures. A repository/library of documents is available for users, and all items are tracked in the system by unique identification numbers and status (open, closed, percent complete, etc.). The system also provides queries and version control for input of all items.
ERIC Educational Resources Information Center
Phenix, Katharine; And Others
1990-01-01
Five articles discuss information technology in libraries: (1) "Software for Libraries" (Katharine Phenix); (2) "Online Update: European Online Services" (Martin Kesselman); (3) "Connect Time: Online Pricing Breakthroughs" (Barbara Quint); (4) "Microcomputing: Micro Biology Computer Viruses" (James LaRue);…
Data publishing - visions of the future
NASA Astrophysics Data System (ADS)
Schäfer, Leonie; Klump, Jens; Bertelmann, Roland; Klar, Jochen; Enke, Harry; Rathmann, Torsten; Koudela, Daniela; Köhler, Klaus; Müller-Pfefferkorn, Ralph; van Uytvanck, Dieter; Strathmann, Stefan; Engelhardt, Claudia
2013-04-01
This poster describes future scenarios of information infrastructures in science and other fields of research. The scenarios presented are based on practical experience resulting from interaction with research data in a research center and its library, and further enriched by the results of a baseline study of existing data repositories and data infrastructures. The baseline study was conducted as part of the project "Requirements for a multi-disciplinary research data infrastructure (Radieschen)", which is funded by the German Research Foundation (DFG). Current changes in information infrastructures pose new challenges to libraries and scientific journals, which both act as information service providers, facilitating access to digital media, support publications of research data and enable their long-term archiving. Digital media and research data open new aspects in the field of activity of libraries and scientific journals. What will a library of the future look like? Will a library purely serve as interface to data centres? Will libraries and data centres merge into a new service unit? Will a future library be the interface to academic cloud services? Scientific journals already converted from mostly print editions to print and e-journals. What type of journals will emerge in the future? Is there a role for data-centred journals? Will there be journals to publish software code to make this type of research result citable and a part of the record of science? Just as users evolve from being consumers of information into producers, the role of information service providers, such as libraries, changes from a purely supporting to a contributing role. Furthermore, the role of the library changes from a central point of access for the search of publications to an important link in the value-adding chain from author to publication. Journals for software publication might be another vision for the future in data publishing. Software forms the missing link between big data collected by experiments, monitoring or simulation. In order to verify the results presented, a paper should also report on the process of data analysis applied to the data sets stored at data centers. In this case data, software, and interpretation supplement each other as a trustworthy, reproducible presentation of research results. Another approach is suggested by researchers of the EU-funded project "Liquid Publications" (1). Instead of traditional publications the researchers propose liquid journals as evolving collections of links and material, and recommend new methods in reviewing and assessing publications. Another point of interest are workflows in data publication. The commonly used model to depict the data life cycle might look appealing but does not necessarily represent reality. The model proposed by Treloar et. al. (2) offers a better approach to depict transition of research data between different domains of use, e.g. from the group domain to the public domain. However, several questions need to be addressed, such as how to avoid the loss of contextual information during transitions between domains, and the influence of the size of the data on the workflow process. This poster aims to present different scenarios of the future - from the point of view of researchers, libraries and scientific journals and will invite for further discussion. (1) LiquidPub Green Paper, https://dev.liquidpub.org/svn/liquidpub/papers/deliverables/LPGreenPaper.pdf (2) Treloar, A., Harboe-Ree, C. (2008). Data management and the curation continuum: how the Monash experience is informing repository relationships. In VALA2008, Melbourne, Australia. Retrieved from http://www.valaconf.org.au/vala2008/papers2008/111_Treloar_Final.pdf
Information Technology in Libraries. A Pakistani Perspective.
ERIC Educational Resources Information Center
Mahmood, Khalid
This book presents an overview of the present status of the use of library automation hardware and software in Pakistan. The following 20 articles are included: (1) "The Status of Library Automation in Pakistan"; (2) "Promoting Information Technology in Pakistan: the Netherlands Library Development Project"; (3) "Library…
Earth Science Markup Language: Transitioning From Design to Application
NASA Technical Reports Server (NTRS)
Moe, Karen; Graves, Sara; Ramachandran, Rahul
2002-01-01
The primary objective of the proposed Earth Science Markup Language (ESML) research is to transition from design to application. The resulting schema and prototype software will foster community acceptance for the "define once, use anywhere" concept central to ESML. Supporting goals include: 1. Refinement of the ESML schema and software libraries in cooperation with the user community. 2. Application of the ESML schema and software libraries to a variety of Earth science data sets and analysis tools. 3. Development of supporting prototype software for enhanced ease of use. 4. Cooperation with standards bodies in order to assure ESML is aligned with related metadata standards as appropriate. 5. Widespread publication of the ESML approach, schema, and software.
The simulation library of the Belle II software system
NASA Astrophysics Data System (ADS)
Kim, D. Y.; Ritter, M.; Bilka, T.; Bobrov, A.; Casarosa, G.; Chilikin, K.; Ferber, T.; Godang, R.; Jaegle, I.; Kandra, J.; Kodys, P.; Kuhr, T.; Kvasnicka, P.; Nakayama, H.; Piilonen, L.; Pulvermacher, C.; Santelj, L.; Schwenker, B.; Sibidanov, A.; Soloviev, Y.; Starič, M.; Uglov, T.
2017-10-01
SuperKEKB, the next generation B factory, has been constructed in Japan as an upgrade of KEKB. This brand new e+ e- collider is expected to deliver a very large data set for the Belle II experiment, which will be 50 times larger than the previous Belle sample. Both the triggered physics event rate and the background event rate will be increased by at least 10 times than the previous ones, and will create a challenging data taking environment for the Belle II detector. The software system of the Belle II experiment is designed to execute this ambitious plan. A full detector simulation library, which is a part of the Belle II software system, is created based on Geant4 and has been tested thoroughly. Recently the library has been upgraded with Geant4 version 10.1. The library is behaving as expected and it is utilized actively in producing Monte Carlo data sets for various studies. In this paper, we will explain the structure of the simulation library and the various interfaces to other packages including geometry and beam background simulation.
ERIC Educational Resources Information Center
Samuels, Ruth Gallegos; Griffy, Henry
2012-01-01
This article discusses best practices for evaluating open source software for use in library projects, based on the authors' experience evaluating electronic publishing solutions. First, it presents a brief review of the literature, emphasizing the need to evaluate open source solutions carefully in order to minimize Total Cost of Ownership. Next,…
Hammond Workforce 2000: A Three-Year Project. October 1989 to September 1992.
ERIC Educational Resources Information Center
Meyers, Arthur S.; Somerville, Deborah J.
A 3-year Library Services and Construction Act grant project from 1989-1992 provided for adult learning centers, equipped with Apple IIGS computers and software at each location of the Hammond Public Library (Indiana). User-friendly, job-based software to strengthen reading, writing, mathematics, spelling, and grammar skills, as well as video and…
Library iTour: Introducing the iPod Generation to the Academic Library
ERIC Educational Resources Information Center
Cairns, Virginia; Dean, Toni C.
2009-01-01
Purpose: For many years, the Lupton Library offered a traditional library introduction class to first year students participating in the Freshman Seminar Program at the University of Tennessee at Chattanooga. In 2007, the library applied for and received a campus grant to purchase thirty iPod Touches, along with accompanying hardware and software.…
Engineer’s Handbook. Central Archive for Reusable Defense Software (CARDS)
1994-02-28
benefit frc- this reuse effort? Reuse should be done for a domain rather than just for a program. " Identify relationships between domains to facilitate... benefits to the government and its contractors. " Help provide guidelines to enable domain managers to do a trade-off study on requirements, e.g., does...libraries, if desired or required. This can only occur where the government domain growth matches, or can benefit from, the inclusion or incorporation of the
SNS programming environment user's guide
NASA Technical Reports Server (NTRS)
Tennille, Geoffrey M.; Howser, Lona M.; Humes, D. Creig; Cronin, Catherine K.; Bowen, John T.; Drozdowski, Joseph M.; Utley, Judith A.; Flynn, Theresa M.; Austin, Brenda A.
1992-01-01
The computing environment is briefly described for the Supercomputing Network Subsystem (SNS) of the Central Scientific Computing Complex of NASA Langley. The major SNS computers are a CRAY-2, a CRAY Y-MP, a CONVEX C-210, and a CONVEX C-220. The software is described that is common to all of these computers, including: the UNIX operating system, computer graphics, networking utilities, mass storage, and mathematical libraries. Also described is file management, validation, SNS configuration, documentation, and customer services.
Chao, Edmund Y S; Armiger, Robert S; Yoshida, Hiroaki; Lim, Jonathan; Haraguchi, Naoki
2007-03-08
The ability to combine physiology and engineering analyses with computer sciences has opened the door to the possibility of creating the "Virtual Human" reality. This paper presents a broad foundation for a full-featured biomechanical simulator for the human musculoskeletal system physiology. This simulation technology unites the expertise in biomechanical analysis and graphic modeling to investigate joint and connective tissue mechanics at the structural level and to visualize the results in both static and animated forms together with the model. Adaptable anatomical models including prosthetic implants and fracture fixation devices and a robust computational infrastructure for static, kinematic, kinetic, and stress analyses under varying boundary and loading conditions are incorporated on a common platform, the VIMS (Virtual Interactive Musculoskeletal System). Within this software system, a manageable database containing long bone dimensions, connective tissue material properties and a library of skeletal joint system functional activities and loading conditions are also available and they can easily be modified, updated and expanded. Application software is also available to allow end-users to perform biomechanical analyses interactively. Examples using these models and the computational algorithms in a virtual laboratory environment are used to demonstrate the utility of these unique database and simulation technology. This integrated system, model library and database will impact on orthopaedic education, basic research, device development and application, and clinical patient care related to musculoskeletal joint system reconstruction, trauma management, and rehabilitation.
Chao, Edmund YS; Armiger, Robert S; Yoshida, Hiroaki; Lim, Jonathan; Haraguchi, Naoki
2007-01-01
The ability to combine physiology and engineering analyses with computer sciences has opened the door to the possibility of creating the "Virtual Human" reality. This paper presents a broad foundation for a full-featured biomechanical simulator for the human musculoskeletal system physiology. This simulation technology unites the expertise in biomechanical analysis and graphic modeling to investigate joint and connective tissue mechanics at the structural level and to visualize the results in both static and animated forms together with the model. Adaptable anatomical models including prosthetic implants and fracture fixation devices and a robust computational infrastructure for static, kinematic, kinetic, and stress analyses under varying boundary and loading conditions are incorporated on a common platform, the VIMS (Virtual Interactive Musculoskeletal System). Within this software system, a manageable database containing long bone dimensions, connective tissue material properties and a library of skeletal joint system functional activities and loading conditions are also available and they can easily be modified, updated and expanded. Application software is also available to allow end-users to perform biomechanical analyses interactively. Examples using these models and the computational algorithms in a virtual laboratory environment are used to demonstrate the utility of these unique database and simulation technology. This integrated system, model library and database will impact on orthopaedic education, basic research, device development and application, and clinical patient care related to musculoskeletal joint system reconstruction, trauma management, and rehabilitation. PMID:17343764
Quarterly Report - May through July 2012
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miller, Laniece E.
2012-08-09
The first quarter of my postgraduate internship has been an extremely varied one, and one which I have tackled several different aspects of the project. Because this is the beginning of a new investigation for the Research Library, I think it is appropriate that I explore data management at LANL from multiple perspectives. I have spent a considerable amount of time doing a literature search and taking notes on what I've been reading in preparation for potential writing activities later. The Research Library is not the only research library exploring the possibility of providing services to their user base. Themore » Joint Information Systems Committee (JISC) and the Digital Curation Centre (DCC) in the UK are actively pursuing possibilities to preserve the scientific record. DataOne is a U.S. National Science Foundation (NSF) initiative aimed at helping to curate bioscience data. This is just a tiny sample of the organizations actively looking into the issues surrounding data management on an organizational, cultural, or technical level. I have included a partial bibliography of some papers I have read. Based on what I read, various discussions, and previous library training, I have begun to document the services I feel I could provide researchers in the context of my internship. This is still very much a work in progress as I learn more about the landscape in libraries and at the Laboratory. I have detailed this process and my thoughts on the issue below. As data management is such a complex and interconnected activity, it is impossible to investigate the organizational and cultural needs of the researchers without familiarizing myself with technologies that could facilitate the local cataloging and preservation of data sets. I have spent some time investigating the repository software DSpace. The library has long maintained the digital object repository aDORe, but the differences in features and lack of a user interface compared to DSpace have made DSpace a good test bed for this project. However my internship is not about repository software and DSpace is just one potential tool for supporting researchers and their data. More details my repository investigation. The most exciting aspect of the project thus far has been meeting with researchers, some of which are potential collaborators. Some people I have talked with have been very interested and enthusiastic about the possibility of collaborating, while others have not wanted to discuss the issue at all. I have had discussions with individual researchers managing their own lab as well as with researchers who are part of much larger collaborations. Three of the research groups whom I feel are of particular interest are detailed below. I have added an appendix below which goes into more detail about the protein crystallography community which has addressed the complete data life cycle within their field end to end. The issue of data management is much bigger than just my internship and there are several people and organizations exploring the issues at the Laboratory. I am making every effort to stay focused on small science data sets and ensure that my activities use standards-based approaches and are sustainable.« less
Small but Pristine--Lessons for Small Library Automation.
ERIC Educational Resources Information Center
Clement, Russell; Robertson, Dane
1990-01-01
Compares the more positive library automation experiences of a small public library with those of a large research library. Topics addressed include collection size; computer size and the need for outside control of a data processing center; staff size; selection process for hardware and software; and accountability. (LRW)
NASA Technical Reports Server (NTRS)
Condon, Steven; Hendrick, Robert; Stark, Michael E.; Steger, Warren
1997-01-01
The Flight Dynamics Division (FDD) of NASA's Goddard Space Flight Center (GSFC) recently embarked on a far-reaching revision of its process for developing and maintaining satellite support software. The new process relies on an object-oriented software development method supported by a domain specific library of generalized components. This Generalized Support Software (GSS) Domain Engineering Process is currently in use at the NASA GSFC Software Engineering Laboratory (SEL). The key facets of the GSS process are (1) an architecture for rapid deployment of FDD applications, (2) a reuse asset library for FDD classes, and (3) a paradigm shift from developing software to configuring software for mission support. This paper describes the GSS architecture and process, results of fielding the first applications, lessons learned, and future directions
Expanding roles in a library-based bioinformatics service program: a case study
Li, Meng; Chen, Yi-Bu; Clintworth, William A
2013-01-01
Question: How can a library-based bioinformatics support program be implemented and expanded to continuously support the growing and changing needs of the research community? Setting: A program at a health sciences library serving a large academic medical center with a strong research focus is described. Methods: The bioinformatics service program was established at the Norris Medical Library in 2005. As part of program development, the library assessed users' bioinformatics needs, acquired additional funds, established and expanded service offerings, and explored additional roles in promoting on-campus collaboration. Results: Personnel and software have increased along with the number of registered software users and use of the provided services. Conclusion: With strategic efforts and persistent advocacy within the broader university environment, library-based bioinformatics service programs can become a key part of an institution's comprehensive solution to researchers' ever-increasing bioinformatics needs. PMID:24163602
Ayres, Daniel L; Darling, Aaron; Zwickl, Derrick J; Beerli, Peter; Holder, Mark T; Lewis, Paul O; Huelsenbeck, John P; Ronquist, Fredrik; Swofford, David L; Cummings, Michael P; Rambaut, Andrew; Suchard, Marc A
2012-01-01
Phylogenetic inference is fundamental to our understanding of most aspects of the origin and evolution of life, and in recent years, there has been a concentration of interest in statistical approaches such as Bayesian inference and maximum likelihood estimation. Yet, for large data sets and realistic or interesting models of evolution, these approaches remain computationally demanding. High-throughput sequencing can yield data for thousands of taxa, but scaling to such problems using serial computing often necessitates the use of nonstatistical or approximate approaches. The recent emergence of graphics processing units (GPUs) provides an opportunity to leverage their excellent floating-point computational performance to accelerate statistical phylogenetic inference. A specialized library for phylogenetic calculation would allow existing software packages to make more effective use of available computer hardware, including GPUs. Adoption of a common library would also make it easier for other emerging computing architectures, such as field programmable gate arrays, to be used in the future. We present BEAGLE, an application programming interface (API) and library for high-performance statistical phylogenetic inference. The API provides a uniform interface for performing phylogenetic likelihood calculations on a variety of compute hardware platforms. The library includes a set of efficient implementations and can currently exploit hardware including GPUs using NVIDIA CUDA, central processing units (CPUs) with Streaming SIMD Extensions and related processor supplementary instruction sets, and multicore CPUs via OpenMP. To demonstrate the advantages of a common API, we have incorporated the library into several popular phylogenetic software packages. The BEAGLE library is free open source software licensed under the Lesser GPL and available from http://beagle-lib.googlecode.com. An example client program is available as public domain software.
Ayres, Daniel L.; Darling, Aaron; Zwickl, Derrick J.; Beerli, Peter; Holder, Mark T.; Lewis, Paul O.; Huelsenbeck, John P.; Ronquist, Fredrik; Swofford, David L.; Cummings, Michael P.; Rambaut, Andrew; Suchard, Marc A.
2012-01-01
Abstract Phylogenetic inference is fundamental to our understanding of most aspects of the origin and evolution of life, and in recent years, there has been a concentration of interest in statistical approaches such as Bayesian inference and maximum likelihood estimation. Yet, for large data sets and realistic or interesting models of evolution, these approaches remain computationally demanding. High-throughput sequencing can yield data for thousands of taxa, but scaling to such problems using serial computing often necessitates the use of nonstatistical or approximate approaches. The recent emergence of graphics processing units (GPUs) provides an opportunity to leverage their excellent floating-point computational performance to accelerate statistical phylogenetic inference. A specialized library for phylogenetic calculation would allow existing software packages to make more effective use of available computer hardware, including GPUs. Adoption of a common library would also make it easier for other emerging computing architectures, such as field programmable gate arrays, to be used in the future. We present BEAGLE, an application programming interface (API) and library for high-performance statistical phylogenetic inference. The API provides a uniform interface for performing phylogenetic likelihood calculations on a variety of compute hardware platforms. The library includes a set of efficient implementations and can currently exploit hardware including GPUs using NVIDIA CUDA, central processing units (CPUs) with Streaming SIMD Extensions and related processor supplementary instruction sets, and multicore CPUs via OpenMP. To demonstrate the advantages of a common API, we have incorporated the library into several popular phylogenetic software packages. The BEAGLE library is free open source software licensed under the Lesser GPL and available from http://beagle-lib.googlecode.com. An example client program is available as public domain software. PMID:21963610
[Establishment and management of electronic appointment library for dental implant patients].
Dong, Zheng-jie; Xu, Kan
2013-10-01
To design an excel form which can prompt dental implant patient appointment through color change, which can scientifically manage implant EMR library through appropriate interlinkage and number. An excel form based on operating system Windows XP was designed and software 2003 Microsoft excel was used, which was configured to change color with the passage of time by the use of command "conditional format". An excel form was designed. The color turned to red automatically on the day the patient underwent implant surgery. It turned to yellow when the patient recalled 2 weeks after the first operation, to green when the patient underwent secondary operation. It was designed to be gray when all the procedures of implant restoration was finished. In addition, we could know patients' main implant situation through directly opening his EMR when clicking on his name or number. Dentists can remind the implant patient appointment schedule through color change of an excel form, and can consult the implant patient EMR directly through interlinkage or number.
SU-G-BRB-02: An Open-Source Software Analysis Library for Linear Accelerator Quality Assurance
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kerns, J; Yaldo, D
Purpose: Routine linac quality assurance (QA) tests have become complex enough to require automation of most test analyses. A new data analysis software library was built that allows physicists to automate routine linear accelerator quality assurance tests. The package is open source, code tested, and benchmarked. Methods: Images and data were generated on a TrueBeam linac for the following routine QA tests: VMAT, starshot, CBCT, machine logs, Winston Lutz, and picket fence. The analysis library was built using the general programming language Python. Each test was analyzed with the library algorithms and compared to manual measurements taken at the timemore » of acquisition. Results: VMAT QA results agreed within 0.1% between the library and manual measurements. Machine logs (dynalogs & trajectory logs) were successfully parsed; mechanical axis positions were verified for accuracy and MLC fluence agreed well with EPID measurements. CBCT QA measurements were within 10 HU and 0.2mm where applicable. Winston Lutz isocenter size measurements were within 0.2mm of TrueBeam’s Machine Performance Check. Starshot analysis was within 0.2mm of the Winston Lutz results for the same conditions. Picket fence images with and without a known error showed that the library was capable of detecting MLC offsets within 0.02mm. Conclusion: A new routine QA software library has been benchmarked and is available for use by the community. The library is open-source and extensible for use in larger systems.« less
The Weakest Link: Library Catalogs.
ERIC Educational Resources Information Center
Young, Terrence E., Jr.
2002-01-01
Describes methods of correcting MARC records in online public access catalogs in school libraries. Highlights include in-house methods; professional resources; conforming to library cataloging standards; vendor services, including Web-based services; software specifically developed for record cleanup; and outsourcing. (LRW)
Building Geospatial Web Services for Ecological Monitoring and Forecasting
NASA Astrophysics Data System (ADS)
Hiatt, S. H.; Hashimoto, H.; Melton, F. S.; Michaelis, A. R.; Milesi, C.; Nemani, R. R.; Wang, W.
2008-12-01
The Terrestrial Observation and Prediction System (TOPS) at NASA Ames Research Center is a modeling system that generates a suite of gridded data products in near real-time that are designed to enhance management decisions related to floods, droughts, forest fires, human health, as well as crop, range, and forest production. While these data products introduce great possibilities for assisting management decisions and informing further research, realization of their full potential is complicated by their shear volume and by the need for a necessary infrastructure for remotely browsing, visualizing, and analyzing the data. In order to address these difficulties we have built an OGC-compliant WMS and WCS server based on an open source software stack that provides standardized access to our archive of data. This server is built using the open source Java library GeoTools which achieves efficient I/O and image rendering through Java Advanced Imaging. We developed spatio-temporal raster management capabilities using the PostGrid raster indexation engine. We provide visualization and browsing capabilities through a customized Ajax web interface derived from the kaMap project. This interface allows resource managers to quickly assess ecosystem conditions and identify significant trends and anomalies from within their web browser without the need to download source data or install special software. Our standardized web services also expose TOPS data to a range of potential clients, from web mapping applications to virtual globes and desktop GIS packages. However, support for managing the temporal dimension of our data is currently limited in existing software systems. Future work will attempt to overcome this shortcoming by building time-series visualization and analysis tools that can be integrated with existing geospatial software.
PandaEPL: a library for programming spatial navigation experiments.
Solway, Alec; Miller, Jonathan F; Kahana, Michael J
2013-12-01
Recent advances in neuroimaging and neural recording techniques have enabled researchers to make significant progress in understanding the neural mechanisms underlying human spatial navigation. Because these techniques generally require participants to remain stationary, computer-generated virtual environments are used. We introduce PandaEPL, a programming library for the Python language designed to simplify the creation of computer-controlled spatial-navigation experiments. PandaEPL is built on top of Panda3D, a modern open-source game engine. It allows users to construct three-dimensional environments that participants can navigate from a first-person perspective. Sound playback and recording and also joystick support are provided through the use of additional optional libraries. PandaEPL also handles many tasks common to all cognitive experiments, including managing configuration files, logging all internal and participant-generated events, and keeping track of the experiment state. We describe how PandaEPL compares with other software for building spatial-navigation experiments and walk the reader through the process of creating a fully functional experiment.
PandaEPL: A library for programming spatial navigation experiments
Solway, Alec; Miller, Jonathan F.
2013-01-01
Recent advances in neuroimaging and neural recording techniques have enabled researchers to make significant progress in understanding the neural mechanisms underlying human spatial navigation. Because these techniques generally require participants to remain stationary, computer-generated virtual environments are used. We introduce PandaEPL, a programming library for the Python language designed to simplify the creation of computer-controlled spatial-navigation experiments. PandaEPL is built on top of Panda3D, a modern open-source game engine. It allows users to construct three-dimensional environments that participants can navigate from a first-person perspective. Sound playback and recording and also joystick support are provided through the use of additional optional libraries. PandaEPL also handles many tasks common to all cognitive experiments, including managing configuration files, logging all internal and participant-generated events, and keeping track of the experiment state. We describe how PandaEPL compares with other software for building spatial-navigation experiments and walk the reader through the process of creating a fully functional experiment. PMID:23549683
DSPSR: Digital Signal Processing Software for Pulsar Astronomy
NASA Astrophysics Data System (ADS)
van Straten, W.; Bailes, M.
2010-10-01
DSPSR, written primarily in C++, is an open-source, object-oriented, digital signal processing software library and application suite for use in radio pulsar astronomy. The library implements an extensive range of modular algorithms for use in coherent dedispersion, filterbank formation, pulse folding, and other tasks. The software is installed and compiled using the standard GNU configure and make system, and is able to read astronomical data in 18 different file formats, including FITS, S2, CPSR, CPSR2, PuMa, PuMa2, WAPP, ASP, and Mark5.
Enhancements to the EPANET-RTX (Real-Time Analytics) ...
Technical brief and software The U.S. Environmental Protection Agency (EPA) developed EPANET-RTX as a collection of object-oriented software libraries comprising the core data access, data transformation, and data synthesis (real-time analytics) components of a real-time hydraulic and water quality modeling system. While EPANET-RTX uses the hydraulic and water quality solvers of EPANET, the object libraries are a self-contained set of building blocks for software developers. “Real-time EPANET” promises to change the way water utilities, commercial vendors, engineers, and the water community think about modeling.
Eapen, Bell Raj
2006-01-01
EndNote is a useful software for online literature search and efficient bibliography management. It helps to format the bibliography according to the citation style of each journal. EndNote stores references in a library file, which can be shared with others. It can connect to online resources like PubMed and retrieve search results as per the search criteria. It can also effortlessly integrate with popular word processors like MS Word. The Indian Journal of Dermatology, Venereology and Leprology website has a provision to import references to EndNote.
NASA Technical Reports Server (NTRS)
Malin, Jane T.; Basham, Bryan D.
1989-01-01
CONFIG is a modeling and simulation tool prototype for analyzing the normal and faulty qualitative behaviors of engineered systems. Qualitative modeling and discrete-event simulation have been adapted and integrated, to support early development, during system design, of software and procedures for management of failures, especially in diagnostic expert systems. Qualitative component models are defined in terms of normal and faulty modes and processes, which are defined by invocation statements and effect statements with time delays. System models are constructed graphically by using instances of components and relations from object-oriented hierarchical model libraries. Extension and reuse of CONFIG models and analysis capabilities in hybrid rule- and model-based expert fault-management support systems are discussed.
Knowledge-based reusable software synthesis system
NASA Technical Reports Server (NTRS)
Donaldson, Cammie
1989-01-01
The Eli system, a knowledge-based reusable software synthesis system, is being developed for NASA Langley under a Phase 2 SBIR contract. Named after Eli Whitney, the inventor of interchangeable parts, Eli assists engineers of large-scale software systems in reusing components while they are composing their software specifications or designs. Eli will identify reuse potential, search for components, select component variants, and synthesize components into the developer's specifications. The Eli project began as a Phase 1 SBIR to define a reusable software synthesis methodology that integrates reusabilityinto the top-down development process and to develop an approach for an expert system to promote and accomplish reuse. The objectives of the Eli Phase 2 work are to integrate advanced technologies to automate the development of reusable components within the context of large system developments, to integrate with user development methodologies without significant changes in method or learning of special languages, and to make reuse the easiest operation to perform. Eli will try to address a number of reuse problems including developing software with reusable components, managing reusable components, identifying reusable components, and transitioning reuse technology. Eli is both a library facility for classifying, storing, and retrieving reusable components and a design environment that emphasizes, encourages, and supports reuse.
An object-oriented class library for medical software development.
O'Kane, K C; McColligan, E E
1996-12-01
The objective of this research is the development of a Medical Object Library (MOL) consisting of reusable, inheritable, portable, extendable C++ classes that facilitate rapid development of medical software at reduced cost and increased functionality. The result of this research is a library of class objects that range in function from string and hierarchical file handling entities to high level, procedural agents that perform increasingly complex, integrated tasks. A system built upon these classes is compatible with any other system similarly constructed with respect to data definitions, semantics, data organization and storage. As new objects are built, they can be added to the class library for subsequent use. The MOL is a toolkit of software objects intended to support a common file access methodology, a unified medical record structure, consistent message processing, standard graphical display facilities and uniform data collection procedures. This work emphasizes the relationship that potentially exists between the structure of a hierarchical medical record and procedural language components by means of a hierarchical class library and tree structured file access facility. In doing so, it attempts to establish interest in and demonstrate the practicality of the hierarchical medical record model in the modern context of object oriented programming.
DOE Office of Scientific and Technical Information (OSTI.GOV)
De Jong, Wibe A.; Walker, Andrew M.; Hanwell, Marcus D.
Background Multidisciplinary integrated research requires the ability to couple the diverse sets of data obtained from a range of complex experiments and computer simulations. Integrating data requires semantically rich information. In this paper the generation of semantically rich data from the NWChem computational chemistry software is discussed within the Chemical Markup Language (CML) framework. Results The NWChem computational chemistry software has been modified and coupled to the FoX library to write CML compliant XML data files. The FoX library was expanded to represent the lexical input files used by the computational chemistry software. Conclusions The production of CML compliant XMLmore » files for the computational chemistry software NWChem can be relatively easily accomplished using the FoX library. A unified computational chemistry or CompChem convention and dictionary needs to be developed through a community-based effort. The long-term goal is to enable a researcher to do Google-style chemistry and physics searches.« less
An Open Software Platform for Sharing Water Resource Models, Code and Data
NASA Astrophysics Data System (ADS)
Knox, Stephen; Meier, Philipp; Mohamed, Khaled; Korteling, Brett; Matrosov, Evgenii; Huskova, Ivana; Harou, Julien; Rosenberg, David; Tilmant, Amaury; Medellin-Azuara, Josue; Wicks, Jon
2016-04-01
The modelling of managed water resource systems requires new approaches in the face of increasing future uncertainty. Water resources management models, even if applied to diverse problem areas, use common approaches such as representing the problem as a network of nodes and links. We propose a data management software platform, called Hydra, that uses this commonality to allow multiple models using a node-link structure to be managed and run using a single software system. Hydra's user interface allows users to manage network topology and associated data. Hydra feeds this data directly into a model, importing from and exporting to different file formats using Apps. An App connects Hydra to a custom model, a modelling system such as GAMS or MATLAB or to different file formats such as MS Excel, CSV and ESRI Shapefiles. Hydra allows users to manage their data in a single, consistent place. Apps can be used to run domain-specific models and allow users to work with their own required file formats. The Hydra App Store offers a collaborative space where model developers can publish, review and comment on Apps, models and data. Example Apps and open-source libraries are available in a variety of languages (Python, Java and .NET). The App Store can act as a hub for water resource modellers to view and share Apps, models and data easily. This encourages an ecosystem of development using a shared platform, resulting in more model integration and potentially greater unity within resource modelling communities. www.hydraplatform.org www.hydraappstore.com
The Australian Bibliographic Network--An Introduction.
ERIC Educational Resources Information Center
Hannan, Chris
1982-01-01
Focuses on the centralized, online, shared-cataloging facility of the Australian Bibliographic Network operated by the National Library of Australia. Existing automated library systems in Australia, selection and implementation of Washington Library Network software, comparison of Australian and U.S. automation, and projections for future…
Building a Digital Library for Multibeam Data, Images and Documents
NASA Astrophysics Data System (ADS)
Miller, S. P.; Staudigel, H.; Koppers, A.; Johnson, C.; Cande, S.; Sandwell, D.; Peckman, U.; Becker, J. J.; Helly, J.; Zaslavsky, I.; Schottlaender, B. E.; Starr, S.; Montoya, G.
2001-12-01
The Scripps Institution of Oceanography, the UCSD Libraries and the San Diego Supercomputing Center have joined forces to establish a digital library for accessing a wide range of multibeam and marine geophysical data, to a community that ranges from the MGG researcher to K-12 outreach clients. This digital library collection will include 233 multibeam cruises with grids, plots, photographs, station data, technical reports, planning documents and publications, drawn from the holdings of the Geological Data Center and the SIO Archives. Inquiries will be made through an Ocean Exploration Console, reminiscent of a cockpit display where a multitude of data may be displayed individually or in two or three-dimensional projections. These displays will provide access to cruise data as well as global databases such as Global Topography, crustal age, and sediment thickness, thus meeting the day-to-day needs of researchers as well as educators, students, and the public. The prototype contains a few selected expeditions, and a review of the initial approach will be solicited from the user community during the poster session. The search process can be focused by a variety of constraints: geospatial (lat-lon box), temporal (e.g., since 1996), keyword (e.g., cruise, place name, PI, etc.), or expert-level (e.g., K-6 or researcher). The Storage Resource Broker (SRB) software from the SDSC manages the evolving collection as a series of distributed but related archives in various media, from shipboard data through processing and final archiving. The latest version of MB-System provides for the systematic creation of standard metadata, and for the harvesting of metadata from multibeam files. Automated scripts will be used to load the metadata catalog to enable queries with an Oracle database management system. These new efforts to bridge the gap between libraries and data archives are supported by the NSF Information Technology and National Science Digital Library (NSDL) programs, augmented by UC funds, and closely coordinated with Digital Library for Earth System Education (DLESE) activities.
Using the Intel Math Kernel Library on Peregrine | High-Performance
Computing | NREL the Intel Math Kernel Library on Peregrine Using the Intel Math Kernel Library on Peregrine Learn how to use the Intel Math Kernel Library (MKL) with Peregrine system software. MKL architectures. Core math functions in MKL include BLAS, LAPACK, ScaLAPACK, sparse solvers, fast Fourier
ERIC Educational Resources Information Center
Kenan, Sharon K.
2012-01-01
Technological innovations have transformed all areas of community college libraries. Automated library systems, office software, and Internet access have altered work processes for library personnel and have changed research methodologies for students and faculty. The purpose of this bounded multiple case study was to explore how the adoption of…
Castañón, Jesús; Román, José Pablo; Jessop, Theodore C; de Blas, Jesús; Haro, Rubén
2018-06-01
DNA-encoded libraries (DELs) have emerged as an efficient and cost-effective drug discovery tool for the exploration and screening of very large chemical space using small-molecule collections of unprecedented size. Herein, we report an integrated automation and informatics system designed to enhance the quality, efficiency, and throughput of the production and affinity selection of these libraries. The platform is governed by software developed according to a database-centric architecture to ensure data consistency, integrity, and availability. Through its versatile protocol management functionalities, this application captures the wide diversity of experimental processes involved with DEL technology, keeps track of working protocols in the database, and uses them to command robotic liquid handlers for the synthesis of libraries. This approach provides full traceability of building-blocks and DNA tags in each split-and-pool cycle. Affinity selection experiments and high-throughput sequencing reads are also captured in the database, and the results are automatically deconvoluted and visualized in customizable representations. Researchers can compare results of different experiments and use machine learning methods to discover patterns in data. As of this writing, the platform has been validated through the generation and affinity selection of various libraries, and it has become the cornerstone of the DEL production effort at Lilly.
Flexible session management in a distributed environment
NASA Astrophysics Data System (ADS)
Miller, Zach; Bradley, Dan; Tannenbaum, Todd; Sfiligoi, Igor
2010-04-01
Many secure communication libraries used by distributed systems, such as SSL, TLS, and Kerberos, fail to make a clear distinction between the authentication, session, and communication layers. In this paper we introduce CEDAR, the secure communication library used by the Condor High Throughput Computing software, and present the advantages to a distributed computing system resulting from CEDAR's separation of these layers. Regardless of the authentication method used, CEDAR establishes a secure session key, which has the flexibility to be used for multiple capabilities. We demonstrate how a layered approach to security sessions can avoid round-trips and latency inherent in network authentication. The creation of a distinct session management layer allows for optimizations to improve scalability by way of delegating sessions to other components in the system. This session delegation creates a chain of trust that reduces the overhead of establishing secure connections and enables centralized enforcement of system-wide security policies. Additionally, secure channels based upon UDP datagrams are often overlooked by existing libraries; we show how CEDAR's structure accommodates this as well. As an example of the utility of this work, we show how the use of delegated security sessions and other techniques inherent in CEDAR's architecture enables US CMS to meet their scalability requirements in deploying Condor over large-scale, wide-area grid systems.
Challenges in Small Screening Laboratories: SaaS to the rescue
Lemmon, Vance P.; Jia, Yuanyuan; Shi, Yan; Holbrook, S. Douglas; Bixby, John L; Buchser, William
2012-01-01
The Miami Project to Cure Paralysis, part of the University of Miami Miller School of Medicine, includes a laboratory devoted to High Content Analysis (HCA) of neurons. The goal of the laboratory is to uncover signalling pathways, genes, compounds, or drugs that can be used to promote nerve growth. HCA permits the quantification of neuronal morphology, including the lengths and numbers of axons. HCA screening of various libraries on primary neurons requires a team-based approach, a variety of process steps and complex manipulations of cells and libraries to obtain meaningful results. HCA itself produces vast amounts of information including images, well-based data and cell-based phenotypic measures. Managing experimental workflow and library data, along with the extensive amount of experimental results is challenging. For academic laboratories generating large data sets from experiments using thousands of perturbagens, a laboratory information management system (LIMS) is the data tracking solution of choice. With both productivity and efficiency as driving rationales, the Miami Project has equipped its HCA laboratory with a Software As A Service (SAAS) LIMS to ensure the quality of its experiments and workflows. The article discusses this application in detail, and how the system was selected and integrated into the laboratory. The advantages of SaaS are described. PMID:21631415
Automated software development workstation
NASA Technical Reports Server (NTRS)
Prouty, Dale A.; Klahr, Philip
1988-01-01
A workstation is being developed that provides a computational environment for all NASA engineers across application boundaries, which automates reuse of existing NASA software and designs, and efficiently and effectively allows new programs and/or designs to be developed, catalogued, and reused. The generic workstation is made domain specific by specialization of the user interface, capturing engineering design expertise for the domain, and by constructing/using a library of pertinent information. The incorporation of software reusability principles and expert system technology into this workstation provide the obvious benefits of increased productivity, improved software use and design reliability, and enhanced engineering quality by bringing engineering to higher levels of abstraction based on a well tested and classified library.
Ueno, Yutaka; Ito, Shuntaro; Konagaya, Akihiko
2014-12-01
To better understand the behaviors and structural dynamics of proteins within a cell, novel software tools are being developed that can create molecular animations based on the findings of structural biology. This study proposes our method developed based on our prototypes to detect collisions and examine the soft-body dynamics of molecular models. The code was implemented with a software development toolkit for rigid-body dynamics simulation and a three-dimensional graphics library. The essential functions of the target software system included the basic molecular modeling environment, collision detection in the molecular models, and physical simulations of the movement of the model. Taking advantage of recent software technologies such as physics simulation modules and interpreted scripting language, the functions required for accurate and meaningful molecular animation were implemented efficiently.
ERIC Educational Resources Information Center
Abrams, Mary; Kurlychek, Ken
1989-01-01
This article describes the Software Evaluation Clearinghouse for Educators of the Hearing Impaired at Gallaudet University (Washington, DC). Software compatible with Apple and IBM hardware is collected, rated by clearinghouse members, and described in a printed catalog. Tips on starting a software lending library are offered. (PB)
xSDK Foundations: Toward an Extreme-scale Scientific Software Development Kit
DOE Office of Scientific and Technical Information (OSTI.GOV)
Heroux, Michael A.; Bartlett, Roscoe; Demeshko, Irina
Here, extreme-scale computational science increasingly demands multiscale and multiphysics formulations. Combining software developed by independent groups is imperative: no single team has resources for all predictive science and decision support capabilities. Scientific libraries provide high-quality, reusable software components for constructing applications with improved robustness and portability. However, without coordination, many libraries cannot be easily composed. Namespace collisions, inconsistent arguments, lack of third-party software versioning, and additional difficulties make composition costly. The Extreme-scale Scientific Software Development Kit (xSDK) defines community policies to improve code quality and compatibility across independently developed packages (hypre, PETSc, SuperLU, Trilinos, and Alquimia) and provides a foundationmore » for addressing broader issues in software interoperability, performance portability, and sustainability. The xSDK provides turnkey installation of member software and seamless combination of aggregate capabilities, and it marks first steps toward extreme-scale scientific software ecosystems from which future applications can be composed rapidly with assured quality and scalability.« less
xSDK Foundations: Toward an Extreme-scale Scientific Software Development Kit
Heroux, Michael A.; Bartlett, Roscoe; Demeshko, Irina; ...
2017-03-01
Here, extreme-scale computational science increasingly demands multiscale and multiphysics formulations. Combining software developed by independent groups is imperative: no single team has resources for all predictive science and decision support capabilities. Scientific libraries provide high-quality, reusable software components for constructing applications with improved robustness and portability. However, without coordination, many libraries cannot be easily composed. Namespace collisions, inconsistent arguments, lack of third-party software versioning, and additional difficulties make composition costly. The Extreme-scale Scientific Software Development Kit (xSDK) defines community policies to improve code quality and compatibility across independently developed packages (hypre, PETSc, SuperLU, Trilinos, and Alquimia) and provides a foundationmore » for addressing broader issues in software interoperability, performance portability, and sustainability. The xSDK provides turnkey installation of member software and seamless combination of aggregate capabilities, and it marks first steps toward extreme-scale scientific software ecosystems from which future applications can be composed rapidly with assured quality and scalability.« less
Techniques for Efficiently Managing Large Geosciences Data Sets
NASA Astrophysics Data System (ADS)
Kruger, A.; Krajewski, W. F.; Bradley, A. A.; Smith, J. A.; Baeck, M. L.; Steiner, M.; Lawrence, R. E.; Ramamurthy, M. K.; Weber, J.; Delgreco, S. A.; Domaszczynski, P.; Seo, B.; Gunyon, C. A.
2007-12-01
We have developed techniques and software tools for efficiently managing large geosciences data sets. While the techniques were developed as part of an NSF-Funded ITR project that focuses on making NEXRAD weather data and rainfall products available to hydrologists and other scientists, they are relevant to other geosciences disciplines that deal with large data sets. Metadata, relational databases, data compression, and networking are central to our methodology. Data and derived products are stored on file servers in a compressed format. URLs to, and metadata about the data and derived products are managed in a PostgreSQL database. Virtually all access to the data and products is through this database. Geosciences data normally require a number of processing steps to transform the raw data into useful products: data quality assurance, coordinate transformations and georeferencing, applying calibration information, and many more. We have developed the concept of crawlers that manage this scientific workflow. Crawlers are unattended processes that run indefinitely, and at set intervals query the database for their next assignment. A database table functions as a roster for the crawlers. Crawlers perform well-defined tasks that are, except for perhaps sequencing, largely independent from other crawlers. Once a crawler is done with its current assignment, it updates the database roster table, and gets its next assignment by querying the database. We have developed a library that enables one to quickly add crawlers. The library provides hooks to external (i.e., C-language) compiled codes, so that developers can work and contribute independently. Processes called ingesters inject data into the system. The bulk of the data are from a real-time feed using UCAR/Unidata's IDD/LDM software. An exciting recent development is the establishment of a Unidata HYDRO feed that feeds value-added metadata over the IDD/LDM. Ingesters grab the metadata and populate the PostgreSQL tables. These and other concepts we have developed have enabled us to efficiently manage a 70 Tb (and growing) data weather radar data set.
Hagerstown-Jefferson Township Public Library Internet Web Site.
ERIC Educational Resources Information Center
Albertson, Marie
1997-01-01
Describes the development of the Hagerstown (Indiana) public library's Web site. Highlights include writing successful grant proposals for funding; software from Microsoft; community support; free community access to the Internet from home computers as well as at the library; problems encountered; and future plans. (LRW)
A Model of RHIC Using the Unified Accelerator Libraries
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pilat, F.; Tepikian, S.; Trahern, C. G.
1998-01-01
The Unified Accelerator Library (UAL) is an object oriented and modular software environment for accelerator physics which comprises an accelerator object model for the description of the machine (SMF, for Standard Machine Format), a collection of Physics Libraries, and a Perl inte,face that provides a homogeneous shell for integrating and managing these components. Currently available physics libraries include TEAPOT++, a collection of C++ physics modules conceptually derived from TEAPOT, and DNZLIB, a differential algebra package for map generation. This software environment has been used to build a flat model of RHIC which retains the hierarchical lattice description while assigning specificmore » characteristics to individual elements, such as measured field harmonics. A first application of the model and of the simulation capabilities of UAL has been the study of RHIC stability in the presence of siberian snakes and spin rotators. The building blocks of RHIC snakes and rotators are helical dipoles, unconventional devices that can not be modeled by traditional accelerator physics codes and have been implemented in UAL as Taylor maps. Section 2 describes the RHIC data stores, Section 3 the RHIC SMF format and Section 4 the RHIC specific Perl interface (RHIC Shell). Section 5 explains how the RHIC SMF and UAL have been used to study the RHIC dynamic behavior and presents detuning and dynamic aperture results. If the reader is not familiar with the motivation and characteristics of UAL, we include in the Appendix an useful overview paper. An example of a complete set of Perl Scripts for RHIC simulation can also be found in the Appendix.« less
Continuous quality improvement using intelligent infusion pump data analysis.
Breland, Burnis D
2010-09-01
The use of continuous quality-improvement (CQI) processes in the implementation of intelligent infusion pumps in a community teaching hospital is described. After the decision was made to implement intelligent i.v. infusion pumps in a 413-bed, community teaching hospital, drug libraries for use in the safety software had to be created. Before drug libraries could be created, it was necessary to determine the epidemiology of medication use in various clinical care areas. Standardization of medication administration was performed through the CQI process, using practical knowledge of clinicians at the bedside and evidence-based drug safety parameters in the scientific literature. Post-implementation, CQI allowed refinement of clinically important safety limits while minimizing inappropriate, meaningless soft limit alerts on a few select agents. Assigning individual clinical care areas (CCAs) to individual patient care units facilitated customization of drug libraries and identification of specific CCA compliance concerns. Between June 2007 and June 2008, there were seven library updates. These involved drug additions and deletions, customization of individual CCAs, and alterations of limits. Overall compliance with safety software use rose over time, from 33% in November 2006 to over 98% in December 2009. Many potentially clinically significant dosing errors were intercepted by the safety software, prompting edits by end users. Only 4-6% of soft limit alerts resulted in edits. Compliance rates for use of infusion pump safety software varied among CCAs over time. Education, auditing, and refinement of drug libraries led to improved compliance in most CCAs.
Rehabilitation and Prosthetic Services
... VA Learning University (VALU) SimLearn Libraries (VALNET) VA Software Documentation Library (VDL) About VHA Learn about VHA Forms & ... & Sensory Aids Service (PSAS) Our Mission The mission of the Prosthetic & ...
DMG-α--a computational geometry library for multimolecular systems.
Szczelina, Robert; Murzyn, Krzysztof
2014-11-24
The DMG-α library grants researchers in the field of computational biology, chemistry, and biophysics access to an open-sourced, easy to use, and intuitive software for performing fine-grained geometric analysis of molecular systems. The library is capable of computing power diagrams (weighted Voronoi diagrams) in three dimensions with 3D periodic boundary conditions, computing approximate projective 2D Voronoi diagrams on arbitrarily defined surfaces, performing shape properties recognition using α-shape theory and can do exact Solvent Accessible Surface Area (SASA) computation. The software is written mainly as a template-based C++ library for greater performance, but a rich Python interface (pydmga) is provided as a convenient way to manipulate the DMG-α routines. To illustrate possible applications of the DMG-α library, we present results of sample analyses which allowed to determine nontrivial geometric properties of two Escherichia coli-specific lipids as emerging from molecular dynamics simulations of relevant model bilayers.
Educating Librarians and Information Resource Managers: Differing Management Perspectives?
ERIC Educational Resources Information Center
Bouthillier, France
1993-01-01
Examines differences between library management and information resource management (IRM). Highlights include a historical perspective of library management education and IRM; the organizational perspective of library management and the emphasis of information as a resource in IRM; library management and advances in information technology; and…
2015-08-21
using the Open Computer Vision ( OpenCV ) libraries [6] for computer vision and the Qt library [7] for the user interface. The software has the...depth. The software application calibrates the cameras using the plane based calibration model from the OpenCV calib3D module and allows the...6] OpenCV . 2015. OpenCV Open Source Computer Vision. [Online]. Available at: opencv.org [Accessed]: 09/01/2015. [7] Qt. 2015. Qt Project home
Commercial imagery archive, management, exploitation, and distribution project development
NASA Astrophysics Data System (ADS)
Hollinger, Bruce; Sakkas, Alysa
1999-10-01
The Lockheed Martin (LM) team had garnered over a decade of operational experience on the U.S. Government's IDEX II (Imagery Dissemination and Exploitation) system. Recently, it set out to create a new commercial product to serve the needs of large-scale imagery archiving and analysis markets worldwide. LM decided to provide a turnkey commercial solution to receive, store, retrieve, process, analyze and disseminate in 'push' or 'pull' modes imagery, data and data products using a variety of sources and formats. LM selected 'best of breed' hardware and software components and adapted and developed its own algorithms to provide added functionality not commercially available elsewhere. The resultant product, Intelligent Library System (ILS)TM, satisfies requirements for (1) a potentially unbounded, data archive (5000 TB range) (2) automated workflow management for increased user productivity; (3) automatic tracking and management of files stored on shelves; (4) ability to ingest, process and disseminate data volumes with bandwidths ranging up to multi- gigabit per second; (5) access through a thin client-to-server network environment; (6) multiple interactive users needing retrieval of files in seconds from both archived images or in real time, and (7) scalability that maintains information throughput performance as the size of the digital library grows.
Commercial imagery archive, management, exploitation, and distribution product development
NASA Astrophysics Data System (ADS)
Hollinger, Bruce; Sakkas, Alysa
1999-12-01
The Lockheed Martin (LM) team had garnered over a decade of operational experience on the U.S. Government's IDEX II (Imagery Dissemination and Exploitation) system. Recently, it set out to create a new commercial product to serve the needs of large-scale imagery archiving and analysis markets worldwide. LM decided to provide a turnkey commercial solution to receive, store, retrieve, process, analyze and disseminate in 'push' or 'pull' modes imagery, data and data products using a variety of sources and formats. LM selected 'best of breed' hardware and software components and adapted and developed its own algorithms to provide added functionality not commercially available elsewhere. The resultant product, Intelligent Library System (ILS)TM, satisfies requirements for (a) a potentially unbounded, data archive (5000 TB range) (b) automated workflow management for increased user productivity; (c) automatic tracking and management of files stored on shelves; (d) ability to ingest, process and disseminate data volumes with bandwidths ranging up to multi- gigabit per second; (e) access through a thin client-to-server network environment; (f) multiple interactive users needing retrieval of files in seconds from both archived images or in real time, and (g) scalability that maintains information throughput performance as the size of the digital library grows.
Panorama: A Targeted Proteomics Knowledge Base
2015-01-01
Panorama is a web application for storing, sharing, analyzing, and reusing targeted assays created and refined with Skyline,1 an increasingly popular Windows client software tool for targeted proteomics experiments. Panorama allows laboratories to store and organize curated results contained in Skyline documents with fine-grained permissions, which facilitates distributed collaboration and secure sharing of published and unpublished data via a web-browser interface. It is fully integrated with the Skyline workflow and supports publishing a document directly to a Panorama server from the Skyline user interface. Panorama captures the complete Skyline document information content in a relational database schema. Curated results published to Panorama can be aggregated and exported as chromatogram libraries. These libraries can be used in Skyline to pick optimal targets in new experiments and to validate peak identification of target peptides. Panorama is open-source and freely available. It is distributed as part of LabKey Server,2 an open source biomedical research data management system. Laboratories and organizations can set up Panorama locally by downloading and installing the software on their own servers. They can also request freely hosted projects on https://panoramaweb.org, a Panorama server maintained by the Department of Genome Sciences at the University of Washington. PMID:25102069
Finding the Right Educational Software for Your Child.
ERIC Educational Resources Information Center
Moore, Jack
1990-01-01
Ideas are presented for identifying, evaluating, and selecting instructional software for children with special needs. The article notes several library research tools as sources of information and lists specific questions to consider when evaluating software. (JDD)
Cataloging and Organizing Microcomputer Software--Where Do We Go from First Base?
ERIC Educational Resources Information Center
Choi, Susan E.
This position paper addresses general topics to be considered when organizing library software collections. Tasks involved in organizing and cataloging educational software collections are discussed, including arrangement/classification; the type of catalog; descriptions of the software; the general materials designator; storage requirements; and…
Porting and refurbishment of the WSS TNG control software
NASA Astrophysics Data System (ADS)
Caproni, Alessandro; Zacchei, Andrea; Vuerli, Claudio; Pucillo, Mauro
2004-09-01
The Workstation Software Sytem (WSS) is the high level control software of the Italian Galileo Galilei Telescope settled in La Palma Canary Island developed at the beginning of '90 for HP-UX workstations. WSS may be seen as a middle layer software system that manages the communications between the real time systems (VME), different workstations and high level applications providing a uniform distributed environment. The project to port the control software from the HP workstation to Linux environment started at the end of 2001. It is aimed to refurbish the control software introducing some of the new software technologies and languages, available for free in the Linux operating system. The project was realized by gradually substituting each HP workstation with a Linux PC with the goal to avoid main changes in the original software running under HP-UX. Three main phases characterized the project: creation of a simulated control room with several Linux PCs running WSS (to check all the functionality); insertion in the simulated control room of some HPs (to check the mixed environment); substitution of HP workstation in the real control room. From a software point of view, the project introduces some new technologies, like multi-threading, and the possibility to develop high level WSS applications with almost every programming language that implements the Berkley sockets. A library to develop java applications has also been created and tested.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Auld, Joshua; Hope, Michael; Ley, Hubert
This paper discusses the development of an agent-based modelling software development kit, and the implementation and validation of a model using it that integrates dynamic simulation of travel demand, network supply and network operations. A description is given of the core utilities in the kit: a parallel discrete event engine, interprocess exchange engine, and memory allocator, as well as a number of ancillary utilities: visualization library, database IO library, and scenario manager. The overall framework emphasizes the design goals of: generality, code agility, and high performance. This framework allows the modeling of several aspects of transportation system that are typicallymore » done with separate stand-alone software applications, in a high-performance and extensible manner. The issue of integrating such models as dynamic traffic assignment and disaggregate demand models has been a long standing issue for transportation modelers. The integrated approach shows a possible way to resolve this difficulty. The simulation model built from the POLARIS framework is a single, shared-memory process for handling all aspects of the integrated urban simulation. The resulting gains in computational efficiency and performance allow planning models to be extended to include previously separate aspects of the urban system, enhancing the utility of such models from the planning perspective. Initial tests with case studies involving traffic management center impacts on various network events such as accidents, congestion and weather events, show the potential of the system.« less
A real-time MPEG software decoder using a portable message-passing library
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kwong, Man Kam; Tang, P.T. Peter; Lin, Biquan
1995-12-31
We present a real-time MPEG software decoder that uses message-passing libraries such as MPL, p4 and MPI. The parallel MPEG decoder currently runs on the IBM SP system but can be easil ported to other parallel machines. This paper discusses our parallel MPEG decoding algorithm as well as the parallel programming environment under which it uses. Several technical issues are discussed, including balancing of decoding speed, memory limitation, 1/0 capacities, and optimization of MPEG decoding components. This project shows that a real-time portable software MPEG decoder is feasible in a general-purpose parallel machine.
Logic Model Checking of Time-Periodic Real-Time Systems
NASA Technical Reports Server (NTRS)
Florian, Mihai; Gamble, Ed; Holzmann, Gerard
2012-01-01
In this paper we report on the work we performed to extend the logic model checker SPIN with built-in support for the verification of periodic, real-time embedded software systems, as commonly used in aircraft, automobiles, and spacecraft. We first extended the SPIN verification algorithms to model priority based scheduling policies. Next, we added a library to support the modeling of periodic tasks. This library was used in a recent application of the SPIN model checker to verify the engine control software of an automobile, to study the feasibility of software triggers for unintended acceleration events.
ERIC Educational Resources Information Center
Romero, Juan Suarez
2010-01-01
Public libraries are redefining their roles in order to stay relevant to the needs of the communities they serve. Today, libraries are places where reading meets hands-on learning and where quietness coexists with voices and music. The latest advances in technology for children and teens, specifically, robotics sets and media-rich software, are…
The New York City Subways: The First Ten Years. A Library Research Exercise Using a Computer.
ERIC Educational Resources Information Center
Machalow, Robert
This document presents a library research exercise developed at York College which uses the Apple IIe microcomputer and word processing software--the Applewriter--to teach library research skills. Unlike some other library research exercises on disk, this program allows the student to decide on alternative approaches to solving the given problem:…
Nunez-Iglesias, Juan; Kennedy, Ryan; Plaza, Stephen M.; Chakraborty, Anirban; Katz, William T.
2014-01-01
The aim in high-resolution connectomics is to reconstruct complete neuronal connectivity in a tissue. Currently, the only technology capable of resolving the smallest neuronal processes is electron microscopy (EM). Thus, a common approach to network reconstruction is to perform (error-prone) automatic segmentation of EM images, followed by manual proofreading by experts to fix errors. We have developed an algorithm and software library to not only improve the accuracy of the initial automatic segmentation, but also point out the image coordinates where it is likely to have made errors. Our software, called gala (graph-based active learning of agglomeration), improves the state of the art in agglomerative image segmentation. It is implemented in Python and makes extensive use of the scientific Python stack (numpy, scipy, networkx, scikit-learn, scikit-image, and others). We present here the software architecture of the gala library, and discuss several designs that we consider would be generally useful for other segmentation packages. We also discuss the current limitations of the gala library and how we intend to address them. PMID:24772079
Using component technology to facilitate external software reuse in ground-based planning systems
NASA Technical Reports Server (NTRS)
Chase, A.
2003-01-01
APGEN (Activity Plan GENerator - 314), a multi-mission planning tool, must interface with external software to vest serve its users. AP-GEN's original method for incorporating external software, the User-Defined library mechanism, has been very successful in allowing APGEN users access to external software functionality.
The tensor network theory library
NASA Astrophysics Data System (ADS)
Al-Assam, S.; Clark, S. R.; Jaksch, D.
2017-09-01
In this technical paper we introduce the tensor network theory (TNT) library—an open-source software project aimed at providing a platform for rapidly developing robust, easy to use and highly optimised code for TNT calculations. The objectives of this paper are (i) to give an overview of the structure of TNT library, and (ii) to help scientists decide whether to use the TNT library in their research. We show how to employ the TNT routines by giving examples of ground-state and dynamical calculations of one-dimensional bosonic lattice system. We also discuss different options for gaining access to the software available at www.tensornetworktheory.org.
A Security-façade Library for Virtual-observatory Software
NASA Astrophysics Data System (ADS)
Rixon, G.
2009-09-01
The security-façade library implements, for Java, IVOA's security standards. It supports the authentication mechanisms for SOAP and REST web-services, the sign-on mechanisms (with MyProxy, AstroGrid Accounts protocol or local credential-caches), the delegation protocol, and RFC3820-enabled HTTPS for Apache Tomcat. Using the façade, a developer who is not a security specialist can easily add access control to a virtual-observatory service and call secured services from an application. The library has been an internal part of AstroGrid software for some time and it is now offered for use by other developers.
Management Preparation and Training of Department Heads in ARL Libraries.
ERIC Educational Resources Information Center
Wittenbach, Stefanie A.; And Others
1992-01-01
A survey of cataloging and reference department heads in ARL (Association for Research Libraries) member libraries showed that both professional experience and management coursework play a part in a librarian's promotion to department head. It is suggested that library management will improve if libraries require formal management preparation and…
The Microcomputer in the Library: VI. Implementation and Future Development.
ERIC Educational Resources Information Center
Leggate, Peter; Dyer, Hilary
1986-01-01
This sixth article in a series discusses planning for the installation and implementation of automated systems in the library, workstation design and location, scheduling of software implementation, security, data input, staff and reader training, job design, impact of automation on library procedures, evaluation of system performance, and future…
101 Computer Projects for Libraries. 101 Micro Series.
ERIC Educational Resources Information Center
Dewey, Patrick R.
The projects collected in this book represent a wide cross section of the way microcomputers are used in libraries. Each project description includes organization and contact information, hardware and software used, cost and project length estimates, and Web or print references when available. Projects come from academic and public libraries,…
Library Users: How They Adapt to Changing Roles.
ERIC Educational Resources Information Center
Miido, Helis
Traditional library tasks, for example database searching, are increasingly performed by library users, forcing both the librarian and the user to assume at times dichotomous roles of teacher and student. Modern librarians install new software and guide organizations in multimedia applications. Librarians need to be cognizant of the human factor,…
Opening up Library Automation Software
ERIC Educational Resources Information Center
Breeding, Marshall
2009-01-01
Throughout the history of library automation, the author has seen a steady advancement toward more open systems. In the early days of library automation, when proprietary systems dominated, the need for standards was paramount since other means of inter-operability and data exchange weren't possible. Today's focus on Application Programming…
An Integrated Library System from Existing Microcomputer Programs.
ERIC Educational Resources Information Center
Kuntz, Lynda S.
1988-01-01
Demonstrates how three commercial microcomputer software packages--PC-Talk III, Wordstar, and dBase III--were combined to produce an integrated library system at the U.S. Army Concepts Analysis Agency library. The retrospective conversion process is discussed, and the four modules of the system are described: acquisitions/cataloging; online…
Mini-Micro CDS/ISIS in the Thailand Development Research Institute Library.
ERIC Educational Resources Information Center
Wongkoltoot, Poonsin; Indee, Somsak
1992-01-01
Describes the Thailand Development Research Institute Library's development of an integrated bibliographic system using UNESCO's Micro-ISIS software. Linkages between databases were made using an in-house application (TIBIS) written in CDS/ISIS Pascal. The library system is available on a local area network (LAN). (three references) (EA)
Washington Public Libraries Online: Collaborating in Cyberspace.
ERIC Educational Resources Information Center
Wildin, Nancy
1997-01-01
Discussion of public libraries, the Internet, and the World Wide Web focuses on development of a Web site in Washington. Highlights include access to the Internet through online public access catalogs; partnerships between various types of libraries; hardware and software; HTML training; content design; graphics design; marketing; evaluation; and…
Software To Go: A Catalog of Software Available for Loan.
ERIC Educational Resources Information Center
Kurlychek, Ken, Comp.
This catalog lists the holdings of the Software To Go software lending library and clearinghouse for programs and agencies serving students or clients who are deaf or hard of hearing. An introduction describes the clearinghouse and its collection of software, much of it commercial and copyrighted material, for Apple, Macintosh, and IBM (MS-DOS)…
Maintenance of Automated Library Systems.
ERIC Educational Resources Information Center
Epstein, Susan Baerg
1983-01-01
Discussion of the maintenance of both the software and hardware in an automated library system highlights maintenance by the vendor, contracts and costs, the maintenance log, downtime, and planning for trouble. (EJS)
Knowledge Management in Libraries in the 21st Century.
ERIC Educational Resources Information Center
Shanhong, Tang
This paper begins with a section that describes characteristics of knowledge management in libraries, including: human resource management is the core of knowledge management in libraries; the objective of knowledge management in libraries is to promote knowledge innovation; and information technology is a tool for knowledge management in…
An application of machine learning to the organization of institutional software repositories
NASA Technical Reports Server (NTRS)
Bailin, Sidney; Henderson, Scott; Truszkowski, Walt
1993-01-01
Software reuse has become a major goal in the development of space systems, as a recent NASA-wide workshop on the subject made clear. The Data Systems Technology Division of Goddard Space Flight Center has been working on tools and techniques for promoting reuse, in particular in the development of satellite ground support software. One of these tools is the Experiment in Libraries via Incremental Schemata and Cobweb (ElvisC). ElvisC applies machine learning to the problem of organizing a reusable software component library for efficient and reliable retrieval. In this paper we describe the background factors that have motivated this work, present the design of the system, and evaluate the results of its application.
Automated Reuse of Scientific Subroutine Libraries through Deductive Synthesis
NASA Technical Reports Server (NTRS)
Lowry, Michael R.; Pressburger, Thomas; VanBaalen, Jeffrey; Roach, Steven
1997-01-01
Systematic software construction offers the potential of elevating software engineering from an art-form to an engineering discipline. The desired result is more predictable software development leading to better quality and more maintainable software. However, the overhead costs associated with the formalisms, mathematics, and methods of systematic software construction have largely precluded their adoption in real-world software development. In fact, many mainstream software development organizations, such as Microsoft, still maintain a predominantly oral culture for software development projects; which is far removed from a formalism-based culture for software development. An exception is the limited domain of safety-critical software, where the high-assuiance inherent in systematic software construction justifies the additional cost. We believe that systematic software construction will only be adopted by mainstream software development organization when the overhead costs have been greatly reduced. Two approaches to cost mitigation are reuse (amortizing costs over many applications) and automation. For the last four years, NASA Ames has funded the Amphion project, whose objective is to automate software reuse through techniques from systematic software construction. In particular, deductive program synthesis (i.e., program extraction from proofs) is used to derive a composition of software components (e.g., subroutines) that correctly implements a specification. The construction of reuse libraries of software components is the standard software engineering solution for improving software development productivity and quality.
Feasibility study of an Integrated Program for Aerospace vehicle Design (IPAD). Volume 1A: Summary
NASA Technical Reports Server (NTRS)
Miller, R. E., Jr.; Redhed, D. D.; Kawaguchi, A. S.; Hansen, S. D.; Southall, J. W.
1973-01-01
IPAD was defined as a total system oriented to the product design process. This total system was designed to recognize the product design process, individuals and their design process tasks, and the computer-based IPAD System to aid product design. Principal elements of the IPAD System include the host computer and its interactive system software, new executive and data management software, and an open-ended IPAD library of technical programs to match the intended product design process. The basic goal of the IPAD total system is to increase the productivity of the product design organization. Increases in individual productivity were feasible through automation and computer support of routine information handling. Such proven automation can directly decrease cost and flowtime in the product design process.
DOE Office of Scientific and Technical Information (OSTI.GOV)
SADE is a software package for rapidly assembling analytic pipelines to manipulate data. The packages consists of the engine that manages the data and coordinates the movement of data between the tasks performing a function? a set of core libraries consisting of plugins that perform common tasks? and a framework to extend the system supporting the development of new plugins. Currently through configuration files, a pipeline can be defined that maps the routing of data through a series of plugins. Pipelines can be run in a batch mode or can process streaming data? they can be executed from the commandmore » line or run through a Windows background service. There currently exists over a hundred plugins, over fifty pipeline configurations? and the software is now being used by about a half-dozen projects.« less
How to tap NASA-developed technology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ruzic, N.
The National Aeronautics and Space Administration (NASA) space program's contribution to technology and the transfer of its achievements to industrial and consumer products is unprecedented. The process of transferring new technology suffers, however, partly because managers tend to ignore new technological markets unless new products solve their specific problems and partly because managers may not know the technology is available. NASA's Technology Utilization Branch has learned to initiate transfer, using a network of centers to dispense information on applications. NASA also has a large software library and computer programs, as well as teams to make person-to-person contacts. Examples of successfulmore » transfers have affected energy sources, building contruction, health, and safety. (DCK)« less
BASKET on-board software library
NASA Astrophysics Data System (ADS)
Luntzer, Armin; Ottensamer, Roland; Kerschbaum, Franz
2014-07-01
The University of Vienna is a provider of on-board data processing software with focus on data compression, such as used on board the highly successful Herschel/PACS instrument, as well as in the small BRITE-Constellation fleet of cube-sats. Current contributions are made to CHEOPS, SAFARI and PLATO. The effort was taken to review the various functions developed for Herschel and provide a consolidated software library to facilitate the work for future missions. This library is a shopping basket of algorithms. Its contents are separated into four classes: auxiliary functions (e.g. circular buffers), preprocessing functions (e.g. for calibration), lossless data compression (arithmetic or Rice coding) and lossy reduction steps (ramp fitting etc.). The "BASKET" has all functionality that is needed to create an on-board data processing chain. All sources are written in C, supplemented by optimized versions in assembly, targeting popular CPU architectures for space applications. BASKET is open source and constantly growing
Software support for SBGN maps: SBGN-ML and LibSBGN.
van Iersel, Martijn P; Villéger, Alice C; Czauderna, Tobias; Boyd, Sarah E; Bergmann, Frank T; Luna, Augustin; Demir, Emek; Sorokin, Anatoly; Dogrusoz, Ugur; Matsuoka, Yukiko; Funahashi, Akira; Aladjem, Mirit I; Mi, Huaiyu; Moodie, Stuart L; Kitano, Hiroaki; Le Novère, Nicolas; Schreiber, Falk
2012-08-01
LibSBGN is a software library for reading, writing and manipulating Systems Biology Graphical Notation (SBGN) maps stored using the recently developed SBGN-ML file format. The library (available in C++ and Java) makes it easy for developers to add SBGN support to their tools, whereas the file format facilitates the exchange of maps between compatible software applications. The library also supports validation of maps, which simplifies the task of ensuring compliance with the detailed SBGN specifications. With this effort we hope to increase the adoption of SBGN in bioinformatics tools, ultimately enabling more researchers to visualize biological knowledge in a precise and unambiguous manner. Milestone 2 was released in December 2011. Source code, example files and binaries are freely available under the terms of either the LGPL v2.1+ or Apache v2.0 open source licenses from http://libsbgn.sourceforge.net. sbgn-libsbgn@lists.sourceforge.net.
TimeBench: a data model and software library for visual analytics of time-oriented data.
Rind, Alexander; Lammarsch, Tim; Aigner, Wolfgang; Alsallakh, Bilal; Miksch, Silvia
2013-12-01
Time-oriented data play an essential role in many Visual Analytics scenarios such as extracting medical insights from collections of electronic health records or identifying emerging problems and vulnerabilities in network traffic. However, many software libraries for Visual Analytics treat time as a flat numerical data type and insufficiently tackle the complexity of the time domain such as calendar granularities and intervals. Therefore, developers of advanced Visual Analytics designs need to implement temporal foundations in their application code over and over again. We present TimeBench, a software library that provides foundational data structures and algorithms for time-oriented data in Visual Analytics. Its expressiveness and developer accessibility have been evaluated through application examples demonstrating a variety of challenges with time-oriented data and long-term developer studies conducted in the scope of research and student projects.
DICOM implementation on online tape library storage system
NASA Astrophysics Data System (ADS)
Komo, Darmadi; Dai, Hailei L.; Elghammer, David; Levine, Betty A.; Mun, Seong K.
1998-07-01
The main purpose of this project is to implement a Digital Image and Communications (DICOM) compliant online tape library system over the Internet. Once finished, the system will be used to store medical exams generated from U.S. ARMY Mobile ARMY Surgical Hospital (MASH) in Tuzla, Bosnia. A modified UC Davis implementation of DICOM storage class is used for this project. DICOM storage class user and provider are implemented as the system's interface to the Internet. The DICOM software provides flexible configuration options such as types of modalities and trusted remote DICOM hosts. Metadata is extracted from each exam and indexed in a relational database for query and retrieve purposes. The medical images are stored inside the Wolfcreek-9360 tape library system from StorageTek Corporation. The tape library system has nearline access to more than 1000 tapes. Each tape has a capacity of 800 megabytes making the total nearline tape access of around 1 terabyte. The tape library uses the Application Storage Manager (ASM) which provides cost-effective file management, storage, archival, and retrieval services. ASM automatically and transparently copies files from expensive magnetic disk to less expensive nearline tape library, and restores the files back when they are needed. The ASM also provides a crash recovery tool, which enable an entire file system restore in a short time. A graphical user interface (GUI) function is used to view the contents of the storage systems. This GUI also allows user to retrieve the stored exams and send the exams to anywhere on the Internet using DICOM protocols. With the integration of different components of the system, we have implemented a high capacity online tape library storage system that is flexible and easy to use. Using tape as an alternative storage media as opposed to the magnetic disk has the great potential of cost savings in terms of dollars per megabyte of storage. As this system matures, the Hospital Information Systems/Radiology Information Systems (HIS/RIS) or other components can be developed potentially as interfaces to the outside world thus widen the usage of the tape library system.
Design and implementation of an institutional case report form library.
Nahm, Meredith; Shepherd, John; Buzenberg, Ann; Rostami, Reza; Corcoran, Andrew; McCall, Jonathan; Pietrobon, Ricardo
2011-02-01
Case report forms (CRFs) are used to collect data in clinical research. Case report form development represents a significant part of the clinical trial process and can affect study success. Libraries of CRFs can preserve the organizational knowledge and expertise invested in CRF development and expedite the sharing of such knowledge. Although CRF libraries have been advocated, there have been no published accounts reporting institutional experiences with creating and using them. We sought to enhance an existing institutional CRF library by improving information indexing and accessibility. We describe this CRF library and discuss challenges encountered in its development and implementation, as well as future directions for continued work in this area. We transformed an existing but underused and poorly accessible CRF library into a resource capable of supporting and expediting clinical and translational investigation at our institution by (1) expanding access to the entire institution; (2) adding more form attributes for improved information retrieval; and (3) creating a formal information curation and maintenance process. An open-source content management system, Plone (Plone.org), served as the platform for our CRF library. We report results from these three processes. Over the course of this project, the size of the CRF library increased from 160 CRFs comprising an estimated total of 17,000 pages, to 177 CRFs totaling 1.5 gigabytes. Eighty-two of these CRFs are now available to researchers across our institution; 95 CRFs remain within a contractual confidentiality window (usually 5 years from database lock) and are not available to users outside of the Duke Clinical Research Institute (DCRI). Conservative estimates suggest that the library supports an average of 37 investigators per month. The resources needed to curate and maintain the CRF library require less than 10% of the effort of one full-time equivalent employee. Although we succeeded in expanding use of the CRF library, creating awareness of such institutional resources among investigators and research teams remains challenging and requires additional efforts to overcome. Institutions that have not achieved a critical mass of attractive research resources or effective dissemination mechanisms may encounter persistent difficulty attracting researchers to use institutional resources. Further, a useful CRF library requires both an initial investment of resources for development, as well as ongoing maintenance once it is established. CRF libraries can be established and made broadly available to institutional researchers. Curation - that is, indexing newly added forms - is required. Such a resource provides knowledge management capacity for institutions until standards and software are available to support widespread exchange of data and form definitions.
NASA Astrophysics Data System (ADS)
Lisio, Giovanni; Candia, Sante; Campolo, Giovanni; Pascucci, Dario
2011-08-01
Thales Alenia Space Italy has carried out the definition of a configurable (on mission basis) PUS ECSS-E_70- 41A see [3] Centralised Services Layer, characterised by:- a mission-independent set of 'classes' implementing the services logic.- a mission-dependent set of configuration data and selection flags.The software components belonging to this layer implement the PUS standard services ECSS-E_70-41A and a set of mission-specific services. The design of this layer has been performed by separating the services mechanisms (mission-independent execution logic) from the services configuration information (mission-dependent data). Once instantiated for a specific mission, the PUS Centralised Services Layer offers a large set of capabilities available to the CSCI's Applications Layer. This paper describes the building blocks PUS architectural solution developed by Thales Alenia Space Italy, emphasizing the mechanisms which allow easy configuration of the Scalable PUS library to fulfill the requirements of different missions. This paper also focus the Thales Alenia Space solution to automatically generate the mission-specific "PUS Services" flight software based on mission specific requirements. Building the PUS services mechanisms, which are configurable on mission basis is part of the PRIMA (Multipurpose Spacecraft Bus ) 'missionisation' process improvement. PRIMA Platform Avionics Software (ASW) is continuously evolving to improve modularity and standardization of interfaces and of SW components (see references in [1]).
ERIC Educational Resources Information Center
Weber, Jonathan
2006-01-01
Creating a digital library might seem like a task best left to a large research collection with a vast staff and generous budget. However, tools for successfully creating digital libraries are getting easier to use all the time. The explosion of people creating content for the web has led to the availability of many high-quality applications and…
ERIC Educational Resources Information Center
California Univ., Los Angeles. Inst. of Library Research.
The general conclusions of the planning study on Mechanized Information Services in the University Library are that such services represent a desirable, even necessary, extension of the library's traditional functions. Preliminary specifications for such a library-based "Center for Information Services" (CIS) are presented in this report. Covered…
Parallel computation and the Basis system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, G.R.
1992-12-16
A software package has been written that can facilitate efforts to develop powerful, flexible, and easy-to-use programs that can run in single-processor, massively parallel, and distributed computing environments. Particular attention has been given to the difficulties posed by a program consisting of many science packages that represent subsystems of a complicated, coupled system. Methods have been found to maintain independence of the packages by hiding data structures without increasing the communication costs in a parallel computing environment. Concepts developed in this work are demonstrated by a prototype program that uses library routines from two existing software systems, Basis and Parallelmore » Virtual Machine (PVM). Most of the details of these libraries have been encapsulated in routines and macros that could be rewritten for alternative libraries that possess certain minimum capabilities. The prototype software uses a flexible master-and-slaves paradigm for parallel computation and supports domain decomposition with message passing for partitioning work among slaves. Facilities are provided for accessing variables that are distributed among the memories of slaves assigned to subdomains. The software is named PROTOPAR.« less
Parallel computation and the basis system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, G.R.
1993-05-01
A software package has been written that can facilitate efforts to develop powerful, flexible, and easy-to use programs that can run in single-processor, massively parallel, and distributed computing environments. Particular attention has been given to the difficulties posed by a program consisting of many science packages that represent subsystems of a complicated, coupled system. Methods have been found to maintain independence of the packages by hiding data structures without increasing the communications costs in a parallel computing environment. Concepts developed in this work are demonstrated by a prototype program that uses library routines from two existing software systems, Basis andmore » Parallel Virtual Machine (PVM). Most of the details of these libraries have been encapsulated in routines and macros that could be rewritten for alternative libraries that possess certain minimum capabilities. The prototype software uses a flexible master-and-slaves paradigm for parallel computation and supports domain decomposition with message passing for partitioning work among slaves. Facilities are provided for accessing variables that are distributed among the memories of slaves assigned to subdomains. The software is named PROTOPAR.« less
The Electronic Hermit: Trends in Library Automation.
ERIC Educational Resources Information Center
LaRue, James
1988-01-01
Reviews trends in library software development including: (1) microcomputer applications; (2) CD-ROM; (3) desktop publishing; (4) public access microcomputers; (5) artificial intelligence; (6) mainframes and minicomputers; and (7) automated catalogs. (MES)
ERIC Educational Resources Information Center
Stephenson, Kimberley
2012-01-01
Cross-campus collaboration for library website design and management can be challenging, but the process can produce stronger, more attractive, and more usable library websites. Collaborative library website design and management can also lead to new avenues for marketing library tools and services; expert consultation for library technology…
OLS Client and OLS Dialog: Open Source Tools to Annotate Public Omics Datasets.
Perez-Riverol, Yasset; Ternent, Tobias; Koch, Maximilian; Barsnes, Harald; Vrousgou, Olga; Jupp, Simon; Vizcaíno, Juan Antonio
2017-10-01
The availability of user-friendly software to annotate biological datasets and experimental details is becoming essential in data management practices, both in local storage systems and in public databases. The Ontology Lookup Service (OLS, http://www.ebi.ac.uk/ols) is a popular centralized service to query, browse and navigate biomedical ontologies and controlled vocabularies. Recently, the OLS framework has been completely redeveloped (version 3.0), including enhancements in the data model, like the added support for Web Ontology Language based ontologies, among many other improvements. However, the new OLS is not backwards compatible and new software tools are needed to enable access to this widely used framework now that the previous version is no longer available. We here present the OLS Client as a free, open-source Java library to retrieve information from the new version of the OLS. It enables rapid tool creation by providing a robust, pluggable programming interface and common data model to programmatically access the OLS. The library has already been integrated and is routinely used by several bioinformatics resources and related data annotation tools. Secondly, we also introduce an updated version of the OLS Dialog (version 2.0), a Java graphical user interface that can be easily plugged into Java desktop applications to access the OLS. The software and related documentation are freely available at https://github.com/PRIDE-Utilities/ols-client and https://github.com/PRIDE-Toolsuite/ols-dialog. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Development of the FITS tools package for multiple software environments
NASA Technical Reports Server (NTRS)
Pence, W. D.; Blackburn, J. K.
1992-01-01
The HEASARC is developing a package of general purpose software for analyzing data files in FITS format. This paper describes the design philosophy which makes the software both machine-independent (it runs on VAXs, Suns, and DEC-stations) and software environment-independent. Currently the software can be compiled and linked to produce IRAF tasks, or alternatively, the same source code can be used to generate stand-alone tasks using one of two implementations of a user-parameter interface library. The machine independence of the software is achieved by writing the source code in ANSI standard Fortran or C, using the machine-independent FITSIO subroutine interface for all data file I/O, and using a standard user-parameter subroutine interface for all user I/O. The latter interface is based on the Fortran IRAF Parameter File interface developed at STScI. The IRAF tasks are built by linking to the IRAF implementation of this parameter interface library. Two other implementations of this parameter interface library, which have no IRAF dependencies, are now available which can be used to generate stand-alone executable tasks. These stand-alone tasks can simply be executed from the machine operating system prompt either by supplying all the task parameters on the command line or by entering the task name after which the user will be prompted for any required parameters. A first release of this FTOOLS package is now publicly available. The currently available tasks are described, along with instructions on how to obtain a copy of the software.
ERIC Educational Resources Information Center
Davies, Denise M.
1985-01-01
Discusses design, development, and use of a database to provide organization and access to a computer software collection at the University of Hawaii School of Library Studies. Field specifications, samples of report forms, and a description of the physical organization of the software collection are included. (MBR)
NASA Technical Reports Server (NTRS)
Chien, Steve A.; Tran, Daniel Q.; Rabideau, Gregg R.; Schaffer, Steven R.
2011-01-01
Software has been designed to schedule remote sensing with the Earth Observing One spacecraft. The software attempts to satisfy as many observation requests as possible considering each against spacecraft operation constraints such as data volume, thermal, pointing maneuvers, and others. More complex constraints such as temperature are approximated to enable efficient reasoning while keeping the spacecraft within safe limits. Other constraints are checked using an external software library. For example, an attitude control library is used to determine the feasibility of maneuvering between pairs of observations. This innovation can deal with a wide range of spacecraft constraints and solve large scale scheduling problems like hundreds of observations and thousands of combinations of observation sequences.
Cognitive Foundry v. 3.0 (OSS)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Basilico, Justin; Dixon, Kevin; McClain, Jonathan
2009-11-18
The Cognitive Foundry is a unified collection of tools designed for research and applications that use cognitive modeling, machine learning, or pattern recognition. The software library contains design patterns, interface definitions, and default implementations of reusable software components and algorithms designed to support a wide variety of research and development needs. The library contains three main software packages: the Common package that contains basic utilities and linear algebraic methods, the Cognitive Framework package that contains tools to assist in implementing and analyzing theories of cognition, and the Machine Learning package that provides general algorithms and methods for populating Cognitive Frameworkmore » components from domain-relevant data.« less
Advanced data acquisition and display techniques for laser velocimetry
NASA Technical Reports Server (NTRS)
Kjelgaard, Scott O.; Weston, Robert P.
1991-01-01
The Basic Aerodynamics Research Tunnel (BART) has been equipped with state-of-the-art instrumentation for acquiring the data needed for code validation. This paper describes the three-component LDV and the workstation-based data-acquisition system (DAS) which has been developed for the BART. The DAS allows the use of automation and the quick integration of advanced instrumentation, while minimizing the software development time required between investigations. The paper also includes a description of a graphics software library developed to support the windowing environment of the DAS. The real-time displays generated using the graphics library help the researcher ensure the test is proceeding properly. The graphics library also supports the requirements of posttest data analysis. The use of the DAS and graphics libraries is illustrated by presenting examples of the real-time and postprocessing display graphics for LDV investigations.
Using volunteers in Ontario hospital libraries: views of library managers.
McDiarmid, Mary; Auster, Ethel
2005-04-01
Volunteers have been a resource for all types of libraries for many years. Little research has been done to describe the attitudes librarians have toward library volunteers. More specifically, the attitudes of hospital librarians toward volunteers have never been studied. The objective was to explore and describe the extent of volunteer use and to determine library managers' attitudes toward volunteers. An anonymous, self-report 38-item questionnaire was mailed to the target population of 89 hospital library managers in Ontario. Seventy-nine useable questionnaires were analyzed from an adjusted sample of 86 eligible respondents, resulting in a response rate of 92%. SPSS 11.5 was used to analyze the data. The data revealed the attitudes of managers using volunteers did not differ significantly from the attitudes of managers not using volunteers. The findings showed that a majority of managers did not believe their libraries were adequately staffed with paid employees. Sufficient evidence was found of an association between a manager's belief in the adequacy of staffing in the library and the use of volunteers in the library (chi2(1, N=76)=4.11, P=0.043). Specifically, volunteers were more likely to be used by managers who did not believe their libraries were adequately staffed. The presence of a union in the library and the use of volunteers were also associated (chi2(1, N=77)=4.77, P=0.029). When unions were present in the library, volunteers were less likely to be used. This research has implications for hospital library managers in the management of volunteers. Volunteers should not be viewed as a quick fix or as a long-term solution for a library's understaffing problem.
Using volunteers in Ontario hospital libraries: views of library managers*
McDiarmid, Mary; Auster, Ethel
2005-01-01
Background: Volunteers have been a resource for all types of libraries for many years. Little research has been done to describe the attitudes librarians have toward library volunteers. More specifically, the attitudes of hospital librarians toward volunteers have never been studied. Objective: The objective was to explore and describe the extent of volunteer use and to determine library managers' attitudes toward volunteers. Design, Setting, and Participants: An anonymous, self-report 38-item questionnaire was mailed to the target population of 89 hospital library managers in Ontario. Seventy-nine useable questionnaires were analyzed from an adjusted sample of 86 eligible respondents, resulting in a response rate of 92%. SPSS 11.5 was used to analyze the data. Findings: The data revealed the attitudes of managers using volunteers did not differ significantly from the attitudes of managers not using volunteers. The findings showed that a majority of managers did not believe their libraries were adequately staffed with paid employees. Sufficient evidence was found of an association between a manager's belief in the adequacy of staffing in the library and the use of volunteers in the library (χ2(1, N = 76) = 4.11, P = 0.043). Specifically, volunteers were more likely to be used by managers who did not believe their libraries were adequately staffed. The presence of a union in the library and the use of volunteers were also associated (χ2(1, N = 77) = 4.77, P = 0.029). When unions were present in the library, volunteers were less likely to be used. Implications: This research has implications for hospital library managers in the management of volunteers. Volunteers should not be viewed as a quick fix or as a long-term solution for a library's understaffing problem. PMID:15858629
Introduction of Electronic Book Ordering with EDIFACT in a Special Library: A Case Study.
ERIC Educational Resources Information Center
Stadler, Peter; Thomas, Martin; Mernke, Ernst
1999-01-01
Describes book ordering procedures at a German pharmaceutical library using the Electronic Data Interchange for Administration, Commerce and Transport (EDIFACT) guidelines developed by the United Nations. Topics include advantages for libraries and for the book trade, changes needed in software, productivity gains, and EDIFACT versus the Internet.…
A Survey of Popular R Packages for Cluster Analysis
ERIC Educational Resources Information Center
Flynt, Abby; Dean, Nema
2016-01-01
Cluster analysis is a set of statistical methods for discovering new group/class structure when exploring data sets. This article reviews the following popular libraries/commands in the R software language for applying different types of cluster analysis: from the stats library, the kmeans, and hclust functions; the mclust library; the poLCA…
Automation Challenges of the 80's: What to Do until Your Integrated Library System Arrives.
ERIC Educational Resources Information Center
Allan, Ferne C.; Shields, Joyce M.
1986-01-01
A medium-sized aerospace library has developed interim solutions to automation needs by using software and equipment that were available in-house in preparation for an expected integrated library system. Automated processes include authors' file of items authored by employees, journal routing (including routing slips), statistics, journal…
Two Ways to Set Up Wireless Hotspot: Comparing Apples and Oranges
ERIC Educational Resources Information Center
Mutch, Andrew; Ventura, Karen
2006-01-01
Every place has it--coffee shops, bookstores, universities ... and now libraries are offering wireless Internet access too. At the Waterford Township Public Library (Waterford, Michigan), Andrew Mutch created a wide-open wireless network using Public IP open source software. About 20 miles away, at the Novi Public Library (Novi, Michigan), Karen…
A Tour of the Stacks--HyperCard for Libraries.
ERIC Educational Resources Information Center
Ertel, Monica; Oros, Jane
1989-01-01
Description of HyperCard, a software package that runs on Macintosh microcomputers, focuses on its use in the Apple Computer, Inc., Library as a user guide to the library. Examples of screen displays are given, and a list of resources is included to help use and understand HyperCard more completely. (LRW)
The Philip Morris Information Network: A Library Database on an In-House Timesharing System.
ERIC Educational Resources Information Center
DeBardeleben, Marian Z.; And Others
1983-01-01
Outlines a database constructed at Philip Morris Research Center Library which encompasses holdings and circulation and acquisitions records for all items in the library. Host computer (DECSYSTEM-2060), software (BASIC), database design, search methodology, cataloging, and accessibility are noted; sample search, circ-in profile, end user profiles,…
Surviving the First Three Months as a Community College Library Director
ERIC Educational Resources Information Center
Kelly, Robert E., IV
2003-01-01
The experiences of a first-time community college library director during the initial three months on the job are detailed. The process of automation software migration is used as a vehicle for demonstrating activities performed to improve knowledge. Marketing, customer service, and identification of core library supporters are addressed. Concern…
ERIC Educational Resources Information Center
Rose, Phillip E.
1988-01-01
Describes the experiences of the AT&T Technical Library in installing a local area network (LAN) and bulletin board using the UNIX operating system. Reasons why a LAN was needed, how the system works, and hardware and software used are discussed. (1 reference) (MES)
Library and Classroom Use of Copyrighted Videotapes and Computer Software.
ERIC Educational Resources Information Center
Reed, Mary Hutchings; Stanek, Debra
Designed to provide guidance for librarians, this publication expresses the opinion of the legal counsel of the American Library Association (ALA) regarding library and classroom use of copyrighted videotapes and computer programs. A discussion of videotapes considers the impact of the Copyright Revision Act of 1976 on in-classroom use, in-library…
Visual programming for next-generation sequencing data analytics.
Milicchio, Franco; Rose, Rebecca; Bian, Jiang; Min, Jae; Prosperi, Mattia
2016-01-01
High-throughput or next-generation sequencing (NGS) technologies have become an established and affordable experimental framework in biological and medical sciences for all basic and translational research. Processing and analyzing NGS data is challenging. NGS data are big, heterogeneous, sparse, and error prone. Although a plethora of tools for NGS data analysis has emerged in the past decade, (i) software development is still lagging behind data generation capabilities, and (ii) there is a 'cultural' gap between the end user and the developer. Generic software template libraries specifically developed for NGS can help in dealing with the former problem, whilst coupling template libraries with visual programming may help with the latter. Here we scrutinize the state-of-the-art low-level software libraries implemented specifically for NGS and graphical tools for NGS analytics. An ideal developing environment for NGS should be modular (with a native library interface), scalable in computational methods (i.e. serial, multithread, distributed), transparent (platform-independent), interoperable (with external software interface), and usable (via an intuitive graphical user interface). These characteristics should facilitate both the run of standardized NGS pipelines and the development of new workflows based on technological advancements or users' needs. We discuss in detail the potential of a computational framework blending generic template programming and visual programming that addresses all of the current limitations. In the long term, a proper, well-developed (although not necessarily unique) software framework will bridge the current gap between data generation and hypothesis testing. This will eventually facilitate the development of novel diagnostic tools embedded in routine healthcare.
PEM public key certificate cache server
NASA Astrophysics Data System (ADS)
Cheung, T.
1993-12-01
Privacy Enhanced Mail (PEM) provides privacy enhancement services to users of Internet electronic mail. Confidentiality, authentication, message integrity, and non-repudiation of origin are provided by applying cryptographic measures to messages transferred between end systems by the Message Transfer System. PEM supports both symmetric and asymmetric key distribution. However, the prevalent implementation uses a public key certificate-based strategy, modeled after the X.509 directory authentication framework. This scheme provides an infrastructure compatible with X.509. According to RFC 1422, public key certificates can be stored in directory servers, transmitted via non-secure message exchanges, or distributed via other means. Directory services provide a specialized distributed database for OSI applications. The directory contains information about objects and then provides structured mechanisms for accessing that information. Since directory services are not widely available now, a good approach is to manage certificates in a centralized certificate server. This document describes the detailed design of a centralized certificate cache serve. This server manages a cache of certificates and a cache of Certificate Revocation Lists (CRL's) for PEM applications. PEMapplications contact the server to obtain/store certificates and CRL's. The server software is programmed in C and ELROS. To use this server, ISODE has to be configured and installed properly. The ISODE library 'libisode.a' has to be linked together with this library because ELROS uses the transport layer functions provided by 'libisode.a.' The X.500 DAP library that is included with the ELROS distribution has to be linked in also, since the server uses the DAP library functions to communicate with directory servers.
Development of a web application for water resources based on open source software
NASA Astrophysics Data System (ADS)
Delipetrev, Blagoj; Jonoski, Andreja; Solomatine, Dimitri P.
2014-01-01
This article presents research and development of a prototype web application for water resources using latest advancements in Information and Communication Technologies (ICT), open source software and web GIS. The web application has three web services for: (1) managing, presenting and storing of geospatial data, (2) support of water resources modeling and (3) water resources optimization. The web application is developed using several programming languages (PhP, Ajax, JavaScript, Java), libraries (OpenLayers, JQuery) and open source software components (GeoServer, PostgreSQL, PostGIS). The presented web application has several main advantages: it is available all the time, it is accessible from everywhere, it creates a real time multi-user collaboration platform, the programing languages code and components are interoperable and designed to work in a distributed computer environment, it is flexible for adding additional components and services and, it is scalable depending on the workload. The application was successfully tested on a case study with concurrent multi-users access.
Procurement of Shared Data Instruments for Research Electronic Data Capture (REDCap)
Obeid, Jihad S; McGraw, Catherine A; Minor, Brenda L; Conde, José G; Pawluk, Robert; Lin, Michael; Wang, Janey; Banks, Sean R; Hemphill, Sheree A; Taylor, Rob; Harris, Paul A
2012-01-01
REDCap (Research Electronic Data Capture) is a web-based software solution and tool set that allows biomedical researchers to create secure online forms for data capture, management and analysis with minimal effort and training. The Shared Data Instrument Library (SDIL) is a relatively new component of REDCap that allows sharing of commonly used data collection instruments for immediate study use by 3 research teams. Objectives of the SDIL project include: 1) facilitating reuse of data dictionaries and reducing duplication of effort; 2) promoting the use of validated data collection instruments, data standards and best practices; and 3) promoting research collaboration and data sharing. Instruments submitted to the library are reviewed by a library oversight committee, with rotating membership from multiple institutions, which ensures quality, relevance and legality of shared instruments. The design allows researchers to download the instruments in a consumable electronic format in the REDCap environment. At the time of this writing, the SDIL contains over 128 data collection instruments. Over 2500 instances of instruments have been downloaded by researchers at multiple institutions. In this paper we describe the library platform, provide detail about experience gained during the first 25 months of sharing public domain instruments and provide evidence of impact for the SDIL across the REDCap consortium research community. We postulate that the shared library of instruments reduces the burden of adhering to sound data collection principles while promoting best practices. PMID:23149159
Library Management Tips that Work
ERIC Educational Resources Information Center
Smallwood, Carol, Ed.
2011-01-01
There's no shortage of library management books out there--but how many of them actually tackle the little details of day-to-day management, the hard-to-categorize things that slip through the cracks of a larger handbook? "Library Management Tips that Work" does exactly that, addressing dozens of such issues facing library managers, including: (1)…
A design for a reusable Ada library
NASA Technical Reports Server (NTRS)
Litke, John D.
1986-01-01
A goal of the Ada language standardization effort is to promote reuse of software, implying the existence of substantial software libraries and the storage/retrieval mechanisms to support them. A searching/cataloging mechanism is proposed that permits full or partial distribution of the database, adapts to a variety of searching mechanisms, permits a changine taxonomy with minimal disruption, and minimizes the requirement of specialized cataloger/indexer skills. The important observation is that key words serve not only as indexing mechanism, but also as an identification mechanism, especially via concatenation and as support for a searching mechanism. By deliberately separating these multiple uses, the modifiability and ease of growth that current libraries require, is achieved.
Climate tools in mainstream Linux distributions
NASA Astrophysics Data System (ADS)
McKinstry, Alastair
2015-04-01
Debian/meterology is a project to integrate climate tools and analysis software into the mainstream Debian/Ubuntu Linux distributions. This work describes lessons learnt, and recommends practices for scientific software to be adopted and maintained in OS distributions. In addition to standard analysis tools (cdo,, grads, ferret, metview, ncl, etc.), software used by the Earth System Grid Federation was chosen for integraion, to enable ESGF portals to be built on this base; however exposing scientific codes via web APIs enables security weaknesses, normally ignorable, to be exposed. How tools are hardened, and what changes are required to handle security upgrades, are described. Secondly, to enable libraries and components (e.g. Python modules) to be integrated requires planning by writers: it is not sufficient to assume users can upgrade their code when you make incompatible changes. Here, practices are recommended to enable upgrades and co-installability of C, C++, Fortran and Python codes. Finally, software packages such as NetCDF and HDF5 can be built in multiple configurations. Tools may then expect incompatible versions of these libraries (e.g. serial and parallel) to be simultaneously available; how this was solved in Debian using "pkg-config" and shared library interfaces is described, and best practices for software writers to enable this are summarised.
Software platform for managing the classification of error- related potentials of observers
NASA Astrophysics Data System (ADS)
Asvestas, P.; Ventouras, E.-C.; Kostopoulos, S.; Sidiropoulos, K.; Korfiatis, V.; Korda, A.; Uzunolglu, A.; Karanasiou, I.; Kalatzis, I.; Matsopoulos, G.
2015-09-01
Human learning is partly based on observation. Electroencephalographic recordings of subjects who perform acts (actors) or observe actors (observers), contain a negative waveform in the Evoked Potentials (EPs) of the actors that commit errors and of observers who observe the error-committing actors. This waveform is called the Error-Related Negativity (ERN). Its detection has applications in the context of Brain-Computer Interfaces. The present work describes a software system developed for managing EPs of observers, with the aim of classifying them into observations of either correct or incorrect actions. It consists of an integrated platform for the storage, management, processing and classification of EPs recorded during error-observation experiments. The system was developed using C# and the following development tools and frameworks: MySQL, .NET Framework, Entity Framework and Emgu CV, for interfacing with the machine learning library of OpenCV. Up to six features can be computed per EP recording per electrode. The user can select among various feature selection algorithms and then proceed to train one of three types of classifiers: Artificial Neural Networks, Support Vector Machines, k-nearest neighbour. Next the classifier can be used for classifying any EP curve that has been inputted to the database.
Lemmon, Vance P; Jia, Yuanyuan; Shi, Yan; Holbrook, S Douglas; Bixby, John L; Buchser, William
2011-11-01
The Miami Project to Cure Paralysis, part of the University of Miami Miller School of Medicine, includes a laboratory devoted to High Content Analysis (HCA) of neurons. The goal of the laboratory is to uncover signaling pathways, genes, compounds, or drugs that can be used to promote nerve growth. HCA permits the quantification of neuronal morphology, including the lengths and numbers of axons. HCA of various libraries on primary neurons requires a team-based approach, a variety of process steps and complex manipulations of cells and libraries to obtain meaningful results. HCA itself produces vast amounts of information including images, well-based data and cell-based phenotypic measures. Documenting and integrating the experimental workflows, library data and extensive experimental results is challenging. For academic laboratories generating large data sets from experiments involving thousands of perturbagens, a Laboratory Information Management System (LIMS) is the data tracking solution of choice. With both productivity and efficiency as driving rationales, the Miami Project has equipped its HCA laboratory with an On Demand or Software As A Service (SaaS) LIMS to ensure the quality of its experiments and workflows. The article discusses how the system was selected and integrated into the laboratory. The advantages of a SaaS based LIMS over a client-server based system are described. © 2011 Bentham Science Publishers
Mathematical and Statistical Software Index. Final Report.
ERIC Educational Resources Information Center
Black, Doris E., Comp.
Brief descriptions are provided of general-purpose mathematical and statistical software, including 27 "stand-alone" programs, three subroutine systems, and two nationally recognized statistical packages, which are available in the Air Force Human Resources Laboratory (AFHRL) software library. This index was created to enable researchers…
Zhang, Lanlan; Hub, Martina; Mang, Sarah; Thieke, Christian; Nix, Oliver; Karger, Christian P; Floca, Ralf O
2013-06-01
Radiotherapy is a fast-developing discipline which plays a major role in cancer care. Quantitative analysis of radiotherapy data can improve the success of the treatment and support the prediction of outcome. In this paper, we first identify functional, conceptional and general requirements on a software system for quantitative analysis of radiotherapy. Further we present an overview of existing radiotherapy analysis software tools and check them against the stated requirements. As none of them could meet all of the demands presented herein, we analyzed possible conceptional problems and present software design solutions and recommendations to meet the stated requirements (e.g. algorithmic decoupling via dose iterator pattern; analysis database design). As a proof of concept we developed a software library "RTToolbox" following the presented design principles. The RTToolbox is available as open source library and has already been tested in a larger-scale software system for different use cases. These examples demonstrate the benefit of the presented design principles. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Habib, Salman; Roser, Robert
Computing plays an essential role in all aspects of high energy physics. As computational technology evolves rapidly in new directions, and data throughput and volume continue to follow a steep trend-line, it is important for the HEP community to develop an effective response to a series of expected challenges. In order to help shape the desired response, the HEP Forum for Computational Excellence (HEP-FCE) initiated a roadmap planning activity with two key overlapping drivers -- 1) software effectiveness, and 2) infrastructure and expertise advancement. The HEP-FCE formed three working groups, 1) Applications Software, 2) Software Libraries and Tools, and 3)more » Systems (including systems software), to provide an overview of the current status of HEP computing and to present findings and opportunities for the desired HEP computational roadmap. The final versions of the reports are combined in this document, and are presented along with introductory material.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Habib, Salman; Roser, Robert; LeCompte, Tom
2015-10-29
Computing plays an essential role in all aspects of high energy physics. As computational technology evolves rapidly in new directions, and data throughput and volume continue to follow a steep trend-line, it is important for the HEP community to develop an effective response to a series of expected challenges. In order to help shape the desired response, the HEP Forum for Computational Excellence (HEP-FCE) initiated a roadmap planning activity with two key overlapping drivers -- 1) software effectiveness, and 2) infrastructure and expertise advancement. The HEP-FCE formed three working groups, 1) Applications Software, 2) Software Libraries and Tools, and 3)more » Systems (including systems software), to provide an overview of the current status of HEP computing and to present findings and opportunities for the desired HEP computational roadmap. The final versions of the reports are combined in this document, and are presented along with introductory material.« less
Installé, Arnaud Jf; Van den Bosch, Thierry; De Moor, Bart; Timmerman, Dirk
2014-10-20
Using machine-learning techniques, clinical diagnostic model research extracts diagnostic models from patient data. Traditionally, patient data are often collected using electronic Case Report Form (eCRF) systems, while mathematical software is used for analyzing these data using machine-learning techniques. Due to the lack of integration between eCRF systems and mathematical software, extracting diagnostic models is a complex, error-prone process. Moreover, due to the complexity of this process, it is usually only performed once, after a predetermined number of data points have been collected, without insight into the predictive performance of the resulting models. The objective of the study of Clinical Data Miner (CDM) software framework is to offer an eCRF system with integrated data preprocessing and machine-learning libraries, improving efficiency of the clinical diagnostic model research workflow, and to enable optimization of patient inclusion numbers through study performance monitoring. The CDM software framework was developed using a test-driven development (TDD) approach, to ensure high software quality. Architecturally, CDM's design is split over a number of modules, to ensure future extendability. The TDD approach has enabled us to deliver high software quality. CDM's eCRF Web interface is in active use by the studies of the International Endometrial Tumor Analysis consortium, with over 4000 enrolled patients, and more studies planned. Additionally, a derived user interface has been used in six separate interrater agreement studies. CDM's integrated data preprocessing and machine-learning libraries simplify some otherwise manual and error-prone steps in the clinical diagnostic model research workflow. Furthermore, CDM's libraries provide study coordinators with a method to monitor a study's predictive performance as patient inclusions increase. To our knowledge, CDM is the only eCRF system integrating data preprocessing and machine-learning libraries. This integration improves the efficiency of the clinical diagnostic model research workflow. Moreover, by simplifying the generation of learning curves, CDM enables study coordinators to assess more accurately when data collection can be terminated, resulting in better models or lower patient recruitment costs.
Hopkins, Mark E; Summers-Ables, Joy E; Clifton, Shari C; Coffman, Michael A
2011-06-01
To make electronic resources available to library users while effectively harnessing intellectual capital within the library, ultimately fostering the library's use of technology to interact asynchronously with its patrons (users). The methods used in the project included: (1) developing a new library website to facilitate the creation, management, accessibility, maintenance and dissemination of library resources; and (2) establishing ownership by those who participated in the project, while creating effective work allocation strategies through the implementation of a content management system that allowed the library to manage cost, complexity and interoperability. Preliminary results indicate that contributors to the system benefit from an increased understanding of the library's resources and add content valuable to library patrons. These strategies have helped promote the manageable creation and maintenance of electronic content in accomplishing the library's goal of interacting with its patrons. Establishment of a contributive system for adding to the library's electronic resources and electronic content has been successful. Further work will look at improving asynchronous interaction, particularly highlighting accessibility of electronic content and resources. © 2010 The authors. Health Information and Libraries Journal © 2010 Health Libraries Group.
NASA Astrophysics Data System (ADS)
Houchin, J. S.
2014-09-01
A common problem for the off-line validation of the calibration algorithms and algorithm coefficients is being able to run science data through the exact same software used for on-line calibration of that data. The Joint Polar Satellite System (JPSS) program solved part of this problem by making the Algorithm Development Library (ADL) available, which allows the operational algorithm code to be compiled and run on a desktop Linux workstation using flat file input and output. However, this solved only part of the problem, as the toolkit and methods to initiate the processing of data through the algorithms were geared specifically toward the algorithm developer, not the calibration analyst. In algorithm development mode, a limited number of sets of test data are staged for the algorithm once, and then run through the algorithm over and over as the software is developed and debugged. In calibration analyst mode, we are continually running new data sets through the algorithm, which requires significant effort to stage each of those data sets for the algorithm without additional tools. AeroADL solves this second problem by providing a set of scripts that wrap the ADL tools, providing both efficient means to stage and process an input data set, to override static calibration coefficient look-up-tables (LUT) with experimental versions of those tables, and to manage a library containing multiple versions of each of the static LUT files in such a way that the correct set of LUTs required for each algorithm are automatically provided to the algorithm without analyst effort. Using AeroADL, The Aerospace Corporation's analyst team has demonstrated the ability to quickly and efficiently perform analysis tasks for both the VIIRS and OMPS sensors with minimal training on the software tools.
Choe, Sanggil; Kim, Suncheun; Choi, Hyeyoung; Choi, Hwakyoung; Chung, Heesun; Hwang, Bangyeon
2010-06-15
Agilent GC-MS MSD Chemstation offers automated library search report for toxicological screening using total ion chromatogram (TIC) and mass spectroscopy in normal mode. Numerous peaks appear in the chromatogram of biological specimen such as blood or urine and often large migrating peaks obscure small target peaks, in addition, any target peaks of low abundance regularly give wrong library search result or low matching score. As a result, retention time and mass spectrum of all the peaks in the chromatogram have to be checked to see if they are relevant. These repeated actions are very tedious and time-consuming to toxicologists. MSD Chemstation software operates using a number of macro files which give commands and instructions on how to work on and extract data from the chromatogram and spectroscopy. These macro files are developed by the own compiler of the software. All the original macro files can be modified and new macro files can be added to the original software by users. To get more accurate results with more convenient method and to save time for data analysis, we developed new macro files for reports generation and inserted new menus in the Enhanced Data Analysis program. Toxicological screening reports generated by these new macro files are in text mode or graphic mode and these reports can be generated with three different automated subtraction options. Text reports have Brief mode and Full mode and graphic reports have the option with or without mass spectrum mode. Matched mass spectrum and matching score for detected compounds are printed in reports by modified library searching modules. We have also developed an independent application program named DrugMan. This program manages drug groups, lists and parameters that are in use in MSD Chemstation. The incorporation of DrugMan with modified macro modules provides a powerful tool for toxicological screening and save a lot of valuable time on toxicological work. (c) 2010 Elsevier Ireland Ltd. All rights reserved.
ATLAS offline software performance monitoring and optimization
NASA Astrophysics Data System (ADS)
Chauhan, N.; Kabra, G.; Kittelmann, T.; Langenberg, R.; Mandrysch, R.; Salzburger, A.; Seuster, R.; Ritsch, E.; Stewart, G.; van Eldik, N.; Vitillo, R.; Atlas Collaboration
2014-06-01
In a complex multi-developer, multi-package software environment, such as the ATLAS offline framework Athena, tracking the performance of the code can be a non-trivial task in itself. In this paper we describe improvements in the instrumentation of ATLAS offline software that have given considerable insight into the performance of the code and helped to guide the optimization work. The first tool we used to instrument the code is PAPI, which is a programing interface for accessing hardware performance counters. PAPI events can count floating point operations, cycles, instructions and cache accesses. Triggering PAPI to start/stop counting for each algorithm and processed event results in a good understanding of the algorithm level performance of ATLAS code. Further data can be obtained using Pin, a dynamic binary instrumentation tool. Pin tools can be used to obtain similar statistics as PAPI, but advantageously without requiring recompilation of the code. Fine grained routine and instruction level instrumentation is also possible. Pin tools can additionally interrogate the arguments to functions, like those in linear algebra libraries, so that a detailed usage profile can be obtained. These tools have characterized the extensive use of vector and matrix operations in ATLAS tracking. Currently, CLHEP is used here, which is not an optimal choice. To help evaluate replacement libraries a testbed has been setup allowing comparison of the performance of different linear algebra libraries (including CLHEP, Eigen and SMatrix/SVector). Results are then presented via the ATLAS Performance Management Board framework, which runs daily with the current development branch of the code and monitors reconstruction and Monte-Carlo jobs. This framework analyses the CPU and memory performance of algorithms and an overview of results are presented on a web page. These tools have provided the insight necessary to plan and implement performance enhancements in ATLAS code by identifying the most common operations, with the call parameters well understood, and allowing improvements to be quantified in detail.
Developing framework for agent- based diabetes disease management system: user perspective.
Mohammadzadeh, Niloofar; Safdari, Reza; Rahimi, Azin
2014-02-01
One of the characteristics of agents is mobility which makes them very suitable for remote electronic health and tele medicine. The aim of this study is developing a framework for agent based diabetes information management at national level through identifying required agents. The main tool is a questioner that is designed in three sections based on studying library resources, performance of major organizations in the field of diabetes in and out of the country and interviews with experts in the medical, health information management and software fields. Questionnaires based on Delphi methods were distributed among 20 experts. In order to design and identify agents required in health information management for the prevention and appropriate and rapid treatment of diabetes, the results were analyzed using SPSS 17 and Results were plotted with FREEPLANE mind map software. ACCESS TO DATA TECHNOLOGY IN PROPOSED FRAMEWORK IN ORDER OF PRIORITY IS: mobile (mean 1/80), SMS, EMAIL (mean 2/80), internet, web (mean 3/30), phone (mean 3/60), WIFI (mean 4/60). In delivering health care to diabetic patients, considering social and human aspects is essential. Having a systematic view for implementation of agent systems and paying attention to all aspects such as feedbacks, user acceptance, budget, motivation, hierarchy, useful standards, affordability of individuals, identifying barriers and opportunities and so on, are necessary.
Management of Library Associations Section. Management and Technology Division. Papers.
ERIC Educational Resources Information Center
International Federation of Library Associations, The Hague (Netherlands).
Papers on the management of library associations, which were presented at the 1983 International Federation of Library Associations (IFLA) conference, include: (1) "The Management of Library Associations: Publishing--A Third World Perspective," in which Pearl Springer (Trinidad and Tobago) outlines the importance of publishing in…
Managing a Public Library. Parts I and II.
ERIC Educational Resources Information Center
Aronoff, Carol; Cart, Michael
1985-01-01
Discusses factors that contribute to Santa Monica Public Library's excellence, including supportive city government and citizenry, city council, library board, lean top management, and the basic management team. Specific management principles and techniques, public relations, and measures of service at Beverly Hills Public Library are also…
A Fault Oblivious Extreme-Scale Execution Environment
DOE Office of Scientific and Technical Information (OSTI.GOV)
McKie, Jim
The FOX project, funded under the ASCR X-stack I program, developed systems software and runtime libraries for a new approach to the data and work distribution for massively parallel, fault oblivious application execution. Our work was motivated by the premise that exascale computing systems will provide a thousand-fold increase in parallelism and a proportional increase in failure rate relative to today’s machines. To deliver the capability of exascale hardware, the systems software must provide the infrastructure to support existing applications while simultaneously enabling efficient execution of new programming models that naturally express dynamic, adaptive, irregular computation; coupled simulations; and massivemore » data analysis in a highly unreliable hardware environment with billions of threads of execution. Our OS research has prototyped new methods to provide efficient resource sharing, synchronization, and protection in a many-core compute node. We have experimented with alternative task/dataflow programming models and shown scalability in some cases to hundreds of thousands of cores. Much of our software is in active development through open source projects. Concepts from FOX are being pursued in next generation exascale operating systems. Our OS work focused on adaptive, application tailored OS services optimized for multi → many core processors. We developed a new operating system NIX that supports role-based allocation of cores to processes which was released to open source. We contributed to the IBM FusedOS project, which promoted the concept of latency-optimized and throughput-optimized cores. We built a task queue library based on distributed, fault tolerant key-value store and identified scaling issues. A second fault tolerant task parallel library was developed, based on the Linda tuple space model, that used low level interconnect primitives for optimized communication. We designed fault tolerance mechanisms for task parallel computations employing work stealing for load balancing that scaled to the largest existing supercomputers. Finally, we implemented the Elastic Building Blocks runtime, a library to manage object-oriented distributed software components. To support the research, we won two INCITE awards for time on Intrepid (BG/P) and Mira (BG/Q). Much of our work has had impact in the OS and runtime community through the ASCR Exascale OS/R workshop and report, leading to the research agenda of the Exascale OS/R program. Our project was, however, also affected by attrition of multiple PIs. While the PIs continued to participate and offer guidance as time permitted, losing these key individuals was unfortunate both for the project and for the DOE HPC community.« less
An open-source java platform for automated reaction mapping.
Crabtree, John D; Mehta, Dinesh P; Kouri, Tina M
2010-09-27
This article presents software applications that have been built upon a modular, open-source, reaction mapping library that can be used in both cheminformatics and bioinformatics research. We first describe the theoretical underpinnings and modular architecture of the core software library. We then describe two applications that have been built upon that core. The first is a generic reaction viewer and mapper, and the second classifies reactions according to rules that can be modified by end users with little or no programming skills.
2012-09-28
spectral-geotechnical libraries and models developed during remote sensing and calibration/ validation campaigns conducted by NRL and collaborating...geotechnical libraries and models developed during remote sensing and calibration/ validation campaigns conducted by NRL and collaborating institutions in four...2010; Bachmann, Fry, et al, 2012a). The NRL HITT tool is a model for how we develop and validate software, and the future development of tools by
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nelson, Eric M.
2004-05-20
The YAP software library computes (1) electromagnetic modes, (2) electrostatic fields, (3) magnetostatic fields and (4) particle trajectories in 2d and 3d models. The code employs finite element methods on unstructured grids of tetrahedral, hexahedral, prism and pyramid elements, with linear through cubic element shapes and basis functions to provide high accuracy. The novel particle tracker is robust, accurate and efficient, even on unstructured grids with discontinuous fields. This software library is a component of the MICHELLE 3d finite element gun code.
Image analysis library software development
NASA Technical Reports Server (NTRS)
Guseman, L. F., Jr.; Bryant, J.
1977-01-01
The Image Analysis Library consists of a collection of general purpose mathematical/statistical routines and special purpose data analysis/pattern recognition routines basic to the development of image analysis techniques for support of current and future Earth Resources Programs. Work was done to provide a collection of computer routines and associated documentation which form a part of the Image Analysis Library.
Expert Systems for Libraries at SCIL [Small Computers in Libraries]'88.
ERIC Educational Resources Information Center
Kochtanek, Thomas R.; And Others
1988-01-01
Six brief papers on expert systems for libraries cover (1) a knowledge-based approach to database design; (2) getting started in expert systems; (3) using public domain software to develop a business reference system; (4) a music cataloging inquiry system; (5) linguistic analysis of reference transactions; and (6) a model of a reference librarian.…
Who Moved My Intranet? The Human Side of Introducing Collaborative Technologies to Library Staff
ERIC Educational Resources Information Center
Jeffery, Keven; Dworak, Ellie
2010-01-01
Intranets can be crucial tools in fostering communication within an academic library. This article describes the successful implementation of an intranet wiki at the San Diego State University Library & Information Access. The steps involved with implementing, marketing, and supporting the MediaWiki software are described, and the results of a…
ERIC Educational Resources Information Center
Cubbage, Charlotte
2002-01-01
Discusses problems with patron Internet access in academic libraries and describes a study conducted at Northwestern University (Illinois) that used Internet tracking software to assess user Internet behavior. Topics include Internet use policies; pornography; and loss of control over library services and information content that is provided. (LRW)
ERIC Educational Resources Information Center
Krzywkowski, Valerie I., Ed.
Based on the conference theme, "Competencies for Librarians," papers presented at the 1985 meeting of the association include: (1) "Planning a Library-Based Public Access Microcomputer Facility" (Suzanne Kehm); (2) "Processing and Circulating Microcomputer Software in the Academic Library: A Sharing Session" (Jan…
Experiences Using an Open Source Software Library to Teach Computer Vision Subjects
ERIC Educational Resources Information Center
Cazorla, Miguel; Viejo, Diego
2015-01-01
Machine vision is an important subject in computer science and engineering degrees. For laboratory experimentation, it is desirable to have a complete and easy-to-use tool. In this work we present a Java library, oriented to teaching computer vision. We have designed and built the library from the scratch with emphasis on readability and…
Special Features of the Advanced Loans Module of the ABCD Integrated Library System
ERIC Educational Resources Information Center
de Smet, Egbert
2011-01-01
Purpose: The "advanced loans" module of the relatively new library software, ABCD, is an addition to the normal loans module and it offers a "generic transaction decision-making engine" functionality. The module requires extra installation effort and parameterisation, so this article aims to explain to the many potentially interested libraries,…
ERIC Educational Resources Information Center
Betty, Paul
2009-01-01
Increasing use of screencast and Flash authoring software within libraries is resulting in "homegrown" library collections of digital learning objects and multimedia presentations. The author explores the use of Google Analytics to track usage statistics for interactive Shockwave Flash (.swf) files, the common file output for screencast and Flash…
Content and Workflow Management for Library Websites: Case Studies
ERIC Educational Resources Information Center
Yu, Holly, Ed.
2005-01-01
Using database-driven web pages or web content management (WCM) systems to manage increasingly diverse web content and to streamline workflows is a commonly practiced solution recognized in libraries today. However, limited library web content management models and funding constraints prevent many libraries from purchasing commercially available…
Neurology diagnostics security and terminal adaptation for PocketNeuro project.
Chemak, C; Bouhlel, M-S; Lapayre, J-C
2008-09-01
This paper presents new approaches of medical information security and terminal mobile phone adaptation for the PocketNeuro project. The latter term refers to a project created for the management of neurological diseases. It consists of transmitting information about patients ("desk of patients") to a doctor's mobile phone during a visit and examination of a patient. These new approaches for the PocketNeuro project were analyzed in terms of medical information security and adaptation of the diagnostic images to the doctor's mobile phone. Images were extracted from a DICOM library. Matlab and its library were used as software to test our approaches and to validate our results. Experiments performed on a database of 30 256 x 256 pixel-sized neuronal medical images indicated that our new approaches for PocketNeuro project are valid and support plans for large-scale studies between French and Swiss hospitals using secured connections.
Open Source Software and the Intellectual Commons.
ERIC Educational Resources Information Center
Dorman, David
2002-01-01
Discusses the Open Source Software method of software development and its relationship to control over information content. Topics include digital library resources; reference services; preservation; the legal and economic status of information; technical standards; access to digital data; control of information use; and copyright and patent laws.…
Perl Extension to the Bproc Library
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grunau, Daryl W.
2004-06-07
The Beowulf Distributed process Space (Bproc) software stack is comprised of UNIX/Linux kernel modifications and a support library by which a cluster of machines, each running their own private kernel, can present itself as a unified process space to the user. A Bproc cluster contains a single front-end machine and many back-end nodes which receive and run processes given to them by the front-end. Any process which is migrated to a back-end node is also visible as a ghost process on the fron-end, and may be controlled there using traditional UNIX semantics (e.g. ps(1), kill(1), etc). This software is amore » Perl extension to the Bproc library which enables the Perl programmer to make direct calls to functions within the Bproc library. See http://www.clustermatic.org, http://bproc.sourceforge.net, and http://www.perl.org« less
Ferraro Petrillo, Umberto; Roscigno, Gianluca; Cattaneo, Giuseppe; Giancarlo, Raffaele
2017-05-15
MapReduce Hadoop bioinformatics applications require the availability of special-purpose routines to manage the input of sequence files. Unfortunately, the Hadoop framework does not provide any built-in support for the most popular sequence file formats like FASTA or BAM. Moreover, the development of these routines is not easy, both because of the diversity of these formats and the need for managing efficiently sequence datasets that may count up to billions of characters. We present FASTdoop, a generic Hadoop library for the management of FASTA and FASTQ files. We show that, with respect to analogous input management routines that have appeared in the Literature, it offers versatility and efficiency. That is, it can handle collections of reads, with or without quality scores, as well as long genomic sequences while the existing routines concentrate mainly on NGS sequence data. Moreover, in the domain where a comparison is possible, the routines proposed here are faster than the available ones. In conclusion, FASTdoop is a much needed addition to Hadoop-BAM. The software and the datasets are available at http://www.di.unisa.it/FASTdoop/ . umberto.ferraro@uniroma1.it. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com
ERIC Educational Resources Information Center
Webster, Duane E.
1974-01-01
The Management Review and Analysis Program, an assisted self-study strategy for large academic and research libraries, assists libraries in reviewing and analyzing their current management policies and practices, and provides guidelines for the application of contemporary principles of management for the improvement of library programs. (Author/LS)
Using Modern Technologies to Capture and Share Indigenous Astronomical Knowledge
NASA Astrophysics Data System (ADS)
Nakata, Martin; Hamacher, Duane W.; Warren, John; Byrne, Alex; Pagnucco, Maurice; Harley, Ross; Venugopal, Srikumar; Thorpe, Kirsten; Neville, Richard; Bolt, Reuben
2014-06-01
Indigenous Knowledge is important for Indigenous communities across the globe and for the advancement of our general scientific knowledge. In particular, Indigenous astronomical knowledge integrates many aspects of Indigenous Knowledge, including seasonal calendars, navigation, food economics, law, ceremony, and social structure. Capturing, managing, and disseminating this knowledge in the digital environment poses a number of challenges, which we aim to address using a collaborative project emerging between experts in the higher education, library, archive and industry sectors. Using Microsoft's WorldWide Telescope and Rich Interactive Narratives technologies, we propose to develop software, media design, and archival management solutions to allow Indigenous communities to share their astronomical knowledge with the world on their terms and in a culturally sensitive manner.
Management Techniques for Librarians.
ERIC Educational Resources Information Center
Evans, G. Edward
This textbook on library management techniques is concerned with basic management problems. Examples of problems in planning, organization, and coordination are drawn from situations in libraries or information centers. After an introduction to library management, the history of management is covered. Several styles of management and organization…
García-Jacas, César R; Marrero-Ponce, Yovani; Acevedo-Martínez, Liesner; Barigye, Stephen J; Valdés-Martiní, José R; Contreras-Torres, Ernesto
2014-07-05
The present report introduces the QuBiLS-MIDAS software belonging to the ToMoCoMD-CARDD suite for the calculation of three-dimensional molecular descriptors (MDs) based on the two-linear (bilinear), three-linear, and four-linear (multilinear or N-linear) algebraic forms. Thus, it is unique software that computes these tensor-based indices. These descriptors, establish relations for two, three, and four atoms by using several (dis-)similarity metrics or multimetrics, matrix transformations, cutoffs, local calculations and aggregation operators. The theoretical background of these N-linear indices is also presented. The QuBiLS-MIDAS software was developed in the Java programming language and employs the Chemical Development Kit library for the manipulation of the chemical structures and the calculation of the atomic properties. This software is composed by a desktop user-friendly interface and an Abstract Programming Interface library. The former was created to simplify the configuration of the different options of the MDs, whereas the library was designed to allow its easy integration to other software for chemoinformatics applications. This program provides functionalities for data cleaning tasks and for batch processing of the molecular indices. In addition, it offers parallel calculation of the MDs through the use of all available processors in current computers. The studies of complexity of the main algorithms demonstrate that these were efficiently implemented with respect to their trivial implementation. Lastly, the performance tests reveal that this software has a suitable behavior when the amount of processors is increased. Therefore, the QuBiLS-MIDAS software constitutes a useful application for the computation of the molecular indices based on N-linear algebraic maps and it can be used freely to perform chemoinformatics studies. Copyright © 2014 Wiley Periodicals, Inc.
Algorithm-Based Fault Tolerance for Numerical Subroutines
NASA Technical Reports Server (NTRS)
Tumon, Michael; Granat, Robert; Lou, John
2007-01-01
A software library implements a new methodology of detecting faults in numerical subroutines, thus enabling application programs that contain the subroutines to recover transparently from single-event upsets. The software library in question is fault-detecting middleware that is wrapped around the numericalsubroutines. Conventional serial versions (based on LAPACK and FFTW) and a parallel version (based on ScaLAPACK) exist. The source code of the application program that contains the numerical subroutines is not modified, and the middleware is transparent to the user. The methodology used is a type of algorithm- based fault tolerance (ABFT). In ABFT, a checksum is computed before a computation and compared with the checksum of the computational result; an error is declared if the difference between the checksums exceeds some threshold. Novel normalization methods are used in the checksum comparison to ensure correct fault detections independent of algorithm inputs. In tests of this software reported in the peer-reviewed literature, this library was shown to enable detection of 99.9 percent of significant faults while generating no false alarms.
FastaValidator: an open-source Java library to parse and validate FASTA formatted sequences.
Waldmann, Jost; Gerken, Jan; Hankeln, Wolfgang; Schweer, Timmy; Glöckner, Frank Oliver
2014-06-14
Advances in sequencing technologies challenge the efficient importing and validation of FASTA formatted sequence data which is still a prerequisite for most bioinformatic tools and pipelines. Comparative analysis of commonly used Bio*-frameworks (BioPerl, BioJava and Biopython) shows that their scalability and accuracy is hampered. FastaValidator represents a platform-independent, standardized, light-weight software library written in the Java programming language. It targets computer scientists and bioinformaticians writing software which needs to parse quickly and accurately large amounts of sequence data. For end-users FastaValidator includes an interactive out-of-the-box validation of FASTA formatted files, as well as a non-interactive mode designed for high-throughput validation in software pipelines. The accuracy and performance of the FastaValidator library qualifies it for large data sets such as those commonly produced by massive parallel (NGS) technologies. It offers scientists a fast, accurate and standardized method for parsing and validating FASTA formatted sequence data.
ENCoRE: an efficient software for CRISPR screens identifies new players in extrinsic apoptosis.
Trümbach, Dietrich; Pfeiffer, Susanne; Poppe, Manuel; Scherb, Hagen; Doll, Sebastian; Wurst, Wolfgang; Schick, Joel A
2017-11-25
As CRISPR/Cas9 mediated screens with pooled guide libraries in somatic cells become increasingly established, an unmet need for rapid and accurate companion informatics tools has emerged. We have developed a lightweight and efficient software to easily manipulate large raw next generation sequencing datasets derived from such screens into informative relational context with graphical support. The advantages of the software entitled ENCoRE (Easy NGS-to-Gene CRISPR REsults) include a simple graphical workflow, platform independence, local and fast multithreaded processing, data pre-processing and gene mapping with custom library import. We demonstrate the capabilities of ENCoRE to interrogate results from a pooled CRISPR cellular viability screen following Tumor Necrosis Factor-alpha challenge. The results not only identified stereotypical players in extrinsic apoptotic signaling but two as yet uncharacterized members of the extrinsic apoptotic cascade, Smg7 and Ces2a. We further validated and characterized cell lines containing mutations in these genes against a panel of cell death stimuli and involvement in p53 signaling. In summary, this software enables bench scientists with sensitive data or without access to informatic cores to rapidly interpret results from large scale experiments resulting from pooled CRISPR/Cas9 library screens.
Open source libraries and frameworks for biological data visualisation: a guide for developers.
Wang, Rui; Perez-Riverol, Yasset; Hermjakob, Henning; Vizcaíno, Juan Antonio
2015-04-01
Recent advances in high-throughput experimental techniques have led to an exponential increase in both the size and the complexity of the data sets commonly studied in biology. Data visualisation is increasingly used as the key to unlock this data, going from hypothesis generation to model evaluation and tool implementation. It is becoming more and more the heart of bioinformatics workflows, enabling scientists to reason and communicate more effectively. In parallel, there has been a corresponding trend towards the development of related software, which has triggered the maturation of different visualisation libraries and frameworks. For bioinformaticians, scientific programmers and software developers, the main challenge is to pick out the most fitting one(s) to create clear, meaningful and integrated data visualisation for their particular use cases. In this review, we introduce a collection of open source or free to use libraries and frameworks for creating data visualisation, covering the generation of a wide variety of charts and graphs. We will focus on software written in Java, JavaScript or Python. We truly believe this software offers the potential to turn tedious data into exciting visual stories. © 2014 The Authors. PROTEOMICS published by Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
Open source libraries and frameworks for biological data visualisation: A guide for developers
Wang, Rui; Perez-Riverol, Yasset; Hermjakob, Henning; Vizcaíno, Juan Antonio
2015-01-01
Recent advances in high-throughput experimental techniques have led to an exponential increase in both the size and the complexity of the data sets commonly studied in biology. Data visualisation is increasingly used as the key to unlock this data, going from hypothesis generation to model evaluation and tool implementation. It is becoming more and more the heart of bioinformatics workflows, enabling scientists to reason and communicate more effectively. In parallel, there has been a corresponding trend towards the development of related software, which has triggered the maturation of different visualisation libraries and frameworks. For bioinformaticians, scientific programmers and software developers, the main challenge is to pick out the most fitting one(s) to create clear, meaningful and integrated data visualisation for their particular use cases. In this review, we introduce a collection of open source or free to use libraries and frameworks for creating data visualisation, covering the generation of a wide variety of charts and graphs. We will focus on software written in Java, JavaScript or Python. We truly believe this software offers the potential to turn tedious data into exciting visual stories. PMID:25475079
Software for Teaching Physiology and Biophysics.
ERIC Educational Resources Information Center
Weiss, Thomas F.; And Others
1992-01-01
Describes a software library developed to teach biophysics and physiology undergraduates that includes software on (1) the Hodgkin-Huxley model for excitation of action potentials in electrically excitable cells; (2) a random-walk model of diffusion; (3) single voltage-gated ion channels; (4) steady-state chemically mediated transport; and (5)…
A Public Domain Software Library for Reading and Language Arts.
ERIC Educational Resources Information Center
Balajthy, Ernest
A three-year project carried out by the Microcomputers and Reading Committee of the New Jersey Reading Association involved the collection, improvement, and distribution of free microcomputer software (public domain programs) designed to deal with reading and writing skills. Acknowledging that this free software is not without limitations (poor…
SeaDataNet Pan-European infrastructure for Ocean & Marine Data Management
NASA Astrophysics Data System (ADS)
Manzella, G. M.; Maillard, C.; Maudire, G.; Schaap, D.; Rickards, L.; Nast, F.; Balopoulos, E.; Mikhailov, N.; Vladymyrov, V.; Pissierssens, P.; Schlitzer, R.; Beckers, J. M.; Barale, V.
2007-12-01
SEADATANET is developing a Pan-European data management infrastructure to insure access to a large number of marine environmental data (i.e. temperature, salinity current, sea level, chemical, physical and biological properties), safeguard and long term archiving. Data are derived from many different sensors installed on board of research vessels, satellite and the various platforms of the marine observing system. SeaDataNet allows to have information on real time and archived marine environmental data collected at a pan-european level, through directories on marine environmental data and projects. SeaDataNet allows the access to the most comprehensive multidisciplinary sets of marine in-situ and remote sensing data, from about 40 laboratories, through user friendly tools. The data selection and access is operated through the Common Data Index (CDI), XML files compliant with ISO standards and unified dictionaries. Technical Developments carried out by SeaDataNet includes: A library of Standards - Meta-data standards, compliant with ISO 19115, for communication and interoperability between the data platforms. Software of interoperable on line system - Interconnection of distributed data centres by interfacing adapted communication technology tools. Off-Line Data Management software - software representing the minimum equipment of all the data centres is developed by AWI "Ocean Data View (ODV)". Training, Education and Capacity Building - Training 'on the job' is carried out by IOC-Unesco in Ostende. SeaDataNet Virtual Educational Centre internet portal provides basic tools for informal education
NASA Technical Reports Server (NTRS)
Malin, Jane T.; Schrenkenghost, Debra K.
2001-01-01
The Adjustable Autonomy Testbed (AAT) is a simulation-based testbed located in the Intelligent Systems Laboratory in the Automation, Robotics and Simulation Division at NASA Johnson Space Center. The purpose of the testbed is to support evaluation and validation of prototypes of adjustable autonomous agent software for control and fault management for complex systems. The AA T project has developed prototype adjustable autonomous agent software and human interfaces for cooperative fault management. This software builds on current autonomous agent technology by altering the architecture, components and interfaces for effective teamwork between autonomous systems and human experts. Autonomous agents include a planner, flexible executive, low level control and deductive model-based fault isolation. Adjustable autonomy is intended to increase the flexibility and effectiveness of fault management with an autonomous system. The test domain for this work is control of advanced life support systems for habitats for planetary exploration. The CONFIG hybrid discrete event simulation environment provides flexible and dynamically reconfigurable models of the behavior of components and fluids in the life support systems. Both discrete event and continuous (discrete time) simulation are supported, and flows and pressures are computed globally. This provides fast dynamic simulations of interacting hardware systems in closed loops that can be reconfigured during operations scenarios, producing complex cascading effects of operations and failures. Current object-oriented model libraries support modeling of fluid systems, and models have been developed of physico-chemical and biological subsystems for processing advanced life support gases. In FY01, water recovery system models will be developed.
Judicious use of custom development in an open source component architecture
NASA Astrophysics Data System (ADS)
Bristol, S.; Latysh, N.; Long, D.; Tekell, S.; Allen, J.
2014-12-01
Modern software engineering is not as much programming from scratch as innovative assembly of existing components. Seamlessly integrating disparate components into scalable, performant architecture requires sound engineering craftsmanship and can often result in increased cost efficiency and accelerated capabilities if software teams focus their creativity on the edges of the problem space. ScienceBase is part of the U.S. Geological Survey scientific cyberinfrastructure, providing data and information management, distribution services, and analysis capabilities in a way that strives to follow this pattern. ScienceBase leverages open source NoSQL and relational databases, search indexing technology, spatial service engines, numerous libraries, and one proprietary but necessary software component in its architecture. The primary engineering focus is cohesive component interaction, including construction of a seamless Application Programming Interface (API) across all elements. The API allows researchers and software developers alike to leverage the infrastructure in unique, creative ways. Scaling the ScienceBase architecture and core API with increasing data volume (more databases) and complexity (integrated science problems) is a primary challenge addressed by judicious use of custom development in the component architecture. Other data management and informatics activities in the earth sciences have independently resolved to a similar design of reusing and building upon established technology and are working through similar issues for managing and developing information (e.g., U.S. Geoscience Information Network; NASA's Earth Observing System Clearing House; GSToRE at the University of New Mexico). Recent discussions facilitated through the Earth Science Information Partners are exploring potential avenues to exploit the implicit relationships between similar projects for explicit gains in our ability to more rapidly advance global scientific cyberinfrastructure.
Zooming In on Copyright with Integrated Library Software Services
ERIC Educational Resources Information Center
Oye, Karen
2007-01-01
Over the last decade, many of Kelvin Smith Library's (KSL) content delivery services have gone digital, and some, such as enhanced course reserves products, are new to the market. The best digital library services have given KSL options and integrated solutions that allow it to do more than it thought possible just a few years ago. As with many…
Largenet2: an object-oriented programming library for simulating large adaptive networks.
Zschaler, Gerd; Gross, Thilo
2013-01-15
The largenet2 C++ library provides an infrastructure for the simulation of large dynamic and adaptive networks with discrete node and link states. The library is released as free software. It is available at http://biond.github.com/largenet2. Largenet2 is licensed under the Creative Commons Attribution-NonCommercial 3.0 Unported License. gerd@biond.org
1982-01-29
N - Nw .VA COMPUTER PROGRAM USER’S MANUAL FOR . 0FIREFINDER DIGITAL TOPOGRAPHIC DATA VERIFICATION LIBRARY DUBBING SYSTEM VOLUME II DUBBING 29 JANUARY...Digital Topographic Data Verification Library Dubbing System, Volume II, Dubbing 6. PERFORMING ORG. REPORT NUMER 7. AUTHOR(q) S. CONTRACT OR GRANT...Software Library FIREFINDER Dubbing 20. ABSTRACT (Continue an revWee *Ide II necessary end identify by leek mauber) PThis manual describes the computer
Tools for Modeling & Simulation of Molecular and Nanoelectronics Devices
2012-06-14
implemented a prototype DFT simulation software using two different open source Finite Element (FE) libraries: DEALII and FENICS . These two libraries have been...ATK. In the first part of this Phase I project we investigated two different candidate finite element libraries, DEAL II and FENICS . Although both...element libraries, Deal.II and FEniCS /dolfin, for use as back-ends to a finite element DFT in ATK, Quantum Insight and QuantumWise A/S, October 2011.
ERIC Educational Resources Information Center
International Federation of Library Associations and Institutions, London (England).
Eleven papers delivered at the annual meeting of the International Federation of Library Associations and Institutions for the Division of Management and Technology are presented. Some were presented at a roundtable on audiovisual media, and others are from sessions on library buildings and equipment, information management, and statistics in…
ERIC Educational Resources Information Center
Yoon, Ayoung; Schultz, Teresa
2017-01-01
Examining landscapes of research data management services in academic libraries is timely and significant for both those libraries on the front line and the libraries that are already ahead. While it provides overall understanding of where the research data management program is at and where it is going, it also provides understanding of current…
ERIC Educational Resources Information Center
Samuels, Alan R.; And Others
1987-01-01
These five papers by speakers at the Small Computers in Libraries 1987 conference include: "Acquiring and Using Shareware in Building Small Scale Automated Information systems" (Samuels); "A Software Lending Collection" (Talab); "Providing Subject Access to Microcomputer Software" (Mitchell); "Interfacing Vendor…
Prediction of Hydrolysis Products of Organic Chemicals under Environmental pH Conditions
Cheminformatics-based software tools can predict the molecular structure of transformation products using a library of transformation reaction schemes. This paper presents the development of such a library for abiotic hydrolysis of organic chemicals under environmentally relevant...
The Electronic Library Workstation--Today.
ERIC Educational Resources Information Center
Nolte, James
1990-01-01
Describes the components--hardware, software and applications, CD-ROM and online reference resources, and telecommunications links--of an electronic library workstation in use at Clarkson University (Potsdam, New York). Data manipulation, a hypothetical research scenario, and recommended workstation capabilities are also discussed. (MES)
DiffPy-CMI-Python libraries for Complex Modeling Initiative
DOE Office of Scientific and Technical Information (OSTI.GOV)
Billinge, Simon; Juhas, Pavol; Farrow, Christopher
2014-02-01
Software to manipulate and describe crystal and molecular structures and set up structural refinements from multiple experimental inputs. Calculation and simulation of structure derived physical quantities. Library for creating customized refinements of atomic structures from available experimental and theoretical inputs.
Videotex--The Library of the Future.
ERIC Educational Resources Information Center
Mischo, Lare; Hegarty, Kevin
1982-01-01
Discusses a presentation prepared by Boeing Computer Services in cooperation with the Tacoma Public Library staff, which demonstrates the potential of interactive cable systems based on the Canadian Telidon system. Features of this videotex system, software, and equipment, including microcomputers, are noted. (EJS)
Hufnagel, S; Harbison, K; Silva, J; Mettala, E
1994-01-01
This paper describes a new method for the evolutionary determination of user requirements and system specifications called scenario-based engineering process (SEP). Health care professional workstations are critical components of large scale health care system architectures. We suggest that domain-specific software architectures (DSSAs) be used to specify standard interfaces and protocols for reusable software components throughout those architectures, including workstations. We encourage the use of engineering principles and abstraction mechanisms. Engineering principles are flexible guidelines, adaptable to particular situations. Abstraction mechanisms are simplifications for management of complexity. We recommend object-oriented design principles, graphical structural specifications, and formal components' behavioral specifications. We give an ambulatory care scenario and associated models to demonstrate SEP. The scenario uses health care terminology and gives patients' and health care providers' system views. Our goal is to have a threefold benefit. (i) Scenario view abstractions provide consistent interdisciplinary communications. (ii) Hierarchical object-oriented structures provide useful abstractions for reuse, understandability, and long term evolution. (iii) SEP and health care DSSA integration into computer aided software engineering (CASE) environments. These environments should support rapid construction and certification of individualized systems, from reuse libraries.
Risk Management and Disaster Recovery in Public Libraries in South Australia: A Pilot Study
ERIC Educational Resources Information Center
Velasquez, Diane L.; Evans, Nina; Kaeding, Joanne
2016-01-01
Introduction: This paper reports the findings of a study of risk management in public libraries. The focus of the research was to determine whether the libraries had a risk management and disaster plan for major disasters. Method: A qualitative study was done to investigate risk management and disaster recovery in public libraries in South…
Geospatial Data Stream Processing in Python Using FOSS4G Components
NASA Astrophysics Data System (ADS)
McFerren, G.; van Zyl, T.
2016-06-01
One viewpoint of current and future IT systems holds that there is an increase in the scale and velocity at which data are acquired and analysed from heterogeneous, dynamic sources. In the earth observation and geoinformatics domains, this process is driven by the increase in number and types of devices that report location and the proliferation of assorted sensors, from satellite constellations to oceanic buoy arrays. Much of these data will be encountered as self-contained messages on data streams - continuous, infinite flows of data. Spatial analytics over data streams concerns the search for spatial and spatio-temporal relationships within and amongst data "on the move". In spatial databases, queries can assess a store of data to unpack spatial relationships; this is not the case on streams, where spatial relationships need to be established with the incomplete data available. Methods for spatially-based indexing, filtering, joining and transforming of streaming data need to be established and implemented in software components. This article describes the usage patterns and performance metrics of a number of well known FOSS4G Python software libraries within the data stream processing paradigm. In particular, we consider the RTree library for spatial indexing, the Shapely library for geometric processing and transformation and the PyProj library for projection and geodesic calculations over streams of geospatial data. We introduce a message oriented Python-based geospatial data streaming framework called Swordfish, which provides data stream processing primitives, functions, transports and a common data model for describing messages, based on the Open Geospatial Consortium Observations and Measurements (O&M) and Unidata Common Data Model (CDM) standards. We illustrate how the geospatial software components are integrated with the Swordfish framework. Furthermore, we describe the tight temporal constraints under which geospatial functionality can be invoked when processing high velocity, potentially infinite geospatial data streams. The article discusses the performance of these libraries under simulated streaming loads (size, complexity and volume of messages) and how they can be deployed and utilised with Swordfish under real load scenarios, illustrated by a set of Vessel Automatic Identification System (AIS) use cases. We conclude that the described software libraries are able to perform adequately under geospatial data stream processing scenarios - many real application use cases will be handled sufficiently by the software.
HPC Software Stack Testing Framework
DOE Office of Scientific and Technical Information (OSTI.GOV)
Garvey, Cormac
The HPC Software stack testing framework (hpcswtest) is used in the INL Scientific Computing Department to test the basic sanity and integrity of the HPC Software stack (Compilers, MPI, Numerical libraries and Applications) and to quickly discover hard failures, and as a by-product it will indirectly check the HPC infrastructure (network, PBS and licensing servers).
Surgical pathology report in the era of desktop publishing.
Pillarisetti, S G
1993-01-01
Since it is believed that "a picture is worth a thousand words," incorporation of computer-generated line art was used as a adjunct to gross description in surgical pathology reporting in selected cases. The lack of an integrated software program was overcome by using commercially available graphic and word processing software. A library of drawings was developed over the last few years. Most time-consuming is the development of templates and the graphic library. With some effort it is possible to integrate graphics of high quality into surgical pathology reports.
NASA Astrophysics Data System (ADS)
Biscarros, D.; Cantenot, C.; Séronie-Vivien, J.; Schmidt, G.
AstroBus on-board software is a customisable software for ERC32 based avionics implementing standard ESA Packet Utilization Standard functions. Its architecture based on generic design templates and relying on a library providing standard PUS TC, TM and event services enhances its reusability on various programs. Finally, AstroBus on-board software development and validation environment is based on last generation tools providing an optimised customisation environment.
Adaptive Integration of Nonsmooth Dynamical Systems
2017-10-11
controlled time stepping method to interactively design running robots. [1] John Shepherd, Samuel Zapolsky, and Evan M. Drumwright, “Fast multi-body...software like this to test software running on my robots. Started working in simulation after attempting to use software like this to test software... running on my robots. The libraries that produce these beautiful results have failed at simulating robotic manipulation. Postulate: It is easier to
Leveraging Metadata to Create Better Web Services
ERIC Educational Resources Information Center
Mitchell, Erik
2012-01-01
Libraries have been increasingly concerned with data creation, management, and publication. This increase is partly driven by shifting metadata standards in libraries and partly by the growth of data and metadata repositories being managed by libraries. In order to manage these data sets, libraries are looking for new preservation and discovery…
Collaborative Collections Management Programs in ARL Libraries. SPEC Kit 235.
ERIC Educational Resources Information Center
Soete, George J., Comp.
1998-01-01
This survey was conducted to discover how extensively ARL (Association of Research Libraries) libraries are involved in formal, active programs of collaboration for collections management. Seventy ARL libraries completed a questionnaire focusing on all collections formats and a number of related collections management activities. Results indicated…
Marketing and Library Management.
ERIC Educational Resources Information Center
Murphy, Kurt R.
1991-01-01
Examines the role of marketing in the management of libraries. The role of public relations (PR) in the total marketing concept is discussed, surveys that have explored PR efforts in academic and public libraries are described, and changes affecting libraries that marketing efforts could help to manage are discussed. (seven references) (LRW)
Starting a research data management program based in a university library.
Henderson, Margaret E; Knott, Teresa L
2015-01-01
As the need for research data management grows, many libraries are considering adding data services to help with the research mission of their institution. The Virginia Commonwealth University (VCU) Libraries created a position and hired a director of research data management in September 2013. The position was new to the libraries and the university. With the backing of the library administration, a plan for building relationships with VCU faculty, researchers, students, service and resource providers, including grant administrators, was developed to educate and engage the community in data management plan writing and research data management training.
Starting a Research Data Management Program Based in a University Library
Henderson, Margaret E.; Knott, Teresa L.
2015-01-01
As the need for research data management grows, many libraries are considering adding data services to help with the research mission of their institution. The VCU Libraries created a position and hired a director of research data management in September 2013. The position was new to the libraries and the university. With the backing of the library administration, a plan for building relationships with VCU faculty, researchers, students, service and resource providers, including grant administrators, was developed to educate and engage the community in data management plan writing and research data management training. PMID:25611440
Assessing the accuracy of software predictions of mammalian and microbial metabolites
New chemical development and hazard assessments benefit from accurate predictions of mammalian and microbial metabolites. Fourteen biotransformation libraries encoded in eight software packages that predict metabolite structures were assessed for their sensitivity (proportion of ...
ERIC Educational Resources Information Center
Krishnamurthy, M.
2008-01-01
Purpose: The purpose of this paper is to describe the open access and open source movement in the digital library world. Design/methodology/approach: A review of key developments in the open access and open source movement is provided. Findings: Open source software and open access to research findings are of great use to scholars in developing…
Technology and Microcomputers for an Information Centre/Special Library.
ERIC Educational Resources Information Center
Daehn, Ralph M.
1984-01-01
Discusses use of microcomputer hardware and software, telecommunications methods, and advanced library methods to create a specialized information center's database of literature relating to farm machinery and food processing. Systems and services (electronic messaging, serials control, database creation, cataloging, collections, circulation,…
75 FR 32692 - Schools and Libraries Universal Service Support Mechanism
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-09
..., wireless Internet access applications, and web hosting. We propose to revise the Commission's rules to.../anti-spam software, scheduling services, wireless Internet access applications, and web hosting should... schools and libraries may receive discounts for eligible telecommunications services, Internet access, and...
NASA Astrophysics Data System (ADS)
Hardyanto, W.; Purwinarko, A.; Adhi, M. A.
2018-03-01
The library which is the gate of the University should be supported by the existence of an adequate information system, to provide excellent service and optimal to every user. Library management system that has been in existence since 2009 needs to be re-evaluated so that the system can meet the needs of both operator and Unnes user in particular, and users from outside Unnes in general. This study aims to evaluate and improve the existing library management system to produce a system that is accountable and able to meet the needs of end users, as well as produce a library management system that is integrated Unnes. Research is directed to produce evaluation report with Technology Acceptance Model (TAM) approach and library management system integrated with the national standard.
ERIC Educational Resources Information Center
Choudhury, Sayeed; Hobbs, Benjamin; Lorie, Mark; Flores, Nicholas; Coleman, Anita; Martin, Mairead; Kuhlman, David L.; McNair, John H.; Rhodes, William A.; Tipton, Ron; Agnew, Grace; Nicholson, Dennis; Macgregor, George
2002-01-01
Includes four articles that address issues related to digital libraries. Highlights include a framework for evaluating digital library services, particularly academic research libraries; interdisciplinary approaches to education about digital libraries that includes library and information science and computing; digital rights management; and the…
Implementing the HDF-EOS5 software library for data products in the UNAVCO InSAR archive
NASA Astrophysics Data System (ADS)
Baker, Scott; Meertens, Charles; Crosby, Christopher
2017-04-01
UNAVCO is a non-profit university-governed consortium that operates the U.S. National Science Foundation (NSF) Geodesy Advancing Geosciences and EarthScope (GAGE) facility and provides operational support to the Western North America InSAR Consortium (WInSAR). The seamless synthetic aperture radar archive (SSARA) is a seamless distributed access system for SAR data and higher-level data products. Under the NASA-funded SSARA project, a user-contributed InSAR archive for interferograms, time series, and other derived data products was developed at UNAVCO. The InSAR archive development has led to the adoption of the HDF-EOS5 data model, file format, and library. The HDF-EOS software library was designed to support NASA Earth Observation System (EOS) science data products and provides data structures for radar geometry (Swath) and geocoded (Grid) data based on the HDF5 data model and file format provided by the HDF Group. HDF-EOS5 inherits the benefits of HDF5 (open-source software support, internal compression, portability, support for structural data, self-describing file metadata enhanced performance, and xml support) and provides a way to standardize InSAR data products. Instrument- and datatype-independent services, such as subsetting, can be applied to files across a wide variety of data products through the same library interface. The library allows integration with GIS software packages such as ArcGIS and GDAL, conversion to other data formats like NetCDF and GeoTIFF, and is extensible with new data structures to support future requirements. UNAVCO maintains a GitHub repository that provides example software for creating data products from popular InSAR processing software packages like GMT5SAR and ISCE as well as examples for reading and converting the data products into other formats. Digital object identifiers (DOI) have been incorporated into the InSAR archive allowing users to assign a permanent location for their processed result and easily reference the final data products. A metadata attribute is added to the HDF-EOS5 file when a DOI is minted for a data product. These data products are searchable through the SSARA federated query providing access to processed data for both expert and non-expert InSAR users. The archive facilitates timely distribution of processed data with particular importance for geohazards and event response.
Implementation of a Space Communications Cognitive Engine
NASA Technical Reports Server (NTRS)
Hackett, Timothy M.; Bilen, Sven G.; Ferreira, Paulo Victor R.; Wyglinski, Alexander M.; Reinhart, Richard C.
2017-01-01
Although communications-based cognitive engines have been proposed, very few have been implemented in a full system, especially in a space communications system. In this paper, we detail the implementation of a multi-objective reinforcement-learning algorithm and deep artificial neural networks for the use as a radio-resource-allocation controller. The modular software architecture presented encourages re-use and easy modification for trying different algorithms. Various trade studies involved with the system implementation and integration are discussed. These include the choice of software libraries that provide platform flexibility and promote reusability, choices regarding the deployment of this cognitive engine within a system architecture using the DVB-S2 standard and commercial hardware, and constraints placed on the cognitive engine caused by real-world radio constraints. The implemented radio-resource allocation-management controller was then integrated with the larger spaceground system developed by NASA Glenn Research Center (GRC).
StreamThermal: A software package for calculating thermal metrics from stream temperature data
Tsang, Yin-Phan; Infante, Dana M.; Stewart, Jana S.; Wang, Lizhu; Tingly, Ralph; Thornbrugh, Darren; Cooper, Arthur; Wesley, Daniel
2016-01-01
Improving quality and better availability of continuous stream temperature data allows natural resource managers, particularly in fisheries, to understand associations between different characteristics of stream thermal regimes and stream fishes. However, there is no convenient tool to efficiently characterize multiple metrics reflecting stream thermal regimes with the increasing amount of data. This article describes a software program packaged as a library in R to facilitate this process. With this freely-available package, users will be able to quickly summarize metrics that describe five categories of stream thermal regimes: magnitude, variability, frequency, timing, and rate of change. The installation and usage instruction of this package, the definition of calculated thermal metrics, as well as the output format from the package are described, along with an application showing the utility for multiple metrics. We believe this package can be widely utilized by interested stakeholders and greatly assist more studies in fisheries.
Menegidio, Fabiano B; Jabes, Daniela L; Costa de Oliveira, Regina; Nunes, Luiz R
2018-02-01
This manuscript introduces and describes Dugong, a Docker image based on Ubuntu 16.04, which automates installation of more than 3500 bioinformatics tools (along with their respective libraries and dependencies), in alternative computational environments. The software operates through a user-friendly XFCE4 graphic interface that allows software management and installation by users not fully familiarized with the Linux command line and provides the Jupyter Notebook to assist in the delivery and exchange of consistent and reproducible protocols and results across laboratories, assisting in the development of open science projects. Source code and instructions for local installation are available at https://github.com/DugongBioinformatics, under the MIT open source license. Luiz.nunes@ufabc.edu.br. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com
DICOM static and dynamic representation through unified modeling language
NASA Astrophysics Data System (ADS)
Martinez-Martinez, Alfonso; Jimenez-Alaniz, Juan R.; Gonzalez-Marquez, A.; Chavez-Avelar, N.
2004-04-01
The DICOM standard, as all standards, specifies in generic way the management in network and storage media environments of digital medical images and their related information. However, understanding the specifications for particular implementation is not a trivial work. Thus, this work is about understanding and modelling parts of the DICOM standard using Object Oriented methodologies, as part of software development processes. This has offered different static and dynamic views, according with the standard specifications, and the resultant models have been represented through the Unified Modelling Language (UML). The modelled parts are related to network conformance claim: Network Communication Support for Message Exchange, Message Exchange, Information Object Definitions, Service Class Specifications, Data Structures and Encoding, and Data Dictionary. The resultant models have given a better understanding about DICOM parts and have opened the possibility of create a software library to develop DICOM conformable PACS applications.
ERIC Educational Resources Information Center
International Federation of Library Associations and Institutions, The Hague (Netherlands).
Papers presented at a session on management of library associations at the 1986 International Federation of Library Associations (IFLA) conference include: (1) "Medical Library Association: Organizational Change 1898 to Present--Illustrations from Continuing Education" (Raymond A. Palmer and M. Kent Mayfield, United States); (2)…
ERIC Educational Resources Information Center
Penchansky, Mimi B.; And Others
Prepared for the 1981 Spring Institute of the Library Association of City University of New York (LACUNY), this bibliography lists sources on academic library management techniques. Its three sections encompass the following areas: (1) the individual's relationship to the library organization, (2) effective management of time, and (3) human…
Management Matters: The Library Media Specialist's Management Toolbox
ERIC Educational Resources Information Center
Pappas, Marjorie L.
2004-01-01
Library media specialists need tools to help them manage the school library media program. The Internet includes a vast array of tools that a library media specialist might find useful. The websites and electronic resources included in this article are only a representative sample and future columns may explore additional tools. All the tools are…
The Management of Technical Services--1980.
ERIC Educational Resources Information Center
Rohdy, Margaret A.
1981-01-01
Examines the literature of administration of library technical services from both a library and a management approach, and describes some of the activities of management sections of various library organizations. There are 27 references. (RAA)
A Manual for Small Libraries in Alaska. Second Edition.
ERIC Educational Resources Information Center
Kolb, Audrey
This manual is intended as a guide and a resource for staff and volunteers in small public libraries in Alaska. It is for libraries managed by local residents that have not had training in operating and managing a library. Some of the chapters may be useful to other types of libraries, such as a small school library or a church library. The guide…
Why Students Should Use Multimedia.
ERIC Educational Resources Information Center
Bucher, Katherine
1995-01-01
Examines benefits of multimedia presentation software for teachers, librarians, and elementary and secondary school students. Discusses an example of two school library media specialists working with an English class using the library's reference collection (print and CD-ROM) and HyperCard. Suggests presentation assignments for various subjects.…
Microcomputers and Workstations in Libraries: Trends and Opportunities.
ERIC Educational Resources Information Center
Welsch, Erwin K.
1990-01-01
Summarizes opinions of scholars in various disciplines on workstation history, definition, and functions. Networks and configurations for library workstations, including hardware and software recommendations, are described. The impact of workstations on the workplace resulting in task, process, and institutional transformation, is also considered.…
Exploring Planet PDA: The Librarian as Astronaut, Innovator, and Expert.
ERIC Educational Resources Information Center
Galganski, Carol; Peters, Tom; Bell, Lori
2002-01-01
Describes the integration of personal digital assistants into a medical center library's services in Illinois. Discusses training for users; hardware selection; software selection and content; technical support; the role of libraries, including the creation of policies and procedures; and future challenges. (LRW)
The Future of Library Automation in Schools.
ERIC Educational Resources Information Center
Anderson, Elaine
2000-01-01
Addresses the future of library automation programs for schools. Discusses requirements of emerging OPACs and circulation systems; the Schools Interoperability Framework (SIF), an industry initiatives to develop an open specification for ensuring that K-12 instructional and administrative software applications work together more effectively; home…
Providing Services to Virtual Patrons.
ERIC Educational Resources Information Center
Hulshof, Robert
1999-01-01
Discusses the types of services libraries need to support patrons who access the library via the Internet or e-mail. Highlights include issues in technical support; establishing policies and procedures; tools for technical support, including hardware and software; impacts of technical support on staff; and future possibilities. (LRW)
Protecting Public-Access Computers in Libraries.
ERIC Educational Resources Information Center
King, Monica
1999-01-01
Describes one public library's development of a computer-security plan, along with helpful products used. Discussion includes Internet policy, physical protection of hardware, basic protection of the operating system and software on the network, browser dilemmas and maintenance, creating clear intuitive interface, and administering fair use and…
Virtually Usable: A Test of the Information Gardens
ERIC Educational Resources Information Center
Johnson, Megan; Ochoa, Louise; Purpur, Geraldine
2007-01-01
This paper presents the results of a usability study conducted to determine the functionality of a desktop, three-dimensional virtual library designed and supported by the Appalachian State University Distance Learning Library Services team. Formative evaluations were performed with representative students utilizing Morae software. Results…
75 FR 25185 - Broadband Initiatives Program
Federal Register 2010, 2011, 2012, 2013, 2014
2010-05-07
..., excluding desktop or laptop computers, computer hardware and software (including anti-virus, anti-spyware, and other security software), audio or video equipment, computer network components... 10 desktop or laptop computers and individual workstations to be located within the rural library...
Open source clustering software.
de Hoon, M J L; Imoto, S; Nolan, J; Miyano, S
2004-06-12
We have implemented k-means clustering, hierarchical clustering and self-organizing maps in a single multipurpose open-source library of C routines, callable from other C and C++ programs. Using this library, we have created an improved version of Michael Eisen's well-known Cluster program for Windows, Mac OS X and Linux/Unix. In addition, we generated a Python and a Perl interface to the C Clustering Library, thereby combining the flexibility of a scripting language with the speed of C. The C Clustering Library and the corresponding Python C extension module Pycluster were released under the Python License, while the Perl module Algorithm::Cluster was released under the Artistic License. The GUI code Cluster 3.0 for Windows, Macintosh and Linux/Unix, as well as the corresponding command-line program, were released under the same license as the original Cluster code. The complete source code is available at http://bonsai.ims.u-tokyo.ac.jp/mdehoon/software/cluster. Alternatively, Algorithm::Cluster can be downloaded from CPAN, while Pycluster is also available as part of the Biopython distribution.
ERIC Educational Resources Information Center
Turner, Richard; Matthews, Graham; Ashcroft, Linda; Farrow, Janet
2007-01-01
This paper investigates aspects of the management of independent secondary school libraries in England and Wales. It is based on a survey of 150 independent school library managers, with a response rate of 68.7 percent, which was carried out as part of an ongoing PhD research project. The paper considers a range of issues important to school…
ERIC Educational Resources Information Center
Kim, Yong-Mi; Abbas, June
2010-01-01
This study investigates the adoption of Library 2.0 functionalities by academic libraries and users through a knowledge management perspective. Based on randomly selected 230 academic library Web sites and 184 users, the authors found RSS and blogs are widely adopted by academic libraries while users widely utilized the bookmark function.…
Self-Managed Teams for Library Management: Increasing Employee Participation via Empowerment.
ERIC Educational Resources Information Center
Poon-Richards, Craig
1995-01-01
Investigates the growing prevalence of participatory management in libraries. The operation of self-managed teams is discussed both in theory and in practice, the latter with examples from Sterling Library at Yale University. Research is summarized that relates to management teams and how they create a sense of empowerment by building shared…
Providing Access to CD-ROM Databases in a Campus Setting. Part II: Networking CD-ROMs via a LAN.
ERIC Educational Resources Information Center
Koren, Judy
1992-01-01
The second part of a report on CD-ROM networking in libraries describes LAN (local area network) technology; networking software and towers; gateway software for connecting to campuswide networks; Macintosh LANs; and network licenses. Several product and software reviews are included, and a sidebar lists vendor addresses. (NRP)
Research flight software engineering and MUST, an integrated system of support tools
NASA Technical Reports Server (NTRS)
Straeter, T. A.; Foudriat, E. C.; Will, R. W.
1977-01-01
Consideration is given to software development to support NASA flight research. The Multipurpose User-Oriented Software Technology (MUST) program, designed to integrate digital systems into flight research, is discussed. Particular attention is given to the program's special interactive user interface, subroutine library, assemblers, compiler, automatic documentation tools, and test and simulation subsystems.
Wrapping Python around MODFLOW/MT3DMS based groundwater models
NASA Astrophysics Data System (ADS)
Post, V.
2008-12-01
Numerical models that simulate groundwater flow and solute transport require a great amount of input data that is often organized into different files. A large proportion of the input data consists of spatially-distributed model parameters. The model output consists of a variety data such as heads, fluxes and concentrations. Typically all files have different formats. Consequently, preparing input and managing output is a complex and error-prone task. Proprietary software tools are available that facilitate the preparation of input files and analysis of model outcomes. The use of such software may be limited if it does not support all the features of the groundwater model or when the costs of such tools are prohibitive. Therefore a Python library was developed that contains routines to generate input files and process output files of MODFLOW/MT3DMS based models. The library is freely available and has an open structure so that the routines can be customized and linked into other scripts and libraries. The current set of functions supports the generation of input files for MODFLOW and MT3DMS, including the capability to read spatially-distributed input parameters (e.g. hydraulic conductivity) from PNG files. Both ASCII and binary output files can be read efficiently allowing for visualization of, for example, solute concentration patterns in contour plots with superimposed flow vectors using matplotlib. Series of contour plots are then easily saved as an animation. The subroutines can also be used within scripts to calculate derived quantities such as the mass of a solute within a particular region of the model domain. Using Python as a wrapper around groundwater models provides an efficient and flexible way of processing input and output data, which is not constrained by limitations of third-party products.
ARC SDK: A toolbox for distributed computing and data applications
NASA Astrophysics Data System (ADS)
Skou Andersen, M.; Cameron, D.; Lindemann, J.
2014-06-01
Grid middleware suites provide tools to perform the basic tasks of job submission and retrieval and data access, however these tools tend to be low-level, operating on individual jobs or files and lacking in higher-level concepts. User communities therefore generally develop their own application-layer software catering to their specific communities' needs on top of the Grid middleware. It is thus important for the Grid middleware to provide a friendly, well documented and simple to use interface for the applications to build upon. The Advanced Resource Connector (ARC), developed by NorduGrid, provides a Software Development Kit (SDK) which enables applications to use the middleware for job and data management. This paper presents the architecture and functionality of the ARC SDK along with an example graphical application developed with the SDK. The SDK consists of a set of libraries accessible through Application Programming Interfaces (API) in several languages. It contains extensive documentation and example code and is available on multiple platforms. The libraries provide generic interfaces and rely on plugins to support a given technology or protocol and this modular design makes it easy to add a new plugin if the application requires supporting additional technologies.The ARC Graphical Clients package is a graphical user interface built on top of the ARC SDK and the Qt toolkit and it is presented here as a fully functional example of an application. It provides a graphical interface to enable job submission and management at the click of a button, and allows data on any Grid storage system to be manipulated using a visual file system hierarchy, as if it were a regular file system.
ERIC Educational Resources Information Center
Krzywkowski, Valerie I., Ed.
The 15 papers in this collection discuss various aspects of computer use in libraries and several other aspects of library service not directly related to computers. Following an introduction and a list of officers, the papers are: (1) "Criminal Justice and Related Databases" (Kate E. Adams); (2) "Software and Hard Thought:…
Putting the Library at Students' Fingertips
ERIC Educational Resources Information Center
Foley, Marianne
2012-01-01
The absence of a well-defined space for library resources within course-management systems has been well documented in library literature. Academic libraries have sought to remedy this deficiency in numerous ways. This article describes how a "library nugget," or module, was added to the Buffalo State College course-management system,…
Using geographic information systems to identify prospective marketing areas for a special library.
McConnaughy, Rozalynd P; Wilson, Steven P
2006-05-04
The Center for Disability Resources (CDR) Library is the largest collection of its kind in the Southeastern United States, consisting of over 5,200 books, videos/DVDs, brochures, and audiotapes covering a variety of disability-related topics, from autism to transition resources. The purpose of the library is to support the information needs of families, faculty, students, staff, and other professionals in South Carolina working with individuals with disabilities. The CDR Library is funded on a yearly basis; therefore, maintaining high usage is crucial. A variety of promotional efforts have been used to attract new patrons to the library. Anyone in South Carolina can check out materials from the library, and most of the patrons use the library remotely by requesting materials, which are then mailed to them. The goal of this project was to identify areas of low geographic usage as a means of identifying locations for future library marketing efforts. Nearly four years worth of library statistics were compiled in a spreadsheet that provided information per county on the number of checkouts, the number of renewals, and the population. Five maps were created using ArcView GIS software to create visual representations of patron checkout and renewal behavior per county. Out of the 46 counties in South Carolina, eight counties never checked out materials from the library. As expected urban areas and counties near the library's physical location have high usage totals. The visual representation of the data made identification of low usage regions easier than using a standalone database with no visual-spatial component. The low usage counties will be the focus of future Center for Disability Resources Library marketing efforts. Due to the impressive visual-spatial representations created with Geographic Information Systems, which more efficiently communicate information than stand-alone database information can, librarians may benefit from the software's use as a supplemental tool for tracking library usage and planning promotional efforts.
Software packager user's guide
NASA Technical Reports Server (NTRS)
Callahan, John R.
1995-01-01
Software integration is a growing area of concern for many programmers and software managers because the need to build new programs quickly from existing components is greater than ever. This includes building versions of software products for multiple hardware platforms and operating systems, building programs from components written in different languages, and building systems from components that must execute on different machines in a distributed network. The goal of software integration is to make building new programs from existing components more seamless -- programmers should pay minimal attention to the underlying configuration issues involved. Libraries of reusable components and classes are important tools but only partial solutions to software development problems. Even though software components may have compatible interfaces, there may be other reasons, such as differences between execution environments, why they cannot be integrated. Often, components must be adapted or reimplemented to fit into another application because of implementation differences -- they are implemented in different programming languages, dependent on different operating system resources, or must execute on different physical machines. The software packager is a tool that allows programmers to deal with interfaces between software components and ignore complex integration details. The packager takes modular descriptions of the structure of a software system written in the package specification language and produces an integration program in the form of a makefile. If complex integration tools are needed to integrate a set of components, such as remote procedure call stubs, their use is implied by the packager automatically and stub generation tools are invoked in the corresponding makefile. The programmer deals only with the components themselves and not the details of how to build the system on any given platform.
Sand, Andreas; Kristiansen, Martin; Pedersen, Christian N S; Mailund, Thomas
2013-11-22
Hidden Markov models are widely used for genome analysis as they combine ease of modelling with efficient analysis algorithms. Calculating the likelihood of a model using the forward algorithm has worst case time complexity linear in the length of the sequence and quadratic in the number of states in the model. For genome analysis, however, the length runs to millions or billions of observations, and when maximising the likelihood hundreds of evaluations are often needed. A time efficient forward algorithm is therefore a key ingredient in an efficient hidden Markov model library. We have built a software library for efficiently computing the likelihood of a hidden Markov model. The library exploits commonly occurring substrings in the input to reuse computations in the forward algorithm. In a pre-processing step our library identifies common substrings and builds a structure over the computations in the forward algorithm which can be reused. This analysis can be saved between uses of the library and is independent of concrete hidden Markov models so one preprocessing can be used to run a number of different models.Using this library, we achieve up to 78 times shorter wall-clock time for realistic whole-genome analyses with a real and reasonably complex hidden Markov model. In one particular case the analysis was performed in less than 8 minutes compared to 9.6 hours for the previously fastest library. We have implemented the preprocessing procedure and forward algorithm as a C++ library, zipHMM, with Python bindings for use in scripts. The library is available at http://birc.au.dk/software/ziphmm/.
The JPL Library information retrieval system
NASA Technical Reports Server (NTRS)
Walsh, J.
1975-01-01
The development, capabilities, and products of the computer-based retrieval system of the Jet Propulsion Laboratory Library are described. The system handles books and documents, produces a book catalog, and provides a machine search capability. Programs and documentation are available to the public through NASA's computer software dissemination program.
Bibliographic Utilities and the Use of Microcomputers in Libraries: Current and Projected Practices.
ERIC Educational Resources Information Center
McAninch, Glen
1986-01-01
Bibliographic utilities are marketing specially designed hardware and software that permit libraries to patch together automated cataloging, acquisitions, serials control, reference, online public catalogs, circulation, interlibrary loan, and administrative functions. The increasing complexity of the technology is making it more difficult to…
Development and Evaluation of a Measure of Library Automation.
ERIC Educational Resources Information Center
Pungitore, Verna L.
1986-01-01
Construct validity and reliability estimates indicate that study designed to measure utilization of automation in public and academic libraries was successful in tentatively identifying and measuring three subdimensions of level of automation: quality of hardware, method of software development, and number of automation specialists. Questionnaire…
Public Access Workstations in the Library: New Trends.
ERIC Educational Resources Information Center
Beecher, Henry
1991-01-01
Discusses the use of microcomputer-based workstations that are provided for public access in libraries. Criteria for workstations are discussed, including standard hardware, open-design software, scalable interface, and connectivity options for networking; systems that provide full-text access are described; and the need for standards is…
Search without Boundaries Using Simple APIs
ERIC Educational Resources Information Center
Tong, Qi (Helen)
2009-01-01
The U.S. Geological Survey (USGS) Library, where the author serves as the digital services librarian, is increasingly challenged to make it easier for users to find information from many heterogeneous information sources. Information is scattered throughout different software applications (i.e., library catalog, federated search engine, link…
Ringling School of Art and Design Builds a CASTLE.
ERIC Educational Resources Information Center
Morse, Yvonne; Davis, Wendy
1984-01-01
Describes the development and installation of the Computer Automated Software for the Total Library Environment System (CASTLE), which uses a microcomputer to automate operations of small academic library in six main areas: circulation, online catalog, inventory and file maintenance, audiovisual equipment, accounting, and information and…
Imaging Technology in Libraries: Photo CD Offers New Possibilities.
ERIC Educational Resources Information Center
Beiser, Karl
1993-01-01
Describes Kodak's Photo CD technology, a format for the storage and retrieval of photographic images in electronic form. Highlights include current and future Photo CD formats; computer imaging technology; ownership issues; hardware for using Photo CD; software; library and information center applications, including image collections and…
The Quebec National Library on the Web.
ERIC Educational Resources Information Center
Kieran, Shirley; Sauve, Diane
1997-01-01
Provides an overview of the Quebec National Library (Bibliotheque Nationale du Quebec, or BNQ) Web site. Highlights include issues related to content, design, and technology; IRIS, the BNQ online public access catalog; development of the multimedia catalog; software; digitization of documents; links to bibliographic records; and future…
Going Prime Time with Live Chat Reference.
ERIC Educational Resources Information Center
Hoag, Tara J.; Cichanowicz, Edana McCaffrey
2001-01-01
Describes the development of the Suffolk Cooperative Library System's live, online chat reference service, a pilot project for public libraries in Suffolk County (New York). Topics include chat software selection; a virtual reference collection; marketing; funding; staffing; evaluation; expanded hours of service; email; and extracting data from…
Design of Inhouse Automated Library Systems.
ERIC Educational Resources Information Center
Cortez, Edwin M.
1984-01-01
Examines six steps inherent to development of in-house automated library system: (1) problem definition, (2) requirement specifications, (3) analysis of alternatives and solutions, (4, 5) design and implementation of hardware and software, and (6) evaluation. Practical method for comparing and weighting options is illustrated and explained. A…
Who Goes There? Measuring Library Web Site Usage.
ERIC Educational Resources Information Center
Bauer, Kathleen
2000-01-01
Discusses how libraries can gather data on the use of their Web sites. Highlights include Web server log files, including the common log file, referrer log file, and agent log file; log file limitations; privacy concerns; and choosing log analysis software, both free and commercial. (LRW)
ERIC Educational Resources Information Center
Pitkin, Gary M., Ed.
As a reference guide for library professionals, this volume helps librarians prepare for the future in the growing electronic environment by examining the historical and theoretical background of the National Electronic Library and assessing the role of libraries in the past, present, and future. The book is divided in two sections: "The…
Tips for Good Electronic Presentations.
ERIC Educational Resources Information Center
Strasser, Dennis
1996-01-01
Describes library uses of presentation graphics software and offers tips for creating electronic presentations. Tips include: audience retention; visual aid options; software package options; presentation planning; presentation showing; and use of text, colors, and graphics. Sidebars note common presentation errors and popular presentation…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, K.G.; Kosson, D.S.; Garrabrants, A.C.
2013-07-01
The Cementitious Barriers Partnership (CBP) Project is a multi-disciplinary, multi-institutional collaboration supported by the Office of Tank Waste Management within the Office of Environmental Management of U.S. Department of Energy (US DOE). The CBP program has developed a set of integrated tools (based on state-of-the-art models and leaching test methods) that improve understanding and predictions of the long-term hydraulic and chemical performance of cementitious barriers used in nuclear applications. Tools selected for and developed under this program are intended to evaluate and predict the behavior of cementitious barriers used in near-surface engineered waste disposal systems for periods of performance upmore » to or longer than 100 years for operating facilities and longer than 1,000 years for waste management purposes. CBP software tools were made available to selected DOE Office of Environmental Management and field site users for training and evaluation based on a set of important degradation scenarios, including sulfate ingress/attack and carbonation of cementitious materials. The tools were presented at two-day training workshops held at U.S. National Institute of Standards and Technology (NIST), Savannah River, and Hanford included LeachXS{sup TM}/ORCHESTRA, STADIUM{sup R}, and a CBP-developed GoldSim Dashboard interface. Collectively, these components form the CBP Software Toolbox. The new U.S. Environmental Protection Agency leaching test methods based on the Leaching Environmental Assessment Framework (LEAF) were also presented. The CBP Dashboard uses a custom Dynamic-link library developed by CBP to couple to the LeachXS{sup TM}/ORCHESTRA and STADIUM{sup R} codes to simulate reactive transport and degradation in cementitious materials for selected performance assessment scenarios. The first day of the workshop introduced participants to the software components via presentation materials, and the second day included hands-on tutorial exercises followed by discussions of enhancements desired by participants. Tools were revised based on feedback obtained during the workshops held from April through June 2012. The resulting improved CBP Software Toolbox, including evaluation versions of and LeachXS{sup TM}/ORCHESTRA and STADIUM{sup R} has been made available to workshop and selected other participants for further assessment. Inquiries about future workshops and requests for access to the Toolbox software can be made via the CBP web site [1]. (authors)« less
OSIRIX: open source multimodality image navigation software
NASA Astrophysics Data System (ADS)
Rosset, Antoine; Pysher, Lance; Spadola, Luca; Ratib, Osman
2005-04-01
The goal of our project is to develop a completely new software platform that will allow users to efficiently and conveniently navigate through large sets of multidimensional data without the need of high-end expensive hardware or software. We also elected to develop our system on new open source software libraries allowing other institutions and developers to contribute to this project. OsiriX is a free and open-source imaging software designed manipulate and visualize large sets of medical images: http://homepage.mac.com/rossetantoine/osirix/
CRISPR library designer (CLD): software for multispecies design of single guide RNA libraries.
Heigwer, Florian; Zhan, Tianzuo; Breinig, Marco; Winter, Jan; Brügemann, Dirk; Leible, Svenja; Boutros, Michael
2016-03-24
Genetic screens using CRISPR/Cas9 are a powerful method for the functional analysis of genomes. Here we describe CRISPR library designer (CLD), an integrated bioinformatics application for the design of custom single guide RNA (sgRNA) libraries for all organisms with annotated genomes. CLD is suitable for the design of libraries using modified CRISPR enzymes and targeting non-coding regions. To demonstrate its utility, we perform a pooled screen for modulators of the TNF-related apoptosis inducing ligand (TRAIL) pathway using a custom library of 12,471 sgRNAs. CLD predicts a high fraction of functional sgRNAs and is publicly available at https://github.com/boutroslab/cld.
NASA Technical Reports Server (NTRS)
Conrad, A. R.; Lupton, W. F.
1992-01-01
Each Keck instrument presents a consistent software view to the user interface programmer. The view consists of a small library of functions, which are identical for all instruments, and a large set of keywords, that vary from instrument to instrument. All knowledge of the underlying task structure is hidden from the application programmer by the keyword layer. Image capture software uses the same function library to collect data for the image header. Because the image capture software and the instrument control software are built on top of the same keyword layer, a given observation can be 'replayed' by extracting keyword-value pairs from the image header and passing them back to the control system. The keyword layer features non-blocking as well as blocking I/O. A non-blocking keyword write operation (such as setting a filter position) specifies a callback to be invoked when the operation is complete. A non-blocking keyword read operation specifies a callback to be invoked whenever the keyword changes state. The keyword-callback style meshes well with the widget-callback style commonly used in X window programs. The first keyword library was built for the two Keck optical instruments. More recently, keyword libraries have been developed for the infrared instruments and for telescope control. Although the underlying mechanisms used for inter-process communication by each of these systems vary widely (Lick MUSIC, Sun RPC, and direct socket I/O, respectively), a basic user interface has been written that can be used with any of these systems. Since the keyword libraries are bound to user interface programs dynamically at run time, only a single set of user interface executables is needed. For example, the same program, 'xshow', can be used to display continuously the telescope's position, the time left in an instrument's exposure, or both values simultaneously. Less generic tools that operate on specific keywords, for example an X display that controls optical instrument exposures, have also been written using the keyword layer.
The software and algorithms for hyperspectral data processing
NASA Astrophysics Data System (ADS)
Shyrayeva, Anhelina; Martinov, Anton; Ivanov, Victor; Katkovsky, Leonid
2017-04-01
Hyperspectral remote sensing technique is widely used for collecting and processing -information about the Earth's surface objects. Hyperspectral data are combined to form a three-dimensional (x, y, λ) data cube. Department of Aerospace Research of the Institute of Applied Physical Problems of the Belarusian State University presents a general model of the software for hyperspectral image data analysis and processing. The software runs in Windows XP/7/8/8.1/10 environment on any personal computer. This complex has been has been written in C++ language using QT framework and OpenGL for graphical data visualization. The software has flexible structure that consists of a set of independent plugins. Each plugin was compiled as Qt Plugin and represents Windows Dynamic library (dll). Plugins can be categorized in terms of data reading types, data visualization (3D, 2D, 1D) and data processing The software has various in-built functions for statistical and mathematical analysis, signal processing functions like direct smoothing function for moving average, Savitzky-Golay smoothing technique, RGB correction, histogram transformation, and atmospheric correction. The software provides two author's engineering techniques for the solution of atmospheric correction problem: iteration method of refinement of spectral albedo's parameters using Libradtran and analytical least square method. The main advantages of these methods are high rate of processing (several minutes for 1 GB data) and low relative error in albedo retrieval (less than 15%). Also, the software supports work with spectral libraries, region of interest (ROI) selection, spectral analysis such as cluster-type image classification and automatic hypercube spectrum comparison by similarity criterion with similar ones from spectral libraries, and vice versa. The software deals with different kinds of spectral information in order to identify and distinguish spectrally unique materials. Also, the following advantages should be noted: fast and low memory hypercube manipulation features, user-friendly interface, modularity, and expandability.
Knowledge Management and Academic Libraries.
ERIC Educational Resources Information Center
Townley, Charles T.
2001-01-01
The emerging field of knowledge management offers academic libraries the opportunity to improve effectiveness, both for themselves and their parent institutions. This article summarizes knowledge management theory. Current applications in academic libraries and higher education are described. Similarities and difficulties between knowledge…
Academic Information Services: A Library Management Perspective.
ERIC Educational Resources Information Center
Allen, Bryce
1995-01-01
Using networked information resources to communicate research results has great potential for academic libraries; this development will require collaboration among libraries, scholars, computing centers, and university presses. Library managers can help overcome collaboration barriers by developing appropriate organizational structures, selecting…
Beam Instrument Development System
DOE Office of Scientific and Technical Information (OSTI.GOV)
DOOLITTLE, LAWRENCE; HUANG, GANG; DU, QIANG
Beam Instrumentation Development System (BIDS) is a collection of common support libraries and modules developed during a series of Low-Level Radio Frequency (LLRF) control and timing/synchronization projects. BIDS includes a collection of Hardware Description Language (HDL) libraries and software libraries. The BIDS can be used for the development of any FPGA-based system, such as LLRF controllers. HDL code in this library is generic and supports common Digital Signal Processing (DSP) functions, FPGA-specific drivers (high-speed serial link wrappers, clock generation, etc.), ADC/DAC drivers, Ethernet MAC implementation, etc.
System and method for memory allocation in a multiclass memory system
Loh, Gabriel; Meswani, Mitesh; Ignatowski, Michael; Nutter, Mark
2016-06-28
A system for memory allocation in a multiclass memory system includes a processor coupleable to a plurality of memories sharing a unified memory address space, and a library store to store a library of software functions. The processor identifies a type of a data structure in response to a memory allocation function call to the library for allocating memory to the data structure. Using the library, the processor allocates portions of the data structure among multiple memories of the multiclass memory system based on the type of the data structure.
Analyzing rasters, vectors and time series using new Python interfaces in GRASS GIS 7
NASA Astrophysics Data System (ADS)
Petras, Vaclav; Petrasova, Anna; Chemin, Yann; Zambelli, Pietro; Landa, Martin; Gebbert, Sören; Neteler, Markus; Löwe, Peter
2015-04-01
GRASS GIS 7 is a free and open source GIS software developed and used by many scientists (Neteler et al., 2012). While some users of GRASS GIS prefer its graphical user interface, significant part of the scientific community takes advantage of various scripting and programing interfaces offered by GRASS GIS to develop new models and algorithms. Here we will present different interfaces added to GRASS GIS 7 and available in Python, a popular programming language and environment in geosciences. These Python interfaces are designed to satisfy the needs of scientists and programmers under various circumstances. PyGRASS (Zambelli et al., 2013) is a new object-oriented interface to GRASS GIS modules and libraries. The GRASS GIS libraries are implemented in C to ensure maximum performance and the PyGRASS interface provides an intuitive, pythonic access to their functionality. GRASS GIS Python scripting library is another way of accessing GRASS GIS modules. It combines the simplicity of Bash and the efficiency of the Python syntax. When full access to all low-level and advanced functions and structures from GRASS GIS library is required, Python programmers can use an interface based on the Python ctypes package. Ctypes interface provides complete, direct access to all functionality as it would be available to C programmers. GRASS GIS provides specialized Python library for managing and analyzing spatio-temporal data (Gebbert and Pebesma, 2014). The temporal library introduces space time datasets representing time series of raster, 3D raster or vector maps and allows users to combine various spatio-temporal operations including queries, aggregation, sampling or the analysis of spatio-temporal topology. We will also discuss the advantages of implementing scientific algorithm as a GRASS GIS module and we will show how to write such module in Python. To facilitate the development of the module, GRASS GIS provides a Python library for testing (Petras and Gebbert, 2014) which helps researchers to ensure the robustness of the algorithm, correctness of the results in edge cases as well as the detection of changes in results due to new development. For all modules GRASS GIS automatically creates standardized command line and graphical user interfaces and documentation. Finally, we will show how GRASS GIS can be used together with powerful Python tools such as the NumPy package and the IPython Notebook. References: Gebbert, S., Pebesma, E., 2014. A temporal GIS for field based environmental modeling. Environmental Modelling & Software 53, 1-12. Neteler, M., Bowman, M.H., Landa, M. and Metz, M., 2012. GRASS GIS: a multi-purpose Open Source GIS. Environmental Modelling & Software 31: 124-130. Petras, V., Gebbert, S., 2014. Testing framework for GRASS GIS: ensuring reproducibility of scientific geospatial computing. Poster presented at: AGU Fall Meeting, December 15-19, 2014, San Francisco, USA. Zambelli, P., Gebbert, S., Ciolli, M., 2013. Pygrass: An Object Oriented Python Application Programming Interface (API) for Geographic Resources Analysis Support System (GRASS) Geographic Information System (GIS). ISPRS International Journal of Geo-Information 2, 201-219.
Rapid identification of dairy lactic acid bacteria by M13-generated, RAPD-PCR fingerprint databases.
Rossetti, Lia; Giraffa, Giorgio
2005-11-01
About a thousand lactic acid bacteria (LAB) isolated from dairy products, especially cheeses, were identified and typed by species-specific PCR and RAPD-PCR, respectively. RAPD-PCR profiles, which were obtained by using the M13 sequence as a primer, allowed us to implement a large database of different fingerprints, which were analysed by BioNumerics software. Cluster analysis of the combined RAPD-PCR fingerprinting profiles enabled us to implement a library, which is a collection of library units, which in turn is a selection of representative database entries. A library unit, in this case, can be considered to be a definable taxon. The strains belonged to 11 main RAPD-PCR fingerprinting library units identified as Lactobacillus casei/paracasei, Lactobacillus plantarum, Lactobacillus rhamnosus, Lactobacillus helveticus, Lactobacillus delbrueckii, Lactobacillus fermentum, Lactobacillus brevis, Enterococcus faecium, Enterococcus faecalis, Streptococcus thermophilus and Lactococcus lactis. The possibility to routinely identify newly typed, bacterial isolates by consulting the library of the software was valued. The proposed method could be suggested to refine previous strain identifications, eliminate redundancy and dispose of a technologically useful LAB strain collection. The same approach could also be applied to identify LAB strains isolated from other food ecosystems.