Sample records for file management tool

  1. Distributed File System Utilities to Manage Large DatasetsVersion 0.5

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2014-05-21

    FileUtils provides a suite of tools to manage large datasets typically created by large parallel MPI applications. They are written in C and use standard POSIX I/Ocalls. The current suite consists of tools to copy, compare, remove, and list. The tools provide dramatic speedup over existing Linux tools, which often run as a single process.

  2. Configuration Management File Manager Developed for Numerical Propulsion System Simulation

    NASA Technical Reports Server (NTRS)

    Follen, Gregory J.

    1997-01-01

    One of the objectives of the High Performance Computing and Communication Project's (HPCCP) Numerical Propulsion System Simulation (NPSS) is to provide a common and consistent way to manage applications, data, and engine simulations. The NPSS Configuration Management (CM) File Manager integrated with the Common Desktop Environment (CDE) window management system provides a common look and feel for the configuration management of data, applications, and engine simulations for U.S. engine companies. In addition, CM File Manager provides tools to manage a simulation. Features include managing input files, output files, textual notes, and any other material normally associated with simulation. The CM File Manager includes a generic configuration management Application Program Interface (API) that can be adapted for the configuration management repositories of any U.S. engine company.

  3. BamTools: a C++ API and toolkit for analyzing and managing BAM files.

    PubMed

    Barnett, Derek W; Garrison, Erik K; Quinlan, Aaron R; Strömberg, Michael P; Marth, Gabor T

    2011-06-15

    Analysis of genomic sequencing data requires efficient, easy-to-use access to alignment results and flexible data management tools (e.g. filtering, merging, sorting, etc.). However, the enormous amount of data produced by current sequencing technologies is typically stored in compressed, binary formats that are not easily handled by the text-based parsers commonly used in bioinformatics research. We introduce a software suite for programmers and end users that facilitates research analysis and data management using BAM files. BamTools provides both the first C++ API publicly available for BAM file support as well as a command-line toolkit. BamTools was written in C++, and is supported on Linux, Mac OSX and MS Windows. Source code and documentation are freely available at http://github.org/pezmaster31/bamtools.

  4. Data Storage and Transfer | High-Performance Computing | NREL

    Science.gov Websites

    High-Performance Computing (HPC) systems. Photo of computer server wiring and lights, blurred to show data. WinSCP for Windows File Transfers Use to transfer files from a local computer to a remote computer. Robinhood for File Management Use this tool to manage your data files on Peregrine. Best

  5. BamTools: a C++ API and toolkit for analyzing and managing BAM files

    PubMed Central

    Barnett, Derek W.; Garrison, Erik K.; Quinlan, Aaron R.; Strömberg, Michael P.; Marth, Gabor T.

    2011-01-01

    Motivation: Analysis of genomic sequencing data requires efficient, easy-to-use access to alignment results and flexible data management tools (e.g. filtering, merging, sorting, etc.). However, the enormous amount of data produced by current sequencing technologies is typically stored in compressed, binary formats that are not easily handled by the text-based parsers commonly used in bioinformatics research. Results: We introduce a software suite for programmers and end users that facilitates research analysis and data management using BAM files. BamTools provides both the first C++ API publicly available for BAM file support as well as a command-line toolkit. Availability: BamTools was written in C++, and is supported on Linux, Mac OSX and MS Windows. Source code and documentation are freely available at http://github.org/pezmaster31/bamtools. Contact: barnetde@bc.edu PMID:21493652

  6. 77 FR 72894 - Self-Regulatory Organizations; BATS Exchange, Inc.; Notice of Filing and Immediate Effectiveness...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-06

    ... Change To Expand the Availability of Risk Management Tools November 30, 2012. Pursuant to Section 19(b)(1... Exchange's Risk Management Tool (the ``Tool'') to all Exchange Members.\\3\\ The Tool is currently available... Access Risk Management Tool.\\6\\ This optional service acts as a risk filter by causing the orders of...

  7. FITSManager: Management of Personal Astronomical Data

    NASA Astrophysics Data System (ADS)

    Cui, Chenzhou; Fan, Dongwei; Zhao, Yongheng; Kembhavi, Ajit; He, Boliang; Cao, Zihuang; Li, Jian; Nandrekar, Deoyani

    2011-07-01

    With the increase of personal storage capacity, it is easy to find hundreds to thousands of FITS files in the personal computer of an astrophysicist. Because Flexible Image Transport System (FITS) is a professional data format initiated by astronomers and used mainly in the small community, data management toolkits for FITS files are very few. Astronomers need a powerful tool to help them manage their local astronomical data. Although Virtual Observatory (VO) is a network oriented astronomical research environment, its applications and related technologies provide useful solutions to enhance the management and utilization of astronomical data hosted in an astronomer's personal computer. FITSManager is such a tool to provide astronomers an efficient management and utilization of their local data, bringing VO to astronomers in a seamless and transparent way. FITSManager provides fruitful functions for FITS file management, like thumbnail, preview, type dependent icons, header keyword indexing and search, collaborated working with other tools and online services, and so on. The development of the FITSManager is an effort to fill the gap between management and analysis of astronomical data.

  8. 77 FR 72902 - Self-Regulatory Organizations; BATS Y-Exchange, Inc.; Notice of Filing and Immediate...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-06

    ... Rule Change To Expand the Availability of Risk Management Tools November 30, 2012. Pursuant to Section... availability of a Risk Management Tool (the ``Tool'') currently made available in connection with sponsored... Sponsored Access Risk Management Tool.\\6\\ This optional service acts as a risk filter by causing the orders...

  9. Registered File Support for Critical Operations Files at (Space Infrared Telescope Facility) SIRTF

    NASA Technical Reports Server (NTRS)

    Turek, G.; Handley, Tom; Jacobson, J.; Rector, J.

    2001-01-01

    The SIRTF Science Center's (SSC) Science Operations System (SOS) has to contend with nearly one hundred critical operations files via comprehensive file management services. The management is accomplished via the registered file system (otherwise known as TFS) which manages these files in a registered file repository composed of a virtual file system accessible via a TFS server and a file registration database. The TFS server provides controlled, reliable, and secure file transfer and storage by registering all file transactions and meta-data in the file registration database. An API is provided for application programs to communicate with TFS servers and the repository. A command line client implementing this API has been developed as a client tool. This paper describes the architecture, current implementation, but more importantly, the evolution of these services based on evolving community use cases and emerging information system technology.

  10. QuakeSim Project Networking

    NASA Astrophysics Data System (ADS)

    Kong, D.; Donnellan, A.; Pierce, M. E.

    2012-12-01

    QuakeSim is an online computational framework focused on using remotely sensed geodetic imaging data to model and understand earthquakes. With the rise in online social networking over the last decade, many tools and concepts have been developed that are useful to research groups. In particular, QuakeSim is interested in the ability for researchers to post, share, and annotate files generated by modeling tools in order to facilitate collaboration. To accomplish this, features were added to the preexisting QuakeSim site that include single sign-on, automated saving of output from modeling tools, and a personal user space to manage sharing permissions on these saved files. These features implement OpenID and Lightweight Data Access Protocol (LDAP) technologies to manage files across several different servers, including a web server running Drupal and other servers hosting the computational tools themselves.

  11. Design and development of an interactive medical teleconsultation system over the World Wide Web.

    PubMed

    Bai, J; Zhang, Y; Dai, B

    1998-06-01

    The objective of the medical teleconsultation system presented in this paper is to demonstrate the use of the World Wide Web (WWW) for telemedicine and interactive medical information exchange. The system, which is developed based on Java, could provide several basic Java tools to fulfill the requirements of medical applications, including a file manager, data tool, bulletin board, and digital audio tool. The digital audio tool uses point-to-point structure to enable two physicians to communicate directly through voice. The others use multipoint structure. The file manager manages the medical images stored in the WWW information server, which come from a hospital database. The data tool supports cooperative operations on the medical data between the participating physicians. The bulletin board enables the users to discuss special cases by writing text on the board, send their personal or group diagnostic reports on the cases, and reorganize the reports and store them in its report file for later use. The system provides a hardware-independent platform for physicians to interact with one another as well as to access medical information over the WWW.

  12. 78 FR 79044 - Self-Regulatory Organizations; New York Stock Exchange LLC; Notice of Filing and Immediate...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-27

    ... Proposed Rule Change to Offer Risk Management Tools Designed to Allow Member Organizations to Monitor and... of the Proposed Rule Change The Exchange proposes to offer risk management tools designed to allow... risk management tools designed to allow member organizations to monitor and address exposure to risk...

  13. Filtering NetCDF Files by Using the EverVIEW Slice and Dice Tool

    USGS Publications Warehouse

    Conzelmann, Craig; Romañach, Stephanie S.

    2010-01-01

    Network Common Data Form (NetCDF) is a self-describing, machine-independent file format for storing array-oriented scientific data. It was created to provide a common interface between applications and real-time meteorological and other scientific data. Over the past few years, there has been a growing movement within the community of natural resource managers in The Everglades, Fla., to use NetCDF as the standard data container for datasets based on multidimensional arrays. As a consequence, a need surfaced for additional tools to view and manipulate NetCDF datasets, specifically to filter the files by creating subsets of large NetCDF files. The U.S. Geological Survey (USGS) and the Joint Ecosystem Modeling (JEM) group are working to address these needs with applications like the EverVIEW Slice and Dice Tool, which allows users to filter grid-based NetCDF files, thus targeting those data most important to them. The major functions of this tool are as follows: (1) to create subsets of NetCDF files temporally, spatially, and by data value; (2) to view the NetCDF data in table form; and (3) to export the filtered data to a comma-separated value (CSV) file format. The USGS and JEM will continue to work with scientists and natural resource managers across The Everglades to solve complex restoration problems through technological advances.

  14. Tools for Requirements Management: A Comparison of Telelogic DOORS and the HiVe

    DTIC Science & Technology

    2006-07-01

    types DOORS deals with are text files, spreadsheets, FrameMaker , rich text, Microsoft Word and Microsoft Project. 2.5.1 Predefined file formats DOORS...during the export. DOORS exports FrameMaker files in an incomplete format, meaning DOORS exported files will have to be opened in FrameMaker and saved

  15. XTCE GOVSAT Tool Suite 1.0

    NASA Technical Reports Server (NTRS)

    Rice, J. Kevin

    2013-01-01

    The XTCE GOVSAT software suite contains three tools: validation, search, and reporting. The Extensible Markup Language (XML) Telemetric and Command Exchange (XTCE) GOVSAT Tool Suite is written in Java for manipulating XTCE XML files. XTCE is a Consultative Committee for Space Data Systems (CCSDS) and Object Management Group (OMG) specification for describing the format and information in telemetry and command packet streams. These descriptions are files that are used to configure real-time telemetry and command systems for mission operations. XTCE s purpose is to exchange database information between different systems. XTCE GOVSAT consists of rules for narrowing the use of XTCE for missions. The Validation Tool is used to syntax check GOVSAT XML files. The Search Tool is used to search (i.e. command and telemetry mnemonics) the GOVSAT XML files and view the results. Finally, the Reporting Tool is used to create command and telemetry reports. These reports can be displayed or printed for use by the operations team.

  16. Users' Manual and Installation Guide for the EverVIEW Slice and Dice Tool (Version 1.0 Beta)

    USGS Publications Warehouse

    Roszell, Dustin; Conzelmann, Craig; Chimmula, Sumani; Chandrasekaran, Anuradha; Hunnicut, Christina

    2009-01-01

    Network Common Data Form (NetCDF) is a self-describing, machine-independent file format for storing array-oriented scientific data. Over the past few years, there has been a growing movement within the community of natural resource managers in The Everglades, Fla., to use NetCDF as the standard data container for datasets based on multidimensional arrays. As a consequence, a need arose for additional tools to view and manipulate NetCDF datasets, specifically to create subsets of large NetCDF files. To address this need, we created the EverVIEW Slice and Dice Tool to allow users to create subsets of grid-based NetCDF files. The major functions of this tool are (1) to subset NetCDF files both spatially and temporally; (2) to view the NetCDF data in table form; and (3) to export filtered data to a comma-separated value file format.

  17. The consistency service of the ATLAS Distributed Data Management system

    NASA Astrophysics Data System (ADS)

    Serfon, Cédric; Garonne, Vincent; ATLAS Collaboration

    2011-12-01

    With the continuously increasing volume of data produced by ATLAS and stored on the WLCG sites, the probability of data corruption or data losses, due to software and hardware failures is increasing. In order to ensure the consistency of all data produced by ATLAS a Consistency Service has been developed as part of the DQ2 Distributed Data Management system. This service is fed by the different ATLAS tools, i.e. the analysis tools, production tools, DQ2 site services or by site administrators that report corrupted or lost files. It automatically corrects the errors reported and informs the users in case of irrecoverable file loss.

  18. An Observation Analysis Tool for time-series analysis and sensor management in the FREEWAT GIS environment for water resources management

    NASA Astrophysics Data System (ADS)

    Cannata, Massimiliano; Neumann, Jakob; Cardoso, Mirko; Rossetto, Rudy; Foglia, Laura; Borsi, Iacopo

    2017-04-01

    In situ time-series are an important aspect of environmental modelling, especially with the advancement of numerical simulation techniques and increased model complexity. In order to make use of the increasing data available through the requirements of the EU Water Framework Directive, the FREEWAT GIS environment incorporates the newly developed Observation Analysis Tool for time-series analysis. The tool is used to import time-series data into QGIS from local CSV files, online sensors using the istSOS service, or MODFLOW model result files and enables visualisation, pre-processing of data for model development, and post-processing of model results. OAT can be used as a pre-processor for calibration observations, integrating the creation of observations for calibration directly from sensor time-series. The tool consists in an expandable Python library of processing methods and an interface integrated in the QGIS FREEWAT plug-in which includes a large number of modelling capabilities, data management tools and calibration capacity.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bartoletti, T.

    SPI/U3.1 consists of five tools used to assess and report the security posture of computers running the UNIX operating system. The tools are: Access Control Test: A rule-based system which identifies sequential dependencies in UNIX access controls. Binary Inspector Tool: Evaluates the release status of system binaries by comparing a crypto-checksum to provide table entries. Change Detection Tool: Maintains and applies a snapshot of critical system files and attributes for purposes of change detection. Configuration Query Language: Accepts CQL-based scripts (provided) to evaluate queries over the status of system files, configuration of services and many other elements of UNIX systemmore » security. Password Security Inspector: Tests for weak or aged passwords. The tools are packaged with a forms-based user interface providing on-line context-sensistive help, job scheduling, parameter management and output report management utilities. Tools may be run independent of the UI.« less

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bartoletti, Tony

    SPI/U3.2 consists of five tools used to assess and report the security posture of computers running the UNIX operating system. The tools are: Access Control Test: A rule-based system which identifies sequential dependencies in UNIX access controls. Binary Authentication Tool: Evaluates the release status of system binaries by comparing a crypto-checksum to provide table entries. Change Detection Tool: Maintains and applies a snapshot of critical system files and attributes for purposes of change detection. Configuration Query Language: Accepts CQL-based scripts (provided) to evaluate queries over the status of system files, configuration of services and many other elements of UNIX systemmore » security. Password Security Inspector: Tests for weak or aged passwords. The tools are packaged with a forms-based user interface providing on-line context-sensistive help, job scheduling, parameter management and output report management utilities. Tools may be run independent of the UI.« less

  1. SPI/U3.2. Security Profile Inspector for UNIX Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bartoletti, A.

    1994-08-01

    SPI/U3.2 consists of five tools used to assess and report the security posture of computers running the UNIX operating system. The tools are: Access Control Test: A rule-based system which identifies sequential dependencies in UNIX access controls. Binary Authentication Tool: Evaluates the release status of system binaries by comparing a crypto-checksum to provide table entries. Change Detection Tool: Maintains and applies a snapshot of critical system files and attributes for purposes of change detection. Configuration Query Language: Accepts CQL-based scripts (provided) to evaluate queries over the status of system files, configuration of services and many other elements of UNIX systemmore » security. Password Security Inspector: Tests for weak or aged passwords. The tools are packaged with a forms-based user interface providing on-line context-sensistive help, job scheduling, parameter management and output report management utilities. Tools may be run independent of the UI.« less

  2. Open source tools for management and archiving of digital microscopy data to allow integration with patient pathology and treatment information.

    PubMed

    Khushi, Matloob; Edwards, Georgina; de Marcos, Diego Alonso; Carpenter, Jane E; Graham, J Dinny; Clarke, Christine L

    2013-02-12

    Virtual microscopy includes digitisation of histology slides and the use of computer technologies for complex investigation of diseases such as cancer. However, automated image analysis, or website publishing of such digital images, is hampered by their large file sizes. We have developed two Java based open source tools: Snapshot Creator and NDPI-Splitter. Snapshot Creator converts a portion of a large digital slide into a desired quality JPEG image. The image is linked to the patient's clinical and treatment information in a customised open source cancer data management software (Caisis) in use at the Australian Breast Cancer Tissue Bank (ABCTB) and then published on the ABCTB website (http://www.abctb.org.au) using Deep Zoom open source technology. Using the ABCTB online search engine, digital images can be searched by defining various criteria such as cancer type, or biomarkers expressed. NDPI-Splitter splits a large image file into smaller sections of TIFF images so that they can be easily analysed by image analysis software such as Metamorph or Matlab. NDPI-Splitter also has the capacity to filter out empty images. Snapshot Creator and NDPI-Splitter are novel open source Java tools. They convert digital slides into files of smaller size for further processing. In conjunction with other open source tools such as Deep Zoom and Caisis, this suite of tools is used for the management and archiving of digital microscopy images, enabling digitised images to be explored and zoomed online. Our online image repository also has the capacity to be used as a teaching resource. These tools also enable large files to be sectioned for image analysis. The virtual slide(s) for this article can be found here: http://www.diagnosticpathology.diagnomx.eu/vs/5330903258483934.

  3. Digital surveying and mapping of forest road network for development of a GIS tool for the effective protection and management of natural ecosystems

    NASA Astrophysics Data System (ADS)

    Drosos, Vasileios C.; Liampas, Sarantis-Aggelos G.; Doukas, Aristotelis-Kosmas G.

    2014-08-01

    In our time, the Geographic Information Systems (GIS) have become important tools, not only in the geosciences and environmental sciences, as well as virtually for all researches that require monitoring, planning or land management. The purpose of this paper was to develop a planning tool and decision making tool using AutoCAD Map software, ArcGIS and Google Earth with emphasis on the investigation of the suitability of forest roads' mapping and the range of its implementation in Greece in prefecture level. Integrating spatial information into a database makes data available throughout the organization; improving quality, productivity, and data management. Also working in such an environment, you can: Access and edit information, integrate and analyze data and communicate effectively. To select desirable information such as forest road network in a very early stage in the planning of silviculture operations, for example before the planning of the harvest is carried out. The software programs that were used were AutoCAD Map for the export in shape files for the GPS data, and ArcGIS in shape files (ArcGlobe), while Google Earth with KML files (Keyhole Markup Language) in order to better visualize and evaluate existing conditions, design in a real-world context and exchange information with government agencies, utilities, and contractors in both CAD and GIS data formats. The automation of the updating procedure and transfer of any files between agencies-departments is one of the main tasks of the integrated GIS-tool among the others should be addressed.

  4. File format for normalizing radiological concentration exposure rate and dose rate data for the effects of radioactive decay and weathering processes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kraus, Terrence D.

    2017-04-01

    This report specifies the electronic file format that was agreed upon to be used as the file format for normalized radiological data produced by the software tool developed under this TI project. The NA-84 Technology Integration (TI) Program project (SNL17-CM-635, Normalizing Radiological Data for Analysis and Integration into Models) investigators held a teleconference on December 7, 2017 to discuss the tasks to be completed under the TI program project. During this teleconference, the TI project investigators determined that the comma-separated values (CSV) file format is the most suitable file format for the normalized radiological data that will be outputted frommore » the normalizing tool developed under this TI project. The CSV file format was selected because it provides the requisite flexibility to manage different types of radiological data (i.e., activity concentration, exposure rate, dose rate) from other sources [e.g., Radiological Assessment and Monitoring System (RAMS), Aerial Measuring System (AMS), Monitoring and Sampling). The CSV file format also is suitable for the file format of the normalized radiological data because this normalized data can then be ingested by other software [e.g., RAMS, Visual Sampling Plan (VSP)] used by the NA-84’s Consequence Management Program.« less

  5. Interactive visualization tools for the structural biologist.

    PubMed

    Porebski, Benjamin T; Ho, Bosco K; Buckle, Ashley M

    2013-10-01

    In structural biology, management of a large number of Protein Data Bank (PDB) files and raw X-ray diffraction images often presents a major organizational problem. Existing software packages that manipulate these file types were not designed for these kinds of file-management tasks. This is typically encountered when browsing through a folder of hundreds of X-ray images, with the aim of rapidly inspecting the diffraction quality of a data set. To solve this problem, a useful functionality of the Macintosh operating system (OSX) has been exploited that allows custom visualization plugins to be attached to certain file types. Software plugins have been developed for diffraction images and PDB files, which in many scenarios can save considerable time and effort. The direct visualization of diffraction images and PDB structures in the file browser can be used to identify key files of interest simply by scrolling through a list of files.

  6. Data Management for Mars Exploration Rovers

    NASA Technical Reports Server (NTRS)

    Snyder, Joseph F.; Smyth, David E.

    2004-01-01

    Data Management for the Mars Exploration Rovers (MER) project is a comprehensive system addressing the needs of development, test, and operations phases of the mission. During development of flight software, including the science software, the data management system can be simulated using any POSIX file system. During testing, the on-board file system can be bit compared with files on the ground to verify proper behavior and end-to-end data flows. During mission operations, end-to-end accountability of data products is supported, from science observation concept to data products within the permanent ground repository. Automated and human-in-the-loop ground tools allow decisions regarding retransmitting, re-prioritizing, and deleting data products to be made using higher level information than is available to a protocol-stack approach such as the CCSDS File Delivery Protocol (CFDP).

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bower, J.C.; Burford, M.J.; Downing, T.R.

    The Integrated Baseline System (IBS) is an emergency management planning and analysis tool that is being developed under the direction of the US Army Nuclear and Chemical Agency (USANCA). The IBS Data Management Guide provides the background, as well as the operations and procedures needed to generate and maintain a site-specific map database. Data and system managers use this guide to manage the data files and database that support the administrative, user-environment, database management, and operational capabilities of the IBS. This document provides a description of the data files and structures necessary for running the IBS software and using themore » site map database.« less

  8. Web Audio/Video Streaming Tool

    NASA Technical Reports Server (NTRS)

    Guruvadoo, Eranna K.

    2003-01-01

    In order to promote NASA-wide educational outreach program to educate and inform the public of space exploration, NASA, at Kennedy Space Center, is seeking efficient ways to add more contents to the web by streaming audio/video files. This project proposes a high level overview of a framework for the creation, management, and scheduling of audio/video assets over the web. To support short-term goals, the prototype of a web-based tool is designed and demonstrated to automate the process of streaming audio/video files. The tool provides web-enabled users interfaces to manage video assets, create publishable schedules of video assets for streaming, and schedule the streaming events. These operations are performed on user-defined and system-derived metadata of audio/video assets stored in a relational database while the assets reside on separate repository. The prototype tool is designed using ColdFusion 5.0.

  9. Meteorological Sensor Array (MSA)-Phase I. Volume 2 (Data Management Tool: Proof of Concept)

    DTIC Science & Technology

    2014-10-01

    directory of next hourly file to read *** utcString = CStr (CInt(utcString) + 1) utcString = String(2 - Len(utcString), Ŕ...hourly file to read *** utcString = CStr (CInt(utcString) + 1) utcString = String(2 - Len(utcString), Ŕ") & utcString

  10. High-performance metadata indexing and search in petascale data storage systems

    NASA Astrophysics Data System (ADS)

    Leung, A. W.; Shao, M.; Bisson, T.; Pasupathy, S.; Miller, E. L.

    2008-07-01

    Large-scale storage systems used for scientific applications can store petabytes of data and billions of files, making the organization and management of data in these systems a difficult, time-consuming task. The ability to search file metadata in a storage system can address this problem by allowing scientists to quickly navigate experiment data and code while allowing storage administrators to gather the information they need to properly manage the system. In this paper, we present Spyglass, a file metadata search system that achieves scalability by exploiting storage system properties, providing the scalability that existing file metadata search tools lack. In doing so, Spyglass can achieve search performance up to several thousand times faster than existing database solutions. We show that Spyglass enables important functionality that can aid data management for scientists and storage administrators.

  11. Open source tools for management and archiving of digital microscopy data to allow integration with patient pathology and treatment information

    PubMed Central

    2013-01-01

    Background Virtual microscopy includes digitisation of histology slides and the use of computer technologies for complex investigation of diseases such as cancer. However, automated image analysis, or website publishing of such digital images, is hampered by their large file sizes. Results We have developed two Java based open source tools: Snapshot Creator and NDPI-Splitter. Snapshot Creator converts a portion of a large digital slide into a desired quality JPEG image. The image is linked to the patient’s clinical and treatment information in a customised open source cancer data management software (Caisis) in use at the Australian Breast Cancer Tissue Bank (ABCTB) and then published on the ABCTB website (http://www.abctb.org.au) using Deep Zoom open source technology. Using the ABCTB online search engine, digital images can be searched by defining various criteria such as cancer type, or biomarkers expressed. NDPI-Splitter splits a large image file into smaller sections of TIFF images so that they can be easily analysed by image analysis software such as Metamorph or Matlab. NDPI-Splitter also has the capacity to filter out empty images. Conclusions Snapshot Creator and NDPI-Splitter are novel open source Java tools. They convert digital slides into files of smaller size for further processing. In conjunction with other open source tools such as Deep Zoom and Caisis, this suite of tools is used for the management and archiving of digital microscopy images, enabling digitised images to be explored and zoomed online. Our online image repository also has the capacity to be used as a teaching resource. These tools also enable large files to be sectioned for image analysis. Virtual Slides The virtual slide(s) for this article can be found here: http://www.diagnosticpathology.diagnomx.eu/vs/5330903258483934 PMID:23402499

  12. A Flexible Online Metadata Editing and Management System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aguilar, Raul; Pan, Jerry Yun; Gries, Corinna

    2010-01-01

    A metadata editing and management system is being developed employing state of the art XML technologies. A modular and distributed design was chosen for scalability, flexibility, options for customizations, and the possibility to add more functionality at a later stage. The system consists of a desktop design tool or schema walker used to generate code for the actual online editor, a native XML database, and an online user access management application. The design tool is a Java Swing application that reads an XML schema, provides the designer with options to combine input fields into online forms and give the fieldsmore » user friendly tags. Based on design decisions, the tool generates code for the online metadata editor. The code generated is an implementation of the XForms standard using the Orbeon Framework. The design tool fulfills two requirements: First, data entry forms based on one schema may be customized at design time and second data entry applications may be generated for any valid XML schema without relying on custom information in the schema. However, the customized information generated at design time is saved in a configuration file which may be re-used and changed again in the design tool. Future developments will add functionality to the design tool to integrate help text, tool tips, project specific keyword lists, and thesaurus services. Additional styling of the finished editor is accomplished via cascading style sheets which may be further customized and different look-and-feels may be accumulated through the community process. The customized editor produces XML files in compliance with the original schema, however, data from the current page is saved into a native XML database whenever the user moves to the next screen or pushes the save button independently of validity. Currently the system uses the open source XML database eXist for storage and management, which comes with third party online and desktop management tools. However, access to metadata files in the application introduced here is managed in a custom online module, using a MySQL backend accessed by a simple Java Server Faces front end. A flexible system with three grouping options, organization, group and single editing access is provided. Three levels were chosen to distribute administrative responsibilities and handle the common situation of an information manager entering the bulk of the metadata but leave specifics to the actual data provider.« less

  13. Knob manager (KM) operators guide

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1993-10-08

    KM, Knob Manager, is a tool which enables the user to use the SUNDIALS knob box to adjust the settings of the control system. The followings are some features of KM: dynamic knob assignments with the user friendly interface; user-defined gain for individual knob; graphical displays for operating range and status of each process variable is assigned; backup and restore one or multiple process variable; save current settings to a file and recall the settings from that file in future.

  14. EVALUATING HYDROLOGICAL RESPONSE TO ...

    EPA Pesticide Factsheets

    Studies of future management and policy options based on different assumptions provide a mechanism to examine possible outcomes and especially their likely benefits or consequences. Planning and assessment in land and water resource management are evolving toward complex, spatially explicit regional assessments. These problems have to be addressed with distributed models that can compute runoff and erosion at different spatial and temporal scales. The extensive data requirements and the difficult task of building input parameter files, however, have long been an obstacle to the timely and cost-effective use of such complex models by resource managers. The U.S. EPA Landscape Ecology Branch in collaboration with the USDA-ARS Southwest Watershed Research Center has developed a geographic information system (GIS) tool to facilitate this process. A GIS provides the framework within which spatially distributed data are collected and used to prepare model input files, and model results are evaluated. The Automated Geospatial Watershed Assessment (AGWA) tool uses widely available standardized spatial datasets that can be obtained via the internet at no cost to the user. The data are used to develop input parameter files for KINEROS2 and SWAT, two watershed runoff and erosion simulation models that operate at different spatial and temporal scales. AGWA automates the process of transforming digital data into simulation model results and provides a visualization tool

  15. Oracle Applications Patch Administration Tool (PAT) Beta Version

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2002-01-04

    PAT is a Patch Administration Tool that provides analysis, tracking, and management of Oracle Application patches. This includes capabilities as outlined below: Patch Analysis & Management Tool Outline of capabilities: Administration Patch Data Maintenance -- track Oracle Application patches applied to what database instance & machine Patch Analysis capture text files (readme.txt and driver files) form comparison detail report comparison detail PL/SQL package comparison detail SQL scripts detail JSP module comparison detail Parse and load the current applptch.txt (10.7) or load patch data from Oracle Application database patch tables (11i) Display Analysis -- Compare patch to be applied with currentmore » Oracle Application installed Appl_top code versions Patch Detail Module comparison detail Analyze and display one Oracle Application module patch. Patch Management -- automatic queue and execution of patches Administration Parameter maintenance -- setting for directory structure of Oracle Application appl_top Validation data maintenance -- machine names and instances to patch Operation Patch Data Maintenance Schedule a patch (queue for later execution) Run a patch (queue for immediate execution) Review the patch logs Patch Management Reports« less

  16. SIOExplorer: Modern IT Methods and Tools for Digital Library Management

    NASA Astrophysics Data System (ADS)

    Sutton, D. W.; Helly, J.; Miller, S.; Chase, A.; Clarck, D.

    2003-12-01

    With more geoscience disciplines becoming data-driven it is increasingly important to utilize modern techniques for data, information and knowledge management. SIOExplorer is a new digital library project with 2 terabytes of oceanographic data collected over the last 50 years on 700 cruises by the Scripps Institution of Oceanography. It is built using a suite of information technology tools and methods that allow for an efficient and effective digital library management system. The library consists of a number of independent collections, each with corresponding metadata formats. The system architecture allows each collection to be built and uploaded based on a collection dependent metadata template file (MTF). This file is used to create the hierarchical structure of the collection, create metadata tables in a relational database, and to populate object metadata files and the collection as a whole. Collections are comprised of arbitrary digital objects stored at the San Diego Supercomputer Center (SDSC) High Performance Storage System (HPSS) and managed using the Storage Resource Broker (SRB), data handling middle ware developed at SDSC. SIOExplorer interoperates with other collections as a data provider through the Open Archives Initiative (OAI) protocol. The user services for SIOExplorer are accessed from CruiseViewer, a Java application served using Java Web Start from the SIOExplorer home page. CruiseViewer is an advanced tool for data discovery and access. It implements general keyword and interactive geospatial search methods for the collections. It uses a basemap to georeference search results on user selected basemaps such as global topography or crustal age. User services include metadata viewing, opening of selective mime type digital objects (such as images, documents and grid files), and downloading of objects (including the brokering of proprietary hold restrictions).

  17. The connectome viewer toolkit: an open source framework to manage, analyze, and visualize connectomes.

    PubMed

    Gerhard, Stephan; Daducci, Alessandro; Lemkaddem, Alia; Meuli, Reto; Thiran, Jean-Philippe; Hagmann, Patric

    2011-01-01

    Advanced neuroinformatics tools are required for methods of connectome mapping, analysis, and visualization. The inherent multi-modality of connectome datasets poses new challenges for data organization, integration, and sharing. We have designed and implemented the Connectome Viewer Toolkit - a set of free and extensible open source neuroimaging tools written in Python. The key components of the toolkit are as follows: (1) The Connectome File Format is an XML-based container format to standardize multi-modal data integration and structured metadata annotation. (2) The Connectome File Format Library enables management and sharing of connectome files. (3) The Connectome Viewer is an integrated research and development environment for visualization and analysis of multi-modal connectome data. The Connectome Viewer's plugin architecture supports extensions with network analysis packages and an interactive scripting shell, to enable easy development and community contributions. Integration with tools from the scientific Python community allows the leveraging of numerous existing libraries for powerful connectome data mining, exploration, and comparison. We demonstrate the applicability of the Connectome Viewer Toolkit using Diffusion MRI datasets processed by the Connectome Mapper. The Connectome Viewer Toolkit is available from http://www.cmtk.org/

  18. The Connectome Viewer Toolkit: An Open Source Framework to Manage, Analyze, and Visualize Connectomes

    PubMed Central

    Gerhard, Stephan; Daducci, Alessandro; Lemkaddem, Alia; Meuli, Reto; Thiran, Jean-Philippe; Hagmann, Patric

    2011-01-01

    Advanced neuroinformatics tools are required for methods of connectome mapping, analysis, and visualization. The inherent multi-modality of connectome datasets poses new challenges for data organization, integration, and sharing. We have designed and implemented the Connectome Viewer Toolkit – a set of free and extensible open source neuroimaging tools written in Python. The key components of the toolkit are as follows: (1) The Connectome File Format is an XML-based container format to standardize multi-modal data integration and structured metadata annotation. (2) The Connectome File Format Library enables management and sharing of connectome files. (3) The Connectome Viewer is an integrated research and development environment for visualization and analysis of multi-modal connectome data. The Connectome Viewer's plugin architecture supports extensions with network analysis packages and an interactive scripting shell, to enable easy development and community contributions. Integration with tools from the scientific Python community allows the leveraging of numerous existing libraries for powerful connectome data mining, exploration, and comparison. We demonstrate the applicability of the Connectome Viewer Toolkit using Diffusion MRI datasets processed by the Connectome Mapper. The Connectome Viewer Toolkit is available from http://www.cmtk.org/ PMID:21713110

  19. Managing Laboratory Data Using Cloud Computing as an Organizational Tool

    ERIC Educational Resources Information Center

    Bennett, Jacqueline; Pence, Harry E.

    2011-01-01

    One of the most significant difficulties encountered when directing undergraduate research and developing new laboratory experiments is how to efficiently manage the data generated by a number of students. Cloud computing, where both software and computer files reside online, offers a solution to this data-management problem and allows researchers…

  20. Construction of a Distributed-network Digital Watershed Management System with B/S Techniques

    NASA Astrophysics Data System (ADS)

    Zhang, W. C.; Liu, Y. M.; Fang, J.

    2017-07-01

    Integrated watershed assessment tools for supporting land management and hydrologic research are becoming established tools in both basic and applied research. The core of these tools are mainly spatially distributed hydrologic models as they can provide a mechanism for investigating interactions among climate, topography, vegetation, and soil. However, the extensive data requirements and the difficult task of building input parameter files for driving these distributed models, have long been an obstacle to the timely and cost-effective use of such complex models by watershed managers and policy-makers. Recently, a web based geographic information system (GIS) tool to facilitate this process has been developed for a large watersheds of Jinghe and Weihe catchments located in the loess plateau of the Huanghe River basin in north-western China. A web-based GIS provides the framework within which spatially distributed data are collected and used to prepare model input files of these two watersheds and evaluate model results as well as to provide the various clients for watershed information inquiring, visualizing and assessment analysis. This Web-based Automated Geospatial Watershed Assessment GIS (WAGWA-GIS) tool uses widely available standardized spatial datasets that can be obtained via the internet oracle databank designed with association of Map Guide platform to develop input parameter files for online simulation at different spatial and temporal scales with Xing’anjiang and TOPMODEL that integrated with web-based digital watershed. WAGWA-GIS automates the process of transforming both digital data including remote sensing data, DEM, Land use/cover, soil digital maps and meteorological and hydrological station geo-location digital maps and text files containing meteorological and hydrological data obtained from stations of the watershed into hydrological models for online simulation and geo-spatial analysis and provides a visualization tool to help the user interpret results. The utility of WAGWA-GIS in jointing hydrologic and ecological investigations has been demonstrated on such diverse landscapes as Jinhe and Weihe watersheds, and will be extended to be utilized in the other watersheds in China step by step in coming years

  1. An Analysis of the INGRES Database Management System Applications Program Development Tools and Programming Environment

    DTIC Science & Technology

    1986-12-01

    Position cursor over the naBe of a report, then use the appropriate enu iteffl to perforn an operation on that report. Naae Owner RBF? Last changed...LANGUAGE- INDEPENDENT, PORTABLE FILE ACCESS SY STEM A MODEL FOR AUTOMATIC FILE AND PROGRAM DESIGN IN BUSINE SS APPLICATION SYSTEM GENERALLY APPLICABLE...Article Description Year: 1988 Title: FLASH : A LANGUAGE- INDEPENDENT, PORTABLE FILE ACCESS SY STEM Authors: ALLCHIN.J.E., KaLER.A.H., WIEDERHOL.D.G

  2. Sequence History Update Tool

    NASA Technical Reports Server (NTRS)

    Khanampompan, Teerapat; Gladden, Roy; Fisher, Forest; DelGuercio, Chris

    2008-01-01

    The Sequence History Update Tool performs Web-based sequence statistics archiving for Mars Reconnaissance Orbiter (MRO). Using a single UNIX command, the software takes advantage of sequencing conventions to automatically extract the needed statistics from multiple files. This information is then used to populate a PHP database, which is then seamlessly formatted into a dynamic Web page. This tool replaces a previous tedious and error-prone process of manually editing HTML code to construct a Web-based table. Because the tool manages all of the statistics gathering and file delivery to and from multiple data sources spread across multiple servers, there is also a considerable time and effort savings. With the use of The Sequence History Update Tool what previously took minutes is now done in less than 30 seconds, and now provides a more accurate archival record of the sequence commanding for MRO.

  3. Students' Acceptance of File Sharing Systems as a Tool for Sharing Course Materials: The Case of Google Drive

    ERIC Educational Resources Information Center

    Sadik, Alaa

    2017-01-01

    Students' perceptions about both ease of use and usefulness are fundamental factors in determining their acceptance and successful use of technology in higher education. File sharing systems are one of these technologies and can be used to manage and deliver course materials and coordinate virtual teams. The aim of this study is to explore how…

  4. Records Inventory Data Collection Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cole, Brian A.

    1995-03-01

    DATALINK was created to provide an easy to use data collection program for records management software products. It provides several useful tools for capturing and validating record index data in the field. It also allows users to easily create a comma delimited, ASCII text file for data export into most records management software products.

  5. Hotspot Patterns: The Formal Definition and Automatic Detection of Architecture Smells

    DTIC Science & Technology

    2015-01-15

    serious question for a project manager or architect: how to determine which parts of the code base should be given higher priority for maintenance and...services framework; Hadoop8 is a tool for distributed processing of large data sets; HBase9 is the Hadoop database; Ivy10 is a dependency management tool...answer this question more rigorously, we conducted Pearson Correlation Analysis to test the dependency between the number of issues a file involves

  6. How strong are passwords used to protect personal health information in clinical trials?

    PubMed

    El Emam, Khaled; Moreau, Katherine; Jonker, Elizabeth

    2011-02-11

    Findings and statements about how securely personal health information is managed in clinical research are mixed. The objective of our study was to evaluate the security of practices used to transfer and share sensitive files in clinical trials. Two studies were performed. First, 15 password-protected files that were transmitted by email during regulated Canadian clinical trials were obtained. Commercial password recovery tools were used on these files to try to crack their passwords. Second, interviews with 20 study coordinators were conducted to understand file-sharing practices in clinical trials for files containing personal health information. We were able to crack the passwords for 93% of the files (14/15). Among these, 13 files contained thousands of records with sensitive health information on trial participants. The passwords tended to be relatively weak, using common names of locations, animals, car brands, and obvious numeric sequences. Patient information is commonly shared by email in the context of query resolution. Files containing personal health information are shared by email and, by posting them on shared drives with common passwords, to facilitate collaboration. If files containing sensitive patient information must be transferred by email, mechanisms to encrypt them and to ensure that password strength is high are necessary. More sophisticated collaboration tools are required to allow file sharing without password sharing. We provide recommendations to implement these practices.

  7. How Strong are Passwords Used to Protect Personal Health Information in Clinical Trials?

    PubMed Central

    Moreau, Katherine; Jonker, Elizabeth

    2011-01-01

    Background Findings and statements about how securely personal health information is managed in clinical research are mixed. Objective The objective of our study was to evaluate the security of practices used to transfer and share sensitive files in clinical trials. Methods Two studies were performed. First, 15 password-protected files that were transmitted by email during regulated Canadian clinical trials were obtained. Commercial password recovery tools were used on these files to try to crack their passwords. Second, interviews with 20 study coordinators were conducted to understand file-sharing practices in clinical trials for files containing personal health information. Results We were able to crack the passwords for 93% of the files (14/15). Among these, 13 files contained thousands of records with sensitive health information on trial participants. The passwords tended to be relatively weak, using common names of locations, animals, car brands, and obvious numeric sequences. Patient information is commonly shared by email in the context of query resolution. Files containing personal health information are shared by email and, by posting them on shared drives with common passwords, to facilitate collaboration. Conclusion If files containing sensitive patient information must be transferred by email, mechanisms to encrypt them and to ensure that password strength is high are necessary. More sophisticated collaboration tools are required to allow file sharing without password sharing. We provide recommendations to implement these practices. PMID:21317106

  8. Extending DIRAC File Management with Erasure-Coding for efficient storage.

    NASA Astrophysics Data System (ADS)

    Cadellin Skipsey, Samuel; Todev, Paulin; Britton, David; Crooks, David; Roy, Gareth

    2015-12-01

    The state of the art in Grid style data management is to achieve increased resilience of data via multiple complete replicas of data files across multiple storage endpoints. While this is effective, it is not the most space-efficient approach to resilience, especially when the reliability of individual storage endpoints is sufficiently high that only a few will be inactive at any point in time. We report on work performed as part of GridPP[1], extending the Dirac File Catalogue and file management interface to allow the placement of erasure-coded files: each file distributed as N identically-sized chunks of data striped across a vector of storage endpoints, encoded such that any M chunks can be lost and the original file can be reconstructed. The tools developed are transparent to the user, and, as well as allowing up and downloading of data to Grid storage, also provide the possibility of parallelising access across all of the distributed chunks at once, improving data transfer and IO performance. We expect this approach to be of most interest to smaller VOs, who have tighter bounds on the storage available to them, but larger (WLCG) VOs may be interested as their total data increases during Run 2. We provide an analysis of the costs and benefits of the approach, along with future development and implementation plans in this area. In general, overheads for multiple file transfers provide the largest issue for competitiveness of this approach at present.

  9. Management Tools for the 80's.

    ERIC Educational Resources Information Center

    Seivert, Dick; Thomas, Frank B.

    Two functions necessary for managing a computer center (organizing and controlling) are discussed with a focus on a nomenclature that was found to be useful for identifying parts of a system and was compatible with the definitions of most operational systems. For example, it is necessary to identify files, reports, and programs used in daily…

  10. Generalized Data Management Systems--Some Perspectives.

    ERIC Educational Resources Information Center

    Minker, Jack

    A Generalized Data Management System (GDMS) is a software environment provided as a tool for analysts, administrators, and programmers who are responsible for the maintenance, query and analysis of a data base to permit the manipulation of newly defined files and data with the existing programs and system. Because the GDMS technology is believed…

  11. DATALINK. Records Inventory Data Collection Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cole, B.

    1995-03-01

    DATALINK was created to provide an easy to use data collection program for records management software products. It provides several useful tools for capturing and validating record index data in the field. It also allows users to easily create a comma delimited, ASCII text file for data export into most records management software products.

  12. 78 FR 79053 - Self-Regulatory Organizations; NYSE MKT LLC; Notice of Filing and Immediate Effectiveness of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-27

    ... discretion to the member organizations to define pre-set risk thresholds. The tools are designed to act as a... comments more efficiently, please use only one method. The Commission will post all comments on the... To Offer Risk Management Tools Designed To Allow Member Organizations To Monitor and Address Exposure...

  13. 77 FR 60166 - Agency Information Collection Activities: Requests for Comments; Clearance of Renewed Approval of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-02

    ... Management (OPM) online USAJOBS system and the FAA's Automated Vacancy Information Access Tool for Online Referral (AVIATOR) staffing tool. Type of Review: Renewal of an information collection. Background: Under... Business Services Division, AES-200. [FR Doc. 2012-24190 Filed 10-1-12; 8:45 am] BILLING CODE 4910-13-P ...

  14. T.I.M.S: TaqMan Information Management System, tools to organize data flow in a genotyping laboratory

    PubMed Central

    Monnier, Stéphanie; Cox, David G; Albion, Tim; Canzian, Federico

    2005-01-01

    Background Single Nucleotide Polymorphism (SNP) genotyping is a major activity in biomedical research. The Taqman technology is one of the most commonly used approaches. It produces large amounts of data that are difficult to process by hand. Laboratories not equipped with a Laboratory Information Management System (LIMS) need tools to organize the data flow. Results We propose a package of Visual Basic programs focused on sample management and on the parsing of input and output TaqMan files. The code is written in Visual Basic, embedded in the Microsoft Office package, and it allows anyone to have access to those tools, without any programming skills and with basic computer requirements. Conclusion We have created useful tools focused on management of TaqMan genotyping data, a critical issue in genotyping laboratories whithout a more sophisticated and expensive system, such as a LIMS. PMID:16221298

  15. Docster: The Future of Document Delivery?

    ERIC Educational Resources Information Center

    Chudnov, Daniel

    2000-01-01

    Considers the possibility of a bibliographic management tool that combines file storage with a Napster-like communications protocol, called docster. Explains Napster and discusses copyright issues, interlibrary loans, infrastructure, security concerns, the library's role, and online publishing. (LRW)

  16. NIF Ignition Target 3D Point Design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jones, O; Marinak, M; Milovich, J

    2008-11-05

    We have developed an input file for running 3D NIF hohlraums that is optimized such that it can be run in 1-2 days on parallel computers. We have incorporated increasing levels of automation into the 3D input file: (1) Configuration controlled input files; (2) Common file for 2D and 3D, different types of capsules (symcap, etc.); and (3) Can obtain target dimensions, laser pulse, and diagnostics settings automatically from NIF Campaign Management Tool. Using 3D Hydra calculations to investigate different problems: (1) Intrinsic 3D asymmetry; (2) Tolerance to nonideal 3D effects (e.g. laser power balance, pointing errors); and (3) Syntheticmore » diagnostics.« less

  17. 78 FR 79046 - Self-Regulatory Organizations; NYSE Arca, Inc.; Notice of Filing and Immediate Effectiveness of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-27

    ... ETP Holder to define pre-set risk thresholds. The tools are designed to act as a backstop for ETP... one method. The Commission will post all comments on the Commission's Internet Web site ( http://www... Change To Offer Risk Management Tools Designed To Allow ETP Holders To Monitor and Address Exposure to...

  18. NSW Executive Enhancements

    DTIC Science & Technology

    1981-06-01

    independently on the same network. Given this reduction in scale, the projected impl widely distributed, fully replicated, synchronized dat design...Manager that "owns" other resources. This strategy requires minimum synchronization while providing advantages in reliability and robustness. 2 3...interactive tools on TENEX, transparent file motion and translation, and a primitive set of project management functions. This demonstration confirmed that

  19. Common Database Interface for Heterogeneous Software Engineering Tools.

    DTIC Science & Technology

    1987-12-01

    SUB-GROUP Database Management Systems ;Programming(Comuters); 1e 05 Computer Files;Information Transfer;Interfaces; 19. ABSTRACT (Continue on reverse...Air Force Institute of Technology Air University In Partial Fulfillment of the Requirements for the Degree of Master of Science in Information Systems ...Literature ..... 8 System 690 Configuration ......... 8 Database Functionis ............ 14 Software Engineering Environments ... 14 Data Manager

  20. Aided generation of search interfaces to astronomical archives

    NASA Astrophysics Data System (ADS)

    Zorba, Sonia; Bignamini, Andrea; Cepparo, Francesco; Knapic, Cristina; Molinaro, Marco; Smareglia, Riccardo

    2016-07-01

    Astrophysical data provider organizations that host web based interfaces to provide access to data resources have to cope with possible changes in data management that imply partial rewrites of web applications. To avoid doing this manually it was decided to develop a dynamically configurable Java EE web application that can set itself up reading needed information from configuration files. Specification of what information the astronomical archive database has to expose is managed using the TAP SCHEMA schema from the IVOA TAP recommendation, that can be edited using a graphical interface. When configuration steps are done the tool will build a war file to allow easy deployment of the application.

  1. HDM/PASCAL Verification System User's Manual

    NASA Technical Reports Server (NTRS)

    Hare, D.

    1983-01-01

    The HDM/Pascal verification system is a tool for proving the correctness of programs written in PASCAL and specified in the Hierarchical Development Methodology (HDM). This document assumes an understanding of PASCAL, HDM, program verification, and the STP system. The steps toward verification which this tool provides are parsing programs and specifications, checking the static semantics, and generating verification conditions. Some support functions are provided such as maintaining a data base, status management, and editing. The system runs under the TOPS-20 and TENEX operating systems and is written in INTERLISP. However, no knowledge is assumed of these operating systems or of INTERLISP. The system requires three executable files, HDMVCG, PARSE, and STP. Optionally, the editor EMACS should be on the system in order for the editor to work. The file HDMVCG is invoked to run the system. The files PARSE and STP are used as lower forks to perform the functions of parsing and proving.

  2. OMERO and Bio-Formats 5: flexible access to large bioimaging datasets at scale

    NASA Astrophysics Data System (ADS)

    Moore, Josh; Linkert, Melissa; Blackburn, Colin; Carroll, Mark; Ferguson, Richard K.; Flynn, Helen; Gillen, Kenneth; Leigh, Roger; Li, Simon; Lindner, Dominik; Moore, William J.; Patterson, Andrew J.; Pindelski, Blazej; Ramalingam, Balaji; Rozbicki, Emil; Tarkowska, Aleksandra; Walczysko, Petr; Allan, Chris; Burel, Jean-Marie; Swedlow, Jason

    2015-03-01

    The Open Microscopy Environment (OME) has built and released Bio-Formats, a Java-based proprietary file format conversion tool and OMERO, an enterprise data management platform under open source licenses. In this report, we describe new versions of Bio-Formats and OMERO that are specifically designed to support large, multi-gigabyte or terabyte scale datasets that are routinely collected across most domains of biological and biomedical research. Bio- Formats reads image data directly from native proprietary formats, bypassing the need for conversion into a standard format. It implements the concept of a file set, a container that defines the contents of multi-dimensional data comprised of many files. OMERO uses Bio-Formats to read files natively, and provides a flexible access mechanism that supports several different storage and access strategies. These new capabilities of OMERO and Bio-Formats make them especially useful for use in imaging applications like digital pathology, high content screening and light sheet microscopy that create routinely large datasets that must be managed and analyzed.

  3. Scalable Unix tools on parallel processors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gropp, W.; Lusk, E.

    1994-12-31

    The introduction of parallel processors that run a separate copy of Unix on each process has introduced new problems in managing the user`s environment. This paper discusses some generalizations of common Unix commands for managing files (e.g. 1s) and processes (e.g. ps) that are convenient and scalable. These basic tools, just like their Unix counterparts, are text-based. We also discuss a way to use these with a graphical user interface (GUI). Some notes on the implementation are provided. Prototypes of these commands are publicly available.

  4. In Internet-Based Visualization System Study about Breakthrough Applet Security Restrictions

    NASA Astrophysics Data System (ADS)

    Chen, Jie; Huang, Yan

    In the process of realization Internet-based visualization system of the protein molecules, system needs to allow users to use the system to observe the molecular structure of the local computer, that is, customers can generate the three-dimensional graphics from PDB file on the client computer. This requires Applet access to local file, related to the Applet security restrictions question. In this paper include two realization methods: 1.Use such as signature tools, key management tools and Policy Editor tools provided by the JDK to digital signature and authentication for Java Applet, breakthrough certain security restrictions in the browser. 2. Through the use of Servlet agent implement indirect access data methods, breakthrough the traditional Java Virtual Machine sandbox model restriction of Applet ability. The two ways can break through the Applet's security restrictions, but each has its own strengths.

  5. Radiology education 2.0--on the cusp of change: part 2. eBooks; file sharing and synchronization tools; websites/teaching files; reference management tools and note taking applications.

    PubMed

    Bhargava, Puneet; Dhand, Sabeen; Lackey, Amanda E; Pandey, Tarun; Moshiri, Mariam; Jambhekar, Kedar

    2013-03-01

    Increasing use of smartphones and handheld computers is accompanied by a rapid growth in the other related industries. Electronic books have revolutionized the centuries-old conventional books and magazines markets and have simplified publishing by reducing the cost and processing time required to create and distribute any given book. We are now able to read, review, store, and share various types of documents via several electronic tools, many of which are available free of charge. Additionally, this electronic revolution has resulted in an explosion of readily available Internet-based educational resources for the residents and has paved the path for educators to reach out to a larger and more diverse student population. Published by Elsevier Inc.

  6. Managing the computational chemistry big data problem: the ioChem-BD platform.

    PubMed

    Álvarez-Moreno, M; de Graaf, C; López, N; Maseras, F; Poblet, J M; Bo, C

    2015-01-26

    We present the ioChem-BD platform ( www.iochem-bd.org ) as a multiheaded tool aimed to manage large volumes of quantum chemistry results from a diverse group of already common simulation packages. The platform has an extensible structure. The key modules managing the main tasks are to (i) upload of output files from common computational chemistry packages, (ii) extract meaningful data from the results, and (iii) generate output summaries in user-friendly formats. A heavy use of the Chemical Mark-up Language (CML) is made in the intermediate files used by ioChem-BD. From them and using XSL techniques, we manipulate and transform such chemical data sets to fulfill researchers' needs in the form of HTML5 reports, supporting information, and other research media.

  7. UNIX: A Tool for Information Management.

    ERIC Educational Resources Information Center

    Frey, Dean

    1989-01-01

    Describes UNIX, a computer operating system that supports multi-task and multi-user operations. Characteristics that make it especially suitable for library applications are discussed, including a hierarchical file structure and utilities for text processing, database activities, and bibliographic work. Sources of information on hardware…

  8. Python Processing and Version Control using VisTrails for the Netherlands Hydrological Instrument (Invited)

    NASA Astrophysics Data System (ADS)

    Verkaik, J.

    2013-12-01

    The Netherlands Hydrological Instrument (NHI) model predicts water demands in periods of drought, supporting the Dutch decision makers in taking operational as well as long-term decisions with respect to the water supply. Other applications of NHI are predicting fresh-salt interaction, nutrient loadings, and agriculture change. The NHI model consists of several coupled models: a saturated groundwater model (MODFLOW), an unsaturated groundwater model (MetaSWAP), a sub-catchment surface water model (MOZART), and a distribution network of surface waters model (DM/SOBEK). Each of these models requires specific, usually large, input data that may be the result of sophisticated schematization workflows. Input data can also be dependent on each other, for example, the precipitation data is input for the unsaturated zone model (cells) as well as for the surface water models (polygons). For efficient data management, we developed several Python tools such that the modeler or stakeholder can use the model in a user-friendly manner, and data is managed in a consistent, transparent and reproducible way. Two open source Python tools are presented here: the data version control module for the workflow manager VisTrails called FileSync, and the NHI model control script that uses FileSync. VisTrails is an open-source scientific workflow and provenance management system that provides support for simulations, data exploration and visualization. Since VisTrails does not directly support version control we developed a version control module called FileSync. With this generic module, the user can synchronize data from and to his workflow through a dialog window. The FileSync dialog calls the FileSync script that is command-line based and performs the actual data synchronization. This script allows the user to easily create a model repository, upload and download data, create releases and define scenarios. The data synchronization approach applied here differs from systems as Subversion or Git, since these systems do not perform well for large (binary) model data files. For this reason, a new concept of parameterization and data splitting has been implemented. Each file, or set of files, is uniquely labeled as a parameter, and for this parameter metadata is maintained by Subversion. The metadata data contains file hashes to identify data content and the location where the actual bulk data are stored that can be reached by FTP. The NHI model control script is a command-line driven Python script for pre-processing, running, and post-processing the NHI model and uses one single configuration file for all computational kernels. This configuration file is an easy-to-use, keyword-driven, Windows INI-file, having separate sections for all the kernels. It also includes a FileSync data section where the user can specify version controlled model data to be used as input. The NHI control script keeps all the data consistent during the pre-processing. Furthermore, this script is able to do model state handling when the NHI model is used for ensemble forecasting.

  9. Recent evolution of the offline computing model of the NOvA experiment

    DOE PAGES

    Habig, Alec; Norman, A.; Group, Craig

    2015-12-23

    The NOvA experiment at Fermilab is a long-baseline neutrino experiment designed to study ν e appearance in a ν μ beam. Over the last few years there has been intense work to streamline the computing infrastructure in preparation for data, which started to flow in from the far detector in Fall 2013. Major accomplishments for this effort include migration to the use of off-site resources through the use of the Open Science Grid and upgrading the file-handling framework from simple disk storage to a tiered system using a comprehensive data management and delivery system to find and access files onmore » either disk or tape storage. NOvA has already produced more than 6.5 million files and more than 1 PB of raw data and Monte Carlo simulation files which are managed under this model. In addition, the current system has demonstrated sustained rates of up to 1 TB/hour of file transfer by the data handling system. NOvA pioneered the use of new tools and this paved the way for their use by other Intensity Frontier experiments at Fermilab. Most importantly, the new framework places the experiment's infrastructure on a firm foundation, and is ready to produce the files needed for first physics.« less

  10. Recent Evolution of the Offline Computing Model of the NOvA Experiment

    NASA Astrophysics Data System (ADS)

    Habig, Alec; Norman, A.

    2015-12-01

    The NOvA experiment at Fermilab is a long-baseline neutrino experiment designed to study νe appearance in a νμ beam. Over the last few years there has been intense work to streamline the computing infrastructure in preparation for data, which started to flow in from the far detector in Fall 2013. Major accomplishments for this effort include migration to the use of off-site resources through the use of the Open Science Grid and upgrading the file-handling framework from simple disk storage to a tiered system using a comprehensive data management and delivery system to find and access files on either disk or tape storage. NOvA has already produced more than 6.5 million files and more than 1 PB of raw data and Monte Carlo simulation files which are managed under this model. The current system has demonstrated sustained rates of up to 1 TB/hour of file transfer by the data handling system. NOvA pioneered the use of new tools and this paved the way for their use by other Intensity Frontier experiments at Fermilab. Most importantly, the new framework places the experiment's infrastructure on a firm foundation, and is ready to produce the files needed for first physics.

  11. DATALINK: Records inventory data collection software. User`s guide

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cole, B.A.

    1995-03-01

    DATALINK was created to provide an easy to use data collection program for records management software products. It provides several useful tools for capturing and validating record index data in the field. It also allows users to easily create a comma delimited, ASCII text file for data export into most records management software products. It runs on virtually any computer us MS-DOS.

  12. Artificial Intelligent Platform as Decision Tool for Asset Management, Operations and Maintenance.

    PubMed

    2018-01-04

    An Artificial Intelligence (AI) system has been developed and implemented for water, wastewater and reuse plants to improve management of sensors, short and long term maintenance plans, asset and investment management plans. It is based on an integrated approach to capture data from different computer systems and files. It adds a layer of intelligence to the data. It serves as a repository of key current and future operations and maintenance conditions that a plant needs have knowledge of. With this information, it is able to simulate the configuration of processes and assets for those conditions to improve or optimize operations, maintenance and asset management, using the IViewOps (Intelligent View of Operations) model. Based on the optimization through model runs, it is able to create output files that can feed data to other systems and inform the staff regarding optimal solutions to the conditions experienced or anticipated in the future.

  13. Family Day Homes: Get Organized with Information Systems.

    ERIC Educational Resources Information Center

    Dague, Mindy

    1999-01-01

    Notes that record keeping and management are critical aspects of home day care centers. Highlights options for tools, including calendars, loose-leaf notebooks, ledgers, computer spreadsheet software, and file boxes. Provides guidelines for organizing information as well as particular information necessary regarding provider, parent, and child.…

  14. Web Extensible Display Manager

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Slominski, Ryan; Larrieu, Theodore L.

    Jefferson Lab's Web Extensible Display Manager (WEDM) allows staff to access EDM control system screens from a web browser in remote offices and from mobile devices. Native browser technologies are leveraged to avoid installing and managing software on remote clients such as browser plugins, tunnel applications, or an EDM environment. Since standard network ports are used firewall exceptions are minimized. To avoid security concerns from remote users modifying a control system, WEDM exposes read-only access and basic web authentication can be used to further restrict access. Updates of monitored EPICS channels are delivered via a Web Socket using a webmore » gateway. The software translates EDM description files (denoted with the edl suffix) to HTML with Scalable Vector Graphics (SVG) following the EDM's edl file vector drawing rules to create faithful screen renderings. The WEDM server parses edl files and creates the HTML equivalent in real-time allowing existing screens to work without modification. Alternatively, the familiar drag and drop EDM screen creation tool can be used to create optimized screens sized specifically for smart phones and then rendered by WEDM.« less

  15. Development of the geometry database for the CBM experiment

    NASA Astrophysics Data System (ADS)

    Akishina, E. P.; Alexandrov, E. I.; Alexandrov, I. N.; Filozova, I. A.; Friese, V.; Ivanov, V. V.

    2018-01-01

    The paper describes the current state of the Geometry Database (Geometry DB) for the CBM experiment. The main purpose of this database is to provide convenient tools for: (1) managing the geometry modules; (2) assembling various versions of the CBM setup as a combination of geometry modules and additional files. The CBM users of the Geometry DB may use both GUI (Graphical User Interface) and API (Application Programming Interface) tools for working with it.

  16. Task Report for Task Authorization 1 for: Technology Demonstration of the Joint Network Defence and Management System (JNDMS) Project

    DTIC Science & Technology

    2009-01-30

    tool written in Java to support the automated creation of simulated subnets. It can be run giving it a subnet, the number of hosts to create, the...network and can also be used to create subnets with specific profiles. Subnet Creator command line: > java –jar SubnetCreator.jar –j [path to client...command: > java –jar jss_client.jar com.mdacorporation.jndms.JSS.Client.JSSBatchClient [file] 5. Software: This is the output file that will store the

  17. Wikis and Collaborative Inquiry

    ERIC Educational Resources Information Center

    Lamb, Annette; Johnson, Larry

    2009-01-01

    Wikis are simply Web sites that provide easy-to-use tools for creating, editing, and sharing digital documents, images, and media files. Multiple participants can enter, submit, manage, and update a single Web workspace creating a community of authors and editors. Wiki projects help young people shift from being "consumers" of the Internet to…

  18. SpectraFox: A free open-source data management and analysis tool for scanning probe microscopy and spectroscopy

    NASA Astrophysics Data System (ADS)

    Ruby, Michael

    In the last decades scanning probe microscopy and spectroscopy have become well-established tools in nanotechnology and surface science. This opened the market for many commercial manufacturers, each with different hardware and software standards. Besides the advantage of a wide variety of available hardware, the diversity may software-wise complicate the data exchange between scientists, and the data analysis for groups working with hardware developed by different manufacturers. Not only the file format differs between manufacturers, but also the data often requires further numerical treatment before publication. SpectraFox is an open-source and independent tool which manages, processes, and evaluates scanning probe spectroscopy and microscopy data. It aims at simplifying the documentation in parallel to measurement, and it provides solid evaluation tools for a large number of data.

  19. KeyWare: an open wireless distributed computing environment

    NASA Astrophysics Data System (ADS)

    Shpantzer, Isaac; Schoenfeld, Larry; Grindahl, Merv; Kelman, Vladimir

    1995-12-01

    Deployment of distributed applications in the wireless domain lack equivalent tools, methodologies, architectures, and network management that exist in LAN based applications. A wireless distributed computing environment (KeyWareTM) based on intelligent agents within a multiple client multiple server scheme was developed to resolve this problem. KeyWare renders concurrent application services to wireline and wireless client nodes encapsulated in multiple paradigms such as message delivery, database access, e-mail, and file transfer. These services and paradigms are optimized to cope with temporal and spatial radio coverage, high latency, limited throughput and transmission costs. A unified network management paradigm for both wireless and wireline facilitates seamless extensions of LAN- based management tools to include wireless nodes. A set of object oriented tools and methodologies enables direct asynchronous invocation of agent-based services supplemented by tool-sets matched to supported KeyWare paradigms. The open architecture embodiment of KeyWare enables a wide selection of client node computing platforms, operating systems, transport protocols, radio modems and infrastructures while maintaining application portability.

  20. Hierachical Data Format 5 v1.10

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    KOZIOL, QUINCEY

    2016-04-20

    HDF5 is a data model, library, and file format for storing and managing data. It supports an unlimited variety of datatypes, and is designed for flexible and efficient I/O and for high volume and complex data. HDF5 is portable and is extensible, allowing applications to evolve in their use of HDF5. The HDF5 Technology suite includes tools and applications for managing, manipulating, viewing, and analyzing data in the HDF5 format.

  1. An interactive environment for the analysis of large Earth observation and model data sets

    NASA Technical Reports Server (NTRS)

    Bowman, Kenneth P.; Walsh, John E.; Wilhelmson, Robert B.

    1994-01-01

    Envision is an interactive environment that provides researchers in the earth sciences convenient ways to manage, browse, and visualize large observed or model data sets. Its main features are support for the netCDF and HDF file formats, an easy to use X/Motif user interface, a client-server configuration, and portability to many UNIX workstations. The Envision package also provides new ways to view and change metadata in a set of data files. It permits a scientist to conveniently and efficiently manage large data sets consisting of many data files. It also provides links to popular visualization tools so that data can be quickly browsed. Envision is a public domain package, freely available to the scientific community. Envision software (binaries and source code) and documentation can be obtained from either of these servers: ftp://vista.atmos.uiuc.edu/pub/envision/ and ftp://csrp.tamu.edu/pub/envision/. Detailed descriptions of Envision capabilities and operations can be found in the User's Guide and Reference Manuals distributed with Envision software.

  2. Cause-and-effect analysis of risk management files to assess patient care in the emergency department.

    PubMed

    White, Andrew A; Wright, Seth W; Blanco, Roberto; Lemonds, Brent; Sisco, Janice; Bledsoe, Sandy; Irwin, Cindy; Isenhour, Jennifer; Pichert, James W

    2004-10-01

    Identifying the etiologies of adverse outcomes is an important first step in improving patient safety and reducing malpractice risks. However, relatively little is known about the causes of emergency department-related adverse outcomes. The objective was to describe a method for identification of common causes of adverse outcomes in an emergency department. This methodology potentially can suggest ways to improve care and might provide a model for identification of factors associated with adverse outcomes. This was a retrospective analysis of 74 consecutive files opened by a malpractice insurer between 1995 and 2000. Each risk-management file was analyzed to identify potential causes of adverse outcomes. The main outcomes were rater-assigned codes for alleged problems with care (e.g., failures of communication or problems related to diagnosis). About 50% of cases were related to injuries or abdominal complaints. A contributing cause was found in 92% of cases, and most had more than one contributing cause. The most frequent contributing categories included failure to diagnose (45%), supervision problems (31%), communication problems (30%), patient behavior (24%), administrative problems (20%), and documentation (20%). Specific relating factors within these categories, such as lack of timely resident supervision and failure to follow policies and procedures, were identified. This project documented that an aggregate analysis of risk-management files has the potential to identify shared causes related to real or perceived adverse outcomes. Several potentially correctable systems problems were identified using this methodology. These simple, descriptive management tools may be useful in identifying issues for problem solving and can be easily learned by physicians and managers.

  3. Wrapping Python around MODFLOW/MT3DMS based groundwater models

    NASA Astrophysics Data System (ADS)

    Post, V.

    2008-12-01

    Numerical models that simulate groundwater flow and solute transport require a great amount of input data that is often organized into different files. A large proportion of the input data consists of spatially-distributed model parameters. The model output consists of a variety data such as heads, fluxes and concentrations. Typically all files have different formats. Consequently, preparing input and managing output is a complex and error-prone task. Proprietary software tools are available that facilitate the preparation of input files and analysis of model outcomes. The use of such software may be limited if it does not support all the features of the groundwater model or when the costs of such tools are prohibitive. Therefore a Python library was developed that contains routines to generate input files and process output files of MODFLOW/MT3DMS based models. The library is freely available and has an open structure so that the routines can be customized and linked into other scripts and libraries. The current set of functions supports the generation of input files for MODFLOW and MT3DMS, including the capability to read spatially-distributed input parameters (e.g. hydraulic conductivity) from PNG files. Both ASCII and binary output files can be read efficiently allowing for visualization of, for example, solute concentration patterns in contour plots with superimposed flow vectors using matplotlib. Series of contour plots are then easily saved as an animation. The subroutines can also be used within scripts to calculate derived quantities such as the mass of a solute within a particular region of the model domain. Using Python as a wrapper around groundwater models provides an efficient and flexible way of processing input and output data, which is not constrained by limitations of third-party products.

  4. Air Traffic Complexity Measurement Environment (ACME): Software User's Guide

    NASA Technical Reports Server (NTRS)

    1996-01-01

    A user's guide for the Air Traffic Complexity Measurement Environment (ACME) software is presented. The ACME consists of two major components, a complexity analysis tool and user interface. The Complexity Analysis Tool (CAT) analyzes complexity off-line, producing data files which may be examined interactively via the Complexity Data Analysis Tool (CDAT). The Complexity Analysis Tool is composed of three independently executing processes that communicate via PVM (Parallel Virtual Machine) and Unix sockets. The Runtime Data Management and Control process (RUNDMC) extracts flight plan and track information from a SAR input file, and sends the information to GARP (Generate Aircraft Routes Process) and CAT (Complexity Analysis Task). GARP in turn generates aircraft trajectories, which are utilized by CAT to calculate sector complexity. CAT writes flight plan, track and complexity data to an output file, which can be examined interactively. The Complexity Data Analysis Tool (CDAT) provides an interactive graphic environment for examining the complexity data produced by the Complexity Analysis Tool (CAT). CDAT can also play back track data extracted from System Analysis Recording (SAR) tapes. The CDAT user interface consists of a primary window, a controls window, and miscellaneous pop-ups. Aircraft track and position data is displayed in the main viewing area of the primary window. The controls window contains miscellaneous control and display items. Complexity data is displayed in pop-up windows. CDAT plays back sector complexity and aircraft track and position data as a function of time. Controls are provided to start and stop playback, adjust the playback rate, and reposition the display to a specified time.

  5. Scalable Earth-observation Analytics for Geoscientists: Spacetime Extensions to the Array Database SciDB

    NASA Astrophysics Data System (ADS)

    Appel, Marius; Lahn, Florian; Pebesma, Edzer; Buytaert, Wouter; Moulds, Simon

    2016-04-01

    Today's amount of freely available data requires scientists to spend large parts of their work on data management. This is especially true in environmental sciences when working with large remote sensing datasets, such as obtained from earth-observation satellites like the Sentinel fleet. Many frameworks like SpatialHadoop or Apache Spark address the scalability but target programmers rather than data analysts, and are not dedicated to imagery or array data. In this work, we use the open-source data management and analytics system SciDB to bring large earth-observation datasets closer to analysts. Its underlying data representation as multidimensional arrays fits naturally to earth-observation datasets, distributes storage and computational load over multiple instances by multidimensional chunking, and also enables efficient time-series based analyses, which is usually difficult using file- or tile-based approaches. Existing interfaces to R and Python furthermore allow for scalable analytics with relatively little learning effort. However, interfacing SciDB and file-based earth-observation datasets that come as tiled temporal snapshots requires a lot of manual bookkeeping during ingestion, and SciDB natively only supports loading data from CSV-like and custom binary formatted files, which currently limits its practical use in earth-observation analytics. To make it easier to work with large multi-temporal datasets in SciDB, we developed software tools that enrich SciDB with earth observation metadata and allow working with commonly used file formats: (i) the SciDB extension library scidb4geo simplifies working with spatiotemporal arrays by adding relevant metadata to the database and (ii) the Geospatial Data Abstraction Library (GDAL) driver implementation scidb4gdal allows to ingest and export remote sensing imagery from and to a large number of file formats. Using added metadata on temporal resolution and coverage, the GDAL driver supports time-based ingestion of imagery to existing multi-temporal SciDB arrays. While our SciDB plugin works directly in the database, the GDAL driver has been specifically developed using a minimum amount of external dependencies (i.e. CURL). Source code for both tools is available from github [1]. We present these tools in a case-study that demonstrates the ingestion of multi-temporal tiled earth-observation data to SciDB, followed by a time-series analysis using R and SciDBR. Through the exclusive use of open-source software, our approach supports reproducibility in scalable large-scale earth-observation analytics. In the future, these tools can be used in an automated way to let scientists only work on ready-to-use SciDB arrays to significantly reduce the data management workload for domain scientists. [1] https://github.com/mappl/scidb4geo} and \\url{https://github.com/mappl/scidb4gdal

  6. CycADS: an annotation database system to ease the development and update of BioCyc databases

    PubMed Central

    Vellozo, Augusto F.; Véron, Amélie S.; Baa-Puyoulet, Patrice; Huerta-Cepas, Jaime; Cottret, Ludovic; Febvay, Gérard; Calevro, Federica; Rahbé, Yvan; Douglas, Angela E.; Gabaldón, Toni; Sagot, Marie-France; Charles, Hubert; Colella, Stefano

    2011-01-01

    In recent years, genomes from an increasing number of organisms have been sequenced, but their annotation remains a time-consuming process. The BioCyc databases offer a framework for the integrated analysis of metabolic networks. The Pathway tool software suite allows the automated construction of a database starting from an annotated genome, but it requires prior integration of all annotations into a specific summary file or into a GenBank file. To allow the easy creation and update of a BioCyc database starting from the multiple genome annotation resources available over time, we have developed an ad hoc data management system that we called Cyc Annotation Database System (CycADS). CycADS is centred on a specific database model and on a set of Java programs to import, filter and export relevant information. Data from GenBank and other annotation sources (including for example: KAAS, PRIAM, Blast2GO and PhylomeDB) are collected into a database to be subsequently filtered and extracted to generate a complete annotation file. This file is then used to build an enriched BioCyc database using the PathoLogic program of Pathway Tools. The CycADS pipeline for annotation management was used to build the AcypiCyc database for the pea aphid (Acyrthosiphon pisum) whose genome was recently sequenced. The AcypiCyc database webpage includes also, for comparative analyses, two other metabolic reconstruction BioCyc databases generated using CycADS: TricaCyc for Tribolium castaneum and DromeCyc for Drosophila melanogaster. Linked to its flexible design, CycADS offers a powerful software tool for the generation and regular updating of enriched BioCyc databases. The CycADS system is particularly suited for metabolic gene annotation and network reconstruction in newly sequenced genomes. Because of the uniform annotation used for metabolic network reconstruction, CycADS is particularly useful for comparative analysis of the metabolism of different organisms. Database URL: http://www.cycadsys.org PMID:21474551

  7. Visibility into the Work: TQM Work Process Analysis with HPT and ISD.

    ERIC Educational Resources Information Center

    Beagles, Charles A.; Griffin, Steven L.

    2003-01-01

    Discusses the use of total quality management (TQM), work process flow diagrams, and ISD (instructional systems development) tools with HPT (human performance technology) to address performance gaps in the Veterans Benefits Administration (VBA). Describes performance goals, which were to improve accuracy and reduce backlog of claim files. (LRW)

  8. Electronic Mail Is One High-Tech Management Tool that Really Delivers.

    ERIC Educational Resources Information Center

    Parker, Donald C.

    1987-01-01

    Describes an electronic mail system used by the Horseheads (New York) Central School Distict's eight schools and central office that saves time and enhances productivity. This software calls up information from the district's computer network and sends it to other users' special files--electronic "mailboxes" set aside for messages and…

  9. MISR ENVI Tool

    Atmospheric Science Data Center

    2013-03-20

    ... projection definitions are provided for augmenting the ENVI defined map projections file, and a sample ENVI menu file which adds a menu ...  |  PDF Users Guide ). The misr_envi tool software can be downloaded as a tar file containing all twelve files,  ...

  10. Challenges for data storage in medical imaging research.

    PubMed

    Langer, Steve G

    2011-04-01

    Researchers in medical imaging have multiple challenges for storing, indexing, maintaining viability, and sharing their data. Addressing all these concerns requires a constellation of tools, but not all of them need to be local to the site. In particular, the data storage challenges faced by researchers can begin to require professional information technology skills. With limited human resources and funds, the medical imaging researcher may be better served with an outsourcing strategy for some management aspects. This paper outlines an approach to manage the main objectives faced by medical imaging scientists whose work includes processing and data mining on non-standard file formats, and relating those files to the their DICOM standard descendents. The capacity of the approach scales as the researcher's need grows by leveraging the on-demand provisioning ability of cloud computing.

  11. Utilizing ORACLE tools within Unix

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ferguson, R.

    1995-07-01

    Large databases, by their very nature, often serve as repositories of data which may be needed by other systems. The transmission of this data to other systems has in the past involved several layers of human intervention. The Integrated Cargo Data Base (ICDB) developed by Martin Marietta Energy Systems for the Military Traffic Management Command as part of the Worldwide Port System provides data integration and worldwide tracking of cargo that passes through common-user ocean cargo ports. One of the key functions of ICDB is data distribution of a variety of data files to a number of other systems. Developmentmore » of automated data distribution procedures had to deal with the following constraints: (1) variable generation time for data files, (2) use of only current data for data files, (3) use of a minimum number of select statements, (4) creation of unique data files for multiple recipients, (5) automatic transmission of data files to recipients, and (6) avoidance of extensive and long-term data storage.« less

  12. A data base processor semantics specification package

    NASA Technical Reports Server (NTRS)

    Fishwick, P. A.

    1983-01-01

    A Semantics Specification Package (DBPSSP) for the Intel Data Base Processor (DBP) is defined. DBPSSP serves as a collection of cross assembly tools that allow the analyst to assemble request blocks on the host computer for passage to the DBP. The assembly tools discussed in this report may be effectively used in conjunction with a DBP compatible data communications protocol to form a query processor, precompiler, or file management system for the database processor. The source modules representing the components of DBPSSP are fully commented and included.

  13. A Web-based interactive diabetes registry for health care management and planning in Saudi Arabia.

    PubMed

    Al-Rubeaan, Khalid A; Youssef, Amira M; Subhani, Shazia N; Ahmad, Najlaa A; Al-Sharqawi, Ahmad H; Ibrahim, Heba M

    2013-09-09

    Worldwide, eHealth is a rapidly growing technology. It provides good quality health services at lower cost and increased availability. Diabetes has reached an epidemic stage in Saudi Arabia and has a medical and economic impact at a countrywide level. Data are greatly needed to better understand and plan to prevent and manage this medical problem. The Saudi National Diabetes Registry (SNDR) is an electronic medical file supported by clinical, investigational, and management data. It functions as a monitoring tool for medical, social, and cultural bases for primary and secondary prevention programs. Economic impact, in the form of direct or indirect cost, is part of the registry's scope. The registry's geographic information system (GIS) produces a variety of maps for diabetes and associated diseases. In addition to availability and distribution of health facilities in the Kingdom, GIS data provide health planners with the necessary information to make informed decisions. The electronic data bank serves as a research tool to help researchers for both prospective and retrospective studies. A Web-based interactive GIS system was designed to serve as an electronic medical file for diabetic patients retrieving data from medical files by trained registrars. Data was audited and cleaned before it was archived in the electronic filing system. It was then used to produce epidemiologic, economic, and geographic reports. A total of 84,942 patients were registered from 2000 to 2012, growing by 10% annually. The SNDR reporting system for epidemiology data gives better understanding of the disease pattern, types, and gender characteristics. Part of the reporting system is to assess quality of health care using different parameters, such as HbA1c, that gives an impression of good diabetes control for each institute. Economic reports give accurate cost estimation of different services given to diabetic patients, such as the annual insulin cost per patient for type 1, type 2, and gestational diabetes, which are 1155 SR (US $308), 1406 SR (US $375), and 1002 SR (US $267), respectively. Of this, 72.02% of the total insulin cost is spent on type 2 patients and 55.39% is in the form of premixed insulin. The SNDR can provide an accurate assessment of the services provided for research purposes. For example, only 27.00% of registered patients had an ophthalmic examination and only 71.10% of patients with proliferative retinopathy had laser therapy. The SNDR is an effective electronic medical file that can provide epidemiologic, economic, and geographic reports that can be used for disease management and health care planning. It is a useful tool for research and disease health care quality monitoring.

  14. A Web-Based Interactive Diabetes Registry for Health Care Management and Planning in Saudi Arabia

    PubMed Central

    Youssef, Amira M; Subhani, Shazia N; Ahmad, Najlaa A; Al-Sharqawi, Ahmad H; Ibrahim, Heba M

    2013-01-01

    Background Worldwide, eHealth is a rapidly growing technology. It provides good quality health services at lower cost and increased availability. Diabetes has reached an epidemic stage in Saudi Arabia and has a medical and economic impact at a countrywide level. Data are greatly needed to better understand and plan to prevent and manage this medical problem. Objective The Saudi National Diabetes Registry (SNDR) is an electronic medical file supported by clinical, investigational, and management data. It functions as a monitoring tool for medical, social, and cultural bases for primary and secondary prevention programs. Economic impact, in the form of direct or indirect cost, is part of the registry’s scope. The registry’s geographic information system (GIS) produces a variety of maps for diabetes and associated diseases. In addition to availability and distribution of health facilities in the Kingdom, GIS data provide health planners with the necessary information to make informed decisions. The electronic data bank serves as a research tool to help researchers for both prospective and retrospective studies. Methods A Web-based interactive GIS system was designed to serve as an electronic medical file for diabetic patients retrieving data from medical files by trained registrars. Data was audited and cleaned before it was archived in the electronic filing system. It was then used to produce epidemiologic, economic, and geographic reports. A total of 84,942 patients were registered from 2000 to 2012, growing by 10% annually. Results The SNDR reporting system for epidemiology data gives better understanding of the disease pattern, types, and gender characteristics. Part of the reporting system is to assess quality of health care using different parameters, such as HbA1c, that gives an impression of good diabetes control for each institute. Economic reports give accurate cost estimation of different services given to diabetic patients, such as the annual insulin cost per patient for type 1, type 2, and gestational diabetes, which are 1155 SR (US $308), 1406 SR (US $375), and 1002 SR (US $267), respectively. Of this, 72.02% of the total insulin cost is spent on type 2 patients and 55.39% is in the form of premixed insulin. The SNDR can provide an accurate assessment of the services provided for research purposes. For example, only 27.00% of registered patients had an ophthalmic examination and only 71.10% of patients with proliferative retinopathy had laser therapy. Conclusions The SNDR is an effective electronic medical file that can provide epidemiologic, economic, and geographic reports that can be used for disease management and health care planning. It is a useful tool for research and disease health care quality monitoring. PMID:24025198

  15. Workflow Management for Complex HEP Analyses

    NASA Astrophysics Data System (ADS)

    Erdmann, M.; Fischer, R.; Rieger, M.; von Cube, R. F.

    2017-10-01

    We present the novel Analysis Workflow Management (AWM) that provides users with the tools and competences of professional large scale workflow systems, e.g. Apache’s Airavata[1]. The approach presents a paradigm shift from executing parts of the analysis to defining the analysis. Within AWM an analysis consists of steps. For example, a step defines to run a certain executable for multiple files of an input data collection. Each call to the executable for one of those input files can be submitted to the desired run location, which could be the local computer or a remote batch system. An integrated software manager enables automated user installation of dependencies in the working directory at the run location. Each execution of a step item creates one report for bookkeeping purposes containing error codes and output data or file references. Required files, e.g. created by previous steps, are retrieved automatically. Since data storage and run locations are exchangeable from the steps perspective, computing resources can be used opportunistically. A visualization of the workflow as a graph of the steps in the web browser provides a high-level view on the analysis. The workflow system is developed and tested alongside of a ttbb cross section measurement where, for instance, the event selection is represented by one step and a Bayesian statistical inference is performed by another. The clear interface and dependencies between steps enables a make-like execution of the whole analysis.

  16. Transported Geothermal Energy Technoeconomic Screening Tool - Calculation Engine

    DOE Data Explorer

    Liu, Xiaobing

    2016-09-21

    This calculation engine estimates technoeconomic feasibility for transported geothermal energy projects. The TGE screening tool (geotool.exe) takes input from input file (input.txt), and list results into output file (output.txt). Both the input and ouput files are in the same folder as the geotool.exe. To use the tool, the input file containing adequate information of the case should be prepared in the format explained below, and the input file should be put into the same folder as geotool.exe. Then the geotool.exe can be executed, which will generate a output.txt file in the same folder containing all key calculation results. The format and content of the output file is explained below as well.

  17. A Tutorial on Creating a Grid Cell Land Cover Data File from Remote Sensing Data.

    DTIC Science & Technology

    1985-06-01

    Creating a Grid Cell Land Cover Data File from Remote Sensing Data Gary E. Ford, Doreen L Meyer, and V. Ralph Algazi Signal and Image Processing Laboratory... L 1. INTRODUCTION Spatial data management systems, also known as geographic information systems, pro- vide powerful, practical tools for the...erosion [8]. Other -... ..... .. . . .. . . -5- 60 Z 0"C. 0 0. , ...- 9L> c 0 o o ( L - 0- 0.0a c 0 4- b. 0 ~ CL*~ C 0 .CL x 0 I" .- -J oo : -. 0 a a Z 0Z I1

  18. SciDB versus Spark: A Preliminary Comparison Based on an Earth Science Use Case

    NASA Astrophysics Data System (ADS)

    Clune, T.; Kuo, K. S.; Doan, K.; Oloso, A.

    2015-12-01

    We compare two Big Data technologies, SciDB and Spark, for performance, usability, and extensibility, when applied to a representative Earth science use case. SciDB is a new-generation parallel distributed database management system (DBMS) based on the array data model that is capable of handling multidimensional arrays efficiently but requires lengthy data ingest prior to analysis, whereas Spark is a fast and general engine for large scale data processing that can immediately process raw data files and thereby avoid the ingest process. Once data have been ingested, SciDB is very efficient in database operations such as subsetting. Spark, on the other hand, provides greater flexibility by supporting a wide variety of high-level tools including DBMS's. For the performance aspect of this preliminary comparison, we configure Spark to operate directly on text or binary data files and thereby limit the need for additional tools. Arguably, a more appropriate comparison would involve exploring other configurations of Spark which exploit supported high-level tools, but that is beyond our current resources. To make the comparison as "fair" as possible, we export the arrays produced by SciDB into text files (or converting them to binary files) for the intake by Spark and thereby avoid any additional file processing penalties. The Earth science use case selected for this comparison is the identification and tracking of snowstorms in the NASA Modern Era Retrospective-analysis for Research and Applications (MERRA) reanalysis data. The identification portion of the use case is to flag all grid cells of the MERRA high-resolution hourly data that satisfies our criteria for snowstorm, whereas the tracking portion connects flagged cells adjacent in time and space to form a snowstorm episode. We will report the results of our comparisons at this presentation.

  19. NELS 2.0 - A general system for enterprise wide information management

    NASA Technical Reports Server (NTRS)

    Smith, Stephanie L.

    1993-01-01

    NELS, the NASA Electronic Library System, is an information management tool for creating distributed repositories of documents, drawings, and code for use and reuse by the aerospace community. The NELS retrieval engine can load metadata and source files of full text objects, perform natural language queries to retrieve ranked objects, and create links to connect user interfaces. For flexibility, the NELS architecture has layered interfaces between the application program and the stored library information. The session manager provides the interface functions for development of NELS applications. The data manager is an interface between session manager and the structured data system. The center of the structured data system is the Wide Area Information Server. This system architecture provides access to information across heterogeneous platforms in a distributed environment. There are presently three user interfaces that connect to the NELS engine; an X-Windows interface, and ASCII interface and the Spatial Data Management System. This paper describes the design and operation of NELS as an information management tool and repository.

  20. Agile Task Tracking Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Duke, Roger T.; Crump, Thomas Vu

    The work was created to provide a tool for the purpose of improving the management of tasks associated with Agile projects. Agile projects are typically completed in an iterative manner with many short duration tasks being performed as part of iterations. These iterations are generally referred to as sprints. The objective of this work is to create a single tool that enables sprint teams to manage all of their tasks in multiple sprints and automatically produce all standard sprint performance charts with minimum effort. The format of the printed work is designed to mimic a standard Kanban board. The workmore » is developed as a single Excel file with worksheets capable of managing up to five concurrent sprints and up to one hundred tasks. It also includes a summary worksheet providing performance information from all active sprints. There are many commercial project management systems typically designed with features desired by larger organizations with many resources managing multiple programs and projects. The audience for this work is the small organizations and Agile project teams desiring an inexpensive, simple, user-friendly, task management tool. This work uses standard readily available software, Excel, requiring minimum data entry and automatically creating summary charts and performance data. It is formatted to print out and resemble standard flip charts and provide the visuals associated with this type of work.« less

  1. GCE Data Toolbox for MATLAB - a software framework for automating environmental data processing, quality control and documentation

    NASA Astrophysics Data System (ADS)

    Sheldon, W.; Chamblee, J.; Cary, R. H.

    2013-12-01

    Environmental scientists are under increasing pressure from funding agencies and journal publishers to release quality-controlled data in a timely manner, as well as to produce comprehensive metadata for submitting data to long-term archives (e.g. DataONE, Dryad and BCO-DMO). At the same time, the volume of digital data that researchers collect and manage is increasing rapidly due to advances in high frequency electronic data collection from flux towers, instrumented moorings and sensor networks. However, few pre-built software tools are available to meet these data management needs, and those tools that do exist typically focus on part of the data management lifecycle or one class of data. The GCE Data Toolbox has proven to be both a generalized and effective software solution for environmental data management in the Long Term Ecological Research Network (LTER). This open source MATLAB software library, developed by the Georgia Coastal Ecosystems LTER program, integrates metadata capture, creation and management with data processing, quality control and analysis to support the entire data lifecycle. Raw data can be imported directly from common data logger formats (e.g. SeaBird, Campbell Scientific, YSI, Hobo), as well as delimited text files, MATLAB files and relational database queries. Basic metadata are derived from the data source itself (e.g. parsed from file headers) and by value inspection, and then augmented using editable metadata templates containing boilerplate documentation, attribute descriptors, code definitions and quality control rules. Data and metadata content, quality control rules and qualifier flags are then managed together in a robust data structure that supports database functionality and ensures data validity throughout processing. A growing suite of metadata-aware editing, quality control, analysis and synthesis tools are provided with the software to support managing data using graphical forms and command-line functions, as well as developing automated workflows for unattended processing. Finalized data and structured metadata can be exported in a wide variety of text and MATLAB formats or uploaded to a relational database for long-term archiving and distribution. The GCE Data Toolbox can be used as a complete, light-weight solution for environmental data and metadata management, but it can also be used in conjunction with other cyber infrastructure to provide a more comprehensive solution. For example, newly acquired data can be retrieved from a Data Turbine or Campbell LoggerNet Database server for quality control and processing, then transformed to CUAHSI Observations Data Model format and uploaded to a HydroServer for distribution through the CUAHSI Hydrologic Information System. The GCE Data Toolbox can also be leveraged in analytical workflows developed using Kepler or other systems that support MATLAB integration or tool chaining. This software can therefore be leveraged in many ways to help researchers manage, analyze and distribute the data they collect.

  2. Outside the Framework of Thinkable Thought: The Modern Orchestration Project

    ERIC Educational Resources Information Center

    Gattegno, Eliot Aron

    2010-01-01

    In today's world of too much information, context--not content--is king. This proposal is for the development of an unparalleled sonic analysis tool that converts audio files into musical score notation and a Web site (API) to collect manage and preserve information about the musical sounds analyzed, as well as music scores, videos, and articles…

  3. 75 FR 81689 - Self-Regulatory Organizations; Notice of Filing of Proposed Rule Change by NASDAQ OMX PHLX LLC To...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-28

    ... closely tailor their investment and risk management strategies and decisions. Furthermore, the Exchange... powerful tool for hedging a market sector, and that various strategies that the investor put into play were... to provide investors with additional short term option classes for investment, trading, and risk...

  4. Accessing and distributing EMBL data using CORBA (common object request broker architecture).

    PubMed

    Wang, L; Rodriguez-Tomé, P; Redaschi, N; McNeil, P; Robinson, A; Lijnzaad, P

    2000-01-01

    The EMBL Nucleotide Sequence Database is a comprehensive database of DNA and RNA sequences and related information traditionally made available in flat-file format. Queries through tools such as SRS (Sequence Retrieval System) also return data in flat-file format. Flat files have a number of shortcomings, however, and the resources therefore currently lack a flexible environment to meet individual researchers' needs. The Object Management Group's common object request broker architecture (CORBA) is an industry standard that provides platform-independent programming interfaces and models for portable distributed object-oriented computing applications. Its independence from programming languages, computing platforms and network protocols makes it attractive for developing new applications for querying and distributing biological data. A CORBA infrastructure developed by EMBL-EBI provides an efficient means of accessing and distributing EMBL data. The EMBL object model is defined such that it provides a basis for specifying interfaces in interface definition language (IDL) and thus for developing the CORBA servers. The mapping from the object model to the relational schema in the underlying Oracle database uses the facilities provided by PersistenceTM, an object/relational tool. The techniques of developing loaders and 'live object caching' with persistent objects achieve a smart live object cache where objects are created on demand. The objects are managed by an evictor pattern mechanism. The CORBA interfaces to the EMBL database address some of the problems of traditional flat-file formats and provide an efficient means for accessing and distributing EMBL data. CORBA also provides a flexible environment for users to develop their applications by building clients to our CORBA servers, which can be integrated into existing systems.

  5. Accessing and distributing EMBL data using CORBA (common object request broker architecture)

    PubMed Central

    Wang, Lichun; Rodriguez-Tomé, Patricia; Redaschi, Nicole; McNeil, Phil; Robinson, Alan; Lijnzaad, Philip

    2000-01-01

    Background: The EMBL Nucleotide Sequence Database is a comprehensive database of DNA and RNA sequences and related information traditionally made available in flat-file format. Queries through tools such as SRS (Sequence Retrieval System) also return data in flat-file format. Flat files have a number of shortcomings, however, and the resources therefore currently lack a flexible environment to meet individual researchers' needs. The Object Management Group's common object request broker architecture (CORBA) is an industry standard that provides platform-independent programming interfaces and models for portable distributed object-oriented computing applications. Its independence from programming languages, computing platforms and network protocols makes it attractive for developing new applications for querying and distributing biological data. Results: A CORBA infrastructure developed by EMBL-EBI provides an efficient means of accessing and distributing EMBL data. The EMBL object model is defined such that it provides a basis for specifying interfaces in interface definition language (IDL) and thus for developing the CORBA servers. The mapping from the object model to the relational schema in the underlying Oracle database uses the facilities provided by PersistenceTM, an object/relational tool. The techniques of developing loaders and 'live object caching' with persistent objects achieve a smart live object cache where objects are created on demand. The objects are managed by an evictor pattern mechanism. Conclusions: The CORBA interfaces to the EMBL database address some of the problems of traditional flat-file formats and provide an efficient means for accessing and distributing EMBL data. CORBA also provides a flexible environment for users to develop their applications by building clients to our CORBA servers, which can be integrated into existing systems. PMID:11178259

  6. Integration experiences and performance studies of A COTS parallel archive systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Hsing-bung; Scott, Cody; Grider, Bary

    2010-01-01

    Current and future Archive Storage Systems have been asked to (a) scale to very high bandwidths, (b) scale in metadata performance, (c) support policy-based hierarchical storage management capability, (d) scale in supporting changing needs of very large data sets, (e) support standard interface, and (f) utilize commercial-off-the-shelf(COTS) hardware. Parallel file systems have been asked to do the same thing but at one or more orders of magnitude faster in performance. Archive systems continue to move closer to file systems in their design due to the need for speed and bandwidth, especially metadata searching speeds such as more caching and lessmore » robust semantics. Currently the number of extreme highly scalable parallel archive solutions is very small especially those that will move a single large striped parallel disk file onto many tapes in parallel. We believe that a hybrid storage approach of using COTS components and innovative software technology can bring new capabilities into a production environment for the HPC community much faster than the approach of creating and maintaining a complete end-to-end unique parallel archive software solution. In this paper, we relay our experience of integrating a global parallel file system and a standard backup/archive product with a very small amount of additional code to provide a scalable, parallel archive. Our solution has a high degree of overlap with current parallel archive products including (a) doing parallel movement to/from tape for a single large parallel file, (b) hierarchical storage management, (c) ILM features, (d) high volume (non-single parallel file) archives for backup/archive/content management, and (e) leveraging all free file movement tools in Linux such as copy, move, ls, tar, etc. We have successfully applied our working COTS Parallel Archive System to the current world's first petaflop/s computing system, LANL's Roadrunner, and demonstrated its capability to address requirements of future archival storage systems.« less

  7. Integration experiments and performance studies of a COTS parallel archive system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Hsing-bung; Scott, Cody; Grider, Gary

    2010-06-16

    Current and future Archive Storage Systems have been asked to (a) scale to very high bandwidths, (b) scale in metadata performance, (c) support policy-based hierarchical storage management capability, (d) scale in supporting changing needs of very large data sets, (e) support standard interface, and (f) utilize commercial-off-the-shelf (COTS) hardware. Parallel file systems have been asked to do the same thing but at one or more orders of magnitude faster in performance. Archive systems continue to move closer to file systems in their design due to the need for speed and bandwidth, especially metadata searching speeds such as more caching andmore » less robust semantics. Currently the number of extreme highly scalable parallel archive solutions is very small especially those that will move a single large striped parallel disk file onto many tapes in parallel. We believe that a hybrid storage approach of using COTS components and innovative software technology can bring new capabilities into a production environment for the HPC community much faster than the approach of creating and maintaining a complete end-to-end unique parallel archive software solution. In this paper, we relay our experience of integrating a global parallel file system and a standard backup/archive product with a very small amount of additional code to provide a scalable, parallel archive. Our solution has a high degree of overlap with current parallel archive products including (a) doing parallel movement to/from tape for a single large parallel file, (b) hierarchical storage management, (c) ILM features, (d) high volume (non-single parallel file) archives for backup/archive/content management, and (e) leveraging all free file movement tools in Linux such as copy, move, Is, tar, etc. We have successfully applied our working COTS Parallel Archive System to the current world's first petafiop/s computing system, LANL's Roadrunner machine, and demonstrated its capability to address requirements of future archival storage systems.« less

  8. Electronic Handbooks Simplify Process Management

    NASA Technical Reports Server (NTRS)

    2012-01-01

    Getting a multitude of people to work together to manage processes across many organizations for example, flight projects, research, technologies, or data centers and others is not an easy task. Just ask Dr. Barry E. Jacobs, a research computer scientist at Goddard Space Flight Center. He helped NASA develop a process management solution that provided documenting tools for process developers and participants to help them quickly learn, adapt, test, and teach their views. Some of these tools included editable files for subprocess descriptions, document descriptions, role guidelines, manager worksheets, and references. First utilized for NASA's Headquarters Directives Management process, the approach led to the invention of a concept called the Electronic Handbook (EHB). This EHB concept was successfully applied to NASA's Small Business Innovation Research (SBIR) and Small Business Technology Transfer (STTR) programs, among other NASA programs. Several Federal agencies showed interest in the concept, so Jacobs and his team visited these agencies to show them how their specific processes could be managed by the methodology, as well as to create mockup versions of the EHBs.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Williams, Edward J., Jr.; Henry, Karen Lynne

    Sandia National Laboratories develops technologies to: (1) sustain, modernize, and protect our nuclear arsenal (2) Prevent the spread of weapons of mass destruction; (3) Provide new capabilities to our armed forces; (4) Protect our national infrastructure; (5) Ensure the stability of our nation's energy and water supplies; and (6) Defend our nation against terrorist threats. We identified the need for a single overarching Integrated Workplace Management System (IWMS) that would enable us to focus on customer missions and improve FMOC processes. Our team selected highly configurable commercial-off-the-shelf (COTS) software with out-of-the-box workflow processes that integrate strategic planning, project management, facilitymore » assessments, and space management, and can interface with existing systems, such as Oracle, PeopleSoft, Maximo, Bentley, and FileNet. We selected the Integrated Workplace Management System (IWMS) from Tririga, Inc. Facility Management System (FMS) Benefits are: (1) Create a single reliable source for facility data; (2) Improve transparency with oversight organizations; (3) Streamline FMOC business processes with a single, integrated facility-management tool; (4) Give customers simple tools and real-time information; (5) Reduce indirect costs; (6) Replace approximately 30 FMOC systems and 60 homegrown tools (such as Microsoft Access databases); and (7) Integrate with FIMS.« less

  10. Installation and management of the SPS and LEP control system computers

    NASA Astrophysics Data System (ADS)

    Bland, Alastair

    1994-12-01

    Control of the CERN SPS and LEP accelerators and service equipment on the two CERN main sites is performed via workstations, file servers, Process Control Assemblies (PCAs) and Device Stub Controllers (DSCs). This paper describes the methods and tools that have been developed to manage the file servers, PCAs and DSCs since the LEP startup in 1989. There are five operational DECstation 5000s used as file servers and boot servers for the PCAs and DSCs. The PCAs consist of 90 SCO Xenix 386 PCs, 40 LynxOS 486 PCs and more than 40 older NORD 100s. The DSCs consist of 90 OS-968030 VME crates and 10 LynxOS 68030 VME crates. In addition there are over 100 development systems. The controls group is responsible for installing the computers, starting all the user processes and ensuring that the computers and the processes run correctly. The operators in the SPS/LEP control room and the Services control room have a Motif-based X window program which gives them, in real time, the state of all the computers and allows them to solve problems or reboot them.

  11. ABM Drag_Pass Report Generator

    NASA Technical Reports Server (NTRS)

    Fisher, Forest; Gladden, Roy; Khanampornpan, Teerapat

    2008-01-01

    dragREPORT software was developed in parallel with abmREPORT, which is described in the preceding article. Both programs were built on the capabilities created during that process. This tool generates a drag_pass report that summarizes vital information from the MRO aerobreaking drag_pass build process to facilitate both sequence reviews and provide a high-level summarization of the sequence for mission management. The script extracts information from the ENV, SSF, FRF, SCMFmax, and OPTG files, presenting them in a single, easy-to-check report providing the majority of parameters needed for cross check and verification as part of the sequence review process. Prior to dragReport, all the needed information was spread across a number of different files, each in a different format. This software is a Perl script that extracts vital summarization information and build-process details from a number of source files into a single, concise report format used to aid the MPST sequence review process and to provide a high-level summarization of the sequence for mission management reference. This software could be adapted for future aerobraking missions to provide similar reports, review and summarization information.

  12. Management and display of four-dimensional environmental data sets using McIDAS

    NASA Technical Reports Server (NTRS)

    Hibbard, William L.; Santek, David; Suomi, Verner E.

    1990-01-01

    Over the past four years, great strides have been made in the areas of data management and display of 4-D meteorological data sets. A survey was conducted of available and planned 4-D meteorological data sources. The data types were evaluated for their impact on the data management and display system. The requirements were analyzed for data base management generated by the 4-D data display system. The suitability of the existing data base management procedures and file structure were evaluated in light of the new requirements. Where needed, new data base management tools and file procedures were designed and implemented. The quality of the basic 4-D data sets was assured. The interpolation and extrapolation techniques of the 4-D data were investigated. The 4-D data from various sources were combined to make a uniform and consistent data set for display purposes. Data display software was designed to create abstract line graphic 3-D displays. Realistic shaded 3-D displays were created. Animation routines for these displays were developed in order to produce a dynamic 4-D presentation. A prototype dynamic color stereo workstation was implemented. A computer functional design specification was produced based on interactive studies and user feedback.

  13. Momentum Management Tool for Low-Thrust Missions

    NASA Technical Reports Server (NTRS)

    Swenka, Edward R.; Smith, Brett A.; Vanelli, Charles A.

    2010-01-01

    A momentum management tool was designed for the Dawn low-thrust interplanetary spacecraft en route to the asteroids Vesta and Ceres, in an effort to better understand the early creation of the solar system. Momentum must be managed to ensure the spacecraft has enough control authority to perform necessary turns and hold a fixed inertial attitude against external torques. Along with torques from solar pressure and gravity-gradients, ion-propulsion engines produce a torque about the thrust axis that must be countered by the four reaction wheel assemblies (RWA). MomProf is a ground operations tool built to address these concerns. The momentum management tool was developed during initial checkout and early cruise, and has been refined to accommodate a wide range of momentum-management issues. With every activity or sequence, wheel speeds and momentum state must be checked to avoid undesirable conditions and use of consumables. MomProf was developed to operate in the MATLAB environment. All data are loaded into MATLAB as a structure to provide consistent access to all inputs by individual functions within the tool. Used in its most basic application, the Dawn momentum tool uses the basic principle of angular momentum conservation, computing momentum in the body frame, and RWA wheel speeds, for all given orientations in the input file. MomProf was designed specifically to be able to handle the changing external torques and frequent de - saturations. Incorporating significant external torques adds complexity since there are various external torques that act under different operational modes.

  14. Management and development of local area network upgrade prototype

    NASA Technical Reports Server (NTRS)

    Fouser, T. J.

    1981-01-01

    Given the situation of having management and development users accessing a central computing facility and given the fact that these same users have the need for local computation and storage, the utilization of a commercially available networking system such as CP/NET from Digital Research provides the building blocks for communicating intelligent microsystems to file and print services. The major problems to be overcome in the implementation of such a network are the dearth of intelligent communication front-ends for the microcomputers and the lack of a rich set of management and software development tools.

  15. Computer assisted audit techniques for UNIX (UNIX-CAATS)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Polk, W.T.

    1991-12-31

    Federal and DOE regulations impose specific requirements for internal controls of computer systems. These controls include adequate separation of duties and sufficient controls for access of system and data. The DOE Inspector General`s Office has the responsibility to examine internal controls, as well as efficient use of computer system resources. As a result, DOE supported NIST development of computer assisted audit techniques to examine BSD UNIX computers (UNIX-CAATS). These systems were selected due to the increasing number of UNIX workstations in use within DOE. This paper describes the design and development of these techniques, as well as the results ofmore » testing at NIST and the first audit at a DOE site. UNIX-CAATS consists of tools which examine security of passwords, file systems, and network access. In addition, a tool was developed to examine efficiency of disk utilization. Test results at NIST indicated inadequate password management, as well as weak network resource controls. File system security was considered adequate. Audit results at a DOE site indicated weak password management and inefficient disk utilization. During the audit, we also found improvements to UNIX-CAATS were needed when applied to large systems. NIST plans to enhance the techniques developed for DOE/IG in future work. This future work would leverage currently available tools, along with needed enhancements. These enhancements would enable DOE/IG to audit large systems, such as supercomputers.« less

  16. Computer assisted audit techniques for UNIX (UNIX-CAATS)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Polk, W.T.

    1991-01-01

    Federal and DOE regulations impose specific requirements for internal controls of computer systems. These controls include adequate separation of duties and sufficient controls for access of system and data. The DOE Inspector General's Office has the responsibility to examine internal controls, as well as efficient use of computer system resources. As a result, DOE supported NIST development of computer assisted audit techniques to examine BSD UNIX computers (UNIX-CAATS). These systems were selected due to the increasing number of UNIX workstations in use within DOE. This paper describes the design and development of these techniques, as well as the results ofmore » testing at NIST and the first audit at a DOE site. UNIX-CAATS consists of tools which examine security of passwords, file systems, and network access. In addition, a tool was developed to examine efficiency of disk utilization. Test results at NIST indicated inadequate password management, as well as weak network resource controls. File system security was considered adequate. Audit results at a DOE site indicated weak password management and inefficient disk utilization. During the audit, we also found improvements to UNIX-CAATS were needed when applied to large systems. NIST plans to enhance the techniques developed for DOE/IG in future work. This future work would leverage currently available tools, along with needed enhancements. These enhancements would enable DOE/IG to audit large systems, such as supercomputers.« less

  17. Visualizing Economic Development with ArcGIS Explorer

    ERIC Educational Resources Information Center

    Webster, Megan L.; Milson, Andrew J.

    2011-01-01

    Numerous educators have noted that Geographic Information Systems (GIS) is a powerful tool for social studies teaching and learning. Yet the use of GIS has been hampered by issues such as the cost of the software and the management of large spatial data files. One trend that shows great promise for GIS in education is the move to cloud computing.…

  18. Digital aerial sketchmapping and downlink communications: a new tool for fire managers

    Treesearch

    Everett Hinkley; Tom Zajkowski; Charlie Schrader-Patton

    2010-01-01

    Aerial sketchmapping is the geolocating of features that are seen on the ground below an aircraft and the subsequent recording of those features. Traditional aerial sketchmapping methods required hand-sketching on hardcopy maps or photos and the translation of that information to a digital file. In 1996, the U.S. Department of Agriculture (USDA) Forest Service embarked...

  19. Ontology for Vector Surveillance and Management

    PubMed Central

    LOZANO-FUENTES, SAUL; BANDYOPADHYAY, ARITRA; COWELL, LINDSAY G.; GOLDFAIN, ALBERT; EISEN, LARS

    2013-01-01

    Ontologies, which are made up by standardized and defined controlled vocabulary terms and their interrelationships, are comprehensive and readily searchable repositories for knowledge in a given domain. The Open Biomedical Ontologies (OBO) Foundry was initiated in 2001 with the aims of becoming an “umbrella” for life-science ontologies and promoting the use of ontology development best practices. A software application (OBO-Edit; *.obo file format) was developed to facilitate ontology development and editing. The OBO Foundry now comprises over 100 ontologies and candidate ontologies, including the NCBI organismal classification ontology (NCBITaxon), the Mosquito Insecticide Resistance Ontology (MIRO), the Infectious Disease Ontology (IDO), the IDOMAL malaria ontology, and ontologies for mosquito gross anatomy and tick gross anatomy. We previously developed a disease data management system for dengue and malaria control programs, which incorporated a set of information trees built upon ontological principles, including a “term tree” to promote the use of standardized terms. In the course of doing so, we realized that there were substantial gaps in existing ontologies with regards to concepts, processes, and, especially, physical entities (e.g., vector species, pathogen species, and vector surveillance and management equipment) in the domain of surveillance and management of vectors and vector-borne pathogens. We therefore produced an ontology for vector surveillance and management, focusing on arthropod vectors and vector-borne pathogens with relevance to humans or domestic animals, and with special emphasis on content to support operational activities through inclusion in databases, data management systems, or decision support systems. The Vector Surveillance and Management Ontology (VSMO) includes >2,200 unique terms, of which the vast majority (>80%) were newly generated during the development of this ontology. One core feature of the VSMO is the linkage, through the has_vector relation, of arthropod species to the pathogenic microorganisms for which they serve as biological vectors. We also recognized and addressed a potential roadblock for use of the VSMO by the vector-borne disease community: the difficulty in extracting information from OBO-Edit ontology files (*.obo files) and exporting the information to other file formats. A novel ontology explorer tool was developed to facilitate extraction and export of information from the VSMO *.obo file into lists of terms and their associated unique IDs in *.txt or *.csv file formats. These lists can then be imported into a database or data management system for use as select lists with predefined terms. This is an important step to ensure that the knowledge contained in our ontology can be put into practical use. PMID:23427646

  20. Ontology for vector surveillance and management.

    PubMed

    Lozano-Fuentes, Saul; Bandyopadhyay, Aritra; Cowell, Lindsay G; Goldfain, Albert; Eisen, Lars

    2013-01-01

    Ontologies, which are made up by standardized and defined controlled vocabulary terms and their interrelationships, are comprehensive and readily searchable repositories for knowledge in a given domain. The Open Biomedical Ontologies (OBO) Foundry was initiated in 2001 with the aims of becoming an "umbrella" for life-science ontologies and promoting the use of ontology development best practices. A software application (OBO-Edit; *.obo file format) was developed to facilitate ontology development and editing. The OBO Foundry now comprises over 100 ontologies and candidate ontologies, including the NCBI organismal classification ontology (NCBITaxon), the Mosquito Insecticide Resistance Ontology (MIRO), the Infectious Disease Ontology (IDO), the IDOMAL malaria ontology, and ontologies for mosquito gross anatomy and tick gross anatomy. We previously developed a disease data management system for dengue and malaria control programs, which incorporated a set of information trees built upon ontological principles, including a "term tree" to promote the use of standardized terms. In the course of doing so, we realized that there were substantial gaps in existing ontologies with regards to concepts, processes, and, especially, physical entities (e.g., vector species, pathogen species, and vector surveillance and management equipment) in the domain of surveillance and management of vectors and vector-borne pathogens. We therefore produced an ontology for vector surveillance and management, focusing on arthropod vectors and vector-borne pathogens with relevance to humans or domestic animals, and with special emphasis on content to support operational activities through inclusion in databases, data management systems, or decision support systems. The Vector Surveillance and Management Ontology (VSMO) includes >2,200 unique terms, of which the vast majority (>80%) were newly generated during the development of this ontology. One core feature of the VSMO is the linkage, through the has vector relation, of arthropod species to the pathogenic microorganisms for which they serve as biological vectors. We also recognized and addressed a potential roadblock for use of the VSMO by the vector-borne disease community: the difficulty in extracting information from OBO-Edit ontology files (*.obo files) and exporting the information to other file formats. A novel ontology explorer tool was developed to facilitate extraction and export of information from the VSMO*.obo file into lists of terms and their associated unique IDs in *.txt or *.csv file formats. These lists can then be imported into a database or data management system for use as select lists with predefined terms. This is an important step to ensure that the knowledge contained in our ontology can be put into practical use.

  1. Aerobraking Maneuver (ABM) Report Generator

    NASA Technical Reports Server (NTRS)

    Fisher, Forrest; Gladden, Roy; Khanampornpan, Teerapat

    2008-01-01

    abmREPORT Version 3.1 is a Perl script that extracts vital summarization information from the Mars Reconnaissance Orbiter (MRO) aerobraking ABM build process. This information facilitates sequence reviews, and provides a high-level summarization of the sequence for mission management. The script extracts information from the ENV, SSF, FRF, SCMFmax, and OPTG files and burn magnitude configuration files and presents them in a single, easy-to-check report that provides the majority of the parameters necessary for cross check and verification during the sequence review process. This means that needed information, formerly spread across a number of different files and each in a different format, is all available in this one application. This program is built on the capabilities developed in dragReport and then the scripts evolved as the two tools continued to be developed in parallel.

  2. ATLAS software configuration and build tool optimisation

    NASA Astrophysics Data System (ADS)

    Rybkin, Grigory; Atlas Collaboration

    2014-06-01

    ATLAS software code base is over 6 million lines organised in about 2000 packages. It makes use of some 100 external software packages, is developed by more than 400 developers and used by more than 2500 physicists from over 200 universities and laboratories in 6 continents. To meet the challenge of configuration and building of this software, the Configuration Management Tool (CMT) is used. CMT expects each package to describe its build targets, build and environment setup parameters, dependencies on other packages in a text file called requirements, and each project (group of packages) to describe its policies and dependencies on other projects in a text project file. Based on the effective set of configuration parameters read from the requirements files of dependent packages and project files, CMT commands build the packages, generate the environment for their use, or query the packages. The main focus was on build time performance that was optimised within several approaches: reduction of the number of reads of requirements files that are now read once per package by a CMT build command that generates cached requirements files for subsequent CMT build commands; introduction of more fine-grained build parallelism at package task level, i.e., dependent applications and libraries are compiled in parallel; code optimisation of CMT commands used for build; introduction of package level build parallelism, i. e., parallelise the build of independent packages. By default, CMT launches NUMBER-OF-PROCESSORS build commands in parallel. The other focus was on CMT commands optimisation in general that made them approximately 2 times faster. CMT can generate a cached requirements file for the environment setup command, which is especially useful for deployment on distributed file systems like AFS or CERN VMFS. The use of parallelism, caching and code optimisation significantly-by several times-reduced software build time, environment setup time, increased the efficiency of multi-core computing resources utilisation, and considerably improved software developer and user experience.

  3. Object-oriented approach to fast display of electrophysiological data under MS-windows.

    PubMed

    Marion-Poll, F

    1995-12-01

    Microcomputers provide neuroscientists an alternative to a host of laboratory equipment to record and analyze electrophysiological data. Object-oriented programming tools bring an essential link between custom needs for data acquisition and analysis with general software packages. In this paper, we outline the layout of basic objects that display and manipulate electrophysiological data files. Visual inspection of the recordings is a basic requirement of any data analysis software. We present an approach that allows flexible and fast display of large data sets. This approach involves constructing an intermediate representation of the data in order to lower the number of actual points displayed while preserving the aspect of the data. The second group of objects is related to the management of lists of data files. Typical experiments designed to test the biological activity of pharmacological products include scores of files. Data manipulation and analysis are facilitated by creating multi-document objects that include the names of all experiment files. Implementation steps of both objects are described for an MS-Windows hosted application.

  4. On-Board File Management and Its Application in Flight Operations

    NASA Technical Reports Server (NTRS)

    Kuo, N.

    1998-01-01

    In this paper, the author presents the minimum functions required for an on-board file management system. We explore file manipulation processes and demonstrate how the file transfer along with the file management system will be utilized to support flight operations and data delivery.

  5. Defining Geodetic Reference Frame using Matlab®: PlatEMotion 2.0

    NASA Astrophysics Data System (ADS)

    Cannavò, Flavio; Palano, Mimmo

    2016-03-01

    We describe the main features of the developed software tool, namely PlatE-Motion 2.0 (PEM2), which allows inferring the Euler pole parameters by inverting the observed velocities at a set of sites located on a rigid block (inverse problem). PEM2 allows also calculating the expected velocity value for any point located on the Earth providing an Euler pole (direct problem). PEM2 is the updated version of a previous software tool initially developed for easy-to-use file exchange with the GAMIT/GLOBK software package. The software tool is developed in Matlab® framework and, as the previous version, includes a set of MATLAB functions (m-files), GUIs (fig-files), map data files (mat-files) and user's manual as well as some example input files. New changes in PEM2 include (1) some bugs fixed, (2) improvements in the code, (3) improvements in statistical analysis, (4) new input/output file formats. In addition, PEM2 can be now run under the majority of operating systems. The tool is open source and freely available for the scientific community.

  6. DATA-MEAns: an open source tool for the classification and management of neural ensemble recordings.

    PubMed

    Bonomini, María P; Ferrandez, José M; Bolea, Jose Angel; Fernandez, Eduardo

    2005-10-30

    The number of laboratories using techniques that allow to acquire simultaneous recordings of as many units as possible is considerably increasing. However, the development of tools used to analyse this multi-neuronal activity is generally lagging behind the development of the tools used to acquire these data. Moreover, the data exchange between research groups using different multielectrode acquisition systems is hindered by commercial constraints such as exclusive file structures, high priced licenses and hard policies on intellectual rights. This paper presents a free open-source software for the classification and management of neural ensemble data. The main goal is to provide a graphical user interface that links the experimental data to a basic set of routines for analysis, visualization and classification in a consistent framework. To facilitate the adaptation and extension as well as the addition of new routines, tools and algorithms for data analysis, the source code and documentation are freely available.

  7. ISTP CDF Skeleton Editor

    NASA Technical Reports Server (NTRS)

    Chimiak, Reine; Harris, Bernard; Williams, Phillip

    2013-01-01

    Basic Common Data Format (CDF) tools (e.g., cdfedit) provide no specific support for creating International Solar-Terrestrial Physics/Space Physics Data Facility (ISTP/SPDF) standard files. While it is possible for someone who is familiar with the ISTP/SPDF metadata guidelines to create compliant files using just the basic tools, the process is error-prone and unreasonable for someone without ISTP/SPDF expertise. The key problem is the lack of a tool with specific support for creating files that comply with the ISTP/SPDF guidelines. There are basic CDF tools such as cdfedit and skeletoncdf for creating CDF files, but these have no specific support for creating ISTP/ SPDF compliant files. The SPDF ISTP CDF skeleton editor is a cross-platform, Java-based GUI editor program that allows someone with only a basic understanding of the ISTP/SPDF guidelines to easily create compliant files. The editor is a simple graphical user interface (GUI) application for creating and editing ISTP/SPDF guideline-compliant skeleton CDF files. The SPDF ISTP CDF skeleton editor consists of the following components: A swing-based Java GUI program, JavaHelp-based manual/ tutorial, Image/Icon files, and HTML Web page for distribution. The editor is available as a traditional Java desktop application as well as a Java Network Launching Protocol (JNLP) application. Once started, it functions like a typical Java GUI file editor application for creating/editing application-unique files.

  8. Modeling Tools for Propulsion Analysis and Computational Fluid Dynamics on the Internet

    NASA Technical Reports Server (NTRS)

    Muss, J. A.; Johnson, C. W.; Gotchy, M. B.

    2000-01-01

    The existing RocketWeb(TradeMark) Internet Analysis System (httr)://www.iohnsonrockets.com/rocketweb) provides an integrated set of advanced analysis tools that can be securely accessed over the Internet. Since these tools consist of both batch and interactive analysis codes, the system includes convenient methods for creating input files and evaluating the resulting data. The RocketWeb(TradeMark) system also contains many features that permit data sharing which, when further developed, will facilitate real-time, geographically diverse, collaborative engineering within a designated work group. Adding work group management functionality while simultaneously extending and integrating the system's set of design and analysis tools will create a system providing rigorous, controlled design development, reducing design cycle time and cost.

  9. The ATLAS TAGS database distribution and management - Operational challenges of a multi-terabyte distributed database

    NASA Astrophysics Data System (ADS)

    Viegas, F.; Malon, D.; Cranshaw, J.; Dimitrov, G.; Nowak, M.; Nairz, A.; Goossens, L.; Gallas, E.; Gamboa, C.; Wong, A.; Vinek, E.

    2010-04-01

    The TAG files store summary event quantities that allow a quick selection of interesting events. This data will be produced at a nominal rate of 200 Hz, and is uploaded into a relational database for access from websites and other tools. The estimated database volume is 6TB per year, making it the largest application running on the ATLAS relational databases, at CERN and at other voluntary sites. The sheer volume and high rate of production makes this application a challenge to data and resource management, in many aspects. This paper will focus on the operational challenges of this system. These include: uploading the data from files to the CERN's and remote sites' databases; distributing the TAG metadata that is essential to guide the user through event selection; controlling resource usage of the database, from the user query load to the strategy of cleaning and archiving of old TAG data.

  10. FLASH Interface; a GUI for managing runtime parameters in FLASH simulations

    NASA Astrophysics Data System (ADS)

    Walker, Christopher; Tzeferacos, Petros; Weide, Klaus; Lamb, Donald; Flocke, Norbert; Feister, Scott

    2017-10-01

    We present FLASH Interface, a novel graphical user interface (GUI) for managing runtime parameters in simulations performed with the FLASH code. FLASH Interface supports full text search of available parameters; provides descriptions of each parameter's role and function; allows for the filtering of parameters based on categories; performs input validation; and maintains all comments and non-parameter information already present in existing parameter files. The GUI can be used to edit existing parameter files or generate new ones. FLASH Interface is open source and was implemented with the Electron framework, making it available on Mac OSX, Windows, and Linux operating systems. The new interface lowers the entry barrier for new FLASH users and provides an easy-to-use tool for experienced FLASH simulators. U.S. Department of Energy (DOE), NNSA ASC/Alliances Center for Astrophysical Thermonuclear Flashes, U.S. DOE NNSA ASC through the Argonne Institute for Computing in Science, U.S. National Science Foundation.

  11. Software Management for the NOνAExperiment

    NASA Astrophysics Data System (ADS)

    Davies, G. S.; Davies, J. P.; C Group; Rebel, B.; Sachdev, K.; Zirnstein, J.

    2015-12-01

    The NOvAsoftware (NOνASoft) is written in C++, and built on the Fermilab Computing Division's art framework that uses ROOT analysis software. NOνASoftmakes use of more than 50 external software packages, is developed by more than 50 developers and is used by more than 100 physicists from over 30 universities and laboratories in 3 continents. The software builds are handled by Fermilab's custom version of Software Release Tools (SRT), a UNIX based software management system for large, collaborative projects that is used by several experiments at Fermilab. The system provides software version control with SVN configured in a client-server mode and is based on the code originally developed by the BaBar collaboration. In this paper, we present efforts towards distributing the NOvA software via the CernVM File System distributed file system. We will also describe our recent work to use a CMake build system and Jenkins, the open source continuous integration system, for NOνASoft.

  12. Toward information management in corporations (2)

    NASA Astrophysics Data System (ADS)

    Shibata, Mitsuru

    If construction of inhouse information management systems in an advanced information society should be positioned along with the social information management, its base making begins with reviewing current paper filing systems. Since the problems which inhere in inhouse information management systems utilizing OA equipments also inhere in paper filing systems, the first step toward full scale inhouse information management should be to grasp and solve the fundamental problems in current filing systems. This paper describes analysis of fundamental problems in filing systems, making new type of offices and analysis of improvement needs in filing systems, and some points in improving filing systems.

  13. The collaboratory for MS3D: a new cyberinfrastructure for the structural elucidation of biological macromolecules and their assemblies using mass spectrometry-based approaches.

    PubMed

    Yu, Eizadora T; Hawkins, Arie; Kuntz, Irwin D; Rahn, Larry A; Rothfuss, Andrew; Sale, Kenneth; Young, Malin M; Yang, Christine L; Pancerella, Carmen M; Fabris, Daniele

    2008-11-01

    Modern biomedical research is evolving with the rapid growth of diverse data types, biophysical characterization methods, computational tools and extensive collaboration among researchers spanning various communities and having complementary backgrounds and expertise. Collaborating researchers are increasingly dependent on shared data and tools made available by other investigators with common interests, thus forming communities that transcend the traditional boundaries of the single research laboratory or institution. Barriers, however, remain to the formation of these virtual communities, usually due to the steep learning curve associated with becoming familiar with new tools, or with the difficulties associated with transferring data between tools. Recognizing the need for shared reference data and analysis tools, we are developing an integrated knowledge environment that supports productive interactions among researchers. Here we report on our current collaborative environment, which focuses on bringing together structural biologists working in the area of mass spectrometric based methods for the analysis of tertiary and quaternary macromolecular structures (MS3D) called the Collaboratory for MS3D (C-MS3D). C-MS3D is a Web-portal designed to provide collaborators with a shared work environment that integrates data storage and management with data analysis tools. Files are stored and archived along with pertinent meta data in such a way as to allow file handling to be tracked (data provenance) and data files to be searched using keywords and modification dates. While at this time the portal is designed around a specific application, the shared work environment is a general approach to building collaborative work groups. The goal of this is to not only provide a common data sharing and archiving system, but also to assist in the building of new collaborations and to spur the development of new tools and technologies.

  14. Preliminary ISIS users manual

    NASA Technical Reports Server (NTRS)

    Grantham, C.

    1979-01-01

    The Interactive Software Invocation (ISIS), an interactive data management system, was developed to act as a buffer between the user and host computer system. The user is provided by ISIS with a powerful system for developing software or systems in the interactive environment. The user is protected from the idiosyncracies of the host computer system by providing such a complete range of capabilities that the user should have no need for direct access to the host computer. These capabilities are divided into four areas: desk top calculator, data editor, file manager, and tool invoker.

  15. Predicting Operator Execution Times Using CogTool

    NASA Technical Reports Server (NTRS)

    Santiago-Espada, Yamira; Latorella, Kara A.

    2013-01-01

    Researchers and developers of NextGen systems can use predictive human performance modeling tools as an initial approach to obtain skilled user performance times analytically, before system testing with users. This paper describes the CogTool models for a two pilot crew executing two different types of a datalink clearance acceptance tasks, and on two different simulation platforms. The CogTool time estimates for accepting and executing Required Time of Arrival and Interval Management clearances were compared to empirical data observed in video tapes and registered in simulation files. Results indicate no statistically significant difference between empirical data and the CogTool predictions. A population comparison test found no significant differences between the CogTool estimates and the empirical execution times for any of the four test conditions. We discuss modeling caveats and considerations for applying CogTool to crew performance modeling in advanced cockpit environments.

  16. ArrayInitiative - a tool that simplifies creating custom Affymetrix CDFs

    PubMed Central

    2011-01-01

    Background Probes on a microarray represent a frozen view of a genome and are quickly outdated when new sequencing studies extend our knowledge, resulting in significant measurement error when analyzing any microarray experiment. There are several bioinformatics approaches to improve probe assignments, but without in-house programming expertise, standardizing these custom array specifications as a usable file (e.g. as Affymetrix CDFs) is difficult, owing mostly to the complexity of the specification file format. However, without correctly standardized files there is a significant barrier for testing competing analysis approaches since this file is one of the required inputs for many commonly used algorithms. The need to test combinations of probe assignments and analysis algorithms led us to develop ArrayInitiative, a tool for creating and managing custom array specifications. Results ArrayInitiative is a standalone, cross-platform, rich client desktop application for creating correctly formatted, custom versions of manufacturer-provided (default) array specifications, requiring only minimal knowledge of the array specification rules and file formats. Users can import default array specifications, import probe sequences for a default array specification, design and import a custom array specification, export any array specification to multiple output formats, export the probe sequences for any array specification and browse high-level information about the microarray, such as version and number of probes. The initial release of ArrayInitiative supports the Affymetrix 3' IVT expression arrays we currently analyze, but as an open source application, we hope that others will contribute modules for other platforms. Conclusions ArrayInitiative allows researchers to create new array specifications, in a standard format, based upon their own requirements. This makes it easier to test competing design and analysis strategies that depend on probe definitions. Since the custom array specifications are easily exported to the manufacturer's standard format, researchers can analyze these customized microarray experiments using established software tools, such as those available in Bioconductor. PMID:21548938

  17. CMS Configuration Editor: GUI based application for user analysis job

    NASA Astrophysics Data System (ADS)

    de Cosa, A.

    2011-12-01

    We present the user interface and the software architecture of the Configuration Editor for the CMS experiment. The analysis workflow is organized in a modular way integrated within the CMS framework that organizes in a flexible way user analysis code. The Python scripting language is adopted to define the job configuration that drives the analysis workflow. It could be a challenging task for users, especially for newcomers, to develop analysis jobs managing the configuration of many required modules. For this reason a graphical tool has been conceived in order to edit and inspect configuration files. A set of common analysis tools defined in the CMS Physics Analysis Toolkit (PAT) can be steered and configured using the Config Editor. A user-defined analysis workflow can be produced starting from a standard configuration file, applying and configuring PAT tools according to the specific user requirements. CMS users can adopt this tool, the Config Editor, to create their analysis visualizing in real time which are the effects of their actions. They can visualize the structure of their configuration, look at the modules included in the workflow, inspect the dependences existing among the modules and check the data flow. They can visualize at which values parameters are set and change them according to what is required by their analysis task. The integration of common tools in the GUI needed to adopt an object-oriented structure in the Python definition of the PAT tools and the definition of a layer of abstraction from which all PAT tools inherit.

  18. Divergence Measures Tool:An Introduction with Brief Tutorial

    DTIC Science & Technology

    2014-03-01

    in detecting differences across a wide range of Arabic -language text files (they varied by genre, domain, spelling variation, size, etc.), our...other. 2 These measures have been put to many uses in natural language processing ( NLP ). In the evaluation of machine translation (MT...files uploaded into the tool must be .txt files in ASCII or UTF-8 format. • This tool has been tested on English and Arabic script**, but should

  19. Preliminary digital map of cryptocrystalline occurrences in northern Nevada

    USGS Publications Warehouse

    Moyer, Lorre A.

    1999-01-01

    The purpose was to identify potential cryptocrystalline material sources for tools used by indigenous people of the northern Nevada portion of the Great Basin. Cryptocrystalline occurrence data combed from the U.S. Geological Survey's Mineral Resources Data System (MRDS, 1995) were combined with sites described in Nevada rockhound guides and entered into a geographic information system (GIS). The map area encompasses northern Nevada (fig.1). This open-file report describes the methods used to convert cryptocrystalline occurrence data into a digital format, documents the file structures, and explains how to download the digital files from the U.S. Geological Survey's World Wide Web site. Uses of the spatial dataset include, but are not limited to, natural and cultural resource management, interdisciplinary activities, recreational rockhounding, and gold exploration. It is important to note that the accuracy of the spatial data varies widely, and for some purposes, field checks are advised.

  20. An easy and effective approach to manage radiologic portable document format (PDF) files using iTunes.

    PubMed

    Qian, Li Jun; Zhou, Mi; Xu, Jian Rong

    2008-07-01

    The objective of this article is to explain an easy and effective approach for managing radiologic files in portable document format (PDF) using iTunes. PDF files are widely used as a standard file format for electronic publications as well as for medical online documents. Unfortunately, there is a lack of powerful software to manage numerous PDF documents. In this article, we explain how to use the hidden function of iTunes (Apple Computer) to manage PDF documents as easily as managing music files.

  1. The IEO Data Center Management System: Tools for quality control, analysis and access marine data

    NASA Astrophysics Data System (ADS)

    Casas, Antonia; Garcia, Maria Jesus; Nikouline, Andrei

    2010-05-01

    Since 1994 the Data Centre of the Spanish Oceanographic Institute develops system for archiving and quality control of oceanographic data. The work started in the frame of the European Marine Science & Technology Programme (MAST) when a consortium of several Mediterranean Data Centres began to work on the MEDATLAS project. Along the years, old software modules for MS DOS were rewritten, improved and migrated to Windows environment. Oceanographic data quality control includes now not only vertical profiles (mainly CTD and bottles observations) but also time series of currents and sea level observations. New powerful routines for analysis and for graphic visualization were added. Data presented originally in ASCII format were organized recently in an open source MySQL database. Nowadays, the IEO, as part of SeaDataNet Infrastructure, has designed and developed a new information system, consistent with the ISO 19115 and SeaDataNet standards, in order to manage the large and diverse marine data and information originated in Spain by different sources, and to interoperate with SeaDataNet. The system works with data stored in ASCII files (MEDATLAS, ODV) as well as data stored within the relational database. The components of the system are: 1.MEDATLAS Format and Quality Control - QCDAMAR: Quality Control of Marine Data. Main set of tools for working with data presented as text files. Includes extended quality control (searching for duplicated cruises and profiles, checking date, position, ship velocity, constant profiles, spikes, density inversion, sounding, acceptable data, impossible regional values,...) and input/output filters. - QCMareas: A set of procedures for the quality control of tide gauge data according to standard international Sea Level Observing System. These procedures include checking for unexpected anomalies in the time series, interpolation, filtering, computation of basic statistics and residuals. 2. DAMAR: A relational data base (MySql) designed to manage the wide variety of marine information as common vocabularies, Catalogues (CSR & EDIOS), Data and Metadata. 3.Other tools for analysis and data management - Import_DB: Script to import data and metadata from the Medatlas ASCII files into the database. - SelDamar/Selavi: interface with the database for local and web access. Allows selective retrievals applying the criteria introduced by the user, as geographical bounds, data responsible, cruises, platform, time periods, etc. Includes also statistical reference values calculation, plotting of original and mean profiles together with vertical interpolation. - ExtractDAMAR: Script to extract data when they are archived in ASCII files that meet the criteria upon an user request through SelDamar interface and export them in ODV format, making also a unit conversion.

  2. User’s manual for the Automated Data Assurance and Management application developed for quality control of Everglades Depth Estimation Network water-level data

    USGS Publications Warehouse

    Petkewich, Matthew D.; Daamen, Ruby C.; Roehl, Edwin A.; Conrads, Paul

    2016-09-29

    The generation of Everglades Depth Estimation Network (EDEN) daily water-level and water-depth maps is dependent on high quality real-time data from over 240 water-level stations. To increase the accuracy of the daily water-surface maps, the Automated Data Assurance and Management (ADAM) tool was created by the U.S. Geological Survey as part of Greater Everglades Priority Ecosystems Science. The ADAM tool is used to provide accurate quality-assurance review of the real-time data from the EDEN network and allows estimation or replacement of missing or erroneous data. This user’s manual describes how to install and operate the ADAM software. File structure and operation of the ADAM software is explained using examples.

  3. Analysis of ETMS Data Quality for Traffic Flow Management Decisions

    NASA Technical Reports Server (NTRS)

    Chatterji, Gano B.; Sridhar, Banavar; Kim, Douglas

    2003-01-01

    The data needed for air traffic flow management decision support tools is provided by the Enhanced Traffic Management System (ETMS). This includes both the tools that are in current use and the ones being developed for future deployment. Since the quality of decision support provided by all these tools will be influenced by the quality of the input ETMS data, an assessment of ETMS data quality is needed. Motivated by this desire, ETMS data quality is examined in this paper in terms of the unavailability of flight plans, deviation from the filed flight plans, departure delays, altitude errors and track data drops. Although many of these data quality issues are not new, little is known about their extent. A goal of this paper is to document the magnitude of data quality issues supported by numerical analysis of ETMS data. Guided by this goal, ETMS data for a 24-hour period were processed to determine the number of aircraft with missing flight plan messages at any given instant of time. Results are presented for aircraft above 18,000 feet altitude and also at all altitudes. Since deviation from filed flight plan is also a major cause of trajectory-modeling errors, statistics of deviations are presented. Errors in proposed departure times and ETMS-generated vertical profiles are also shown. A method for conditioning the vertical profiles for improving demand prediction accuracy is described. Graphs of actual sector counts obtained using these vertical profiles are compared with those obtained using the Host data for sectors in the Fort Worth Center to demonstrate the benefit of preprocessing. Finally, results are presented to quantify the extent of data drops. A method for propagating track positions during ETMS data drops is also described.

  4. CGO: utilizing and integrating gene expression microarray data in clinical research and data management.

    PubMed

    Bumm, Klaus; Zheng, Mingzhong; Bailey, Clyde; Zhan, Fenghuang; Chiriva-Internati, M; Eddlemon, Paul; Terry, Julian; Barlogie, Bart; Shaughnessy, John D

    2002-02-01

    Clinical GeneOrganizer (CGO) is a novel windows-based archiving, organization and data mining software for the integration of gene expression profiling in clinical medicine. The program implements various user-friendly tools and extracts data for further statistical analysis. This software was written for Affymetrix GeneChip *.txt files, but can also be used for any other microarray-derived data. The MS-SQL server version acts as a data mart and links microarray data with clinical parameters of any other existing database and therefore represents a valuable tool for combining gene expression analysis and clinical disease characteristics.

  5. Harmonize Pipeline and Archiving Aystem: PESSTO@IA2 Use Case

    NASA Astrophysics Data System (ADS)

    Smareglia, R.; Knapic, C.; Molinaro, M.; Young, D.; Valenti, S.

    2013-10-01

    Italian Astronomical Archives Center (IA2) is a research infrastructure project that aims at coordinating different national and international initiatives to improve the quality of astrophysical data services. IA2 is now also involved in the PESSTO (Public ESO Spectroscopic Survey of Transient Objects) collaboration, developing a complete archiving system to store calibrated post processed data (including sensitive intermediate products), a user interface to access private data and Virtual Observatory (VO) compliant web services to access public fast reduction data via VO tools. The archive system shall rely on the PESSTO Marshall to provide file data and its associated metadata output by the PESSTO data-reduction pipeline. To harmonize the object repository, data handling and archiving system, new tools are under development. These systems must have a strong cross-interaction without increasing the complexities of any single task, in order to improve the performances of the whole system and must have a sturdy logic in order to perform all operations in coordination with the other PESSTO tools. MySQL Replication technology and triggers are used for the synchronization of new data in an efficient, fault tolerant manner. A general purpose library is under development to manage data starting from raw observations to final calibrated ones, open to the overriding of different sources, formats, management fields, storage and publication policies. Configurations for all the systems are stored in a dedicated schema (no configuration files), but can be easily updated by a planned Archiving System Configuration Interface (ASCI).

  6. Dynamo Catalogue: Geometrical tools and data management for particle picking in subtomogram averaging of cryo-electron tomograms.

    PubMed

    Castaño-Díez, Daniel; Kudryashev, Mikhail; Stahlberg, Henning

    2017-02-01

    Cryo electron tomography allows macromolecular complexes within vitrified, intact, thin cells or sections thereof to be visualized, and structural analysis to be performed in situ by averaging over multiple copies of the same molecules. Image processing for subtomogram averaging is specific and cumbersome, due to the large amount of data and its three dimensional nature and anisotropic resolution. Here, we streamline data processing for subtomogram averaging by introducing an archiving system, Dynamo Catalogue. This system manages tomographic data from multiple tomograms and allows visual feedback during all processing steps, including particle picking, extraction, alignment and classification. The file structure of a processing project file structure includes logfiles of performed operations, and can be backed up and shared between users. Command line commands, database queries and a set of GUIs give the user versatile control over the process. Here, we introduce a set of geometric tools that streamline particle picking from simple (filaments, spheres, tubes, vesicles) and complex geometries (arbitrary 2D surfaces, rare instances on proteins with geometric restrictions, and 2D and 3D crystals). Advanced functionality, such as manual alignment and subboxing, is useful when initial templates are generated for alignment and for project customization. Dynamo Catalogue is part of the open source package Dynamo and includes tools to ensure format compatibility with the subtomogram averaging functionalities of other packages, such as Jsubtomo, PyTom, PEET, EMAN2, XMIPP and Relion. Copyright © 2016. Published by Elsevier Inc.

  7. Tools to Ease Your Internet Adventures: Part I.

    ERIC Educational Resources Information Center

    Descy, Don E.

    1993-01-01

    This first of a two-part series highlights three tools that improve accessibility to Internet resources: (1) Alex, a database that accesses files in FTP (file transfer protocol) sites; (2) Archie, software that searches for file names with a user's search term; and (3) Gopher, a menu-driven program to access Internet sites. (LRW)

  8. Popularity Prediction Tool for ATLAS Distributed Data Management

    NASA Astrophysics Data System (ADS)

    Beermann, T.; Maettig, P.; Stewart, G.; Lassnig, M.; Garonne, V.; Barisits, M.; Vigne, R.; Serfon, C.; Goossens, L.; Nairz, A.; Molfetas, A.; Atlas Collaboration

    2014-06-01

    This paper describes a popularity prediction tool for data-intensive data management systems, such as ATLAS distributed data management (DDM). It is fed by the DDM popularity system, which produces historical reports about ATLAS data usage, providing information about files, datasets, users and sites where data was accessed. The tool described in this contribution uses this historical information to make a prediction about the future popularity of data. It finds trends in the usage of data using a set of neural networks and a set of input parameters and predicts the number of accesses in the near term future. This information can then be used in a second step to improve the distribution of replicas at sites, taking into account the cost of creating new replicas (bandwidth and load on the storage system) compared to gain of having new ones (faster access of data for analysis). To evaluate the benefit of the redistribution a grid simulator is introduced that is able replay real workload on different data distributions. This article describes the popularity prediction method and the simulator that is used to evaluate the redistribution.

  9. pcircle - A Suite of Scalable Parallel File System Tools

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    WANG, FEIYI

    2015-10-01

    Most of the software related to file system are written for conventional local file system, they are serialized and can't take advantage of the benefit of a large scale parallel file system. "pcircle" software builds on top of ubiquitous MPI in cluster computing environment and "work-stealing" pattern to provide a scalable, high-performance suite of file system tools. In particular - it implemented parallel data copy and parallel data checksumming, with advanced features such as async progress report, checkpoint and restart, as well as integrity checking.

  10. The Lake Tahoe Basin Land Use Simulation Model

    USGS Publications Warehouse

    Forney, William M.; Oldham, I. Benson

    2011-01-01

    This U.S. Geological Survey Open-File Report describes the final modeling product for the Tahoe Decision Support System project for the Lake Tahoe Basin funded by the Southern Nevada Public Land Management Act and the U.S. Geological Survey's Geographic Analysis and Monitoring Program. This research was conducted by the U.S. Geological Survey Western Geographic Science Center. The purpose of this report is to describe the basic elements of the novel Lake Tahoe Basin Land Use Simulation Model, publish samples of the data inputs, basic outputs of the model, and the details of the Python code. The results of this report include a basic description of the Land Use Simulation Model, descriptions and summary statistics of model inputs, two figures showing the graphical user interface from the web-based tool, samples of the two input files, seven tables of basic output results from the web-based tool and descriptions of their parameters, and the fully functional Python code.

  11. Software Development Of XML Parser Based On Algebraic Tools

    NASA Astrophysics Data System (ADS)

    Georgiev, Bozhidar; Georgieva, Adriana

    2011-12-01

    In this paper, is presented one software development and implementation of an algebraic method for XML data processing, which accelerates XML parsing process. Therefore, the proposed in this article nontraditional approach for fast XML navigation with algebraic tools contributes to advanced efforts in the making of an easier user-friendly API for XML transformations. Here the proposed software for XML documents processing (parser) is easy to use and can manage files with strictly defined data structure. The purpose of the presented algorithm is to offer a new approach for search and restructuring hierarchical XML data. This approach permits fast XML documents processing, using algebraic model developed in details in previous works of the same authors. So proposed parsing mechanism is easy accessible to the web consumer who is able to control XML file processing, to search different elements (tags) in it, to delete and to add a new XML content as well. The presented various tests show higher rapidity and low consumption of resources in comparison with some existing commercial parsers.

  12. An expert system shell for inferring vegetation characteristics: Prototype help system (Task 1)

    NASA Technical Reports Server (NTRS)

    1993-01-01

    The NASA Vegetation Workbench (VEG) is a knowledge based system that infers vegetation characteristics from reflectance data. A prototype of the VEG subgoal HELP.SYSTEM has been completed and the Help System has been added to the VEG system. It is loaded when the user first clicks on the HELP.SYSTEM option in the Tool Box Menu. The Help System provides a user tool to support needed user information. It also provides interactive tools the scientist may use to develop new help messages and to modify existing help messages that are attached to VEG screens. The system automatically manages system and file operations needed to preserve new or modified help messages. The Help System was tested both as a help system development and a help system user tool.

  13. Extended Maintenance Downtime 12/14 - 12/18

    Atmospheric Science Data Center

    2015-12-07

    ... am - 12/18 @ 5 pm EST Event Impact:  File System Maintenance will be performed on a number of the large file systems ... and Customization Tool -  AMAPS, CALIPSO, CERES, MOPITT, TES and TAD Search and Subset Tools   While some sites and tools may ...

  14. Migration of the digital interactive breast-imaging teaching file

    NASA Astrophysics Data System (ADS)

    Cao, Fei; Sickles, Edward A.; Huang, H. K.; Zhou, Xiaoqiang

    1998-06-01

    The digital breast imaging teaching file developed during the last two years in our laboratory has been used successfully at UCSF (University of California, San Francisco) as a routine teaching tool for training radiology residents and fellows in mammography. Building on this success, we have ported the teaching file from an old Pixar imaging/Sun SPARC 470 display system to our newly designed telemammography display workstation (Ultra SPARC 2 platform with two DOME Md5/SBX display boards). The old Pixar/Sun 470 system, although adequate for fast and high-resolution image display, is 4- year-old technology, expensive to maintain and difficult to upgrade. The new display workstation is more cost-effective and is also compatible with the digital image format from a full-field direct digital mammography system. The digital teaching file is built on a sophisticated computer-aided instruction (CAI) model, which simulates the management sequences used in imaging interpretation and work-up. Each user can be prompted to respond by making his/her own observations, assessments, and work-up decisions as well as the marking of image abnormalities. This effectively replaces the traditional 'show-and-tell' teaching file experience with an interactive, response-driven type of instruction.

  15. The prevalence of encoded digital trace evidence in the nonfile space of computer media(,) (.).

    PubMed

    Garfinkel, Simson L

    2014-09-01

    Forensically significant digital trace evidence that is frequently present in sectors of digital media not associated with allocated or deleted files. Modern digital forensic tools generally do not decompress such data unless a specific file with a recognized file type is first identified, potentially resulting in missed evidence. Email addresses are encoded differently for different file formats. As a result, trace evidence can be categorized as Plain in File (PF), Encoded in File (EF), Plain Not in File (PNF), or Encoded Not in File (ENF). The tool bulk_extractor finds all of these formats, but other forensic tools do not. A study of 961 storage devices purchased on the secondary market and shows that 474 contained encoded email addresses that were not in files (ENF). Different encoding formats are the result of different application programs that processed different kinds of digital trace evidence. Specific encoding formats explored include BASE64, GZIP, PDF, HIBER, and ZIP. Published 2014. This article is a U.S. Government work and is in the public domain in the USA. Journal of Forensic Sciences published by Wiley Periodicals, Inc. on behalf of American Academy of Forensic Sciences.

  16. JNDMS Task Authorization 2 Report

    DTIC Science & Technology

    2013-10-01

    uses Barnyard to store alarms from all DREnet Snort sensors in a MySQL database. Barnyard is an open source tool designed to work with Snort to take...Technology ITI Information Technology Infrastructure J2EE Java 2 Enterprise Edition JAR Java Archive. This is an archive file format defined by Java ...standards. JDBC Java Database Connectivity JDW JNDMS Data Warehouse JNDMS Joint Network and Defence Management System JNDMS Joint Network Defence and

  17. EMERALD: A Flexible Framework for Managing Seismic Data

    NASA Astrophysics Data System (ADS)

    West, J. D.; Fouch, M. J.; Arrowsmith, R.

    2010-12-01

    The seismological community is challenged by the vast quantity of new broadband seismic data provided by large-scale seismic arrays such as EarthScope’s USArray. While this bonanza of new data enables transformative scientific studies of the Earth’s interior, it also illuminates limitations in the methods used to prepare and preprocess those data. At a recent seismic data processing focus group workshop, many participants expressed the need for better systems to minimize the time and tedium spent on data preparation in order to increase the efficiency of scientific research. Another challenge related to data from all large-scale transportable seismic experiments is that there currently exists no system for discovering and tracking changes in station metadata. This critical information, such as station location, sensor orientation, instrument response, and clock timing data, may change over the life of an experiment and/or be subject to post-experiment correction. Yet nearly all researchers utilize metadata acquired with the downloaded data, even though subsequent metadata updates might alter or invalidate results produced with older metadata. A third long-standing issue for the seismic community is the lack of easily exchangeable seismic processing codes. This problem stems directly from the storage of seismic data as individual time series files, and the history of each researcher developing his or her preferred data file naming convention and directory organization. Because most processing codes rely on the underlying data organization structure, such codes are not easily exchanged between investigators. To address these issues, we are developing EMERALD (Explore, Manage, Edit, Reduce, & Analyze Large Datasets). The goal of the EMERALD project is to provide seismic researchers with a unified, user-friendly, extensible system for managing seismic event data, thereby increasing the efficiency of scientific enquiry. EMERALD stores seismic data and metadata in a state-of-the-art open source relational database (PostgreSQL), and can, on a timed basis or on demand, download the most recent metadata, compare it with previously acquired values, and alert the user to changes. The backend relational database is capable of easily storing and managing many millions of records. The extensible, plug-in architecture of the EMERALD system allows any researcher to contribute new visualization and processing methods written in any of 12 programming languages, and a central Internet-enabled repository for such methods provides users with the opportunity to download, use, and modify new processing methods on demand. EMERALD includes data acquisition tools allowing direct importation of seismic data, and also imports data from a number of existing seismic file formats. Pre-processed clean sets of data can be exported as standard sac files with user-defined file naming and directory organization, for use with existing processing codes. The EMERALD system incorporates existing acquisition and processing tools, including SOD, TauP, GMT, and FISSURES/DHI, making much of the functionality of those tools available in a unified system with a user-friendly web browser interface. EMERALD is now in beta test. See emerald.asu.edu or contact john.d.west@asu.edu for more details.

  18. MyGeoHub: A Collaborative Geospatial Research and Education Platform

    NASA Astrophysics Data System (ADS)

    Kalyanam, R.; Zhao, L.; Biehl, L. L.; Song, C. X.; Merwade, V.; Villoria, N.

    2017-12-01

    Scientific research is increasingly collaborative and globally distributed; research groups now rely on web-based scientific tools and data management systems to simplify their day-to-day collaborative workflows. However, such tools often lack seamless interfaces, requiring researchers to contend with manual data transfers, annotation and sharing. MyGeoHub is a web platform that supports out-of-the-box, seamless workflows involving data ingestion, metadata extraction, analysis, sharing and publication. MyGeoHub is built on the HUBzero cyberinfrastructure platform and adds general-purpose software building blocks (GABBs), for geospatial data management, visualization and analysis. A data management building block iData, processes geospatial files, extracting metadata for keyword and map-based search while enabling quick previews. iData is pervasive, allowing access through a web interface, scientific tools on MyGeoHub or even mobile field devices via a data service API. GABBs includes a Python map library as well as map widgets that in a few lines of code, generate complete geospatial visualization web interfaces for scientific tools. GABBs also includes powerful tools that can be used with no programming effort. The GeoBuilder tool provides an intuitive wizard for importing multi-variable, geo-located time series data (typical of sensor readings, GPS trackers) to build visualizations supporting data filtering and plotting. MyGeoHub has been used in tutorials at scientific conferences and educational activities for K-12 students. MyGeoHub is also constantly evolving; the recent addition of Jupyter and R Shiny notebook environments enable reproducible, richly interactive geospatial analyses and applications ranging from simple pre-processing to published tools. MyGeoHub is not a monolithic geospatial science gateway, instead it supports diverse needs ranging from just a feature-rich data management system, to complex scientific tools and workflows.

  19. AgMIP Training in Multiple Crop Models and Tools

    NASA Technical Reports Server (NTRS)

    Boote, Kenneth J.; Porter, Cheryl H.; Hargreaves, John; Hoogenboom, Gerrit; Thornburn, Peter; Mutter, Carolyn

    2015-01-01

    The Agricultural Model Intercomparison and Improvement Project (AgMIP) has the goal of using multiple crop models to evaluate climate impacts on agricultural production and food security in developed and developing countries. There are several major limitations that must be overcome to achieve this goal, including the need to train AgMIP regional research team (RRT) crop modelers to use models other than the ones they are currently familiar with, plus the need to harmonize and interconvert the disparate input file formats used for the various models. Two activities were followed to address these shortcomings among AgMIP RRTs to enable them to use multiple models to evaluate climate impacts on crop production and food security. We designed and conducted courses in which participants trained on two different sets of crop models, with emphasis on the model of least experience. In a second activity, the AgMIP IT group created templates for inputting data on soils, management, weather, and crops into AgMIP harmonized databases, and developed translation tools for converting the harmonized data into files that are ready for multiple crop model simulations. The strategies for creating and conducting the multi-model course and developing entry and translation tools are reviewed in this chapter.

  20. 75 FR 66787 - Notice of Filing of Plats of Survey, Wyoming

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-29

    ... DEPARTMENT OF THE INTERIOR Bureau of Land Management [LLWY-957400-11-L14200000-BJ0000] Notice of Filing of Plats of Survey, Wyoming AGENCY: Bureau of Land Management, Interior. ACTION: Notice of filing of plats of survey, Wyoming. SUMMARY: The Bureau of Land Management (BLM) has filed the plats of...

  1. 78 FR 79005 - Notice of Filing of Plats of Survey; North Dakota

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-27

    ... DEPARTMENT OF THE INTERIOR Bureau of Land Management [LLMT926000-L13100000-EI0000] Notice of Filing of Plats of Survey; North Dakota AGENCY: Bureau of Land Management, Interior. ACTION: Notice of filing of plats of survey. SUMMARY: The Bureau of Land Management (BLM) will file the plat of survey of...

  2. 78 FR 66379 - Notice of Filing of Plats of Survey; Montana

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-05

    ... DEPARTMENT OF THE INTERIOR Bureau of Land Management [LLMTL07000-L1420000-BJ0000-LXSIHRRB0000] Notice of Filing of Plats of Survey; Montana AGENCY: Bureau of Land Management, Interior. ACTION: Notice of filing of plats of survey. SUMMARY: The Bureau of Land Management (BLM) will file the plat of...

  3. 77 FR 13620 - Notice of Filing of Plats of Survey; Montana

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-07

    ... DEPARTMENT OF THE INTERIOR Bureau of Land Management [LLMT926000-L14200000-BJ0000] Notice of Filing of Plats of Survey; Montana AGENCY: Bureau of Land Management, Interior. ACTION: Notice of filing of plats of survey. SUMMARY: The Bureau of Land Management (BLM) will file the plat of survey of the...

  4. 78 FR 64531 - Notice of Filing of Plats of Survey; Montana

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-10-29

    ... DEPARTMENT OF THE INTERIOR Bureau of Land Management [LLMT926000-L1430000-EU0000] Notice of Filing of Plats of Survey; Montana AGENCY: Bureau of Land Management, Interior. ACTION: Notice of filing of plats of survey. SUMMARY: The Bureau of Land Management (BLM) will file the plat of survey of the lands...

  5. 77 FR 13621 - Notice of Filing of Plats of Survey; Montana

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-07

    ... DEPARTMENT OF THE INTERIOR Bureau of Land Management [LLMT926000-L14200000-BJ0000] Notice of Filing of Plats of Survey; Montana AGENCY: Bureau of Land Management, Interior. ACTION: Notice of filing of plats of survey. SUMMARY: The Bureau of Land Management (BLM) will file the plat of survey of the...

  6. 76 FR 72970 - Notice of Filing of Plats of Survey; Montana

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-28

    ... DEPARTMENT OF THE INTERIOR Bureau of Land Management [LLMT926000-L98200000-BJ0000-LXCSMT010000] Notice of Filing of Plats of Survey; Montana AGENCY: Bureau of Land Management, Interior. ACTION: Notice of filing of plats of survey. SUMMARY: The Bureau of Land Management (BLM) will file the plat of...

  7. 76 FR 70163 - Notice of Filing of Plats of Survey; Montana

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-10

    ... DEPARTMENT OF THE INTERIOR Bureau of Land Management [LLMT926000-L14200000-BJ0000] Notice of Filing of Plats of Survey; Montana AGENCY: Bureau of Land Management, Interior. ACTION: Notice of filing of plats of survey. SUMMARY: The Bureau of Land Management (BLM) will file the plat of survey of the...

  8. 77 FR 12075 - Notice of Filing of Plats of Survey; Montana

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-28

    ... DEPARTMENT OF THE INTERIOR Bureau of Land Management [LLMT926000-L19100000-BJ0000-LRCS42800800] Notice of Filing of Plats of Survey; Montana AGENCY: Bureau of Land Management, Interior. ACTION: Notice of filing of plats of survey. SUMMARY: The Bureau of Land Management (BLM) will file the plat of...

  9. 77 FR 35423 - Notice of Filing of Plats of Survey; Montana

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-13

    ... DEPARTMENT OF THE INTERIOR Bureau of Land Management [LLMT926000-L19100000-BJ0000-LRCS42800800] Notice of Filing of Plats of Survey; Montana AGENCY: Bureau of Land Management, Interior. ACTION: Notice of filing of plats of survey. SUMMARY: The Bureau of Land Management (BLM) will file the plat of...

  10. 77 FR 12075 - Notice of Filing of Plats of Survey; Montana

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-28

    ... DEPARTMENT OF THE INTERIOR Bureau of Land Management [LLMT926000-L19100000-BJ0000-LRCME1R02060] Notice of Filing of Plats of Survey; Montana AGENCY: Bureau of Land Management, Interior. ACTION: Notice of filing of plats of survey. SUMMARY: The Bureau of Land Management (BLM) will file the plat of...

  11. 77 FR 46109 - Notice of Filing of Plats of Survey; Montana

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-02

    ... DEPARTMENT OF THE INTERIOR Bureau of Land Management [LLMT926000-L19100000-BJ0000-LRCME1R05173] Notice of Filing of Plats of Survey; Montana AGENCY: Bureau of Land Management, Interior. ACTION: Notice of filing of plats of survey. SUMMARY: The Bureau of Land Management (BLM) will file the plat of...

  12. 76 FR 63952 - Notice of Filing of Plats of Survey; Montana

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-14

    ... DEPARTMENT OF THE INTERIOR Bureau of Land Management [LLMT926000-L19100000-BJ0000-LRCME0R04772] Notice of Filing of Plats of Survey; Montana AGENCY: Bureau of Land Management, Interior. ACTION: Notice of filing of plats of survey. SUMMARY: The Bureau of Land Management (BLM) will file the plat of...

  13. 77 FR 34402 - Notice of Filing of Plats of Survey; Montana

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-11

    ... DEPARTMENT OF THE INTERIOR Bureau of Land Management [LLMT926000-L14200000-BJ0000] Notice of Filing of Plats of Survey; Montana AGENCY: Bureau of Land Management, Interior. ACTION: Notice of filing of plats of survey. SUMMARY: The Bureau of Land Management (BLM) will file the plat of survey of the...

  14. 78 FR 76176 - Notice of Filing of Plats of Survey; North Dakota

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-16

    ... DEPARTMENT OF THE INTERIOR Bureau of Land Management [LLMT926000-L13100000-EI0000] Notice of Filing of Plats of Survey; North Dakota AGENCY: Bureau of Land Management, Interior. ACTION: Notice of filing of plats of survey. SUMMARY: The Bureau of Land Management (BLM) will file the plat of survey of...

  15. 77 FR 38320 - Notice of Filing of Plats of Survey; Montana

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-27

    ... DEPARTMENT OF THE INTERIOR Bureau of Land Management [LLMT926000-L19100000-BJ0000-LRCME1R05174] Notice of Filing of Plats of Survey; Montana AGENCY: Bureau of Land Management, Interior. ACTION: Notice of filing of plats of survey. SUMMARY: The Bureau of Land Management (BLM) will file the plat of...

  16. 76 FR 2919 - Notice of Filing of Plats of Survey; Montana

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-18

    ... DEPARTMENT OF THE INTERIOR Bureau of Land Management [LLMT926000-10-L98200000-BJ0000-LXCSMT010000] Notice of Filing of Plats of Survey; Montana AGENCY: Bureau of Land Management, Interior. ACTION: Notice of Filing of plats of survey. SUMMARY: The Bureau of Land Management (BLM) will file the plat of...

  17. 76 FR 9049 - Notice of Filing of Plats of Survey; Montana

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-16

    ... DEPARTMENT OF THE INTERIOR Bureau of Land Management [LLMT926000-11-L14200000-BJ0000] Notice of Filing of Plats of Survey; Montana AGENCY: Bureau of Land Management, Interior. ACTION: Notice of filing of plats of survey. SUMMARY: The Bureau of Land Management (BLM) will file the plat of survey of the...

  18. 78 FR 64531 - Notice of Filing of Plats of Survey; Montana

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-10-29

    ... DEPARTMENT OF THE INTERIOR Bureau of Land Management [LLMT926000-L14200000-BJ0000] Notice of Filing of Plats of Survey; Montana AGENCY: Bureau of Land Management, Interior. ACTION: Notice of filing of plats of survey. SUMMARY: The Bureau of Land Management (BLM) will file the plat of survey of the...

  19. iPat: intelligent prediction and association tool for genomic research.

    PubMed

    Chen, Chunpeng James; Zhang, Zhiwu

    2018-06-01

    The ultimate goal of genomic research is to effectively predict phenotypes from genotypes so that medical management can improve human health and molecular breeding can increase agricultural production. Genomic prediction or selection (GS) plays a complementary role to genome-wide association studies (GWAS), which is the primary method to identify genes underlying phenotypes. Unfortunately, most computing tools cannot perform data analyses for both GWAS and GS. Furthermore, the majority of these tools are executed through a command-line interface (CLI), which requires programming skills. Non-programmers struggle to use them efficiently because of the steep learning curves and zero tolerance for data formats and mistakes when inputting keywords and parameters. To address these problems, this study developed a software package, named the Intelligent Prediction and Association Tool (iPat), with a user-friendly graphical user interface. With iPat, GWAS or GS can be performed using a pointing device to simply drag and/or click on graphical elements to specify input data files, choose input parameters and select analytical models. Models available to users include those implemented in third party CLI packages such as GAPIT, PLINK, FarmCPU, BLINK, rrBLUP and BGLR. Users can choose any data format and conduct analyses with any of these packages. File conversions are automatically conducted for specified input data and selected packages. A GWAS-assisted genomic prediction method was implemented to perform genomic prediction using any GWAS method such as FarmCPU. iPat was written in Java for adaptation to multiple operating systems including Windows, Mac and Linux. The iPat executable file, user manual, tutorials and example datasets are freely available at http://zzlab.net/iPat. zhiwu.zhang@wsu.edu.

  20. Personnel Evaluation: Noncommissioned Officer Evaluation Reporting System

    DTIC Science & Technology

    2002-05-15

    Maintenance System), paper copies will be maintained in state, command, or local career manage- ment individual files ( CMIF ) such as AGR management...Routine use DA Form 2166-8 will be maintained in the rated NCO’s official military personnel file (OMPF) and career manage- ment individual file ( CMIF ). A...CAR Chief, Army Reserve CDR commander CE commander’s evaluation CG commanding general CMIF career management individual file CNGB Chief, National Guard

  1. The Particle Physics Data Grid. Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Livny, Miron

    2002-08-16

    The main objective of the Particle Physics Data Grid (PPDG) project has been to implement and evaluate distributed (Grid-enabled) data access and management technology for current and future particle and nuclear physics experiments. The specific goals of PPDG have been to design, implement, and deploy a Grid-based software infrastructure capable of supporting the data generation, processing and analysis needs common to the physics experiments represented by the participants, and to adapt experiment-specific software to operate in the Grid environment and to exploit this infrastructure. To accomplish these goals, the PPDG focused on the implementation and deployment of several critical services:more » reliable and efficient file replication service, high-speed data transfer services, multisite file caching and staging service, and reliable and recoverable job management services. The focus of the activity was the job management services and the interplay between these services and distributed data access in a Grid environment. Software was developed to study the interaction between HENP applications and distributed data storage fabric. One key conclusion was the need for a reliable and recoverable tool for managing large collections of interdependent jobs. An attached document provides an overview of the current status of the Directed Acyclic Graph Manager (DAGMan) with its main features and capabilities.« less

  2. Finding a Needle in a PCAP

    DTIC Science & Technology

    2015-01-27

    Separate from analysis Indexing: • Timestamp Files • BPF Filters • GUI tools • Splunk 5 YAF PCAP Features Rolling PCAP dump • Rotates files using time...PCAP file for each flow. • Use with BPF filters. 6 Gh0st Rat Investigation 7 Gh0st Chinese remote access Trojan Free source code Easy to modify...Merge PCAP files w/ mergecap PCAP Write a BPF filter that will return session Separate Flows TCPDUMP YAF 26 Questions? CERT NetSA tools website

  3. Pgltools: a genomic arithmetic tool suite for manipulation of Hi-C peak and other chromatin interaction data.

    PubMed

    Greenwald, William W; Li, He; Smith, Erin N; Benaglio, Paola; Nariai, Naoki; Frazer, Kelly A

    2017-04-07

    Genomic interaction studies use next-generation sequencing (NGS) to examine the interactions between two loci on the genome, with subsequent bioinformatics analyses typically including annotation, intersection, and merging of data from multiple experiments. While many file types and analysis tools exist for storing and manipulating single locus NGS data, there is currently no file standard or analysis tool suite for manipulating and storing paired-genomic-loci: the data type resulting from "genomic interaction" studies. As genomic interaction sequencing data are becoming prevalent, a standard file format and tools for working with these data conveniently and efficiently are needed. This article details a file standard and novel software tool suite for working with paired-genomic-loci data. We present the paired-genomic-loci (PGL) file standard for genomic-interactions data, and the accompanying analysis tool suite "pgltools": a cross platform, pypy compatible python package available both as an easy-to-use UNIX package, and as a python module, for integration into pipelines of paired-genomic-loci analyses. Pgltools is a freely available, open source tool suite for manipulating paired-genomic-loci data. Source code, an in-depth manual, and a tutorial are available publicly at www.github.com/billgreenwald/pgltools , and a python module of the operations can be installed from PyPI via the PyGLtools module.

  4. Interoperability format translation and transformation between IFC architectural design file and simulation file formats

    DOEpatents

    Chao, Tian-Jy; Kim, Younghun

    2015-02-03

    Automatically translating a building architecture file format (Industry Foundation Class) to a simulation file, in one aspect, may extract data and metadata used by a target simulation tool from a building architecture file. Interoperability data objects may be created and the extracted data is stored in the interoperability data objects. A model translation procedure may be prepared to identify a mapping from a Model View Definition to a translation and transformation function. The extracted data may be transformed using the data stored in the interoperability data objects, an input Model View Definition template, and the translation and transformation function to convert the extracted data to correct geometric values needed for a target simulation file format used by the target simulation tool. The simulation file in the target simulation file format may be generated.

  5. Software to Compare NPP HDF5 Data Files

    NASA Technical Reports Server (NTRS)

    Wiegand, Chiu P.; LeMoigne-Stewart, Jacqueline; Ruley, LaMont T.

    2013-01-01

    This software was developed for the NPOESS (National Polar-orbiting Operational Environmental Satellite System) Preparatory Project (NPP) Science Data Segment. The purpose of this software is to compare HDF5 (Hierarchical Data Format) files specific to NPP and report whether the HDF5 files are identical. If the HDF5 files are different, users have the option of printing out the list of differences in the HDF5 data files. The user provides paths to two directories containing a list of HDF5 files to compare. The tool would select matching HDF5 file names from the two directories and run the comparison on each file. The user can also select from three levels of detail. Level 0 is the basic level, which simply states whether the files match or not. Level 1 is the intermediate level, which lists the differences between the files. Level 2 lists all the details regarding the comparison, such as which objects were compared, and how and where they are different. The HDF5 tool is written specifically for the NPP project. As such, it ignores certain attributes (such as creation_date, creation_ time, etc.) in the HDF5 files. This is because even though two HDF5 files could represent exactly the same granule, if they are created at different times, the creation date and time would be different. This tool is smart enough to ignore differences that are not relevant to NPP users.

  6. BatTool: an R package with GUI for assessing the effect of White-nose syndrome and other take events on Myotis spp. of bats

    PubMed Central

    2014-01-01

    Background Myotis species of bats such as the Indiana Bat and Little Brown Bat are facing population declines because of White-nose syndrome (WNS). These species also face threats from anthropogenic activities such as wind energy development. Population models may be used to provide insights into threats facing these species. We developed a population model, BatTool, as an R package to help decision makers and natural resource managers examine factors influencing the dynamics of these species. The R package includes two components: 1) a deterministic and stochastic model that are accessible from the command line and 2) a graphical user interface (GUI). Results BatTool is an R package allowing natural resource managers and decision makers to understand Myotis spp. population dynamics. Through the use of a GUI, the model allows users to understand how WNS and other take events may affect the population. The results are saved both graphically and as data files. Additionally, R-savvy users may access the population functions through the command line and reuse the code as part of future research. This R package could also be used as part of a population dynamics or wildlife management course. Conclusions BatTool provides access to a Myotis spp. population model. This tool can help natural resource managers and decision makers with the Endangered Species Act deliberations for these species and with issuing take permits as part of regulatory decision making. The tool is available online as part of this publication. PMID:24955110

  7. BatTool: an R package with GUI for assessing the effect of White-nose syndrome and other take events on Myotis spp. of bats

    USGS Publications Warehouse

    Erickson, Richard A.; Thogmartin, Wayne E.; Szymanski, Jennifer A.

    2014-01-01

    Background: Myotis species of bats such as the Indiana Bat and Little Brown Bat are facing population declines because of White-nose syndrome (WNS). These species also face threats from anthropogenic activities such as wind energy development. Population models may be used to provide insights into threats facing these species. We developed a population model, BatTool, as an R package to help decision makers and natural resource managers examine factors influencing the dynamics of these species. The R package includes two components: 1) a deterministic and stochastic model that are accessible from the command line and 2) a graphical user interface (GUI). Results: BatTool is an R package allowing natural resource managers and decision makers to understand Myotis spp. population dynamics. Through the use of a GUI, the model allows users to understand how WNS and other take events may affect the population. The results are saved both graphically and as data files. Additionally, R-savvy users may access the population functions through the command line and reuse the code as part of future research. This R package could also be used as part of a population dynamics or wildlife management course. Conclusions: BatTool provides access to a Myotis spp. population model. This tool can help natural resource managers and decision makers with the Endangered Species Act deliberations for these species and with issuing take permits as part of regulatory decision making. The tool is available online as part of this publication.

  8. BatTool: an R package with GUI for assessing the effect of White-nose syndrome and other take events on Myotis spp. of bats.

    PubMed

    Erickson, Richard A; Thogmartin, Wayne E; Szymanski, Jennifer A

    2014-01-01

    Myotis species of bats such as the Indiana Bat and Little Brown Bat are facing population declines because of White-nose syndrome (WNS). These species also face threats from anthropogenic activities such as wind energy development. Population models may be used to provide insights into threats facing these species. We developed a population model, BatTool, as an R package to help decision makers and natural resource managers examine factors influencing the dynamics of these species. The R package includes two components: 1) a deterministic and stochastic model that are accessible from the command line and 2) a graphical user interface (GUI). BatTool is an R package allowing natural resource managers and decision makers to understand Myotis spp. population dynamics. Through the use of a GUI, the model allows users to understand how WNS and other take events may affect the population. The results are saved both graphically and as data files. Additionally, R-savvy users may access the population functions through the command line and reuse the code as part of future research. This R package could also be used as part of a population dynamics or wildlife management course. BatTool provides access to a Myotis spp. population model. This tool can help natural resource managers and decision makers with the Endangered Species Act deliberations for these species and with issuing take permits as part of regulatory decision making. The tool is available online as part of this publication.

  9. 41 CFR 105-64.407 - How do I file a Statement of Disagreement?

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... Statement of Disagreement? You may file a Statement of Disagreement with the system manager within 30 days... inaccurate, irrelevant, untimely, or incomplete. The system manager will file the statement with your record... 41 Public Contracts and Property Management 3 2012-01-01 2012-01-01 false How do I file a...

  10. 77 FR 22610 - Notice of Filing of Plats of Survey; South Dakota

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-04-16

    ... DEPARTMENT OF THE INTERIOR Bureau of Land Management [LLMT926000-L19100000-BJ0000-LRCME1G05120] Notice of Filing of Plats of Survey; South Dakota AGENCY: Bureau of Land Management, Interior. ACTION: Notice of filing of plats of survey. SUMMARY: The Bureau of Land Management (BLM) will file the plat of...

  11. 76 FR 64967 - Notice of Filing of Plats of Survey; South Dakota

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-19

    ... DEPARTMENT OF THE INTERIOR Bureau of Land Management [LLMT926000-L19100000-BJ0000-LRCME0G01253] Notice of Filing of Plats of Survey; South Dakota AGENCY: Bureau of Land Management, Interior. ACTION: Notice of filing of plats of survey. SUMMARY: The Bureau of Land Management (BLM) will file the plat of...

  12. 76 FR 5397 - Notice of Filing of Plats of Survey; Montana

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-31

    ... DEPARTMENT OF THE INTERIOR Bureau of Land Management [LLMT926000-11-L19100000-BJ0000-LRCME0R04758] Notice of Filing of Plats of Survey; Montana AGENCY: Bureau of Land Management, Interior. ACTION: Notice of filing of plats of survey. SUMMARY: The Bureau of Land Management (BLM) will file the plat of...

  13. 75 FR 57287 - Notice of Filing of Plats of Survey; Montana

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-20

    ... DEPARTMENT OF THE INTERIOR Bureau of Land Management [LLMT926000-10-L19100000-BJ0000-LRCM08RS4649] Notice of Filing of Plats of Survey; Montana AGENCY: Bureau of Land Management, Interior. ACTION: Notice of filing of plats of survey. SUMMARY: The Bureau of Land Management (BLM) will file the plat of...

  14. 75 FR 31812 - Notice of Filing of Plats of Survey; Montana

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-04

    ... DEPARTMENT OF THE INTERIOR Bureau of Land Management [LLMT926000-10-L19100000-BJ0000-LRCM07RE4030] Notice of Filing of Plats of Survey; Montana AGENCY: Bureau of Land Management, Interior. ACTION: Notice of filing of plats of survey. SUMMARY: The Bureau of Land Management (BLM) will file the plat of...

  15. 76 FR 29006 - Notice of Filing of Plats of Survey; Montana

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-19

    ... DEPARTMENT OF THE INTERIOR Bureau of Land Management [LLMT926000-11-L19100000-BJ0000-LRCME0R04043] Notice of Filing of Plats of Survey; Montana AGENCY: Bureau of Land Management, Interior. ACTION: Notice of filing of plat of survey. SUMMARY: The Bureau of Land Management (BLM) will file the plat of...

  16. 77 FR 22610 - Notice of Filing of Plats of Survey; South Dakota

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-04-16

    ... DEPARTMENT OF THE INTERIOR Bureau of Land Management [LLMT926000-L19100000-BJ0000-LRCME1G04814] Notice of Filing of Plats of Survey; South Dakota AGENCY: Bureau of Land Management, Interior. ACTION: Notice of filing of plats of survey. SUMMARY: The Bureau of Land Management (BLM) will file the plat of...

  17. 76 FR 58533 - Notice of Filing of Plats of Survey; North Dakota

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-21

    ... DEPARTMENT OF THE INTERIOR Bureau of Land Management [LLMT926000-L19100000-BJ0000-LRCME0G03224] Notice of Filing of Plats of Survey; North Dakota AGENCY: Bureau of Land Management, Interior. ACTION: Notice of filing of plats of survey. SUMMARY: The Bureau of Land Management (BLM) will file the plat of...

  18. 76 FR 64969 - Notice of Filing of Plats of Survey; South Dakota

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-19

    ... DEPARTMENT OF THE INTERIOR Bureau of Land Management [LLMT926000-L19100000-BJ0000-LRCME0G04815] Notice of Filing of Plats of Survey; South Dakota AGENCY: Bureau of Land Management, Interior. ACTION: Notice of filing of plats of survey. SUMMARY: The Bureau of Land Management (BLM) will file the plat of...

  19. 76 FR 41821 - Notice of Filing of Plats of Survey; Montana

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-15

    ... DEPARTMENT OF THE INTERIOR Bureau of Land Management [LLMT926000-11-L19100000-BJ0000-LRCME0G03219] Notice of Filing of Plats of Survey; Montana AGENCY: Bureau of Land Management, Interior. ACTION: Notice of filing of plats of survey. SUMMARY: The Bureau of Land Management (BLM) will file the plat of...

  20. 77 FR 38321 - Notice of Filing of Plats of Survey; South Dakota

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-27

    ... DEPARTMENT OF THE INTERIOR Bureau of Land Management [LLMT926000-L19100000-BJ0000-LRCME1G04810] Notice of Filing of Plats of Survey; South Dakota AGENCY: Bureau of Land Management, Interior. ACTION: Notice of filing of plats of survey. SUMMARY: The Bureau of Land Management (BLM) will file the plat of...

  1. 76 FR 44946 - Notice of Filing of Plats of Survey; North Dakota

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-27

    ... DEPARTMENT OF THE INTERIOR Bureau of Land Management [LLMT926000-L19100000-BJ0000-LRCME0G03219] Notice of Filing of Plats of Survey; North Dakota AGENCY: Bureau of Land Management, Interior. ACTION: Notice of filing of plats of survey. SUMMARY: The Bureau of Land Management (BLM) will file the plat of...

  2. 41 CFR 102-117.190 - Where do I file a claim for loss or damage to property?

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 41 Public Contracts and Property Management 3 2012-01-01 2012-01-01 false Where do I file a claim... Management Federal Property Management Regulations System (Continued) FEDERAL MANAGEMENT REGULATION TRANSPORTATION 117-TRANSPORTATION MANAGEMENT Shipping Freight § 102-117.190 Where do I file a claim for loss or...

  3. A Proof-Carrying File System

    DTIC Science & Technology

    2009-06-06

    written in Standard ML, and comprises nearly 7,000 lines of code. OpenSSL is used for all cryptographic operations. Because the front end tools are used...be managed. Macrobenchmarks. To understand the performance of PCFS in practice, we also ran two simple macrobenchmarks. The first (called OpenSSL in...the table below), untars the OpenSSL source code, compiles it and deletes it. The other (called Fuse in the table below), performs similar operations

  4. morphforge: a toolbox for simulating small networks of biologically detailed neurons in Python

    PubMed Central

    Hull, Michael J.; Willshaw, David J.

    2014-01-01

    The broad structure of a modeling study can often be explained over a cup of coffee, but converting this high-level conceptual idea into graphs of the final simulation results may require many weeks of sitting at a computer. Although models themselves can be complex, often many mental resources are wasted working around complexities of the software ecosystem such as fighting to manage files, interfacing between tools and data formats, finding mistakes in code or working out the units of variables. morphforge is a high-level, Python toolbox for building and managing simulations of small populations of multicompartmental biophysical model neurons. An entire in silico experiment, including the definition of neuronal morphologies, channel descriptions, stimuli, visualization and analysis of results can be written within a single short Python script using high-level objects. Multiple independent simulations can be created and run from a single script, allowing parameter spaces to be investigated. Consideration has been given to the reuse of both algorithmic and parameterizable components to allow both specific and stochastic parameter variations. Some other features of the toolbox include: the automatic generation of human-readable documentation (e.g., PDF files) about a simulation; the transparent handling of different biophysical units; a novel mechanism for plotting simulation results based on a system of tags; and an architecture that supports both the use of established formats for defining channels and synapses (e.g., MODL files), and the possibility to support other libraries and standards easily. We hope that this toolbox will allow scientists to quickly build simulations of multicompartmental model neurons for research and serve as a platform for further tool development. PMID:24478690

  5. ARC SDK: A toolbox for distributed computing and data applications

    NASA Astrophysics Data System (ADS)

    Skou Andersen, M.; Cameron, D.; Lindemann, J.

    2014-06-01

    Grid middleware suites provide tools to perform the basic tasks of job submission and retrieval and data access, however these tools tend to be low-level, operating on individual jobs or files and lacking in higher-level concepts. User communities therefore generally develop their own application-layer software catering to their specific communities' needs on top of the Grid middleware. It is thus important for the Grid middleware to provide a friendly, well documented and simple to use interface for the applications to build upon. The Advanced Resource Connector (ARC), developed by NorduGrid, provides a Software Development Kit (SDK) which enables applications to use the middleware for job and data management. This paper presents the architecture and functionality of the ARC SDK along with an example graphical application developed with the SDK. The SDK consists of a set of libraries accessible through Application Programming Interfaces (API) in several languages. It contains extensive documentation and example code and is available on multiple platforms. The libraries provide generic interfaces and rely on plugins to support a given technology or protocol and this modular design makes it easy to add a new plugin if the application requires supporting additional technologies.The ARC Graphical Clients package is a graphical user interface built on top of the ARC SDK and the Qt toolkit and it is presented here as a fully functional example of an application. It provides a graphical interface to enable job submission and management at the click of a button, and allows data on any Grid storage system to be manipulated using a visual file system hierarchy, as if it were a regular file system.

  6. An Open-source Meteorological Operational System and its Installation in Portuguese- speaking Countries

    NASA Astrophysics Data System (ADS)

    Almeida, W. G.; Ferreira, A. L.; Mendes, M. V.; Ribeiro, A.; Yoksas, T.

    2007-05-01

    CPTEC, a division of Brazil’s INPE, has been using several open-source software packages for a variety of tasks in its Data Division. Among these tools are ones traditionally used in research and educational communities such as GrADs (Grid Analysis and Display System from the Center for Ocean-Land-Atmosphere Studies (COLA)), the Local Data Manager (LDM) and GEMPAK (from Unidata), andl operational tools such the Automatic File Distributor (AFD) that are popular among National Meteorological Services. In addition, some tools developed locally at CPTEC are also being made available as open-source packages. One package is being used to manage the data from Automatic Weather Stations that INPE operates. This system uses only open- source tools such as MySQL database, PERL scripts and Java programs for web access, and Unidata’s Internet Data Distribution (IDD) system and AFD for data delivery. All of these packages are get bundled into a low-cost and easy to install and package called the Meteorological Data Operational System. Recently, in a cooperation with the SICLIMAD project, this system has been modified for use by Portuguese- speaking countries in Africa to manage data from many Automatic Weather Stations that are being installed in these countries under SICLIMAD sponsorship. In this presentation we describe the tools included-in and and architecture-of the Meteorological Data Operational System.

  7. Using Hierarchical Folders and Tags for File Management

    ERIC Educational Resources Information Center

    Ma, Shanshan

    2010-01-01

    Hierarchical folders have been widely used for managing digital files. A well constructed hierarchical structure can keep files organized. A parent folder can have several subfolders and one subfolder can only reside in one parent folder. Files are stored in folders or subfolders. Files can be found by traversing a given path, going through…

  8. Empowering file-based radio production through media asset management systems

    NASA Astrophysics Data System (ADS)

    Muylaert, Bjorn; Beckers, Tom

    2006-10-01

    In recent years, IT-based production and archiving of media has matured to a level which enables broadcasters to switch over from tape- or CD-based to file-based workflows for the production of their radio and television programs. This technology is essential for the future of broadcasters as it provides the flexibility and speed of execution the customer demands by enabling, among others, concurrent access and production, faster than real-time ingest, edit during ingest, centrally managed annotation and quality preservation of media. In terms of automation of program production, the radio department is the most advanced within the VRT, the Flemish broadcaster. Since a couple of years ago, the radio department has been working with digital equipment and producing its programs mainly on standard IT equipment. Historically, the shift from analogue to digital based production has been a step by step process initiated and coordinated by each radio station separately, resulting in a multitude of tools and metadata collections, some of them developed in-house, lacking integration. To make matters worse, each of those stations adopted a slightly different production methodology. The planned introduction of a company-wide Media Asset Management System allows a coordinated overhaul to a unified production architecture. Benefits include the centralized ingest and annotation of audio material and the uniform, integrated (in terms of IT infrastructure) workflow model. Needless to say, the ingest strategy, metadata management and integration with radio production systems play a major role in the level of success of any improvement effort. This paper presents a data model for audio-specific concepts relevant to radio production. It includes an investigation of ingest techniques and strategies. Cooperation with external, professional production tools is demonstrated through a use-case scenario: the integration of an existing, multi-track editing tool with a commercially available Media Asset Management System. This will enable an uncomplicated production chain, with a recognizable look and feel for all system users, regardless of their affiliated radio station, as well as central retrieval and storage of information and metadata.

  9. Adding Data Management Services to Parallel File Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brandt, Scott

    2015-03-04

    The objective of this project, called DAMASC for “Data Management in Scientific Computing”, is to coalesce data management with parallel file system management to present a declarative interface to scientists for managing, querying, and analyzing extremely large data sets efficiently and predictably. Managing extremely large data sets is a key challenge of exascale computing. The overhead, energy, and cost of moving massive volumes of data demand designs where computation is close to storage. In current architectures, compute/analysis clusters access data in a physically separate parallel file system and largely leave it scientist to reduce data movement. Over the past decadesmore » the high-end computing community has adopted middleware with multiple layers of abstractions and specialized file formats such as NetCDF-4 and HDF5. These abstractions provide a limited set of high-level data processing functions, but have inherent functionality and performance limitations: middleware that provides access to the highly structured contents of scientific data files stored in the (unstructured) file systems can only optimize to the extent that file system interfaces permit; the highly structured formats of these files often impedes native file system performance optimizations. We are developing Damasc, an enhanced high-performance file system with native rich data management services. Damasc will enable efficient queries and updates over files stored in their native byte-stream format while retaining the inherent performance of file system data storage via declarative queries and updates over views of underlying files. Damasc has four key benefits for the development of data-intensive scientific code: (1) applications can use important data-management services, such as declarative queries, views, and provenance tracking, that are currently available only within database systems; (2) the use of these services becomes easier, as they are provided within a familiar file-based ecosystem; (3) common optimizations, e.g., indexing and caching, are readily supported across several file formats, avoiding effort duplication; and (4) performance improves significantly, as data processing is integrated more tightly with data storage. Our key contributions are: SciHadoop which explores changes to MapReduce assumption by taking advantage of semantics of structured data while preserving MapReduce’s failure and resource management; DataMods which extends common abstractions of parallel file systems so they become programmable such that they can be extended to natively support a variety of data models and can be hooked into emerging distributed runtimes such as Stanford’s Legion; and Miso which combines Hadoop and relational data warehousing to minimize time to insight, taking into account the overhead of ingesting data into data warehousing.« less

  10. Interoperability format translation and transformation between IFC architectural design file and simulation file formats

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chao, Tian-Jy; Kim, Younghun

    Automatically translating a building architecture file format (Industry Foundation Class) to a simulation file, in one aspect, may extract data and metadata used by a target simulation tool from a building architecture file. Interoperability data objects may be created and the extracted data is stored in the interoperability data objects. A model translation procedure may be prepared to identify a mapping from a Model View Definition to a translation and transformation function. The extracted data may be transformed using the data stored in the interoperability data objects, an input Model View Definition template, and the translation and transformation function tomore » convert the extracted data to correct geometric values needed for a target simulation file format used by the target simulation tool. The simulation file in the target simulation file format may be generated.« less

  11. A design for a new catalog manager and associated file management for the Land Analysis System (LAS)

    NASA Technical Reports Server (NTRS)

    Greenhagen, Cheryl

    1986-01-01

    Due to the larger number of different types of files used in an image processing system, a mechanism for file management beyond the bounds of typical operating systems is necessary. The Transportable Applications Executive (TAE) Catalog Manager was written to meet this need. Land Analysis System (LAS) users at the EROS Data Center (EDC) encountered some problems in using the TAE catalog manager, including catalog corruption, networking difficulties, and lack of a reliable tape storage and retrieval capability. These problems, coupled with the complexity of the TAE catalog manager, led to the decision to design a new file management system for LAS, tailored to the needs of the EDC user community. This design effort, which addressed catalog management, label services, associated data management, and enhancements to LAS applications, is described. The new file management design will provide many benefits including improved system integration, increased flexibility, enhanced reliability, enhanced portability, improved performance, and improved maintainability.

  12. Design and Implementation of a Web-Based Reporting and Benchmarking Center for Inpatient Glucometrics

    PubMed Central

    Schnipper, Jeffrey Lawrence; Messler, Jordan; Ramos, Pedro; Kulasa, Kristen; Nolan, Ann; Rogers, Kendall

    2014-01-01

    Background: Insulin is a top source of adverse drug events in the hospital, and glycemic control is a focus of improvement efforts across the country. Yet, the majority of hospitals have no data to gauge their performance on glycemic control, hypoglycemia rates, or hypoglycemic management. Current tools to outsource glucometrics reports are limited in availability or function. Methods: Society of Hospital Medicine (SHM) faculty designed and implemented a web-based data and reporting center that calculates glucometrics on blood glucose data files securely uploaded by users. Unit labels, care type (critical care, non–critical care), and unit type (eg, medical, surgical, mixed, pediatrics) are defined on upload allowing for robust, flexible reporting. Reports for any date range, care type, unit type, or any combination of units are available on demand for review or downloading into a variety of file formats. Four reports with supporting graphics depict glycemic control, hypoglycemia, and hypoglycemia management by patient day or patient stay. Benchmarking and performance ranking reports are generated periodically for all hospitals in the database. Results: In all, 76 hospitals have uploaded at least 12 months of data for non–critical care areas and 67 sites have uploaded critical care data. Critical care benchmarking reveals wide variability in performance. Some hospitals achieve top quartile performance in both glycemic control and hypoglycemia parameters. Conclusions: This new web-based glucometrics data and reporting tool allows hospitals to track their performance with a flexible reporting system, and provides them with external benchmarking. Tools like this help to establish standardized glucometrics and performance standards. PMID:24876426

  13. Design and implementation of a web-based reporting and benchmarking center for inpatient glucometrics.

    PubMed

    Maynard, Greg; Schnipper, Jeffrey Lawrence; Messler, Jordan; Ramos, Pedro; Kulasa, Kristen; Nolan, Ann; Rogers, Kendall

    2014-07-01

    Insulin is a top source of adverse drug events in the hospital, and glycemic control is a focus of improvement efforts across the country. Yet, the majority of hospitals have no data to gauge their performance on glycemic control, hypoglycemia rates, or hypoglycemic management. Current tools to outsource glucometrics reports are limited in availability or function. Society of Hospital Medicine (SHM) faculty designed and implemented a web-based data and reporting center that calculates glucometrics on blood glucose data files securely uploaded by users. Unit labels, care type (critical care, non-critical care), and unit type (eg, medical, surgical, mixed, pediatrics) are defined on upload allowing for robust, flexible reporting. Reports for any date range, care type, unit type, or any combination of units are available on demand for review or downloading into a variety of file formats. Four reports with supporting graphics depict glycemic control, hypoglycemia, and hypoglycemia management by patient day or patient stay. Benchmarking and performance ranking reports are generated periodically for all hospitals in the database. In all, 76 hospitals have uploaded at least 12 months of data for non-critical care areas and 67 sites have uploaded critical care data. Critical care benchmarking reveals wide variability in performance. Some hospitals achieve top quartile performance in both glycemic control and hypoglycemia parameters. This new web-based glucometrics data and reporting tool allows hospitals to track their performance with a flexible reporting system, and provides them with external benchmarking. Tools like this help to establish standardized glucometrics and performance standards. © 2014 Diabetes Technology Society.

  14. Planetary Image Geometry Library

    NASA Technical Reports Server (NTRS)

    Deen, Robert C.; Pariser, Oleg

    2010-01-01

    The Planetary Image Geometry (PIG) library is a multi-mission library used for projecting images (EDRs, or Experiment Data Records) and managing their geometry for in-situ missions. A collection of models describes cameras and their articulation, allowing application programs such as mosaickers, terrain generators, and pointing correction tools to be written in a multi-mission manner, without any knowledge of parameters specific to the supported missions. Camera model objects allow transformation of image coordinates to and from view vectors in XYZ space. Pointing models, specific to each mission, describe how to orient the camera models based on telemetry or other information. Surface models describe the surface in general terms. Coordinate system objects manage the various coordinate systems involved in most missions. File objects manage access to metadata (labels, including telemetry information) in the input EDRs and RDRs (Reduced Data Records). Label models manage metadata information in output files. Site objects keep track of different locations where the spacecraft might be at a given time. Radiometry models allow correction of radiometry for an image. Mission objects contain basic mission parameters. Pointing adjustment ("nav") files allow pointing to be corrected. The object-oriented structure (C++) makes it easy to subclass just the pieces of the library that are truly mission-specific. Typically, this involves just the pointing model and coordinate systems, and parts of the file model. Once the library was developed (initially for Mars Polar Lander, MPL), adding new missions ranged from two days to a few months, resulting in significant cost savings as compared to rewriting all the application programs for each mission. Currently supported missions include Mars Pathfinder (MPF), MPL, Mars Exploration Rover (MER), Phoenix, and Mars Science Lab (MSL). Applications based on this library create the majority of operational image RDRs for those missions. A Java wrapper around the library allows parts of it to be used from Java code (via a native JNI interface). Future conversions of all or part of the library to Java are contemplated.

  15. MolabIS--an integrated information system for storing and managing molecular genetics data.

    PubMed

    Truong, Cong V C; Groeneveld, Linn F; Morgenstern, Burkhard; Groeneveld, Eildert

    2011-10-31

    Long-term sample storage, tracing of data flow and data export for subsequent analyses are of great importance in genetics studies. Therefore, molecular labs do need a proper information system to handle an increasing amount of data from different projects. We have developed a molecular labs information management system (MolabIS). It was implemented as a web-based system allowing the users to capture original data at each step of their workflow. MolabIS provides essential functionality for managing information on individuals, tracking samples and storage locations, capturing raw files, importing final data from external files, searching results, accessing and modifying data. Further important features are options to generate ready-to-print reports and convert sequence and microsatellite data into various data formats, which can be used as input files in subsequent analyses. Moreover, MolabIS also provides a tool for data migration. MolabIS is designed for small-to-medium sized labs conducting Sanger sequencing and microsatellite genotyping to store and efficiently handle a relative large amount of data. MolabIS not only helps to avoid time consuming tasks but also ensures the availability of data for further analyses. The software is packaged as a virtual appliance which can run on different platforms (e.g. Linux, Windows). MolabIS can be distributed to a wide range of molecular genetics labs since it was developed according to a general data model. Released under GPL, MolabIS is freely available at http://www.molabis.org.

  16. MolabIS - An integrated information system for storing and managing molecular genetics data

    PubMed Central

    2011-01-01

    Background Long-term sample storage, tracing of data flow and data export for subsequent analyses are of great importance in genetics studies. Therefore, molecular labs do need a proper information system to handle an increasing amount of data from different projects. Results We have developed a molecular labs information management system (MolabIS). It was implemented as a web-based system allowing the users to capture original data at each step of their workflow. MolabIS provides essential functionality for managing information on individuals, tracking samples and storage locations, capturing raw files, importing final data from external files, searching results, accessing and modifying data. Further important features are options to generate ready-to-print reports and convert sequence and microsatellite data into various data formats, which can be used as input files in subsequent analyses. Moreover, MolabIS also provides a tool for data migration. Conclusions MolabIS is designed for small-to-medium sized labs conducting Sanger sequencing and microsatellite genotyping to store and efficiently handle a relative large amount of data. MolabIS not only helps to avoid time consuming tasks but also ensures the availability of data for further analyses. The software is packaged as a virtual appliance which can run on different platforms (e.g. Linux, Windows). MolabIS can be distributed to a wide range of molecular genetics labs since it was developed according to a general data model. Released under GPL, MolabIS is freely available at http://www.molabis.org. PMID:22040322

  17. Software for Managing Personal Files.

    ERIC Educational Resources Information Center

    Lundeen, Gerald

    1989-01-01

    Discusses the special characteristics of personal file management software and compares four microcomputer software packages: Notebook II with Bibliography and Convert, Pro-Cite with Biblio-Links, askSam, and Reference Manager. Each package is evaluated in terms of the user interface, file maintenance, retrieval capabilities, output, and…

  18. 14 CFR 406.113 - Filing documents with the Docket Management System (DMS) and sending documents to the...

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 14 Aeronautics and Space 4 2014-01-01 2014-01-01 false Filing documents with the Docket Management... Management System (DMS) and sending documents to the administrative law judge and Assistant Chief Counsel for Litigation. (a) The Federal Docket Management System (FDMS). (1) Documents filed in a civil penalty...

  19. 14 CFR 406.113 - Filing documents with the Docket Management System (DMS) and sending documents to the...

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 14 Aeronautics and Space 4 2013-01-01 2013-01-01 false Filing documents with the Docket Management... Management System (DMS) and sending documents to the administrative law judge and Assistant Chief Counsel for Litigation. (a) The Federal Docket Management System (FDMS). (1) Documents filed in a civil penalty...

  20. 14 CFR 406.113 - Filing documents with the Docket Management System (DMS) and sending documents to the...

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 14 Aeronautics and Space 4 2012-01-01 2012-01-01 false Filing documents with the Docket Management... Management System (DMS) and sending documents to the administrative law judge and Assistant Chief Counsel for Litigation. (a) The Federal Docket Management System (FDMS). (1) Documents filed in a civil penalty...

  1. 41 CFR 105-54.203-2 - Active charters file.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 41 Public Contracts and Property Management 3 2012-01-01 2012-01-01 false Active charters file... Regulations System (Continued) GENERAL SERVICES ADMINISTRATION Regional Offices-General Services... charters file. The GSA Committee Management Officer retains each original signed charter in a file of...

  2. Space Communications Emulation Facility

    NASA Technical Reports Server (NTRS)

    Hill, Chante A.

    2004-01-01

    Establishing space communication between ground facilities and other satellites is a painstaking task that requires many precise calculations dealing with relay time, atmospheric conditions, and satellite positions, to name a few. The Space Communications Emulation Facility (SCEF) team here at NASA is developing a facility that will approximately emulate the conditions in space that impact space communication. The emulation facility is comprised of a 32 node distributed cluster of computers; each node representing a satellite or ground station. The objective of the satellites is to observe the topography of the Earth (water, vegetation, land, and ice) and relay this information back to the ground stations. Software originally designed by the University of Kansas, labeled the Emulation Manager, controls the interaction of the satellites and ground stations, as well as handling the recording of data. The Emulation Manager is installed on a Linux Operating System, employing both Java and C++ programming codes. The emulation scenarios are written in extensible Markup Language, XML. XML documents are designed to store, carry, and exchange data. With XML documents data can be exchanged between incompatible systems, which makes it ideal for this project because Linux, MAC and Windows Operating Systems are all used. Unfortunately, XML documents cannot display data like HTML documents. Therefore, the SCEF team uses XML Schema Definition (XSD) or just schema to describe the structure of an XML document. Schemas are very important because they have the capability to validate the correctness of data, define restrictions on data, define data formats, and convert data between different data types, among other things. At this time, in order for the Emulation Manager to open and run an XML emulation scenario file, the user must first establish a link between the schema file and the directory under which the XML scenario files are saved. This procedure takes place on the command line on the Linux Operating System. Once this link has been established the Emulation manager validates all the XML files in that directory against the schema file, before the actual scenario is run. Using some very sophisticated commercial software called the Satellite Tool Kit (STK) installed on the Linux box, the Emulation Manager is able to display the data and graphics generated by the execution of a XML emulation scenario file. The Emulation Manager software is written in JAVA programming code. Since the SCEF project is in the developmental stage, the source code for this type of software is being modified to better fit the requirements of the SCEF project. Some parameters for the emulation are hard coded, set at fixed values. Members of the SCEF team are altering the code to allow the user to choose the values of these hard coded parameters by inserting a toolbar onto the preexisting GUI.

  3. bioalcidae, samjs and vcffilterjs: object-oriented formatters and filters for bioinformatics files.

    PubMed

    Lindenbaum, Pierre; Redon, Richard

    2018-04-01

    Reformatting and filtering bioinformatics files are common tasks for bioinformaticians. Standard Linux tools and specific programs are usually used to perform such tasks but there is still a gap between using these tools and the programming interface of some existing libraries. In this study, we developed a set of tools namely bioalcidae, samjs and vcffilterjs that reformat or filter files using a JavaScript engine or a pure java expression and taking advantage of the java API for high-throughput sequencing data (htsjdk). https://github.com/lindenb/jvarkit. pierre.lindenbaum@univ-nantes.fr.

  4. IDG - INTERACTIVE DIF GENERATOR

    NASA Technical Reports Server (NTRS)

    Preheim, L. E.

    1994-01-01

    The Interactive DIF Generator (IDG) utility is a tool used to generate and manipulate Directory Interchange Format files (DIF). Its purpose as a specialized text editor is to create and update DIF files which can be sent to NASA's Master Directory, also referred to as the International Global Change Directory at Goddard. Many government and university data systems use the Master Directory to advertise the availability of research data. The IDG interface consists of a set of four windows: (1) the IDG main window; (2) a text editing window; (3) a text formatting and validation window; and (4) a file viewing window. The IDG main window starts up the other windows and contains a list of valid keywords. The keywords are loaded from a user-designated file and selected keywords can be copied into any active editing window. Once activated, the editing window designates the file to be edited. Upon switching from the editing window to the formatting and validation window, the user has options for making simple changes to one or more files such as inserting tabs, aligning fields, and indenting groups. The viewing window is a scrollable read-only window that allows fast viewing of any text file. IDG is an interactive tool and requires a mouse or a trackball to operate. IDG uses the X Window System to build and manage its interactive forms, and also uses the Motif widget set and runs under Sun UNIX. IDG is written in C-language for Sun computers running SunOS. This package requires the X Window System, Version 11 Revision 4, with OSF/Motif 1.1. IDG requires 1.8Mb of hard disk space. The standard distribution medium for IDG is a .25 inch streaming magnetic tape cartridge in UNIX tar format. It is also available on a 3.5 inch diskette in UNIX tar format. The program was developed in 1991 and is a copyrighted work with all copyright vested in NASA. SunOS is a trademark of Sun Microsystems, Inc. X Window System is a trademark of Massachusetts Institute of Technology. OSF/Motif is a trademark of the Open Software Foundation, Inc. UNIX is a trademark of Bell Laboratories.

  5. 76 FR 21779 - Filing of Plats of Survey: California

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-18

    ... DEPARTMENT OF THE INTERIOR Bureau of Land Management [LLCA 942000, L57000000.BX0000] Filing of Plats of Survey: California AGENCY: Bureau of Land Management, Interior. ACTION: Notice. SUMMARY: The plats of survey of lands described below were officially filed in the Bureau of Land Management...

  6. Introduction of a Data System at the Universite Paul Sabatier, Toulouse (France). Programme on Institutional Management in Higher Education.

    ERIC Educational Resources Information Center

    Prineau, J. P.

    The data system and its branches, computerized in 1970, provide information from the following: student records file, accountancy file, an experimental-stage personnel file, and a planning-stage facilities file. The files not only cope with the university's daily management duties but also supply the French Ministry with statistics. Two types of…

  7. 43 CFR 1822.13 - May I file electronically?

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... MANAGEMENT, DEPARTMENT OF THE INTERIOR GENERAL MANAGEMENT (1000) APPLICATION PROCEDURES Filing a Document... electronic filing if an original signature is not required. If BLM requires your signature, you must file your application or document by delivery or by mailing. If you have any questions regarding which types...

  8. 43 CFR 1822.13 - May I file electronically?

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... MANAGEMENT, DEPARTMENT OF THE INTERIOR GENERAL MANAGEMENT (1000) APPLICATION PROCEDURES Filing a Document... electronic filing if an original signature is not required. If BLM requires your signature, you must file your application or document by delivery or by mailing. If you have any questions regarding which types...

  9. 43 CFR 1822.13 - May I file electronically?

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... MANAGEMENT, DEPARTMENT OF THE INTERIOR GENERAL MANAGEMENT (1000) APPLICATION PROCEDURES Filing a Document... electronic filing if an original signature is not required. If BLM requires your signature, you must file your application or document by delivery or by mailing. If you have any questions regarding which types...

  10. 43 CFR 1822.13 - May I file electronically?

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... MANAGEMENT, DEPARTMENT OF THE INTERIOR GENERAL MANAGEMENT (1000) APPLICATION PROCEDURES Filing a Document... electronic filing if an original signature is not required. If BLM requires your signature, you must file your application or document by delivery or by mailing. If you have any questions regarding which types...

  11. Mash-up of techniques between data crawling/transfer, data preservation/stewardship and data processing/visualization technologies on a science cloud system designed for Earth and space science: a report of successful operation and science projects of the NICT Science Cloud

    NASA Astrophysics Data System (ADS)

    Murata, K. T.

    2014-12-01

    Data-intensive or data-centric science is 4th paradigm after observational and/or experimental science (1st paradigm), theoretical science (2nd paradigm) and numerical science (3rd paradigm). Science cloud is an infrastructure for 4th science methodology. The NICT science cloud is designed for big data sciences of Earth, space and other sciences based on modern informatics and information technologies [1]. Data flow on the cloud is through the following three techniques; (1) data crawling and transfer, (2) data preservation and stewardship, and (3) data processing and visualization. Original tools and applications of these techniques have been designed and implemented. We mash up these tools and applications on the NICT Science Cloud to build up customized systems for each project. In this paper, we discuss science data processing through these three steps. For big data science, data file deployment on a distributed storage system should be well designed in order to save storage cost and transfer time. We developed a high-bandwidth virtual remote storage system (HbVRS) and data crawling tool, NICTY/DLA and Wide-area Observation Network Monitoring (WONM) system, respectively. Data files are saved on the cloud storage system according to both data preservation policy and data processing plan. The storage system is developed via distributed file system middle-ware (Gfarm: GRID datafarm). It is effective since disaster recovery (DR) and parallel data processing are carried out simultaneously without moving these big data from storage to storage. Data files are managed on our Web application, WSDBank (World Science Data Bank). The big-data on the cloud are processed via Pwrake, which is a workflow tool with high-bandwidth of I/O. There are several visualization tools on the cloud; VirtualAurora for magnetosphere and ionosphere, VDVGE for google Earth, STICKER for urban environment data and STARStouch for multi-disciplinary data. There are 30 projects running on the NICT Science Cloud for Earth and space science. In 2003 56 refereed papers were published. At the end, we introduce a couple of successful results of Earth and space sciences using these three techniques carried out on the NICT Sciences Cloud. [1] http://sc-web.nict.go.jp

  12. Development of a user-centered radiology teaching file system

    NASA Astrophysics Data System (ADS)

    dos Santos, Marcelo; Fujino, Asa

    2011-03-01

    Learning radiology requires systematic and comprehensive study of a large knowledge base of medical images. In this work is presented the development of a digital radiology teaching file system. The proposed system has been created in order to offer a set of customized services regarding to users' contexts and their informational needs. This has been done by means of an electronic infrastructure that provides easy and integrated access to all relevant patient data at the time of image interpretation, so that radiologists and researchers can examine all available data to reach well-informed conclusions, while protecting patient data privacy and security. The system is presented such as an environment which implements a distributed clinical database, including medical images, authoring tools, repository for multimedia documents, and also a peer-reviewed model which assures dataset quality. The current implementation has shown that creating clinical data repositories on networked computer environments points to be a good solution in terms of providing means to review information management practices in electronic environments and to create customized and contextbased tools for users connected to the system throughout electronic interfaces.

  13. UAEMIAAE

    Atmospheric Science Data Center

    2013-12-19

    UAEMIAAE Aerosol product. ( File version details ) File version  F07_0015  has better ... properties. File version  F08_0016  has improved cloud screening procedure resulting in better aerosol optical depth. ... Coverage:  August - October 2004 File Format:  HDF-EOS Tools:  FTP Access: Data Pool ...

  14. 75 FR 1406 - Idaho: Filing of Plats of Survey

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-01-11

    ... Survey AGENCY: Bureau of Land Management, Interior. ACTION: Notice of filing of plats of surveys. SUMMARY: The Bureau of Land Management (BLM) has officially filed the plats of survey of the lands described... INFORMATION: These surveys were executed at the request of the Bureau of Land Management to meet their...

  15. 78 FR 65705 - Notice of Filing of Plats of Survey; Colorado

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-01

    ... DEPARTMENT OF THE INTERIOR Bureau of Land Management [LLCO956000 L14200000.BJ0000] Notice of Filing of Plats of Survey; Colorado AGENCY: Bureau of Land Management, Interior. ACTION: Notice of Filing of Plats of Survey; Colorado. SUMMARY: On Thursday, October 6, 2011, the Bureau of Land Management...

  16. 77 FR 30314 - Notice of Filing of Plats

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-22

    ... DEPARTMENT OF INTERIOR Bureau of Land Management [LLCO956000 L14200000.BJ0000] Notice of Filing of Plats AGENCY: Bureau of Land Management, Interior. ACTION: Notice of filing of plats; Colorado. SUMMARY: The Bureau of Land Management (BLM) Colorado State Office is publishing this notice to inform the...

  17. 75 FR 77659 - Notice of Stay of Filing of Plat

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-13

    ... DEPARTMENT OF THE INTERIOR Bureau of Land Management [LLCO956000.L14200000 BJ0000] Notice of Stay of Filing of Plat AGENCY: Bureau of Land Management, Interior. ACTION: Notice of Stay of Filing of Plat. SUMMARY: On Wednesday, November 3, 2010, the Bureau of Land Management, [[Page 77660

  18. 76 FR 74073 - Notice of Filing of Plats

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-30

    ... DEPARTMENT OF INTERIOR Bureau of Land Management [LLCO956000.L14200000 BJ0000] Notice of Filing of Plats AGENCY: Bureau of Land Management, Interior. ACTION: Notice of Filing of Plats; Colorado SUMMARY: The Bureau of Land Management (BLM) Colorado State Office is publishing this notice to inform the...

  19. 77 FR 17499 - Notice of Filing of Plats

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-26

    ... DEPARTMENT OF THE INTERIOR Bureau of Land Management [LLCO956000.L14200000 BJ0000] Notice of Filing of Plats AGENCY: Bureau of Land Management, Interior. ACTION: Notice of Filing of Plats; Colorado. SUMMARY: The Bureau of Land Management (BLM) Colorado State Office is publishing this notice to inform the...

  20. 77 FR 30550 - Notice of Filing of Plats

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-23

    ... DEPARTMENT OF INTERIOR Bureau of Land Management [LLCO956000 L14200000.BJ0000] Notice of Filing of Plats AGENCY: Bureau of Land Management, Interior. ACTION: Notice of Filing of Plats; Colorado. SUMMARY: The Bureau of Land Management (BLM) Colorado State Office is publishing this notice to inform the...

  1. 75 FR 4412 - Eastern States: Filing of Plat of Survey

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-01-27

    ... DEPARTMENT OF THE INTERIOR Bureau of Land Management [LLES956000-L14200000-BJ0000-LXSITRST0000] Eastern States: Filing of Plat of Survey AGENCY: Bureau of Land Management, Interior. ACTION: Notice of filing of plats of survey; Minnesota and Wisconsin. SUMMARY: The Bureau of Land Management (BLM) will...

  2. 78 FR 12348 - Notice of Filing of Plats of Survey; Colorado

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-22

    ... DEPARTMENT OF THE INTERIOR Bureau of Land Management [LLCO956000 L14200000.BJ0000] Notice of Filing of Plats of Survey; Colorado AGENCY: Bureau of Land Management, Interior. ACTION: Notice of Filing of Plats of Survey; Colorado. SUMMARY: The Bureau of Land Management (BLM) Colorado State Office is...

  3. 77 FR 35422 - Filing of Plats of Survey, Wyoming

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-13

    ... DEPARTMENT OF THE INTERIOR Bureau of Land Management [LLWY-957400-12-L14200000-BJ0000] Filing of Plats of Survey, Wyoming AGENCY: Bureau of Land Management, Interior. ACTION: Notice. SUMMARY: The Bureau of Land Management (BLM) has filed the plats of survey of the lands described below in the BLM...

  4. 77 FR 17499 - Notice of Filing of Plats

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-26

    ... DEPARTMENT OF THE INTERIOR Bureau of Land Management [LLCO956000.L14200000 BJ0000] Notice of Filing of Plats AGENCY: Bureau of Land Management, Interior. ACTION: Notice of filing of plats; Colorado. SUMMARY: The Bureau of Land Management (BLM) Colorado State Office is publishing this notice to inform the...

  5. 78 FR 58343 - Notice of Filing of Plats of Survey; Colorado

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-23

    ... DEPARTMENT OF THE INTERIOR Bureau of Land Management [LLCO956000 L14200000.BJ0000] Notice of Filing of Plats of Survey; Colorado AGENCY: Bureau of Land Management, Interior. ACTION: Notice of Filing of Plats of Survey; Colorado SUMMARY: The Bureau of Land Management (BLM) Colorado State Office is...

  6. 76 FR 38415 - Notice of Filing of Plats

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-30

    ... DEPARTMENT OF INTERIOR Bureau of Land Management [LLCO956000.L14200000 BJ0000] Notice of Filing of Plats AGENCY: Bureau of Land Management, Interior. ACTION: Notice of filing of plats. SUMMARY: The Bureau of Land Management (BLM) Colorado State Office is publishing this notice to inform the public of...

  7. 77 FR 66630 - Notice of Filing of Plats of Survey; Colorado

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-06

    ... DEPARTMENT OF THE INTERIOR Bureau of Land Management [LLCO956000 L14200000.BJ0000] Notice of Filing of Plats of Survey; Colorado AGENCY: Bureau of Land Management, Interior. ACTION: Notice of filing of plats of survey; Colorado. SUMMARY: The Bureau of Land Management (BLM) Colorado State Office is...

  8. 75 FR 31811 - Notice of filing of plats

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-04

    ... DEPARTMENT OF THE INTERIOR Bureau of Land Management [LLCO956000.L14200000 BJ0000] Notice of filing of plats AGENCY: Bureau of Land Management, Interior. ACTION: Notice of filing of plats. SUMMARY: The Bureau of Land Management (BLM) is publishing this notice to inform the public of the intent to...

  9. 78 FR 12349 - Notice of Filing of Plats of Survey; Colorado

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-22

    ... DEPARTMENT OF THE INTERIOR Bureau of Land Management [LLCO956000 L14200000.BJ0000] Notice of Filing of Plats of Survey; Colorado AGENCY: Bureau of Land Management, Interior. ACTION: Notice of Filing of Plats of Survey; Colorado. SUMMARY: The Bureau of Land Management (BLM) Colorado State Office is...

  10. 77 FR 45651 - Notice of Filing of Plats

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-01

    ... DEPARTMENT OF THE INTERIOR Bureau of Land Management [LLCO956000 L14200000.BJ0000] Notice of Filing of Plats AGENCY: Bureau of Land Management, Interior. ACTION: Notice of Filing of Plats; Colorado. SUMMARY: The Bureau of Land Management (BLM) Colorado State Office is publishing this notice to inform the...

  11. 77 FR 50712 - Notice of Filing of Plats; Colorado.

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-22

    ... DEPARTMENT OF THE INTERIOR Bureau of Land Management [LLCO956000 L14200000.BJ0000] Notice of Filing of Plats; Colorado. AGENCY: Bureau of Land Management, Interior. ACTION: Notice of filing of plats; Colorado. SUMMARY: The Bureau of Land Management (BLM) Colorado State Office is publishing this notice to...

  12. 77 FR 58862 - Notice of Filing of Plats of Survey; Colorado

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-09-24

    ... DEPARTMENT OF THE INTERIOR Bureau of Land Management [LLCO956000 L14200000.BJ0000] Notice of Filing of Plats of Survey; Colorado AGENCY: Bureau of Land Management, Interior. ACTION: Notice of Filing of Plats of Survey; Colorado SUMMARY: The Bureau of Land Management (BLM) Colorado State Office is...

  13. 75 FR 67766 - Notice of Filing of Plats

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-03

    ... DEPARTMENT OF INTERIOR Bureau of Land Management Notice of Filing of Plats [LLCO956000.L14200000 BJ0000] AGENCY: Bureau of Land Management, Interior. ACTION: Notice of filing of plats. SUMMARY: The Bureau of Land Management (BLM) Colorado State Office is publishing this notice to inform the public of...

  14. 78 FR 74160 - Notice of Filing of Plats of Survey; Colorado

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-10

    ... DEPARTMENT OF THE INTERIOR Bureau of Land Management [LLCO956000 L14200000.BJ0000] Notice of Filing of Plats of Survey; Colorado AGENCY: Bureau of Land Management, Interior. ACTION: Notice of filing of plats of survey; Colorado. SUMMARY: The Bureau of Land Management (BLM) Colorado State Office is...

  15. 78 FR 32439 - Notice of Filing of Plats of Survey; Colorado

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-30

    ... DEPARTMENT OF THE INTERIOR Bureau of Land Management [LLCO956000 L14200000.BJ0000] Notice of Filing of Plats of Survey; Colorado AGENCY: Bureau of Land Management, Interior. ACTION: Notice of Filing of Plats of Survey; Colorado SUMMARY: The Bureau of Land Management (BLM) Colorado State Office is...

  16. 76 FR 79707 - Notice of Filing of Plats

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-22

    ... DEPARTMENT OF INTERIOR Bureau of Land Management [LLCO956000.L14200000 BJ0000] Notice of Filing of Plats AGENCY: Bureau of Land Management, Interior. ACTION: Notice of Filing of Plats; Colorado. SUMMARY: The Bureau of Land Management (BLM) Colorado State Office is publishing this notice to inform the...

  17. 78 FR 20943 - Notice of Filing of Plats of Survey; Colorado

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-08

    ... DEPARTMENT OF THE INTERIOR Bureau of Land Management [LLCO956000 L14200000.BJ0000] Notice of Filing of Plats of Survey; Colorado AGENCY: Bureau of Land Management, Interior. ACTION: Notice of Filing of Plats of Survey; Colorado SUMMARY: The Bureau of Land Management (BLM) Colorado State Office is...

  18. 77 FR 36573 - Notice of Filing of Plats

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-19

    ... DEPARTMENT OF INTERIOR Bureau of Land Management [LLCO956000 L14200000.BJ0000] Notice of Filing of Plats AGENCY: Bureau of Land Management, Interior. ACTION: Notice of Filing of Plats; Colorado. SUMMARY: The Bureau of Land Management (BLM) Colorado State Office is publishing this notice to inform the...

  19. 78 FR 42799 - Notice of Filing of Plats of Survey; Colorado.

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-17

    ... DEPARTMENT OF THE INTERIOR Bureau of Land Management [LLCO956000 L14200000.BJ0000] Notice of Filing of Plats of Survey; Colorado. AGENCY: Bureau of Land Management, Interior. ACTION: Notice of Filing of Plats of Survey; Colorado SUMMARY: The Bureau of Land Management (BLM) Colorado State Office is...

  20. Innovative Stormwater Quality Tools by SARA for Holistic Watershed Master Planning

    NASA Astrophysics Data System (ADS)

    Thomas, S. M.; Su, Y. C.; Hummel, P. R.

    2016-12-01

    Stormwater management strategies such as Best Management Practices (BMP) and Low-Impact Development (LID) have increasingly gained attention in urban runoff control, becoming vital to holistic watershed master plans. These strategies can help address existing water quality impairments and support regulatory compliance, as well as guide planning and management of future development when substantial population growth and urbanization is projected to occur. However, past efforts have been limited to qualitative planning due to the lack of suitable tools to conduct quantitative assessment. The San Antonio River Authority (SARA), with the assistance of Lockwood, Andrews & Newnam, Inc. (LAN) and AQUA TERRA Consultants (a division of RESPEC), developed comprehensive hydrodynamic and water quality models using the Hydrological Simulation Program-FORTRAN (HSPF) for several urban watersheds in the San Antonio River Basin. These models enabled watershed management to look at water quality issues on a more refined temporal and spatial scale than the limited monitoring data. They also provided a means to locate and quantify potential water quality impairments and evaluate the effects of mitigation measures. To support the models, a suite of software tools were developed. including: 1) SARA Timeseries Utility Tool for managing and processing of large model timeseries files, 2) SARA Load Reduction Tool to determine load reductions needed to achieve screening levels for each modeled constituent on a sub-basin basis, and 3) SARA Enhanced BMP Tool to determine the optimal combination of BMP types and units needed to achieve the required load reductions. Using these SARA models and tools, water quality agencies and stormwater professionals can determine the optimal combinations of BMP/LID to accomplish their goals and save substantial stormwater infrastructure and management costs. The tools can also help regulators and permittees evaluate the feasibility of achieving compliance using BMP/LID. The project has gained national attention, being showcased in multiple newsletters, professional magazines, and conference presentations. The project also won the Texas American Council of Engineering Companies (ACEC) Gold Medal Award and the ACEC National Recognition Award in 2016.

  1. Rotorcraft Optimization Tools: Incorporating Rotorcraft Design Codes into Multi-Disciplinary Design, Analysis, and Optimization

    NASA Technical Reports Server (NTRS)

    Meyn, Larry A.

    2018-01-01

    One of the goals of NASA's Revolutionary Vertical Lift Technology Project (RVLT) is to provide validated tools for multidisciplinary design, analysis and optimization (MDAO) of vertical lift vehicles. As part of this effort, the software package, RotorCraft Optimization Tools (RCOTOOLS), is being developed to facilitate incorporating key rotorcraft conceptual design codes into optimizations using the OpenMDAO multi-disciplinary optimization framework written in Python. RCOTOOLS, also written in Python, currently supports the incorporation of the NASA Design and Analysis of RotorCraft (NDARC) vehicle sizing tool and the Comprehensive Analytical Model of Rotorcraft Aerodynamics and Dynamics II (CAMRAD II) analysis tool into OpenMDAO-driven optimizations. Both of these tools use detailed, file-based inputs and outputs, so RCOTOOLS provides software wrappers to update input files with new design variable values, execute these codes and then extract specific response variable values from the file outputs. These wrappers are designed to be flexible and easy to use. RCOTOOLS also provides several utilities to aid in optimization model development, including Graphical User Interface (GUI) tools for browsing input and output files in order to identify text strings that are used to identify specific variables as optimization input and response variables. This paper provides an overview of RCOTOOLS and its use

  2. 76 FR 6816 - Eastern States: Filing of Plat of Survey

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-08

    ...] Eastern States: Filing of Plat of Survey AGENCY: Bureau of Land Management, Interior. ACTION: Notice of... in the Federal Register, Volume 76, Number 8, on page 2133 a notice entitled ``Eastern States: Filing... 1, 2011 is official filed. FOR FURTHER INFORMATION CONTACT: Bureau of Land Management--Eastern...

  3. 17 CFR 202.3 - Processing of filings.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... of 1939, which are routed to the Division of Investment Management, and filings under the Public Utility Holding Company Act of 1935 which are also routed to the Division of Investment Management. A... respect to filings under the Investment Company Act of 1940 and certain filings relating to investment...

  4. 30 CFR 212.51 - Records and files maintenance.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 30 Mineral Resources 2 2010-07-01 2010-07-01 false Records and files maintenance. 212.51 Section 212.51 Mineral Resources MINERALS MANAGEMENT SERVICE, DEPARTMENT OF THE INTERIOR MINERALS REVENUE MANAGEMENT RECORDS AND FILES MAINTENANCE Oil, Gas, and OCS Sulphur-General § 212.51 Records and files...

  5. 29 CFR 409.4 - Personal responsibility for filing of reports.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 29 Labor 2 2011-07-01 2011-07-01 false Personal responsibility for filing of reports. 409.4... LABOR LABOR-MANAGEMENT STANDARDS REPORTS BY SURETY COMPANIES § 409.4 Personal responsibility for filing of reports. Each individual required to file a report under section 211 of the Labor-Management...

  6. 29 CFR 409.4 - Personal responsibility for filing of reports.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 29 Labor 2 2010-07-01 2010-07-01 false Personal responsibility for filing of reports. 409.4... LABOR LABOR-MANAGEMENT STANDARDS REPORTS BY SURETY COMPANIES § 409.4 Personal responsibility for filing of reports. Each individual required to file a report under section 211 of the Labor-Management...

  7. Automated clustering-based workload characterization

    NASA Technical Reports Server (NTRS)

    Pentakalos, Odysseas I.; Menasce, Daniel A.; Yesha, Yelena

    1996-01-01

    The demands placed on the mass storage systems at various federal agencies and national laboratories are continuously increasing in intensity. This forces system managers to constantly monitor the system, evaluate the demand placed on it, and tune it appropriately using either heuristics based on experience or analytic models. Performance models require an accurate workload characterization. This can be a laborious and time consuming process. It became evident from our experience that a tool is necessary to automate the workload characterization process. This paper presents the design and discusses the implementation of a tool for workload characterization of mass storage systems. The main features of the tool discussed here are: (1)Automatic support for peak-period determination. Histograms of system activity are generated and presented to the user for peak-period determination; (2) Automatic clustering analysis. The data collected from the mass storage system logs is clustered using clustering algorithms and tightness measures to limit the number of generated clusters; (3) Reporting of varied file statistics. The tool computes several statistics on file sizes such as average, standard deviation, minimum, maximum, frequency, as well as average transfer time. These statistics are given on a per cluster basis; (4) Portability. The tool can easily be used to characterize the workload in mass storage systems of different vendors. The user needs to specify through a simple log description language how the a specific log should be interpreted. The rest of this paper is organized as follows. Section two presents basic concepts in workload characterization as they apply to mass storage systems. Section three describes clustering algorithms and tightness measures. The following section presents the architecture of the tool. Section five presents some results of workload characterization using the tool.Finally, section six presents some concluding remarks.

  8. The Shoreline Management Tool - an ArcMap tool for analyzing water depth, inundated area, volume, and selected habitats, with an example for the lower Wood River Valley, Oregon

    USGS Publications Warehouse

    Snyder, Daniel T.; Haluska, Tana L.; Respini-Irwin, Darius

    2013-01-01

    The Shoreline Management Tool is a geographic information system (GIS) based program developed to assist water- and land-resource managers in assessing the benefits and effects of changes in surface-water stage on water depth, inundated area, and water volume. Additionally, the Shoreline Management Tool can be used to identify aquatic or terrestrial habitat areas where conditions may be suitable for specific plants or animals as defined by user-specified criteria including water depth, land-surface slope, and land-surface aspect. The tool can also be used to delineate areas for use in determining a variety of hydrologic budget components such as surface-water storage, precipitation, runoff, or evapotranspiration. The Shoreline Management Tool consists of two parts, a graphical user interface for use with Esri™ ArcMap™ GIS software to interact with the user to define scenarios and map results, and a spreadsheet in Microsoft® Excel® developed to display tables and graphs of the results. The graphical user interface allows the user to define a scenario consisting of an inundation level (stage), land areas (parcels), and habitats (areas meeting user-specified conditions) based on water depth, slope, and aspect criteria. The tool uses data consisting of land-surface elevation, tables of stage/volume and stage/area, and delineated parcel boundaries to produce maps (data layers) of inundated areas and areas that meet the habitat criteria. The tool can be run in a Single-Time Scenario mode or in a Time-Series Scenario mode, which uses an input file of dates and associated stages. The spreadsheet part of the tool uses a macro to process the results from the graphical user interface to create tables and graphs of inundated water volume, inundated area, dry area, and mean water depth for each land parcel based on the user-specified stage. The macro also creates tables and graphs of the area, perimeter, and number of polygons comprising the user-specified habitat areas within each parcel. The Shoreline Management Tool is highly transferable, using easily generated or readily available data. The capabilities of the tool are demonstrated using data from the lower Wood River Valley adjacent to Upper Klamath and Agency Lakes in southern Oregon.

  9. DSSTOX MASTER STRUCTURE-INDEX FILE: SDF FILE AND ...

    EPA Pesticide Factsheets

    The DSSTox Master Structure-Index File serves to consolidate, manage, and ensure quality and uniformity of the chemical and substance information spanning all DSSTox Structure Data Files, including those in development but not yet published separately on this website. The DSSTox Master Structure-Index File serves to consolidate, manage, and ensure quality and uniformity of the chemical and substance information spanning all DSSTox Structure Data Files, including those in development but not yet published separately on this website.

  10. FMC: a one-liner Python program to manage, classify and plot focal mechanisms

    NASA Astrophysics Data System (ADS)

    Álvarez-Gómez, José A.

    2014-05-01

    The analysis of earthquake focal mechanisms (or Seismic Moment Tensor, SMT) is a key tool on seismotectonics research. Each focal mechanism is characterized by several location parameters of the earthquake hypocenter, the earthquake size (magnitude and scalar moment tensor) and some geometrical characteristics of the rupture (nodal planes orientations, SMT components and/or SMT main axes orientations). The aim of FMC is to provide a simple but powerful tool to manage focal mechanism data. The data should be input to the program formatted as one of two of the focal mechanisms formatting options of the GMT (Generic Mapping Tools) package (Wessel and Smith, 1998): the Harvard CMT convention and the single nodal plane Aki and Richards (1980) convention. The former is a SMT format that can be downloaded directly from the Global CMT site (http://www.globalcmt.org/), while the later is the simplest way to describe earthquake rupture data. FMC is programmed in Python language, which is distributed as Open Source GPL-compatible, and therefore can be used to develop Free Software. Python runs on almost any machine, and has a wide support and presence in any operative system. The program has been conceived with the modularity and versatility of the classical UNIX-like tools. Is called from the command line and can be easily integrated into shell scripts (*NIX systems) or batch files (DOS/Windows systems). The program input and outputs can be done by means of ASCII files or using standard input (or redirection "<"), standard output (screen or redirection ">") and pipes ("|"). By default FMC will read the input and write the output as a Harvard CMT (psmeca formatted) ASCII file, although other formats can be used. Optionally FMC will produce a classification diagram representing the rupture type of the focal mechanisms processed. In order to count with a detailed classification of the focal mechanisms I decided to classify the focal mechanism in a series of fields that include the oblique slip regimes. This approximation is similar to the Johnston et al. (1994) classification; with 7 classes of earthquakes: 1) Normal; 2) Normal - Strike-slip; 3) Strike-slip - Normal; 4) Strike-slip; 5) Strike-slip - Reverse; 6) Reverse - strike-slip and 7) Reverse. FMC uses by default this classification in the resulting diagram, based on the Kaverina et al. (1996) projection, which improves the Frohlich and Apperson (1992) ternary diagram.

  11. TCGA Expedition: A Data Acquisition and Management System for TCGA Data

    PubMed Central

    Chandran, Uma R.; Medvedeva, Olga P.; Barmada, M. Michael; Blood, Philip D.; Chakka, Anish; Luthra, Soumya; Ferreira, Antonio; Wong, Kim F.; Lee, Adrian V.; Zhang, Zhihui; Budden, Robert; Scott, J. Ray; Berndt, Annerose; Berg, Jeremy M.; Jacobson, Rebecca S.

    2016-01-01

    Background The Cancer Genome Atlas Project (TCGA) is a National Cancer Institute effort to profile at least 500 cases of 20 different tumor types using genomic platforms and to make these data, both raw and processed, available to all researchers. TCGA data are currently over 1.2 Petabyte in size and include whole genome sequence (WGS), whole exome sequence, methylation, RNA expression, proteomic, and clinical datasets. Publicly accessible TCGA data are released through public portals, but many challenges exist in navigating and using data obtained from these sites. We developed TCGA Expedition to support the research community focused on computational methods for cancer research. Data obtained, versioned, and archived using TCGA Expedition supports command line access at high-performance computing facilities as well as some functionality with third party tools. For a subset of TCGA data collected at University of Pittsburgh, we also re-associate TCGA data with de-identified data from the electronic health records. Here we describe the software as well as the architecture of our repository, methods for loading of TCGA data to multiple platforms, and security and regulatory controls that conform to federal best practices. Results TCGA Expedition software consists of a set of scripts written in Bash, Python and Java that download, extract, harmonize, version and store all TCGA data and metadata. The software generates a versioned, participant- and sample-centered, local TCGA data directory with metadata structures that directly reference the local data files as well as the original data files. The software supports flexible searches of the data via a web portal, user-centric data tracking tools, and data provenance tools. Using this software, we created a collaborative repository, the Pittsburgh Genome Resource Repository (PGRR) that enabled investigators at our institution to work with all TCGA data formats, and to interrogate these data with analysis pipelines, and associated tools. WGS data are especially challenging for individual investigators to use, due to issues with downloading, storage, and processing; having locally accessible WGS BAM files has proven invaluable. Conclusion Our open-source, freely available TCGA Expedition software can be used to create a local collaborative infrastructure for acquiring, managing, and analyzing TCGA data and other large public datasets. PMID:27788220

  12. NASA Uniform Files Index

    NASA Technical Reports Server (NTRS)

    1987-01-01

    This handbook is a guide for the use of all personnel engaged in handling NASA files. It is issued in accordance with the regulations of the National Archives and Records Administration, in the Code of Federal Regulations Title 36, Part 1224, Files Management; and the Federal Information Resources Management Regulation, Subpart 201-45.108, Files Management. It is intended to provide a standardized classification and filing scheme to achieve maximum uniformity and ease in maintaining and using agency records. It is a framework for consistent organization of information in an arrangement that will be useful to current and future researchers. The NASA Uniform Files Index coding structure is composed of the subject classification table used for NASA management directives and the subject groups in the NASA scientific and technical information system. It is designed to correlate files throughout NASA and it is anticipated that it may be useful with automated filing systems. It is expected that in the conversion of current files to this arrangement it will be necessary to add tertiary subjects and make further subdivisions under the existing categories. Established primary and secondary subject categories may not be changed arbitrarily. Proposals for additional subject categories of NASA-wide applicability, and suggestions for improvement in this handbook, should be addressed to the Records Program Manager at the pertinent installation who will forward it to the NASA Records Management Office, Code NTR, for approval. This handbook is issued in loose-leaf form and will be revised by page changes.

  13. What software tools can I use to view ERBE HDF data products?

    Atmospheric Science Data Center

    2014-12-08

    Visualize ERBE data with view_hdf: view_hdf a visualization and analysis tool for accessing data stored in Hierarchical Data Format (HDF) and HDF-EOS. ... Start HDFView Select File Select Open Select the file to be viewed ERBE: Data Access ...

  14. LCFM - LIVING COLOR FRAME MAKER: PC GRAPHICS GENERATION AND MANAGEMENT TOOL FOR REAL-TIME APPLICATIONS

    NASA Technical Reports Server (NTRS)

    Truong, L. V.

    1994-01-01

    Computer graphics are often applied for better understanding and interpretation of data under observation. These graphics become more complicated when animation is required during "run-time", as found in many typical modern artificial intelligence and expert systems. Living Color Frame Maker is a solution to many of these real-time graphics problems. Living Color Frame Maker (LCFM) is a graphics generation and management tool for IBM or IBM compatible personal computers. To eliminate graphics programming, the graphic designer can use LCFM to generate computer graphics frames. The graphical frames are then saved as text files, in a readable and disclosed format, which can be easily accessed and manipulated by user programs for a wide range of "real-time" visual information applications. For example, LCFM can be implemented in a frame-based expert system for visual aids in management of systems. For monitoring, diagnosis, and/or controlling purposes, circuit or systems diagrams can be brought to "life" by using designated video colors and intensities to symbolize the status of hardware components (via real-time feedback from sensors). Thus status of the system itself can be displayed. The Living Color Frame Maker is user friendly with graphical interfaces, and provides on-line help instructions. All options are executed using mouse commands and are displayed on a single menu for fast and easy operation. LCFM is written in C++ using the Borland C++ 2.0 compiler for IBM PC series computers and compatible computers running MS-DOS. The program requires a mouse and an EGA/VGA display. A minimum of 77K of RAM is also required for execution. The documentation is provided in electronic form on the distribution medium in WordPerfect format. A sample MS-DOS executable is provided on the distribution medium. The standard distribution medium for this program is one 5.25 inch 360K MS-DOS format diskette. The contents of the diskette are compressed using the PKWARE archiving tools. The utility to unarchive the files, PKUNZIP.EXE, is included. The Living Color Frame Maker tool was developed in 1992.

  15. Conversion of HSPF Legacy Model to a Platform-Independent, Open-Source Language

    NASA Astrophysics Data System (ADS)

    Heaphy, R. T.; Burke, M. P.; Love, J. T.

    2015-12-01

    Since its initial development over 30 years ago, the Hydrologic Simulation Program - FORTAN (HSPF) model has been used worldwide to support water quality planning and management. In the United States, HSPF receives widespread endorsement as a regulatory tool at all levels of government and is a core component of the EPA's Better Assessment Science Integrating Point and Nonpoint Sources (BASINS) system, which was developed to support nationwide Total Maximum Daily Load (TMDL) analysis. However, the model's legacy code and data management systems have limitations in their ability to integrate with modern software, hardware, and leverage parallel computing, which have left voids in optimization, pre-, and post-processing tools. Advances in technology and our scientific understanding of environmental processes that have occurred over the last 30 years mandate that upgrades be made to HSPF to allow it to evolve and continue to be a premiere tool for water resource planners. This work aims to mitigate the challenges currently facing HSPF through two primary tasks: (1) convert code to a modern widely accepted, open-source, high-performance computing (hpc) code; and (2) convert model input and output files to modern widely accepted, open-source, data model, library, and binary file format. Python was chosen as the new language for the code conversion. It is an interpreted, object-oriented, hpc code with dynamic semantics that has become one of the most popular open-source languages. While python code execution can be slow compared to compiled, statically typed programming languages, such as C and FORTRAN, the integration of Numba (a just-in-time specializing compiler) has allowed this challenge to be overcome. For the legacy model data management conversion, HDF5 was chosen to store the model input and output. The code conversion for HSPF's hydrologic and hydraulic modules has been completed. The converted code has been tested against HSPF's suite of "test" runs and shown good agreement and similar execution times while using the Numba compiler. Continued verification of the accuracy of the converted code against more complex legacy applications and improvement upon execution times by incorporating an intelligent network change detection tool is currently underway, and preliminary results will be presented.

  16. A software platform for statistical evaluation of patient respiratory patterns in radiation therapy.

    PubMed

    Dunn, Leon; Kenny, John

    2017-10-01

    The aim of this work was to design and evaluate a software tool for analysis of a patient's respiration, with the goal of optimizing the effectiveness of motion management techniques during radiotherapy imaging and treatment. A software tool which analyses patient respiratory data files (.vxp files) created by the Varian Real-Time Position Management System (RPM) was developed to analyse patient respiratory data. The software, called RespAnalysis, was created in MATLAB and provides four modules, one each for determining respiration characteristics, providing breathing coaching (biofeedback training), comparing pre and post-training characteristics and performing a fraction-by-fraction assessment. The modules analyse respiratory traces to determine signal characteristics and specifically use a Sample Entropy algorithm as the key means to quantify breathing irregularity. Simulated respiratory signals, as well as 91 patient RPM traces were analysed with RespAnalysis to test the viability of using the Sample Entropy for predicting breathing regularity. Retrospective assessment of patient data demonstrated that the Sample Entropy metric was a predictor of periodic irregularity in respiration data, however, it was found to be insensitive to amplitude variation. Additional waveform statistics assessing the distribution of signal amplitudes over time coupled with Sample Entropy method were found to be useful in assessing breathing regularity. The RespAnalysis software tool presented in this work uses the Sample Entropy method to analyse patient respiratory data recorded for motion management purposes in radiation therapy. This is applicable during treatment simulation and during subsequent treatment fractions, providing a way to quantify breathing irregularity, as well as assess the need for breathing coaching. It was demonstrated that the Sample Entropy metric was correlated to the irregularity of the patient's respiratory motion in terms of periodicity, whilst other metrics, such as percentage deviation of inhale/exhale peak positions provided insight into respiratory amplitude regularity. Copyright © 2017 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  17. Working More Productively: Tools for Administrative Data

    PubMed Central

    Roos, Leslie L; Soodeen, Ruth-Ann; Bond, Ruth; Burchill, Charles

    2003-01-01

    Objective This paper describes a web-based resource () that contains a series of tools for working with administrative data. This work in knowledge management represents an effort to document, find, and transfer concepts and techniques, both within the local research group and to a more broadly defined user community. Concepts and associated computer programs are made as “modular” as possible to facilitate easy transfer from one project to another. Study Setting/Data Sources Tools to work with a registry, longitudinal administrative data, and special files (survey and clinical) from the Province of Manitoba, Canada in the 1990–2003 period. Data Collection Literature review and analyses of web site utilization were used to generate the findings. Principal Findings The Internet-based Concept Dictionary and SAS macros developed in Manitoba are being used in a growing number of research centers. Nearly 32,000 hits from more than 10,200 hosts in a recent month demonstrate broad interest in the Concept Dictionary. Conclusions The tools, taken together, make up a knowledge repository and research production system that aid local work and have great potential internationally. Modular software provides considerable efficiency. The merging of documentation and researcher-to-researcher dissemination keeps costs manageable. PMID:14596394

  18. Mars Reconnaissance Orbiter Uplink Analysis Tool

    NASA Technical Reports Server (NTRS)

    Khanampompan, Teerapat; Gladden, Roy; Fisher, Forest; Hwang, Pauline

    2008-01-01

    This software analyzes Mars Reconnaissance Orbiter (MRO) orbital geometry with respect to Mars Exploration Rover (MER) contact windows, and is the first tool of its kind designed specifically to support MRO-MER interface coordination. Prior to this automated tool, this analysis was done manually with Excel and the UNIX command line. In total, the process would take approximately 30 minutes for each analysis. The current automated analysis takes less than 30 seconds. This tool resides on the flight machine and uses a PHP interface that does the entire analysis of the input files and takes into account one-way light time from another input file. Input flies are copied over to the proper directories and are dynamically read into the tool s interface. The user can then choose the corresponding input files based on the time frame desired for analysis. After submission of the Web form, the tool merges the two files into a single, time-ordered listing of events for both spacecraft. The times are converted to the same reference time (Earth Transmit Time) by reading in a light time file and performing the calculations necessary to shift the time formats. The program also has the ability to vary the size of the keep-out window on the main page of the analysis tool by inputting a custom time for padding each MRO event time. The parameters on the form are read in and passed to the second page for analysis. Everything is fully coded in PHP and can be accessed by anyone with access to the machine via Web page. This uplink tool will continue to be used for the duration of the MER mission's needs for X-band uplinks. Future missions also can use the tools to check overflight times as well as potential site observation times. Adaptation of the input files to the proper format, and the window keep-out times, would allow for other analyses. Any operations task that uses the idea of keep-out windows will have a use for this program.

  19. 76 FR 9596 - Notice of Stay of Filing of Plat; Colorado

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-18

    ... DEPARTMENT OF THE INTERIOR Bureau of Land Management [LLCO956000.L14200000 BJ0000] Notice of Stay of Filing of Plat; Colorado AGENCY: Bureau of Land Management, Interior. ACTION: Notice of Stay of Filing of Plat. SUMMARY: On Monday, December 13, 2010, the Bureau of Land Management (BLM), published a...

  20. 76 FR 70482 - Notice of Filing of Plats of Survey, New Mexico

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-14

    ... DEPARTMENT OF THE INTERIOR Bureau of Land Management [LLNM940000. L1420000.BJ0000] Notice of Filing of Plats of Survey, New Mexico AGENCY: Bureau of Land Management, Interior. ACTION: Notice of... filed in the New Mexico State Office, Bureau of Land Management, Santa Fe, New Mexico, thirty (30...

  1. 76 FR 48174 - Notice of Stay of Filing of Plat; Colorado

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-08

    ... DEPARTMENT OF THE INTERIOR Bureau of Land Management [LLCO956000.L14200000 BJ0000] Notice of Stay of Filing of Plat; Colorado AGENCY: Bureau of Land Management, Interior. ACTION: Notice. SUMMARY: On Friday, February 18, 2011, the Bureau of Land Management, (BLM) published a Notice of Stay of Filing of...

  2. 75 FR 6219 - Filing of Plats of Survey, WY

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-08

    ... DEPARTMENT OF THE INTERIOR Bureau of Land Management [LLWY-957400-10-L14200000-BJ0000] Filing of Plats of Survey, WY AGENCY: Bureau of Land Management, Interior. ACTION: Notice. SUMMARY: The Bureau of Land Management (BLM) has filed the plats of survey of the lands described below in the BLM Wyoming...

  3. 76 FR 13659 - Filing of Plats of Survey, Wyoming and Nebraska

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-14

    ... DEPARTMENT OF THE INTERIOR Bureau of Land Management [LLWY-957400-11-L14200000-BJ0000] Filing of Plats of Survey, Wyoming and Nebraska AGENCY: Bureau of Land Management, Interior. ACTION: Notice. SUMMARY: The Bureau of Land Management (BLM) has filed the plats of survey of the lands described below in...

  4. 77 FR 20842 - Notice of Filing of Plats of Survey, New Mexico

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-04-06

    ... DEPARTMENT OF THE INTERIOR Bureau of Land Management [LLNM940000. L1420000.BJ0000] Notice of Filing of Plats of Survey, New Mexico AGENCY: Bureau of Land Management, Interior. ACTION: Notice of... filed in the New Mexico State Office, Bureau of Land Management, Santa Fe, New Mexico, thirty (30...

  5. 78 FR 36238 - Filing of Plats of Survey, Wyoming and Nebraska

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-17

    ... DEPARTMENT OF THE INTERIOR Bureau of Land Management [LLWY-957400-13-L16100000-BJ0000] Filing of Plats of Survey, Wyoming and Nebraska AGENCY: Bureau of Land Management, Interior. ACTION: Notice. SUMMARY: The Bureau of Land Management (BLM) has filed the plats of survey of the lands described below in...

  6. 76 FR 55700 - Filing of Plats of Survey, Wyoming and Nebraska

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-08

    ... DEPARTMENT OF THE INTERIOR Bureau of Land Management [LLWY-957400-11-L14200000-BJ0000] Filing of Plats of Survey, Wyoming and Nebraska AGENCY: Bureau of Land Management, Interior. ACTION: Notice. SUMMARY: The Bureau of Land Management (BLM) has filed the plats of survey of the lands described below in...

  7. 76 FR 62088 - Notice of Stay of Filing of Plat; Colorado

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-06

    ... DEPARTMENT OF THE INTERIOR Bureau of Land Management [LLCO956000.L14200000 BJ0000] Notice of Stay of Filing of Plat; Colorado AGENCY: Bureau of Land Management, Interior. ACTION: Notice of Stay of Filing of Plat. SUMMARY: On Monday, August 8, 2011, the Bureau of Land Management (BLM), Colorado...

  8. 76 FR 26766 - Notice of Filing of Plats of Survey, New Mexico

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-09

    ... DEPARTMENT OF THE INTERIOR Bureau of Land Management [LLNM940000 L1420000.BJ0000] Notice of Filing of Plats of Survey, New Mexico AGENCY: Bureau of Land Management, Interior. ACTION: Notice of filing... in the New Mexico State Office, Bureau of Land Management, Santa Fe, New Mexico, thirty (30) calendar...

  9. DECOMP: a PDB decomposition tool on the web.

    PubMed

    Ordog, Rafael; Szabadka, Zoltán; Grolmusz, Vince

    2009-07-27

    The protein databank (PDB) contains high quality structural data for computational structural biology investigations. We have earlier described a fast tool (the decomp_pdb tool) for identifying and marking missing atoms and residues in PDB files. The tool also automatically decomposes PDB entries into separate files describing ligands and polypeptide chains. Here, we describe a web interface named DECOMP for the tool. Our program correctly identifies multi-monomer ligands, and the server also offers the preprocessed ligand-protein decomposition of the complete PDB for downloading (up to size: 5GB) AVAILABILITY: http://decomp.pitgroup.org.

  10. Checkpoint-Restart in User Space

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    CRUISE implements a user-space file system that stores data in main memory and transparently spills over to other storage, like local flash memory or the parallel file system, as needed. CRUISE also exposes file contents fo remote direct memory access, allowing external tools to copy files to the parallel file system in the background with reduced CPU interruption.

  11. 76 FR 55700 - Eastern States: Filing of Plat of Survey

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-08

    ...] Eastern States: Filing of Plat of Survey AGENCY: Bureau of Land Management, Interior. ACTION: Notice of Filing of Plat of Survey; Louisiana. SUMMARY: The Bureau of Land Management (BLM) will file the plat of survey of the lands described below in the BLM--Eastern States office in Springfield, Virginia, 30...

  12. 76 FR 45293 - Eastern States: Filing of Plat of Survey

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-28

    ...] Eastern States: Filing of Plat of Survey AGENCY: Bureau of Land Management, Interior. ACTION: Notice of Filing of Plat of Survey; Wisconsin. SUMMARY: The Bureau of Land Management (BLM) will file the plat of survey of the lands described below in the BLM-Eastern States office in Springfield, Virginia, 30...

  13. 76 FR 65533 - Eastern States: Filing of Plat of Survey

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-21

    ...] Eastern States: Filing of Plat of Survey AGENCY: Bureau of Land Management, Interior. ACTION: Notice of filing of plat of survey; North Carolina. SUMMARY: The Bureau of Land Management (BLM) will file the plat of survey of the land described below in the BLM-Eastern States office in Springfield, Virginia, 30...

  14. 75 FR 65028 - Eastern States: Filing of Plat of Survey

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-21

    ...] Eastern States: Filing of Plat of Survey AGENCY: Bureau of Land Management, Interior. ACTION: Notice of Filing of Plat of Survey; North Carolina. SUMMARY: The Bureau of Land Management (BLM) will file the plat of survey of the lands described below in the BLM-Eastern States office in Springfield, Virginia, 30...

  15. 75 FR 13302 - Eastern States: Filing of Plat of Survey

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-19

    ...] Eastern States: Filing of Plat of Survey AGENCY: Bureau of Land Management, Interior. ACTION: Notice of filing of plats of survey; North Carolina and Wisconsin. SUMMARY: The Bureau of Land Management (BLM) will file the plat of survey of the lands described below in the BLM-Eastern States office in...

  16. 76 FR 55700 - Eastern States: Filing of Plat of Survey

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-08

    ...] Eastern States: Filing of Plat of Survey AGENCY: Bureau of Land Management, Interior. ACTION: Notice of Filing of Plat of Survey; Minnesota. SUMMARY: The Bureau of Land Management (BLM) will file the plat of survey of the lands described below in the BLM-Eastern States office in Springfield, Virginia, 30...

  17. 76 FR 45292 - Eastern States: Filing of Plats of Survey

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-28

    ...] Eastern States: Filing of Plats of Survey AGENCY: Bureau of Land Management, Interior. ACTION: Notice of Filing of Plats of Survey; Alabama and Wisconsin. SUMMARY: The Bureau of Land Management (BLM) will file the plats of survey of the lands described below in the BLM-Eastern States office in Springfield...

  18. 76 FR 48882 - Eastern States: Filing of Plat of Survey

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-09

    ...] Eastern States: Filing of Plat of Survey AGENCY: Bureau of Land Management, Interior. ACTION: Notice Of Filing Of Plat Of Survey; Wisconsin. SUMMARY: The Bureau of Land Management (BLM) will file the plat of survey of the lands described below in the BLM-Eastern States office in Springfield, Virginia, 30...

  19. 78 FR 48900 - Eastern States: Filing of Plat of Survey

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-12

    ...] Eastern States: Filing of Plat of Survey AGENCY: Bureau of Land Management, Interior. ACTION: Notice of filing of plat of survey; New York. SUMMARY: The Bureau of Land Management (BLM) will file the plat of survey of the lands described below in the BLM-Eastern States office in Springfield, Virginia, 30...

  20. Distributed data analysis in ATLAS

    NASA Astrophysics Data System (ADS)

    Nilsson, Paul; Atlas Collaboration

    2012-12-01

    Data analysis using grid resources is one of the fundamental challenges to be addressed before the start of LHC data taking. The ATLAS detector will produce petabytes of data per year, and roughly one thousand users will need to run physics analyses on this data. Appropriate user interfaces and helper applications have been made available to ensure that the grid resources can be used without requiring expertise in grid technology. These tools enlarge the number of grid users from a few production administrators to potentially all participating physicists. ATLAS makes use of three grid infrastructures for the distributed analysis: the EGEE sites, the Open Science Grid, and Nordu Grid. These grids are managed by the gLite workload management system, the PanDA workload management system, and ARC middleware; many sites can be accessed via both the gLite WMS and PanDA. Users can choose between two front-end tools to access the distributed resources. Ganga is a tool co-developed with LHCb to provide a common interface to the multitude of execution backends (local, batch, and grid). The PanDA workload management system provides a set of utilities called PanDA Client; with these tools users can easily submit Athena analysis jobs to the PanDA-managed resources. Distributed data is managed by Don Quixote 2, a system developed by ATLAS; DQ2 is used to replicate datasets according to the data distribution policies and maintains a central catalog of file locations. The operation of the grid resources is continually monitored by the Ganga Robot functional testing system, and infrequent site stress tests are performed using the Hammer Cloud system. In addition, the DAST shift team is a group of power users who take shifts to provide distributed analysis user support; this team has effectively relieved the burden of support from the developers.

  1. HDF-EOS Dump Tools

    NASA Astrophysics Data System (ADS)

    Prasad, U.; Rahabi, A.

    2001-05-01

    The following utilities developed for HDF-EOS format data dump are of special use for Earth science data for NASA's Earth Observation System (EOS). This poster demonstrates their use and application. The first four tools take HDF-EOS data files as input. HDF-EOS Metadata Dumper - metadmp Metadata dumper extracts metadata from EOS data granules. It operates by simply copying blocks of metadata from the file to the standard output. It does not process the metadata in any way. Since all metadata in EOS granules is encoded in the Object Description Language (ODL), the output of metadmp will be in the form of complete ODL statements. EOS data granules may contain up to three different sets of metadata (Core, Archive, and Structural Metadata). HDF-EOS Contents Dumper - heosls Heosls dumper displays the contents of HDF-EOS files. This utility provides detailed information on the POINT, SWATH, and GRID data sets. in the files. For example: it will list, the Geo-location fields, Data fields and objects. HDF-EOS ASCII Dumper - asciidmp The ASCII dump utility extracts fields from EOS data granules into plain ASCII text. The output from asciidmp should be easily human readable. With minor editing, asciidmp's output can be made ingestible by any application with ASCII import capabilities. HDF-EOS Binary Dumper - bindmp The binary dumper utility dumps HDF-EOS objects in binary format. This is useful for feeding the output of it into existing program, which does not understand HDF, for example: custom software and COTS products. HDF-EOS User Friendly Metadata - UFM The UFM utility tool is useful for viewing ECS metadata. UFM takes an EOSDIS ODL metadata file and produces an HTML report of the metadata for display using a web browser. HDF-EOS METCHECK - METCHECK METCHECK can be invoked from either Unix or Dos environment with a set of command line options that a user might use to direct the tool inputs and output . METCHECK validates the inventory metadata in (.met file) using The Descriptor file (.desc) as the reference. The tool takes (.desc), and (.met) an ODL file as inputs, and generates a simple output file contains the results of the checking process.

  2. The inadvertent disclosure of personal health information through peer-to-peer file sharing programs

    PubMed Central

    Neri, Emilio; Jonker, Elizabeth; Sokolova, Marina; Peyton, Liam; Neisa, Angelica; Scassa, Teresa

    2010-01-01

    Objective There has been a consistent concern about the inadvertent disclosure of personal information through peer-to-peer file sharing applications, such as Limewire and Morpheus. Examples of personal health and financial information being exposed have been published. We wanted to estimate the extent to which personal health information (PHI) is being disclosed in this way, and compare that to the extent of disclosure of personal financial information (PFI). Design After careful review and approval of our protocol by our institutional research ethics board, files were downloaded from peer-to-peer file sharing networks and manually analyzed for the presence of PHI and PFI. The geographic region of the IP addresses was determined, and classified as either USA or Canada. Measurement We estimated the proportion of files that contain personal health and financial information for each region. We also estimated the proportion of search terms that return files with personal health and financial information. We ascertained and discuss the ethical issues related to this study. Results Approximately 0.4% of Canadian IP addresses had PHI, as did 0.5% of US IP addresses. There was more disclosure of financial information, at 1.7% of Canadian IP addresses and 4.7% of US IP addresses. An analysis of search terms used in these file sharing networks showed that a small percentage of the terms would return PHI and PFI files (ie, there are people successfully searching for PFI and PHI on the peer-to-peer file sharing networks). Conclusion There is a real risk of inadvertent disclosure of PHI through peer-to-peer file sharing networks, although the risk is not as large as for PFI. Anyone keeping PHI on their computers should avoid installing file sharing applications on their computers, or if they have to use such tools, actively manage the risks of inadvertent disclosure of their, their family's, their clients', or patients' PHI. PMID:20190057

  3. Decision & Management Tools for DNAPL Sites: Optimization of Chlorinated Solvent Source and Plume Remediation Considering Uncertainty

    DTIC Science & Technology

    2010-09-01

    differentiated between source codes and input/output files. The text makes references to a REMChlor-GoldSim model. The text also refers to the REMChlor...To the extent possible, the instructions should be accurate and precise. The documentation should differentiate between describing what is actually...Windows XP operating system Model Input Paran1eters. · n1e input parameters were identical to those utilized and reported by CDM (See Table .I .from

  4. Analysis and Comparison of Various Requirements Management Tools for Use in the Shipbuilding Industry

    DTIC Science & Technology

    2006-09-01

    such products as MS Word, MS Excel, MS PowerPoint, Adobe Acrobat, Adobe FrameMaker , Claris FileMaker, Adobe PhotoShop and Adobe Illustrator, it is easy...Adobe FrameMaker , etc. Information can be exported out in the same formats as above plus HTML, MS PowerPoint, and MS Outlook. DOORS is very user...including Postscript, RTF (for PowerPoint), HTML, Interleaf, SVG, FrameMaker , HP LaserJet, HPGL, and EPS. Examples of such charts produced by DOORS

  5. [The Internet and shared decision-making between patients and healthcare providers].

    PubMed

    Silber, Denise

    2009-10-01

    Insurance companies like Kaiser Permanente in the United States remunerate physicians for their email correspondence with patients, increasing the efficiency of office visits. A survey by the French National Board of Physicians regarding the computerization of medical practices in April 2009, confirms that both physicians and patients in France are very favorable to the development of these tools. When patients can manage and/or access their medical files and determine which providers can access them, they become a true partner.

  6. Fast probabilistic file fingerprinting for big data

    PubMed Central

    2013-01-01

    Background Biological data acquisition is raising new challenges, both in data analysis and handling. Not only is it proving hard to analyze the data at the rate it is generated today, but simply reading and transferring data files can be prohibitively slow due to their size. This primarily concerns logistics within and between data centers, but is also important for workstation users in the analysis phase. Common usage patterns, such as comparing and transferring files, are proving computationally expensive and are tying down shared resources. Results We present an efficient method for calculating file uniqueness for large scientific data files, that takes less computational effort than existing techniques. This method, called Probabilistic Fast File Fingerprinting (PFFF), exploits the variation present in biological data and computes file fingerprints by sampling randomly from the file instead of reading it in full. Consequently, it has a flat performance characteristic, correlated with data variation rather than file size. We demonstrate that probabilistic fingerprinting can be as reliable as existing hashing techniques, with provably negligible risk of collisions. We measure the performance of the algorithm on a number of data storage and access technologies, identifying its strengths as well as limitations. Conclusions Probabilistic fingerprinting may significantly reduce the use of computational resources when comparing very large files. Utilisation of probabilistic fingerprinting techniques can increase the speed of common file-related workflows, both in the data center and for workbench analysis. The implementation of the algorithm is available as an open-source tool named pfff, as a command-line tool as well as a C library. The tool can be downloaded from http://biit.cs.ut.ee/pfff. PMID:23445565

  7. The Open Microscopy Environment: open image informatics for the biological sciences

    NASA Astrophysics Data System (ADS)

    Blackburn, Colin; Allan, Chris; Besson, Sébastien; Burel, Jean-Marie; Carroll, Mark; Ferguson, Richard K.; Flynn, Helen; Gault, David; Gillen, Kenneth; Leigh, Roger; Leo, Simone; Li, Simon; Lindner, Dominik; Linkert, Melissa; Moore, Josh; Moore, William J.; Ramalingam, Balaji; Rozbicki, Emil; Rustici, Gabriella; Tarkowska, Aleksandra; Walczysko, Petr; Williams, Eleanor; Swedlow, Jason R.

    2016-07-01

    Despite significant advances in biological imaging and analysis, major informatics challenges remain unsolved: file formats are proprietary, storage and analysis facilities are lacking, as are standards for sharing image data and results. While the open FITS file format is ubiquitous in astronomy, astronomical imaging shares many challenges with biological imaging, including the need to share large image sets using secure, cross-platform APIs, and the need for scalable applications for processing and visualization. The Open Microscopy Environment (OME) is an open-source software framework developed to address these challenges. OME tools include: an open data model for multidimensional imaging (OME Data Model); an open file format (OME-TIFF) and library (Bio-Formats) enabling free access to images (5D+) written in more than 145 formats from many imaging domains, including FITS; and a data management server (OMERO). The Java-based OMERO client-server platform comprises an image metadata store, an image repository, visualization and analysis by remote access, allowing sharing and publishing of image data. OMERO provides a means to manage the data through a multi-platform API. OMERO's model-based architecture has enabled its extension into a range of imaging domains, including light and electron microscopy, high content screening, digital pathology and recently into applications using non-image data from clinical and genomic studies. This is made possible using the Bio-Formats library. The current release includes a single mechanism for accessing image data of all types, regardless of original file format, via Java, C/C++ and Python and a variety of applications and environments (e.g. ImageJ, Matlab and R).

  8. Ground Data System Analysis Tools to Track Flight System State Parameters for the Mars Science Laboratory (MSL) and Beyond

    NASA Technical Reports Server (NTRS)

    Allard, Dan; Deforrest, Lloyd

    2014-01-01

    Flight software parameters enable space mission operators fine-tuned control over flight system configurations, enabling rapid and dynamic changes to ongoing science activities in a much more flexible manner than can be accomplished with (otherwise broadly used) configuration file based approaches. The Mars Science Laboratory (MSL), Curiosity, makes extensive use of parameters to support complex, daily activities via commanded changes to said parameters in memory. However, as the loss of Mars Global Surveyor (MGS) in 2006 demonstrated, flight system management by parameters brings with it risks, including the possibility of losing track of the flight system configuration and the threat of invalid command executions. To mitigate this risk a growing number of missions have funded efforts to implement parameter tracking parameter state software tools and services including MSL and the Soil Moisture Active Passive (SMAP) mission. This paper will discuss the engineering challenges and resulting software architecture of MSL's onboard parameter state tracking software and discuss the road forward to make parameter management tools suitable for use on multiple missions.

  9. SUSHI: an exquisite recipe for fully documented, reproducible and reusable NGS data analysis.

    PubMed

    Hatakeyama, Masaomi; Opitz, Lennart; Russo, Giancarlo; Qi, Weihong; Schlapbach, Ralph; Rehrauer, Hubert

    2016-06-02

    Next generation sequencing (NGS) produces massive datasets consisting of billions of reads and up to thousands of samples. Subsequent bioinformatic analysis is typically done with the help of open source tools, where each application performs a single step towards the final result. This situation leaves the bioinformaticians with the tasks to combine the tools, manage the data files and meta-information, document the analysis, and ensure reproducibility. We present SUSHI, an agile data analysis framework that relieves bioinformaticians from the administrative challenges of their data analysis. SUSHI lets users build reproducible data analysis workflows from individual applications and manages the input data, the parameters, meta-information with user-driven semantics, and the job scripts. As distinguishing features, SUSHI provides an expert command line interface as well as a convenient web interface to run bioinformatics tools. SUSHI datasets are self-contained and self-documented on the file system. This makes them fully reproducible and ready to be shared. With the associated meta-information being formatted as plain text tables, the datasets can be readily further analyzed and interpreted outside SUSHI. SUSHI provides an exquisite recipe for analysing NGS data. By following the SUSHI recipe, SUSHI makes data analysis straightforward and takes care of documentation and administration tasks. Thus, the user can fully dedicate his time to the analysis itself. SUSHI is suitable for use by bioinformaticians as well as life science researchers. It is targeted for, but by no means constrained to, NGS data analysis. Our SUSHI instance is in productive use and has served as data analysis interface for more than 1000 data analysis projects. SUSHI source code as well as a demo server are freely available.

  10. 41 CFR 102-117.195 - Are there time limits affecting filing of a claim?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 41 Public Contracts and Property Management 3 2010-07-01 2010-07-01 false Are there time limits affecting filing of a claim? 102-117.195 Section 102-117.195 Public Contracts and Property Management... 117-TRANSPORTATION MANAGEMENT Shipping Freight § 102-117.195 Are there time limits affecting filing of...

  11. Using electronic document management systems to manage highway project files.

    DOT National Transportation Integrated Search

    2011-12-12

    "WisDOTs Bureau of Technical Services is interested in learning about the practices of other state departments of : transportation in developing and implementing an electronic document management system to manage highway : project files"

  12. SiLK: A Tool Suite for Unsampled Network Flow Analysis at Scale

    DTIC Science & Technology

    2014-06-01

    file format,” [Accessed: Feb 9, 2014]. [Online]. Available: https: //tools.netsa.cert.org/silk/faq.html#file-formats [12] “2012 data breach investigations...report (DBIR),” Verizon, Tech. Rep., 2012. [Online]. Available: http://www.verizonenterprise.com/DBIR/2012/ [13] “2013 data breach investigations

  13. A Pipeline Tool for CCD Image Processing

    NASA Astrophysics Data System (ADS)

    Bell, Jon F.; Young, Peter J.; Roberts, William H.; Sebo, Kim M.

    MSSSO is part of a collaboration developing a wide field imaging CCD mosaic (WFI). As part of this project, we have developed a GUI based pipeline tool that is an integrated part of MSSSO's CICADA data acquisition environment and processes CCD FITS images as they are acquired. The tool is also designed to run as a stand alone program to process previously acquired data. IRAF tasks are used as the central engine, including the new NOAO mscred package for processing multi-extension FITS files. The STScI OPUS pipeline environment may be used to manage data and process scheduling. The Motif GUI was developed using SUN Visual Workshop. C++ classes were written to facilitate launching of IRAF and OPUS tasks. While this first version implements calibration processing up to and including flat field corrections, there is scope to extend it to other processing.

  14. Myths and realities: Defining re-engineering for a large organization

    NASA Technical Reports Server (NTRS)

    Yin, Sandra; Mccreary, Julia

    1992-01-01

    This paper describes the background and results of three studies concerning software reverse engineering, re-engineering, and reuse (R3) hosted by the Internal Revenue Service in 1991 and 1992. The situation at the Internal Revenue--aging, piecemeal computer systems and outdated technology maintained by a large staff--is familiar to many institutions, especially among management information systems. The IRS is distinctive for the sheer magnitude and diversity of its problems; the country's tax records are processed using assembly language and COBOL and spread across tape and network DBMS files. How do we proceed with replacing legacy systems? The three software re-engineering studies looked at methods, CASE tool support, and performed a prototype project using re-engineering methods and tools. During the course of these projects, we discovered critical issues broader than the mechanical definitions of methods and tool technology.

  15. Creating databases for biological information: an introduction.

    PubMed

    Stein, Lincoln

    2002-08-01

    The essence of bioinformatics is dealing with large quantities of information. Whether it be sequencing data, microarray data files, mass spectrometric data (e.g., fingerprints), the catalog of strains arising from an insertional mutagenesis project, or even large numbers of PDF files, there inevitably comes a time when the information can simply no longer be managed with files and directories. This is where databases come into play. This unit briefly reviews the characteristics of several database management systems, including flat file, indexed file, and relational databases, as well as ACeDB. It compares their strengths and weaknesses and offers some general guidelines for selecting an appropriate database management system.

  16. Grid data access on widely distributed worker nodes using scalla and SRM

    NASA Astrophysics Data System (ADS)

    Jakl, P.; Lauret, J.; Hanushevsky, A.; Shoshani, A.; Sim, A.; Gu, J.

    2008-07-01

    Facing the reality of storage economics, NP experiments such as RHIC/STAR have been engaged in a shift of the analysis model, and now heavily rely on using cheap disks attached to processing nodes, as such a model is extremely beneficial over expensive centralized storage. Additionally, exploiting storage aggregates with enhanced distributed computing capabilities such as dynamic space allocation (lifetime of spaces), file management on shared storages (lifetime of files, pinning file), storage policies or a uniform access to heterogeneous storage solutions is not an easy task. The Xrootd/Scalla system allows for storage aggregation. We will present an overview of the largest deployment of Scalla (Structured Cluster Architecture for Low Latency Access) in the world spanning over 1000 CPUs co-sharing the 350 TB Storage Elements and the experience on how to make such a model work in the RHIC/STAR standard analysis framework. We will explain the key features and approach on how to make access to mass storage (HPSS) possible in such a large deployment context. Furthermore, we will give an overview of a fully 'gridified' solution using the plug-and-play features of Scalla architecture, replacing standard storage access with grid middleware SRM (Storage Resource Manager) components designed for space management and will compare the solution with the standard Scalla approach in use in STAR for the past 2 years. Integration details, future plans and status of development will be explained in the area of best transfer strategy between multiple-choice data pools and best placement with respect of load balancing and interoperability with other SRM aware tools or implementations.

  17. GIDEP Batching Tool

    NASA Technical Reports Server (NTRS)

    Fong, Danny; Odell,Dorice; Barry, Peter; Abrahamian, Tomik

    2008-01-01

    This software provides internal, automated search mechanics of GIDEP (Government- Industry Data Exchange Program) Alert data imported from the GIDEP government Web site. The batching tool allows the import of a single parts list in tab-delimited text format into the local JPL GIDEP database. Delimiters from every part number are removed. The original part numbers with delimiters are compared, as well as the newly generated list without the delimiters. The two lists run against the GIDEP imports, and output any matches. This feature only works with Netscape 2.0 or greater, or Internet Explorer 4.0 or greater. The user selects the browser button to choose a text file to import. When the submit button is pressed, this script will import alerts from the text file into the local JPL GIDEP database. This batch tool provides complete in-house control over exported material and data for automated batch match abilities. The batching tool has the ability to match capabilities of the parts list to tables, and yields results that aid further research and analysis. This provides more control over GIDEP information for metrics and reports information not provided by the government site. This software yields results quickly and gives more control over external data from the government site in order to generate other reports not available from the external source. There is enough space to store years of data. The program relates to risk identification and management with regard to projects and GIDEP alert information encompassing flight parts for space exploration.

  18. Use of Semi-Autonomous Tools for ISS Commanding and Monitoring

    NASA Technical Reports Server (NTRS)

    Brzezinski, Amy S.

    2014-01-01

    As the International Space Station (ISS) has moved into a utilization phase, operations have shifted to become more ground-based with fewer mission control personnel monitoring and commanding multiple ISS systems. This shift to fewer people monitoring more systems has prompted use of semi-autonomous console tools in the ISS Mission Control Center (MCC) to help flight controllers command and monitor the ISS. These console tools perform routine operational procedures while keeping the human operator "in the loop" to monitor and intervene when off-nominal events arise. Two such tools, the Pre-positioned Load (PPL) Loader and Automatic Operators Recorder Manager (AutoORM), are used by the ISS Communications RF Onboard Networks Utilization Specialist (CRONUS) flight control position. CRONUS is responsible for simultaneously commanding and monitoring the ISS Command & Data Handling (C&DH) and Communications and Tracking (C&T) systems. PPL Loader is used to uplink small pieces of frequently changed software data tables, called PPLs, to ISS computers to support different ISS operations. In order to uplink a PPL, a data load command must be built that contains multiple user-input fields. Next, a multiple step commanding and verification procedure must be performed to enable an onboard computer for software uplink, uplink the PPL, verify the PPL has incorporated correctly, and disable the computer for software uplink. PPL Loader provides different levels of automation in both building and uplinking these commands. In its manual mode, PPL Loader automatically builds the PPL data load commands but allows the flight controller to verify and save the commands for future uplink. In its auto mode, PPL Loader automatically builds the PPL data load commands for flight controller verification, but automatically performs the PPL uplink procedure by sending commands and performing verification checks while notifying CRONUS of procedure step completion. If an off-nominal condition occurs during procedure execution, PPL Loader notifies CRONUS through popup messages, allowing CRONUS to examine the situation and choose an option of how PPL loader should proceed with the procedure. The use of PPL Loader to perform frequent, routine PPL uplinks offloads CRONUS to better monitor two ISS systems. It also reduces procedure performance time and decreases risk of command errors. AutoORM identifies ISS communication outage periods and builds commands to lock, playback, and unlock ISS Operations Recorder files. Operation Recorder files are circular buffer files of continually recorded ISS telemetry data. Sections of these files can be locked from further writing, be played back to capture telemetry data that occurred during an ISS loss of signal (LOS) period, and then be unlocked for future recording use. Downlinked Operation Recorder files are used by mission support teams for data analysis, especially if failures occur during LOS. The commands to lock, playback, and unlock Operations Recorder files are encompassed in three different operational procedures and contain multiple user-input fields. AutoORM provides different levels of automation for building and uplinking the commands to lock, playback, and unlock Operations Recorder files. In its automatic mode, AutoORM automatically detects ISS LOS periods, then generates and uplinks the commands to lock, playback, and unlock Operations Recorder files when MCC regains signal with ISS. AutoORM also features semi-autonomous and manual modes which integrate CRONUS more into the command verification and uplink process. AutoORMs ability to automatically detect ISS LOS periods and build the necessary commands to preserve, playback, and release recorded telemetry data greatly offloads CRONUS to perform more high-level cognitive tasks, such as mission planning and anomaly troubleshooting. Additionally, since Operations Recorder commands contain numerical time input fields which are tedious for a human to manually build, AutoORM's ability to automatically build commands reduces operational command errors. PPL Loader and AutoORM demonstrate principles of semi-autonomous operational tools that will benefit future space mission operations. Both tools employ different levels of automation to perform simple and routine procedures, thereby offloading human operators to perform higher-level cognitive tasks. Because both tools provide procedure execution status and highlight off-nominal indications, the flight controller is able to intervene during procedure execution if needed. Semi-autonomous tools and systems that can perform routine procedures, yet keep human operators informed of execution, will be essential in future long-duration missions where the onboard crew will be solely responsible for spacecraft monitoring and control.

  19. An OpenEarth Framework (OEF) for Integrating and Visualizing Earth Science Data

    NASA Astrophysics Data System (ADS)

    Moreland, J. L.; Nadeau, D. R.; Baru, C.; Crosby, C. J.

    2009-12-01

    The integration of data is essential to make transformative progress in understanding the complex processes operating at the Earth’s surface and within its interior. While our current ability to collect massive amounts of data, develop structural models, and generate high-resolution dynamics models is well developed, our ability to quantitatively integrate these data and models into holistic interpretations of Earth systems is poorly developed. We lack the basic tools to realize a first-order goal in Earth science of developing integrated 4D models of Earth structure and processes using a complete range of available constraints, at a time when the research agenda of major efforts such as EarthScope demand such a capability. Among the challenges to 3D data integration are data that may be in different coordinate spaces, units, value ranges, file formats, and data structures. While several file format standards exist, they are infrequently or incorrectly used. Metadata is often missing, misleading, or relegated to README text files along side the data. This leaves much of the work to integrate data bogged down by simple data management tasks. The OpenEarth Framework (OEF) being developed by GEON addresses these data management difficulties. The software incorporates file format parsers, data interpretation heuristics, user interfaces to prompt for missing information, and visualization techniques to merge data into a common visual model. The OEF’s data access libraries parse formal and de facto standard file formats and map their data into a common data model. The software handles file format quirks, storage details, caching, local and remote file access, and web service protocol handling. Heuristics are used to determine coordinate spaces, units, and other key data features. Where multiple data structure, naming, and file organization conventions exist, those heuristics check for each convention’s use to find a high confidence interpretation of the data. When no convention or embedded data yields a suitable answer, the user is prompted to fill in the blanks. The OEF’s interaction libraries assist in the construction of user interfaces for data management. These libraries support data import, data prompting, data introspection, the management of the contents of a common data model, and the creation of derived data to support visualization. Finally, visualization libraries provide interactive visualization using an extended version of NASA WorldWind. The OEF viewer supports visualization of terrains, point clouds, 3D volumes, imagery, cutting planes, isosurfaces, and more. Data may be color coded, shaded, and displayed above, or below the terrain, and always registered into a common coordinate space. The OEF architecture is open and cross-platform software libraries are available separately for use with other software projects, while modules from other projects may be integrated into the OEF to extend its features. The OEF is currently being used to visualize data from EarthScope-related research in the Western US.

  20. TabSQL: a MySQL tool to facilitate mapping user data to public databases.

    PubMed

    Xia, Xiao-Qin; McClelland, Michael; Wang, Yipeng

    2010-06-23

    With advances in high-throughput genomics and proteomics, it is challenging for biologists to deal with large data files and to map their data to annotations in public databases. We developed TabSQL, a MySQL-based application tool, for viewing, filtering and querying data files with large numbers of rows. TabSQL provides functions for downloading and installing table files from public databases including the Gene Ontology database (GO), the Ensembl databases, and genome databases from the UCSC genome bioinformatics site. Any other database that provides tab-delimited flat files can also be imported. The downloaded gene annotation tables can be queried together with users' data in TabSQL using either a graphic interface or command line. TabSQL allows queries across the user's data and public databases without programming. It is a convenient tool for biologists to annotate and enrich their data.

  1. TabSQL: a MySQL tool to facilitate mapping user data to public databases

    PubMed Central

    2010-01-01

    Background With advances in high-throughput genomics and proteomics, it is challenging for biologists to deal with large data files and to map their data to annotations in public databases. Results We developed TabSQL, a MySQL-based application tool, for viewing, filtering and querying data files with large numbers of rows. TabSQL provides functions for downloading and installing table files from public databases including the Gene Ontology database (GO), the Ensembl databases, and genome databases from the UCSC genome bioinformatics site. Any other database that provides tab-delimited flat files can also be imported. The downloaded gene annotation tables can be queried together with users' data in TabSQL using either a graphic interface or command line. Conclusions TabSQL allows queries across the user's data and public databases without programming. It is a convenient tool for biologists to annotate and enrich their data. PMID:20573251

  2. A qualitative study on personal information management (PIM) in clinical and basic sciences faculty members of a medical university in Iran

    PubMed Central

    Sedghi, Shahram; Abdolahi, Nida; Azimi, Ali; Tahamtan, Iman; Abdollahi, Leila

    2015-01-01

    Background: Personal Information Management (PIM) refers to the tools and activities to save and retrieve personal information for future uses. This study examined the PIM activities of faculty members of Iran University of Medical Sciences (IUMS) regarding their preferred PIM tools and four aspects of acquiring, organizing, storing and retrieving personal information. Methods: The qualitative design was based on phenomenology approach and we carried out 37 interviews with clinical and basic sciences faculty members of IUMS in 2014. The participants were selected using a random sampling method. All interviews were recorded by a digital voice recorder, and then transcribed, codified and finally analyzed using NVivo 8 software. Results: The use of PIM electronic tools (e-tools) was below expectation among the studied sample and just 37% had reasonable knowledge of PIM e-tools such as, external hard drivers, flash memories etc. However, all participants used both paper and electronic devices to store and access information. Internal mass memories (in Laptops) and flash memories were the most used e-tools to save information. Most participants used "subject" (41.00%) and "file name" (33.7 %) to save, organize and retrieve their stored information. Most users preferred paper-based rather than electronic tools to keep their personal information. Conclusion: Faculty members had little knowledge about PIM techniques and tools. Those who organized personal information could easier retrieve the stored information for future uses. Enhancing familiarity with PIM tools and training courses of PIM tools and techniques are suggested. PMID:26793648

  3. Exploiting volatile opportunistic computing resources with Lobster

    NASA Astrophysics Data System (ADS)

    Woodard, Anna; Wolf, Matthias; Mueller, Charles; Tovar, Ben; Donnelly, Patrick; Hurtado Anampa, Kenyi; Brenner, Paul; Lannon, Kevin; Hildreth, Mike; Thain, Douglas

    2015-12-01

    Analysis of high energy physics experiments using the Compact Muon Solenoid (CMS) at the Large Hadron Collider (LHC) can be limited by availability of computing resources. As a joint effort involving computer scientists and CMS physicists at Notre Dame, we have developed an opportunistic workflow management tool, Lobster, to harvest available cycles from university campus computing pools. Lobster consists of a management server, file server, and worker processes which can be submitted to any available computing resource without requiring root access. Lobster makes use of the Work Queue system to perform task management, while the CMS specific software environment is provided via CVMFS and Parrot. Data is handled via Chirp and Hadoop for local data storage and XrootD for access to the CMS wide-area data federation. An extensive set of monitoring and diagnostic tools have been developed to facilitate system optimisation. We have tested Lobster using the 20 000-core cluster at Notre Dame, achieving approximately 8-10k tasks running simultaneously, sustaining approximately 9 Gbit/s of input data and 340 Mbit/s of output data.

  4. DataUp 2.0: Improving On a Tool For Helping Researchers Archive, Manage, and Share Their Tabular Data

    NASA Astrophysics Data System (ADS)

    Strasser, C.; Borda, S.; Cruse, P.; Kunze, J.

    2013-12-01

    There are many barriers to data management and sharing among earth and environmental scientists; among the most significant are a lack of knowledge about best practices for data management, metadata standards, or appropriate data repositories for archiving and sharing data. Last year we developed an open source web application, DataUp, to help researchers overcome these barriers. DataUp helps scientists to (1) determine whether their file is CSV compatible, (2) generate metadata in a standard format, (3) retrieve an identifier to facilitate data citation, and (4) deposit their data into a repository. With funding from the NSF via a supplemental grant to the DataONE project, we are working to improve upon DataUp. Our main goal for DataUp 2.0 is to ensure organizations and repositories are able to adopt and adapt DataUp to meet their unique needs, including connecting to analytical tools, adding new metadata schema, and expanding the list of connected data repositories. DataUp is a collaborative project between the California Digital Library, DataONE, the San Diego Supercomputing Center, and Microsoft Research Connections.

  5. 36 CFR 1222.20 - How are personal files defined and managed?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 36 Parks, Forests, and Public Property 3 2011-07-01 2011-07-01 false How are personal files... Records § 1222.20 How are personal files defined and managed? (a) Personal files are defined in § 1220.18... Presidential Records Act of 1978 (44 U.S.C. 2201-2207) (see 36 CFR part 1270 of this chapter). (b) Personal...

  6. The HDF Product Designer - Interoperability in the First Mile

    NASA Astrophysics Data System (ADS)

    Lee, H.; Jelenak, A.; Habermann, T.

    2014-12-01

    Interoperable data have been a long-time goal in many scientific communities. The recent growth in analysis, visualization and mash-up applications that expect data stored in a standardized manner has brought the interoperability issue to the fore. On the other hand, producing interoperable data is often regarded as a sideline task in a typical research team for which resources are not readily available. The HDF Group is developing a software tool aimed at lessening the burden of creating data in standards-compliant, interoperable HDF5 files. The tool, named HDF Product Designer, lowers the threshold needed to design such files by providing a user interface that combines the rich HDF5 feature set with applicable metadata conventions. Users can quickly devise new HDF5 files while at the same time seamlessly incorporating the latest best practices and conventions from their community. That is what the term interoperability in the first mile means: enabling generation of interoperable data in HDF5 files from the onset of their production. The tool also incorporates collaborative features, allowing team approach in the file design, as well as easy transfer of best practices as they are being developed. The current state of the tool and the plans for future development will be presented. Constructive input from interested parties is always welcome.

  7. deepTools: a flexible platform for exploring deep-sequencing data.

    PubMed

    Ramírez, Fidel; Dündar, Friederike; Diehl, Sarah; Grüning, Björn A; Manke, Thomas

    2014-07-01

    We present a Galaxy based web server for processing and visualizing deeply sequenced data. The web server's core functionality consists of a suite of newly developed tools, called deepTools, that enable users with little bioinformatic background to explore the results of their sequencing experiments in a standardized setting. Users can upload pre-processed files with continuous data in standard formats and generate heatmaps and summary plots in a straight-forward, yet highly customizable manner. In addition, we offer several tools for the analysis of files containing aligned reads and enable efficient and reproducible generation of normalized coverage files. As a modular and open-source platform, deepTools can easily be expanded and customized to future demands and developments. The deepTools webserver is freely available at http://deeptools.ie-freiburg.mpg.de and is accompanied by extensive documentation and tutorials aimed at conveying the principles of deep-sequencing data analysis. The web server can be used without registration. deepTools can be installed locally either stand-alone or as part of Galaxy. © The Author(s) 2014. Published by Oxford University Press on behalf of Nucleic Acids Research.

  8. Implementation of structure-mapping inference by event-file binding and action planning: a model of tool-improvisation analogies.

    PubMed

    Fields, Chris

    2011-03-01

    Structure-mapping inferences are generally regarded as dependent upon relational concepts that are understood and expressible in language by subjects capable of analogical reasoning. However, tool-improvisation inferences are executed by members of a variety of non-human primate and other species. Tool improvisation requires correctly inferring the motion and force-transfer affordances of an object; hence tool improvisation requires structure mapping driven by relational properties. Observational and experimental evidence can be interpreted to indicate that structure-mapping analogies in tool improvisation are implemented by multi-step manipulation of event files by binding and action-planning mechanisms that act in a language-independent manner. A functional model of language-independent event-file manipulations that implement structure mapping in the tool-improvisation domain is developed. This model provides a mechanism by which motion and force representations commonly employed in tool-improvisation structure mappings may be sufficiently reinforced to be available to inwardly directed attention and hence conceptualization. Predictions and potential experimental tests of this model are outlined.

  9. Converting Inhouse Subject Card Files to Electronic Keyword Files.

    ERIC Educational Resources Information Center

    Culmer, Carita M.

    The library at Phoenix College developed the Controversial Issues Files (CIF), a "home made" card file containing references pertinent to specific ongoing assignments. Although the CIF had proven itself to be an excellent resource tool for beginning researchers, it was cumbersome to maintain in the card format, and was limited to very…

  10. PDBToSDF: Create ligand structure files from PDB file.

    PubMed

    Muppalaneni, Naresh Babu; Rao, Allam Appa

    2011-01-01

    Protein Data Bank (PDB) file contains atomic data for protein and ligand in protein-ligand complexes. Structure data file (SDF) contains data for atoms, bonds, connectivity and coordinates of molecule for ligands. We describe PDBToSDF as a tool to separate the ligand data from pdb file for the calculation of ligand properties like molecular weight, number of hydrogen bond acceptors, hydrogen bond receptors easily.

  11. 76 FR 39263 - Antidumping and Countervailing Duty Proceedings: Electronic Filing Procedures; Administrative...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-06

    ... resulting from the Department's implementation of an electronic filing and documents management program... regulations that is entitled ``IA ACCESS Handbook On Electronic Filing Procedures'' (``IA ACCESS Handbook... management program named Import Administration Antidumping and Countervailing Duty Centralized Electronic...

  12. 5 CFR 1201.4 - General definitions.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... electronic submission. (m) Electronic filing (e-filing). Filing and receiving documents in electronic form in... took the action. Appeals of Office of Personnel Management reconsideration decisions concerning... proceeding. This term applies to the Office of Personnel Management and to the Office of Special Counsel when...

  13. RELEASE NOTES FOR MODELS-3 VERSION 4.1 PATCH: SMOKE TOOL AND FILE CONVERTER

    EPA Science Inventory

    This software patch to the Models-3 system corrects minor errors in the Models-3 framework, provides substantial improvements in the ASCII to I/O API format conversion of the File Converter utility, and new functionalities for the SMOKE Tool. Version 4.1 of the Models-3 system...

  14. Creating Interactive Graphical Overlays in the Advanced Weather Interactive Processing System (AWIPS) Using Shapefiles and DGM Files

    NASA Technical Reports Server (NTRS)

    Barrett, Joe H., III; Lafosse, Richard; Hood, Doris; Hoeth, Brian

    2007-01-01

    Graphical overlays can be created in real-time in the Advanced Weather Interactive Processing System (AWIPS) using shapefiles or DARE Graphics Metafile (DGM) files. This presentation describes how to create graphical overlays on-the-fly for AWIPS, by using two examples of AWIPS applications that were created by the Applied Meteorology Unit (AMU). The first example is the Anvil Threat Corridor Forecast Tool, which produces a shapefile that depicts a graphical threat corridor of the forecast movement of thunderstorm anvil clouds, based on the observed or forecast upper-level winds. This tool is used by the Spaceflight Meteorology Group (SMG) and 45th Weather Squadron (45 WS) to analyze the threat of natural or space vehicle-triggered lightning over a location. The second example is a launch and landing trajectory tool that produces a DGM file that plots the ground track of space vehicles during launch or landing. The trajectory tool can be used by SMG and the 45 WS forecasters to analyze weather radar imagery along a launch or landing trajectory. Advantages of both file types will be listed.

  15. Hydrologic data-verification management program plan

    USGS Publications Warehouse

    Alexander, C.W.

    1982-01-01

    Data verification refers to the performance of quality control on hydrologic data that have been retrieved from the field and are being prepared for dissemination to water-data users. Water-data users now have access to computerized data files containing unpublished, unverified hydrologic data. Therefore, it is necessary to develop techniques and systems whereby the computer can perform some data-verification functions before the data are stored in user-accessible files. Computerized data-verification routines can be developed for this purpose. A single, unified concept describing master data-verification program using multiple special-purpose subroutines, and a screen file containing verification criteria, can probably be adapted to any type and size of computer-processing system. Some traditional manual-verification procedures can be adapted for computerized verification, but new procedures can also be developed that would take advantage of the powerful statistical tools and data-handling procedures available to the computer. Prototype data-verification systems should be developed for all three data-processing environments as soon as possible. The WATSTORE system probably affords the greatest opportunity for long-range research and testing of new verification subroutines. (USGS)

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Enders, Alexander L.; Lousteau, Angela L.

    The Desktop Analysis Reporting Tool (DART) is a software package that allows users to easily view and analyze daily files that span long periods. DART gives users the capability to quickly determine the state of health of a radiation portal monitor (RPM), troubleshoot and diagnose problems, and view data in various time frames to perform trend analysis. In short, it converts the data strings written in the daily files into meaningful tables and plots. The standalone version of DART (“soloDART”) utilizes a database engine that is included with the application; no additional installations are necessary. There is also a networkedmore » version of DART (“polyDART”) that is designed to maximize the benefit of a centralized data repository while distributing the workload to individual desktop machines. This networked approach requires a more complex database manager Structured Query Language (SQL) Server; however, SQL Server is not currently provided with DART. Regardless of which version is used, DART will import daily files from RPMs, store the relevant data in its database, and it can produce reports for status, trend analysis, and reporting purposes.« less

  17. Supporting geoscience with graphical-user-interface Internet tools for the Macintosh

    NASA Astrophysics Data System (ADS)

    Robin, Bernard

    1995-07-01

    This paper describes a suite of Macintosh graphical-user-interface (GUI) software programs that can be used in conjunction with the Internet to support geoscience education. These software programs allow science educators to access and retrieve a large body of resources from an increasing number of network sites, taking advantage of the intuitive, simple-to-use Macintosh operating system. With these tools, educators easily can locate, download, and exchange not only text files but also sound resources, video movie clips, and software application files from their desktop computers. Another major advantage of these software tools is that they are available at no cost and may be distributed freely. The following GUI software tools are described including examples of how they can be used in an educational setting: ∗ Eudora—an e-mail program ∗ NewsWatcher—a newsreader ∗ TurboGopher—a Gopher program ∗ Fetch—a software application for easy File Transfer Protocol (FTP) ∗ NCSA Mosaic—a worldwide hypertext browsing program. An explosive growth of online archives currently is underway as new electronic sites are being added continuously to the Internet. Many of these resources may be of interest to science educators who learn they can share not only ASCII text files, but also graphic image files, sound resources, QuickTime movie clips, and hypermedia projects with colleagues from locations around the world. These powerful, yet simple to learn GUI software tools are providing a revolution in how knowledge can be accessed, retrieved, and shared.

  18. Creating databases for biological information: an introduction.

    PubMed

    Stein, Lincoln

    2013-06-01

    The essence of bioinformatics is dealing with large quantities of information. Whether it be sequencing data, microarray data files, mass spectrometric data (e.g., fingerprints), the catalog of strains arising from an insertional mutagenesis project, or even large numbers of PDF files, there inevitably comes a time when the information can simply no longer be managed with files and directories. This is where databases come into play. This unit briefly reviews the characteristics of several database management systems, including flat file, indexed file, relational databases, and NoSQL databases. It compares their strengths and weaknesses and offers some general guidelines for selecting an appropriate database management system. Copyright 2013 by JohnWiley & Sons, Inc.

  19. Files synchronization from a large number of insertions and deletions

    NASA Astrophysics Data System (ADS)

    Ellappan, Vijayan; Kumari, Savera

    2017-11-01

    Synchronization between different versions of files is becoming a major issue that most of the applications are facing. To make the applications more efficient a economical algorithm is developed from the previously used algorithm of “File Loading Algorithm”. I am extending this algorithm in three ways: First, dealing with non-binary files, Second backup is generated for uploaded files and lastly each files are synchronized with insertions and deletions. User can reconstruct file from the former file with minimizing the error and also provides interactive communication by eliminating the frequency without any disturbance. The drawback of previous system is overcome by using synchronization, in which multiple copies of each file/record is created and stored in backup database and is efficiently restored in case of any unwanted deletion or loss of data. That is, to introduce a protocol that user B may use to reconstruct file X from file Y with suitably low probability of error. Synchronization algorithms find numerous areas of use, including data storage, file sharing, source code control systems, and cloud applications. For example, cloud storage services such as Drop box synchronize between local copies and cloud backups each time users make changes to local versions. Similarly, synchronization tools are necessary in mobile devices. Specialized synchronization algorithms are used for video and sound editing. Synchronization tools are also capable of performing data duplication.

  20. Streamlining Metadata and Data Management for Evolving Digital Libraries

    NASA Astrophysics Data System (ADS)

    Clark, D.; Miller, S. P.; Peckman, U.; Smith, J.; Aerni, S.; Helly, J.; Sutton, D.; Chase, A.

    2003-12-01

    What began two years ago as an effort to stabilize the Scripps Institution of Oceanography (SIO) data archives from more than 700 cruises going back 50 years, has now become the operational fully-searchable "SIOExplorer" digital library, complete with thousands of historic photographs, images, maps, full text documents, binary data files, and 3D visualization experiences, totaling nearly 2 terabytes of digital content. Coping with data diversity and complexity has proven to be more challenging than dealing with large volumes of digital data. SIOExplorer has been built with scalability in mind, so that the addition of new data types and entire new collections may be accomplished with ease. It is a federated system, currently interoperating with three independent data-publishing authorities, each responsible for their own quality control, metadata specifications, and content selection. The IT architecture implemented at the San Diego Supercomputer Center (SDSC) streamlines the integration of additional projects in other disciplines with a suite of metadata management and collection building tools for "arbitrary digital objects." Metadata are automatically harvested from data files into domain-specific metadata blocks, and mapped into various specification standards as needed. Metadata can be browsed and objects can be viewed onscreen or downloaded for further analysis, with automatic proprietary-hold request management.

  1. Enabling policy planning and innovation management through patent information and co-authorship network analyses: a study of tuberculosis in Brazil.

    PubMed

    Vasconcellos, Alexandre Guimarães; Morel, Carlos Medicis

    2012-01-01

    New tools and approaches are necessary to facilitate public policy planning and foster the management of innovation in countries' public health systems. To this end, an understanding of the integrated way in which the various actors who produce scientific knowledge and inventions in technological areas of interest operate, where they are located and how they relate to one another is of great relevance. Tuberculosis has been chosen as a model for the present study as it is a current challenge for Brazilian research and innovation. Publications about tuberculosis written by Brazilian authors were accessed from international databases, analyzed, processed with text searching tools and networks of coauthors were constructed and visualized. Patent applications about tuberculosis in Brazil were retrieved from the Brazilian National Institute of Industrial Property (INPI) and the European Patent Office databases, through the use of International Patent Classification and keywords and then categorized and analyzed. Brazilian authorship of articles about tuberculosis jumped from 1% in 1995 to 5% in 2010. Article production and patent filings of national origin have been concentrated in public universities and research institutions while the participation of private industry in the filing of Brazilian patents has remained limited. The goals of national patenting efforts have still not been reached, as up to the present none of the applications filed have been granted a patent. The analysis of all this data about TB publishing and patents clearly demonstrates the importance of maintaining the continuity of Brazil's production development policies as well as government support for infrastructure projects to be employed in transforming the potential of research. This policy, which already exists for the promotion of new products and processes that, in addition to bringing diverse economic benefits to the country, will also contribute to effective dealing with public health problems affecting Brazil and the World.

  2. Enabling Policy Planning and Innovation Management through Patent Information and Co-Authorship Network Analyses: A Study of Tuberculosis in Brazil

    PubMed Central

    Vasconcellos, Alexandre Guimarães; Morel, Carlos Medicis

    2012-01-01

    Introduction New tools and approaches are necessary to facilitate public policy planning and foster the management of innovation in countries' public health systems. To this end, an understanding of the integrated way in which the various actors who produce scientific knowledge and inventions in technological areas of interest operate, where they are located and how they relate to one another is of great relevance. Tuberculosis has been chosen as a model for the present study as it is a current challenge for Brazilian research and innovation. Methodology Publications about tuberculosis written by Brazilian authors were accessed from international databases, analyzed, processed with text searching tools and networks of coauthors were constructed and visualized. Patent applications about tuberculosis in Brazil were retrieved from the Brazilian National Institute of Industrial Property (INPI) and the European Patent Office databases, through the use of International Patent Classification and keywords and then categorized and analyzed. Results/Conclusions Brazilian authorship of articles about tuberculosis jumped from 1% in 1995 to 5% in 2010. Article production and patent filings of national origin have been concentrated in public universities and research institutions while the participation of private industry in the filing of Brazilian patents has remained limited. The goals of national patenting efforts have still not been reached, as up to the present none of the applications filed have been granted a patent. The analysis of all this data about TB publishing and patents clearly demonstrates the importance of maintaining the continuity of Brazil's production development policies as well as government support for infrastructure projects to be employed in transforming the potential of research. This policy, which already exists for the promotion of new products and processes that, in addition to bringing diverse economic benefits to the country, will also contribute to effective dealing with public health problems affecting Brazil and the World. PMID:23056208

  3. Gee Fu: a sequence version and web-services database tool for genomic assembly, genome feature and NGS data.

    PubMed

    Ramirez-Gonzalez, Ricardo; Caccamo, Mario; MacLean, Daniel

    2011-10-01

    Scientists now use high-throughput sequencing technologies and short-read assembly methods to create draft genome assemblies in just days. Tools and pipelines like the assembler, and the workflow management environments make it easy for a non-specialist to implement complicated pipelines to produce genome assemblies and annotations very quickly. Such accessibility results in a proliferation of assemblies and associated files, often for many organisms. These assemblies get used as a working reference by lots of different workers, from a bioinformatician doing gene prediction or a bench scientist designing primers for PCR. Here we describe Gee Fu, a database tool for genomic assembly and feature data, including next-generation sequence alignments. Gee Fu is an instance of a Ruby-On-Rails web application on a feature database that provides web and console interfaces for input, visualization of feature data via AnnoJ, access to data through a web-service interface, an API for direct data access by Ruby scripts and access to feature data stored in BAM files. Gee Fu provides a platform for storing and sharing different versions of an assembly and associated features that can be accessed and updated by bench biologists and bioinformaticians in ways that are easy and useful for each. http://tinyurl.com/geefu dan.maclean@tsl.ac.uk.

  4. Automated DICOM metadata and volumetric anatomical information extraction for radiation dosimetry

    NASA Astrophysics Data System (ADS)

    Papamichail, D.; Ploussi, A.; Kordolaimi, S.; Karavasilis, E.; Papadimitroulas, P.; Syrgiamiotis, V.; Efstathopoulos, E.

    2015-09-01

    Patient-specific dosimetry calculations based on simulation techniques have as a prerequisite the modeling of the modality system and the creation of voxelized phantoms. This procedure requires the knowledge of scanning parameters and patients’ information included in a DICOM file as well as image segmentation. However, the extraction of this information is complicated and time-consuming. The objective of this study was to develop a simple graphical user interface (GUI) to (i) automatically extract metadata from every slice image of a DICOM file in a single query and (ii) interactively specify the regions of interest (ROI) without explicit access to the radiology information system. The user-friendly application developed in Matlab environment. The user can select a series of DICOM files and manage their text and graphical data. The metadata are automatically formatted and presented to the user as a Microsoft Excel file. The volumetric maps are formed by interactively specifying the ROIs and by assigning a specific value in every ROI. The result is stored in DICOM format, for data and trend analysis. The developed GUI is easy, fast and and constitutes a very useful tool for individualized dosimetry. One of the future goals is to incorporate a remote access to a PACS server functionality.

  5. The European Southern Observatory-MIDAS table file system

    NASA Technical Reports Server (NTRS)

    Peron, M.; Grosbol, P.

    1992-01-01

    The new and substantially upgraded version of the Table File System in MIDAS is presented as a scientific database system. MIDAS applications for performing database operations on tables are discussed, for instance, the exchange of the data to and from the TFS, the selection of objects, the uncertainty joins across tables, and the graphical representation of data. This upgraded version of the TFS is a full implementation of the binary table extension of the FITS format; in addition, it also supports arrays of strings. Different storage strategies for optimal access of very large data sets are implemented and are addressed in detail. As a simple relational database, the TFS may be used for the management of personal data files. This opens the way to intelligent pipeline processing of large amounts of data. One of the key features of the Table File System is to provide also an extensive set of tools for the analysis of the final results of a reduction process. Column operations using standard and special mathematical functions as well as statistical distributions can be carried out; commands for linear regression and model fitting using nonlinear least square methods and user-defined functions are available. Finally, statistical tests of hypothesis and multivariate methods can also operate on tables.

  6. 78 FR 19521 - Filing of Plats of Survey, Nebraska

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-01

    ...] Filing of Plats of Survey, Nebraska AGENCY: Bureau of Land Management, Interior. ACTION: Notice. SUMMARY: The Bureau of Land Management (BLM) is scheduled to file the plats of survey of the lands described.... Box 1828, Cheyenne, Wyoming 82003. SUPPLEMENTARY INFORMATION: These surveys were executed at the...

  7. Integrated Autonomous Network Management (IANM) Multi-Topology Route Manager and Analyzer

    DTIC Science & Technology

    2008-02-01

    zebra tmg mtrcli xinetd (tftp) mysql configuration file (mtrrm.conf) configuration file (mtrrmAggregator.properties) tftp files /tftpboot NetFlow PDUs...configuration upload/download snmp, telnet OSPFv2 user interface tmg Figure 6-2. Internal software organization Figure 6-2 illustrates the main

  8. Archive Inventory Management System (AIMS) — A Fast, Metrics Gathering Framework for Validating and Gaining Insight from Large File-Based Data Archives

    NASA Astrophysics Data System (ADS)

    Verma, R. V.

    2018-04-01

    The Archive Inventory Management System (AIMS) is a software package for understanding the distribution, characteristics, integrity, and nuances of files and directories in large file-based data archives on a continuous basis.

  9. Career Activity File: Counseling Tools for a Guidance Program, K-12.

    ERIC Educational Resources Information Center

    Oklahoma State Dept. of Career and Technology Education, Stillwater.

    This career activity file provides career information resources and tools to support a guidance program. Section 1 is a school guidance program plan designed to assist school counselors in strengthening their current program or in designing a new one. The information can be used to assist schools in meeting the requirements of Standard VI,…

  10. Social Influences on User Behavior in Group Information Repositories

    ERIC Educational Resources Information Center

    Rader, Emilee Jeanne

    2009-01-01

    Group information repositories are systems for organizing and sharing files kept in a central location that all group members can access. These systems are often assumed to be tools for storage and control of files and their metadata, not tools for communication. The purpose of this research is to better understand user behavior in group…

  11. 41 CFR 101-26.308 - Obtaining filing cabinets.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 41 Public Contracts and Property Management 2 2010-07-01 2010-07-01 true Obtaining filing cabinets. 101-26.308 Section 101-26.308 Public Contracts and Property Management Federal Property Management Regulations System FEDERAL PROPERTY MANAGEMENT REGULATIONS SUPPLY AND PROCUREMENT 26-PROCUREMENT SOURCES AND...

  12. 43 CFR 3900.30 - Filing documents.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 43 Public Lands: Interior 2 2013-10-01 2013-10-01 false Filing documents. 3900.30 Section 3900.30 Public Lands: Interior Regulations Relating to Public Lands (Continued) BUREAU OF LAND MANAGEMENT, DEPARTMENT OF THE INTERIOR MINERALS MANAGEMENT (3000) OIL SHALE MANAGEMENT-GENERAL Oil Shale Management...

  13. 43 CFR 3900.30 - Filing documents.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 43 Public Lands: Interior 2 2011-10-01 2011-10-01 false Filing documents. 3900.30 Section 3900.30 Public Lands: Interior Regulations Relating to Public Lands (Continued) BUREAU OF LAND MANAGEMENT, DEPARTMENT OF THE INTERIOR RANGE MANAGEMENT (4000) OIL SHALE MANAGEMENT-GENERAL Oil Shale Management...

  14. 43 CFR 3900.30 - Filing documents.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 43 Public Lands: Interior 2 2012-10-01 2012-10-01 false Filing documents. 3900.30 Section 3900.30 Public Lands: Interior Regulations Relating to Public Lands (Continued) BUREAU OF LAND MANAGEMENT, DEPARTMENT OF THE INTERIOR MINERALS MANAGEMENT (3000) OIL SHALE MANAGEMENT-GENERAL Oil Shale Management...

  15. 43 CFR 3900.30 - Filing documents.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 43 Public Lands: Interior 2 2014-10-01 2014-10-01 false Filing documents. 3900.30 Section 3900.30 Public Lands: Interior Regulations Relating to Public Lands (Continued) BUREAU OF LAND MANAGEMENT, DEPARTMENT OF THE INTERIOR MINERALS MANAGEMENT (3000) OIL SHALE MANAGEMENT-GENERAL Oil Shale Management...

  16. Extended Testability Analysis Tool

    NASA Technical Reports Server (NTRS)

    Melcher, Kevin; Maul, William A.; Fulton, Christopher

    2012-01-01

    The Extended Testability Analysis (ETA) Tool is a software application that supports fault management (FM) by performing testability analyses on the fault propagation model of a given system. Fault management includes the prevention of faults through robust design margins and quality assurance methods, or the mitigation of system failures. Fault management requires an understanding of the system design and operation, potential failure mechanisms within the system, and the propagation of those potential failures through the system. The purpose of the ETA Tool software is to process the testability analysis results from a commercial software program called TEAMS Designer in order to provide a detailed set of diagnostic assessment reports. The ETA Tool is a command-line process with several user-selectable report output options. The ETA Tool also extends the COTS testability analysis and enables variation studies with sensor sensitivity impacts on system diagnostics and component isolation using a single testability output. The ETA Tool can also provide extended analyses from a single set of testability output files. The following analysis reports are available to the user: (1) the Detectability Report provides a breakdown of how each tested failure mode was detected, (2) the Test Utilization Report identifies all the failure modes that each test detects, (3) the Failure Mode Isolation Report demonstrates the system s ability to discriminate between failure modes, (4) the Component Isolation Report demonstrates the system s ability to discriminate between failure modes relative to the components containing the failure modes, (5) the Sensor Sensor Sensitivity Analysis Report shows the diagnostic impact due to loss of sensor information, and (6) the Effect Mapping Report identifies failure modes that result in specified system-level effects.

  17. 77 FR 27479 - Filing of Plats of Survey: Oregon/Washington

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-10

    ...] Filing of Plats of Survey: Oregon/Washington AGENCY: Bureau of Land Management, Interior. ACTION: Notice... officially filed in the Bureau of Land Management Oregon/ Washington State Office, Portland, Oregon, 30 days from the date of this publication. Willamette Meridian Oregon T. 15 S., R. 2 W., accepted April 20...

  18. 75 FR 54910 - Eastern States: Filing of Plats of Survey

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-09

    ... DEPARTMENT OF THE INTERIOR Bureau of Land Management [LLES956000-L14200000-BJ0000-LXSITRST0000] Eastern States: Filing of Plats of Survey AGENCY: Bureau of Land Management, Interior. ACTION: Notice of... Federal Register, Volume 75, Number 131, on page 39579 a notice entitled ``Eastern States: Filing of Plats...

  19. 75 FR 72837 - Eastern States: Filing of Plats of Survey

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-26

    ...] Eastern States: Filing of Plats of Survey AGENCY: Bureau of Land Management, Interior. ACTION: Notice of... (BLM) will file the plats of survey of the lands described below in the BLM-Eastern States office in... INFORMATION CONTACT: Bureau of Land Management-Eastern States, 7450 Boston Boulevard, Springfield, Virginia...

  20. 78 FR 23952 - Eastern States: Filing of Plats of Survey

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-23

    ...] Eastern States: Filing of Plats of Survey AGENCY: Bureau of Land Management, Interior. ACTION: Notice of... file the plats of survey of the lands described below in the BLM-Eastern States office in Springfield... CONTACT: Bureau of Land Management-Eastern States, 7450 Boston Boulevard, Springfield, Virginia 22153...

  1. 75 FR 18234 - Eastern States: Filing of Plat of Survey

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-09

    ...] Eastern States: Filing of Plat of Survey AGENCY: Bureau of Land Management, Interior. ACTION: Notice of...) will file the plats of survey of the lands described below in the BLM-Eastern States office in... INFORMATION CONTACT: Bureau of Land Management--Eastern States, 7450 Boston Boulevard, Springfield, Virginia...

  2. 75 FR 39579 - Eastern States: Filing of Plats of Survey

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-09

    ...] Eastern States: Filing of Plats of Survey AGENCY: Bureau of Land Management, Interior. ACTION: Notice of... file the plats of survey of the lands described below in the BLM-Eastern States office in Springfield... CONTACT: Bureau of Land Management-Eastern States, 7450 Boston Boulevard, Springfield, Virginia 22153...

  3. 77 FR 60719 - Filing of Plats of Survey, Wyoming

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-04

    ...] Filing of Plats of Survey, Wyoming AGENCY: Bureau of Land Management, Interior. ACTION: Notice. SUMMARY: The Bureau of Land Management (BLM) is scheduled to file the plats of survey of the lands described.... Box 1828, Cheyenne, Wyoming 82003. SUPPLEMENTARY INFORMATION: This survey was executed at the request...

  4. 78 FR 19521 - Filing of Plats of Survey: Oregon/Washington

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-01

    ...: HAG13-0161] Filing of Plats of Survey: Oregon/Washington AGENCY: Bureau of Land Management, Interior. ACTION: Notice. SUMMARY: The plats of survey of the following described lands are scheduled to be... survey must file a written notice with the Oregon State Director, Bureau of Land Management, stating that...

  5. 78 FR 64530 - Idaho: Filing of Plats of Survey

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-10-29

    ... of Plats of Survey AGENCY: Bureau of Land Management, Interior. ACTION: Notice of filing of plats of surveys. SUMMARY: The Bureau of Land Management (BLM) has officially filed the plats of survey of the..., 83709-1657. SUPPLEMENTARY INFORMATION: These surveys were executed at the request of the Bureau of Land...

  6. 77 FR 42759 - IDAHO: Filing of Plats of Survey

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-20

    ... of Plats of Survey AGENCY: Bureau of Land Management, Interior. ACTION: Notice of filing of plats of surveys. SUMMARY: The Bureau of Land Management (BLM) has officially filed the plats of survey of the..., 83709-1657. SUPPLEMENTARY INFORMATION: These surveys were executed at the request of the Bureau of Land...

  7. 77 FR 30314 - Eastern States: Filing of Plat of Survey

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-22

    ...] Eastern States: Filing of Plat of Survey AGENCY: Bureau of Land Management, Interior ACTION: Notice. SUMMARY: The Bureau of Land Management (BLM) will file the plat of survey of the lands described below in..., 7450 Boston Boulevard, Springfield, Virginia 22153. Attn: Cadastral Survey. Persons who use a...

  8. 76 FR 23333 - Idaho: Filing of Plats of Survey

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-26

    ... of Plats of Survey AGENCY: Bureau of Land Management, Interior. ACTION: Notice of Filing of Plats of Surveys. SUMMARY: The Bureau of Land Management (BLM) has officially filed the plats of survey of the... 83709-1657. SUPPLEMENTARY INFORMATION: These surveys were executed at the request of the Bureau of Land...

  9. 78 FR 45955 - IDAHO: Filing of Plats of Survey

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-30

    ... of Plats of Survey AGENCY: Bureau of Land Management, Interior. ACTION: Notice of filing of plats of surveys. SUMMARY: The Bureau of Land Management (BLM) has officially filed the plats of survey of the... 83709-1657. SUPPLEMENTARY INFORMATION: These surveys were executed at the request of the Bureau of Land...

  10. 77 FR 66477 - Filing of Plats of Survey: Oregon/Washington

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-05

    ...: HAG13-0040] Filing of Plats of Survey: Oregon/Washington AGENCY: Bureau of Land Management, Interior. ACTION: Notice. SUMMARY: The plats of survey of the following described lands are scheduled to be... survey must file a written notice with the Oregon State Director, Bureau of Land Management, stating that...

  11. 77 FR 21805 - Idaho: Filing of Plats of Survey

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-04-11

    ... of Plats of Survey AGENCY: Bureau of Land Management, Interior. ACTION: Notice of Filing of Plats of Surveys. SUMMARY: The Bureau of Land Management (BLM) has officially filed the plats of survey of the... 83709-1657. SUPPLEMENTARY INFORMATION: These surveys were executed at the request of the Bureau of Land...

  12. 76 FR 71070 - Filing of Plats of Survey, Nebraska

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-16

    ...] Filing of Plats of Survey, Nebraska AGENCY: Bureau of Land Management, Interior. ACTION: Notice. SUMMARY: The Bureau of Land Management (BLM) is scheduled to file the plats of survey of the lands described.... Box 1828, Cheyenne, Wyoming 82003. SUPPLEMENTARY INFORMATION: This survey was executed at the request...

  13. 77 FR 3791 - Idaho: Filing of Plats of Survey

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-01-25

    ... of Plats of Survey AGENCY: Bureau of Land Management, Interior. ACTION: Notice of filing of plats of surveys. SUMMARY: The Bureau of Land Management (BLM) has officially filed the plats of survey of the..., 83709-1657. SUPPLEMENTARY INFORMATION: These surveys were executed at the request of the Bureau of Land...

  14. Information retrieval and display system

    NASA Technical Reports Server (NTRS)

    Groover, J. L.; King, W. L.

    1977-01-01

    Versatile command-driven data management system offers users, through simplified command language, a means of storing and searching data files, sorting data files into specified orders, performing simple or complex computations, effecting file updates, and printing or displaying output data. Commands are simple to use and flexible enough to meet most data management requirements.

  15. 76 FR 52012 - Notice of Filing of Plats of Survey, New Mexico

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-19

    ... Filing of Plats of Survey, New Mexico AGENCY: Bureau of Land Management, Interior. ACTION: Notice of... filed in the New Mexico State Office, Bureau of Land Management, Santa Fe, New Mexico, thirty (30) calendar days from the date of this publication. SUPPLEMENTARY INFORMATION: New Mexico Principal Meridian...

  16. 77 FR 72341 - Combined Notice of Filings #1

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-05

    .... Applicants: Capital Research and Management Company. Description: Request for Amended Order Under Section 203 of the Federal Power Act of Capital Research and Management Company, et. al. Filed Date: 11/28/12... Authorization Under Section 203 of the Federal Power Act and Request for Expedited Consideration. Filed Date: 11...

  17. 76 FR 77551 - Notice of Filing of Plats of Survey, New Mexico

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-13

    ... Filing of Plats of Survey, New Mexico AGENCY: Bureau of Land Management, Interior. ACTION: Notice of... filed in the New Mexico State Office, Bureau of Land Management, Santa Fe, New Mexico, thirty (30) calendar days from the date of this publication. SUPPLEMENTARY INFORMATION: New Mexico Principal Meridian...

  18. 77 FR 17092 - Notice of Filing of Plats of Survey, New Mexico

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-23

    ... Filing of Plats of Survey, New Mexico AGENCY: Bureau of Land Management, Interior. ACTION: Notice of... filed in the New Mexico State Office, Bureau of Land Management, Santa Fe, New Mexico, thirty (30) calendar days from the date of this publication. SUPPLEMENTARY INFORMATION: New Mexico Principal Meridian...

  19. Severe community-acquired pneumonia: timely management measures in the first 24 hours.

    PubMed

    Phua, Jason; Dean, Nathan C; Guo, Qi; Kuan, Win Sen; Lim, Hui Fang; Lim, Tow Keang

    2016-08-28

    Mortality rates for severe community-acquired pneumonia (CAP) range from 17 to 48 % in published studies.In this review, we searched PubMed for relevant papers published between 1981 and June 2016 and relevant files. We explored how early and aggressive management measures, implemented within 24 hours of recognition of severe CAP and carried out both in the emergency department and in the ICU, decrease mortality in severe CAP.These measures begin with the use of severity assessment tools and the application of care bundles via clinical decision support tools. The bundles include early guideline-concordant antibiotics including macrolides, early haemodynamic support (lactate measurement, intravenous fluids, and vasopressors), and early respiratory support (high-flow nasal cannulae, lung-protective ventilation, prone positioning, and neuromuscular blockade for acute respiratory distress syndrome).While the proposed interventions appear straightforward, multiple barriers to their implementation exist. To successfully decrease mortality for severe CAP, early and close collaboration between emergency medicine and respiratory and critical care medicine teams is required. We propose a workflow incorporating these interventions.

  20. Master Middle Ware: A Tool to Integrate Water Resources and Fish Population Dynamics Models

    NASA Astrophysics Data System (ADS)

    Yi, S.; Sandoval Solis, S.; Thompson, L. C.; Kilduff, D. P.

    2017-12-01

    Linking models that investigate separate components of ecosystem processes has the potential to unify messages regarding management decisions by evaluating potential trade-offs in a cohesive framework. This project aimed to improve the ability of riparian resource managers to forecast future water availability conditions and resultant fish habitat suitability, in order to better inform their management decisions. To accomplish this goal, we developed a middleware tool that is capable of linking and overseeing the operations of two existing models, a water resource planning tool Water Evaluation and Planning (WEAP) model and a habitat-based fish population dynamics model (WEAPhish). First, we designed the Master Middle Ware (MMW) software in Visual Basic for Application® in one Excel® file that provided a familiar framework for both data input and output Second, MMW was used to link and jointly operate WEAP and WEAPhish, using Visual Basic Application (VBA) macros to implement system level calls to run the models. To demonstrate the utility of this approach, hydrological, biological, and middleware model components were developed for the Butte Creek basin. This tributary of the Sacramento River, California is managed for both hydropower and the persistence of a threatened population of spring-run Chinook salmon (Oncorhynchus tschawytscha). While we have demonstrated the use of MMW for a particular watershed and fish population, MMW can be customized for use with different rivers and fish populations, assuming basic data requirements are met. This model integration improves on ad hoc linkages for managing data transfer between software programs by providing a consistent, user-friendly, and familiar interface across different model implementations. Furthermore, the data-viewing capabilities of MMW facilitate the rapid interpretation of model results by hydrologists, fisheries biologists, and resource managers, in order to accelerate learning and management decision making.

  1. Enhancing Cassini Operations & Science Planning Tools

    NASA Technical Reports Server (NTRS)

    Castello, Jonathan

    2012-01-01

    The Cassini team uses a variety of software utilities as they manage and coordinate their mission to Saturn. Most of these tools have been unchanged for many years, and although stability is a virtue for long-lived space missions, there are some less-fragile tools that could greatly benefit from modern improvements. This report shall describe three such upgrades, including their architectural differences and their overall impact. Emphasis is placed on the motivation and rationale behind architectural choices rather than the final product, so as to illuminate the lessons learned and discoveries made.These three enhancements included developing a strategy for migrating Science Planning utilities to a new execution model, rewriting the team's internal portal for ease of use and maintenance, and developing a web-based agenda application for tracking the sequence of files being transmitted to the Cassini spacecraft. Of this set, the first two have been fully completed, while the agenda application is currently in the early prototype stage.

  2. 29 CFR 402.7 - Effect of acknowledgment and filing by the Office of Labor-Management Standards.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ...-Management Standards. 402.7 Section 402.7 Labor Regulations Relating to Labor OFFICE OF LABOR-MANAGEMENT STANDARDS, DEPARTMENT OF LABOR LABOR-MANAGEMENT STANDARDS LABOR ORGANIZATION INFORMATION REPORTS § 402.7 Effect of acknowledgment and filing by the Office of Labor-Management Standards. Acknowledgment by the...

  3. 29 CFR 402.7 - Effect of acknowledgment and filing by the Office of Labor-Management Standards.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ...-Management Standards. 402.7 Section 402.7 Labor Regulations Relating to Labor OFFICE OF LABOR-MANAGEMENT STANDARDS, DEPARTMENT OF LABOR LABOR-MANAGEMENT STANDARDS LABOR ORGANIZATION INFORMATION REPORTS § 402.7 Effect of acknowledgment and filing by the Office of Labor-Management Standards. Acknowledgment by the...

  4. 75 FR 6195 - Combined Notice of Filings # 1

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-08

    ... Management Company, LLC. Description: Thornwood Management Co submits revisions to its market-based rate... Management Company, LLC. Description: Thornwood Management Co, LLC submits the Updated Market Power Analysis...: Algonquin Tinker Gen Co et al submits a Notice of Name Change and Succession. Filed Date: 01/27/2010...

  5. New orthopaedic implant management tool for computer-assisted planning, navigation, and simulation: from implant CAD files to a standardized XML-based implant database.

    PubMed

    Sagbo, S; Blochaou, F; Langlotz, F; Vangenot, C; Nolte, L-P; Zheng, G

    2005-01-01

    Computer-Assisted Orthopaedic Surgery (CAOS) has made much progress over the last 10 years. Navigation systems have been recognized as important tools that help surgeons, and various such systems have been developed. A disadvantage of these systems is that they use non-standard formalisms and techniques. As a result, there are no standard concepts for implant and tool management or data formats to store information for use in 3D planning and navigation. We addressed these limitations and developed a practical and generic solution that offers benefits for surgeons, implant manufacturers, and CAS application developers. We developed a virtual implant database containing geometrical as well as calibration information for orthopedic implants and instruments, with a focus on trauma. This database has been successfully tested for various applications in the client/server mode. The implant information is not static, however, because manufacturers periodically revise their implants, resulting in the deletion of some implants and the introduction of new ones. Tracking these continuous changes and keeping CAS systems up to date is a tedious task if done manually. This leads to additional costs for system development, and some errors are inevitably generated due to the huge amount of information that has to be processed. To ease management with respect to implant life cycle, we developed a tool to assist end-users (surgeons, hospitals, CAS system providers, and implant manufacturers) in managing their implants. Our system can be used for pre-operative planning and intra-operative navigation, and also for any surgical simulation involving orthopedic implants. Currently, this tool allows addition of new implants, modification of existing ones, deletion of obsolete implants, export of a given implant, and also creation of backups. Our implant management system has been successfully tested in the laboratory with very promising results. It makes it possible to fill the current gap that exists between the CAS system and implant manufacturers, hospitals, and surgeons.

  6. Raster Files for Utah Play Fairway Analysis

    DOE Data Explorer

    Wannamaker, Phil

    2017-06-16

    This submission contains raster files associated with several datasets that include earthquake density, Na/K geothermometers, fault density, heat flow, and gravity. Integrated together using spatial modeler tools in ArcGIS, these files can be used for play fairway analysis in regard to geothermal exploration.

  7. Improving management decision processes through centralized communication linkages

    NASA Technical Reports Server (NTRS)

    Simanton, D. F.; Garman, J. R.

    1985-01-01

    Information flow is a critical element to intelligent and timely decision-making. At NASA's Johnson Space Center the flow of information is being automated through the use of a centralized backbone network. The theoretical basis of this network, its implications to the horizontal and vertical flow of information, and the technical challenges involved in its implementation are the focus of this paper. The importance of the use of common tools among programs and some future concerns related to file transfer, graphics transfer, and merging of voice and data are also discussed.

  8. The distributed production system of the SuperB project: description and results

    NASA Astrophysics Data System (ADS)

    Brown, D.; Corvo, M.; Di Simone, A.; Fella, A.; Luppi, E.; Paoloni, E.; Stroili, R.; Tomassetti, L.

    2011-12-01

    The SuperB experiment needs large samples of MonteCarlo simulated events in order to finalize the detector design and to estimate the data analysis performances. The requirements are beyond the capabilities of a single computing farm, so a distributed production model capable of exploiting the existing HEP worldwide distributed computing infrastructure is needed. In this paper we describe the set of tools that have been developed to manage the production of the required simulated events. The production of events follows three main phases: distribution of input data files to the remote site Storage Elements (SE); job submission, via SuperB GANGA interface, to all available remote sites; output files transfer to CNAF repository. The job workflow includes procedures for consistency checking, monitoring, data handling and bookkeeping. A replication mechanism allows storing the job output on the local site SE. Results from 2010 official productions are reported.

  9. cadcVOFS: A FUSE Based File System Layer for VOSpace

    NASA Astrophysics Data System (ADS)

    Kavelaars, J.; Dowler, P.; Jenkins, D.; Hill, N.; Damian, A.

    2012-09-01

    The CADC is now making extensive use of the VOSpace protocol for user managed storage. The VOSpace standard allows a diverse set of rich data services to be delivered to users via a simple protocol. We have recently developed the cadcVOFS, a FUSE based file-system layer for VOSpace. cadcVOFS provides a filesystem layer on-top of VOSpace so that standard Unix tools (such as ‘find’, ‘emacs’, ‘awk’ etc) can be used directly on the data objects stored in VOSpace. Once mounted the VOSpace appears as a network storage volume inside the operating system. Within the CADC Cloud Computing project (CANFAR) we have used VOSpace as the method for retrieving and storing processing inputs and products. The abstraction of storage is an important component of Cloud Computing and the high use level of our VOSpace service reflects this.

  10. Using GTO-Velo to Facilitate Communication and Sharing of Simulation Results in Support of the Geothermal Technologies Office Code Comparison Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    White, Signe K.; Purohit, Sumit; Boyd, Lauren W.

    The Geothermal Technologies Office Code Comparison Study (GTO-CCS) aims to support the DOE Geothermal Technologies Office in organizing and executing a model comparison activity. This project is directed at testing, diagnosing differences, and demonstrating modeling capabilities of a worldwide collection of numerical simulators for evaluating geothermal technologies. Teams of researchers are collaborating in this code comparison effort, and it is important to be able to share results in a forum where technical discussions can easily take place without requiring teams to travel to a common location. Pacific Northwest National Laboratory has developed an open-source, flexible framework called Velo that providesmore » a knowledge management infrastructure and tools to support modeling and simulation for a variety of types of projects in a number of scientific domains. GTO-Velo is a customized version of the Velo Framework that is being used as the collaborative tool in support of the GTO-CCS project. Velo is designed around a novel integration of a collaborative Web-based environment and a scalable enterprise Content Management System (CMS). The underlying framework provides a flexible and unstructured data storage system that allows for easy upload of files that can be in any format. Data files are organized in hierarchical folders and each folder and each file has a corresponding wiki page for metadata. The user interacts with Velo through a web browser based wiki technology, providing the benefit of familiarity and ease of use. High-level folders have been defined in GTO-Velo for the benchmark problem descriptions, descriptions of simulator/code capabilities, a project notebook, and folders for participating teams. Each team has a subfolder with write access limited only to the team members, where they can upload their simulation results. The GTO-CCS participants are charged with defining the benchmark problems for the study, and as each GTO-CCS Benchmark problem is defined, the problem creator can provide a description using a template on the metadata page corresponding to the benchmark problem folder. Project documents, references and videos of the weekly online meetings are shared via GTO-Velo. A results comparison tool allows users to plot their uploaded simulation results on the fly, along with those of other teams, to facilitate weekly discussions of the benchmark problem results being generated by the teams. GTO-Velo is an invaluable tool providing the project coordinators and team members with a framework for collaboration among geographically dispersed organizations.« less

  11. CalSimHydro Tool - A Web-based interactive tool for the CalSim 3.0 Hydrology Prepropessor

    NASA Astrophysics Data System (ADS)

    Li, P.; Stough, T.; Vu, Q.; Granger, S. L.; Jones, D. J.; Ferreira, I.; Chen, Z.

    2011-12-01

    CalSimHydro, the CalSim 3.0 Hydrology Preprocessor, is an application designed to automate the various steps in the computation of hydrologic inputs for CalSim 3.0, a water resources planning model developed jointly by California State Department of Water Resources and United States Bureau of Reclamation, Mid-Pacific Region. CalSimHydro consists of a five-step FORTRAN based program that runs the individual models in succession passing information from one model to the next and aggregating data as required by each model. The final product of CalSimHydro is an updated CalSim 3.0 state variable (SV) DSS input file. CalSimHydro consists of (1) a Rainfall-Runoff Model to compute monthly infiltration, (2) a Soil moisture and demand calculator (IDC) that estimates surface runoff, deep percolation, and water demands for natural vegetation cover and various crops other than rice, (3) a Rice Water Use Model to compute the water demands, deep percolation, irrigation return flow, and runoff from precipitation for the rice fields, (4) a Refuge Water Use Model that simulates the ponding operations for managed wetlands, and (5) a Data Aggregation and Transfer Module to aggregate the outputs from the above modules and transfer them to the CalSim SV input file. In this presentation, we describe a web-based user interface for CalSimHydro using Google Earth Plug-In. The CalSimHydro tool allows users to - interact with geo-referenced layers of the Water Budget Areas (WBA) and Demand Units (DU) displayed over the Sacramento Valley, - view the input parameters of the hydrology preprocessor for a selected WBA or DU in a time series plot or a tabular form, - edit the values of the input parameters in the table or by downloading a spreadsheet of the selected parameter in a selected time range, - run the CalSimHydro modules in the backend server and notify the user when the job is done, - visualize the model output and compare it with a base run result, - download the output SV file to be used to run CalSim 3.0. The CalSimHydro tool streamlines the complicated steps to configure and run the hydrology preprocessor by providing a user-friendly visual interface and back-end services to validate user inputs and manage the model execution. It is a powerful addition to the new CalSim 3.0 system.

  12. Application Program Interface for the Orion Aerodynamics Database

    NASA Technical Reports Server (NTRS)

    Robinson, Philip E.; Thompson, James

    2013-01-01

    The Application Programming Interface (API) for the Crew Exploration Vehicle (CEV) Aerodynamic Database has been developed to provide the developers of software an easily implemented, fully self-contained method of accessing the CEV Aerodynamic Database for use in their analysis and simulation tools. The API is programmed in C and provides a series of functions to interact with the database, such as initialization, selecting various options, and calculating the aerodynamic data. No special functions (file read/write, table lookup) are required on the host system other than those included with a standard ANSI C installation. It reads one or more files of aero data tables. Previous releases of aerodynamic databases for space vehicles have only included data tables and a document of the algorithm and equations to combine them for the total aerodynamic forces and moments. This process required each software tool to have a unique implementation of the database code. Errors or omissions in the documentation, or errors in the implementation, led to a lengthy and burdensome process of having to debug each instance of the code. Additionally, input file formats differ for each space vehicle simulation tool, requiring the aero database tables to be reformatted to meet the tool s input file structure requirements. Finally, the capabilities for built-in table lookup routines vary for each simulation tool. Implementation of a new database may require an update to and verification of the table lookup routines. This may be required if the number of dimensions of a data table exceeds the capability of the simulation tools built-in lookup routines. A single software solution was created to provide an aerodynamics software model that could be integrated into other simulation and analysis tools. The highly complex Orion aerodynamics model can then be quickly included in a wide variety of tools. The API code is written in ANSI C for ease of portability to a wide variety of systems. The input data files are in standard formatted ASCII, also for improved portability. The API contains its own implementation of multidimensional table reading and lookup routines. The same aerodynamics input file can be used without modification on all implementations. The turnaround time from aerodynamics model release to a working implementation is significantly reduced

  13. 77 FR 45385 - Capital Research and Management Company, et al.; Notice of Application

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-31

    ...] Capital Research and Management Company, et al.; Notice of Application July 25, 2012. AGENCY: Securities... Management Company (``CRMC''). Filing Dates: The application was filed on December 19, 2008, and amended on... NE., Washington, DC 20549-1090. Applicants, Capital Research and Management Company, 333 South Hope...

  14. Sandia Advanced MEMS Design Tools v. 3.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yarberry, Victor R.; Allen, James J.; Lantz, Jeffrey W.

    This is a major revision to the Sandia Advanced MEMS Design Tools. It replaces all previous versions. New features in this version: Revised to support AutoCAD 2014 and 2015 This CD contains an integrated set of electronic files that: a) Describe the SUMMiT V fabrication process b) Provide enabling educational information (including pictures, videos, technical information) c) Facilitate the process of designing MEMS with the SUMMiT process (prototype file, Design Rule Checker, Standard Parts Library) d) Facilitate the process of having MEMS fabricated at Sandia National Laboratories e) Facilitate the process of having post-fabrication services performed. While there exists somemore » files on the CD that are used in conjunction with software package AutoCAD, these files are not intended for use independent of the CD. Note that the customer must purchase his/her own copy of AutoCAD to use with these files.« less

  15. PH5 for integrating and archiving different data types

    NASA Astrophysics Data System (ADS)

    Azevedo, Steve; Hess, Derick; Beaudoin, Bruce

    2016-04-01

    PH5 is IRIS PASSCAL's file organization of HDF5 used for seismic data. The extensibility and portability of HDF5 allows the PH5 format to evolve and operate on a variety of platforms and interfaces. To make PH5 even more flexible, the seismic metadata is separated from the time series data in order to achieve gains in performance as well as ease of use and to simplify user interaction. This separation affords easy updates to metadata after the data are archived without having to access waveform data. To date, PH5 is currently used for integrating and archiving active source, passive source, and onshore-offshore seismic data sets with the IRIS Data Management Center (DMC). Active development to make PH5 fully compatible with FDSN web services and deliver StationXML is near completion. We are also exploring the feasibility of utilizing QuakeML for active seismic source representation. The PH5 software suite, PIC KITCHEN, comprises in-field tools that include data ingestion (e.g. RefTek format, SEG-Y, and SEG-D), meta-data management tools including QC, and a waveform review tool. These tools enable building archive ready data in-field during active source experiments greatly decreasing the time to produce research ready data sets. Once archived, our online request page generates a unique web form and pre-populates much of it based on the metadata provided to it from the PH5 file. The data requester then can intuitively select the extraction parameters as well as data subsets they wish to receive (current output formats include SEG-Y, SAC, mseed). The web interface then passes this on to the PH5 processing tools to generate the requested seismic data, and e-mail the requester a link to the data set automatically as soon as the data are ready. PH5 file organization was originally designed to hold seismic time series data and meta-data from controlled source experiments using RefTek data loggers. The flexibility of HDF5 has enabled us to extend the use of PH5 in several areas one of which is using PH5 to handle very large data sets. PH5 is also good at integrating data from various types of seismic experiments such as OBS, onshore-offshore, controlled source, and passive recording. HDF5 is capable of holding practically any type of digital data so integrating GPS data with seismic data is possible. Since PH5 is a common format and data contained in HDF5 is accessible randomly it has been easy to extend to include new input and output data formats as community needs arise.

  16. Grid Data Access on Widely Distributed Worker Nodes Using Scalla and SRM

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jakl, Pavel; /Prague, Inst. Phys.; Lauret, Jerome

    2011-11-10

    Facing the reality of storage economics, NP experiments such as RHIC/STAR have been engaged in a shift of the analysis model, and now heavily rely on using cheap disks attached to processing nodes, as such a model is extremely beneficial over expensive centralized storage. Additionally, exploiting storage aggregates with enhanced distributed computing capabilities such as dynamic space allocation (lifetime of spaces), file management on shared storages (lifetime of files, pinning file), storage policies or a uniform access to heterogeneous storage solutions is not an easy task. The Xrootd/Scalla system allows for storage aggregation. We will present an overview of themore » largest deployment of Scalla (Structured Cluster Architecture for Low Latency Access) in the world spanning over 1000 CPUs co-sharing the 350 TB Storage Elements and the experience on how to make such a model work in the RHIC/STAR standard analysis framework. We will explain the key features and approach on how to make access to mass storage (HPSS) possible in such a large deployment context. Furthermore, we will give an overview of a fully 'gridified' solution using the plug-and-play features of Scalla architecture, replacing standard storage access with grid middleware SRM (Storage Resource Manager) components designed for space management and will compare the solution with the standard Scalla approach in use in STAR for the past 2 years. Integration details, future plans and status of development will be explained in the area of best transfer strategy between multiple-choice data pools and best placement with respect of load balancing and interoperability with other SRM aware tools or implementations.« less

  17. VizieR Online Data Catalog: CANDID code for interferometric observations (Gallenne+, 2015)

    NASA Astrophysics Data System (ADS)

    Gallenne, A.; Merand, A.; Kervella, P.; Monnier, J. D.; Schaefer, G. H.; Baron, F.; Breitfelder, J.; Le Bouquin, J. B.; Roettenbacher, R. M.; Gieren, W.; Pietrzynski, G.; McAlister, H.; Ten Brummelaar, T.; Sturmann, J.; Sturmann, L.; Turner, N.; Ridgway, S.; Kraus, S.

    2015-07-01

    This is a suite of Python2.7 tools to find faint companion around star in interferometric data in the OIFITS format. This tool allows to systematically search for faint companions in OIFITS data, and if not found, estimates the detection limit. All files are also available at https://github.com/amerand/CANDID . (3 data files).

  18. 78 FR 44964 - Filing of Plats of Survey: Oregon/Washington

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-25

    ...: HAG13-0251] Filing of Plats of Survey: Oregon/Washington AGENCY: Bureau of Land Management, Interior... officially filed in the Bureau of Land Management, Oregon State Office, Portland, Oregon, 30 days from the date of this publication. Willamette Meridian Oregon T. 40 S., R. 12 E., accepted June 28, 2013 T. 19 S...

  19. 78 FR 5488 - Filing of Plats of Survey: Oregon/Washington

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-25

    ...: HAG13-0093] Filing of Plats of Survey: Oregon/Washington AGENCY: Bureau of Land Management, Interior... officially filed in the Bureau of Land Management, Oregon State Office, Portland, Oregon, 30 days from the date of this publication. Willamette Meridian Oregon T. 17 S., R. 17 E., accepted January 7, 2013 T. 20...

  20. 76 FR 50492 - Idaho: Filing of Plats of Survey

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-15

    ... of Plats of Survey AGENCY: Bureau of Land Management, Interior. ACTION: Notice of filing of plats of surveys. SUMMARY: The Bureau of Land Management (BLM) has officially accepted the plat of survey of the... 83709-1657. SUPPLEMENTARY INFORMATION: The BLM will file the plat of survey of the lands described below...

  1. 77 FR 37919 - Eastern States: Filing of Plat of Survey

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-25

    ...: Filing of Plat of Survey AGENCY: Bureau of Land Management, Interior. ACTION: Notice. SUMMARY: The Bureau of Land Management (BLM) will file the plat of survey of the lands described below in the BLM-Eastern..., Springfield, Virginia 22153. Attn: Cadastral Survey. Persons who use a telecommunications device for the deaf...

  2. 78 FR 16294 - Eastern States: Filing of Plat of Survey

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-14

    ...: Filing of Plat of Survey AGENCY: Bureau of Land Management, Interior. ACTION: Notice. SUMMARY: The Bureau of Land Management (BLM) will file the plat of survey of the lands described below in the BLM-Eastern..., Springfield, Virginia 22153. Attn: Cadastral Survey. Persons who use a telecommunications device for the deaf...

  3. 14 CFR 11.45 - Where and when do I file my comments?

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... do I file my comments? (a) Send your comments to the location specified in the rulemaking document on which you are commenting. If you are asked to send your comments to the Federal Document Management... you do not follow the electronic filing instructions at the Federal Docket Management System Web site...

  4. 14 CFR 11.45 - Where and when do I file my comments?

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... do I file my comments? (a) Send your comments to the location specified in the rulemaking document on which you are commenting. If you are asked to send your comments to the Federal Document Management... you do not follow the electronic filing instructions at the Federal Docket Management System Web site...

  5. 14 CFR 11.45 - Where and when do I file my comments?

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... do I file my comments? (a) Send your comments to the location specified in the rulemaking document on which you are commenting. If you are asked to send your comments to the Federal Document Management... you do not follow the electronic filing instructions at the Federal Docket Management System Web site...

  6. 14 CFR 11.45 - Where and when do I file my comments?

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... do I file my comments? (a) Send your comments to the location specified in the rulemaking document on which you are commenting. If you are asked to send your comments to the Federal Document Management... you do not follow the electronic filing instructions at the Federal Docket Management System Web site...

  7. 78 FR 70545 - KEI (Maine) Power Management (I) LLC, KEI (Maine) Power Management (II) LLC, KEI (Maine) Power...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-26

    ... Application for Amendment of Licenses and Soliciting Comments, Motions To Intervene, and Protests Take notice..., or [email protected] . j. Deadline for filing comments, motions to intervene, and protests... electronic filing. Please file any motion to intervene, protest, comments, and/or recommendations using the...

  8. NASA Standard for Airborne Data: ICARTT Format ESDS-RFC-019

    NASA Astrophysics Data System (ADS)

    Thornhill, A.; Brown, C.; Aknan, A.; Crawford, J. H.; Chen, G.; Williams, E. J.

    2011-12-01

    Airborne field studies generate a plethora of data products in the effort to study atmospheric composition and processes. Data file formats for airborne field campaigns are designed to present data in an understandable and organized way to support collaboration and to document relevant and important meta data. The ICARTT file format was created to facilitate data management during the International Consortium for Atmospheric Research on Transport and Transformation (ICARTT) campaign in 2004 that involved government-agencies and university participants from five countries. Since this mission the ICARTT format has been used in subsequent field campaigns such as Polar Study Using Aircraft Remote Sensing, Surface Measurements and Models of Climates, Chemistry, Aerosols, and Transport (POLARCAT) and the first phase of Deriving Information on Surface Conditions from COlumn and VERtically Resolved Observations Relevant to Air Quality (DISCOVER-AQ). The ICARTT file format has been endorsed as a standard format for airborne data by the Standard Process Group (SPG), one of the Earth Science Data Systems Working Groups (ESDSWG) in 2010. The detailed description of the ICARTT format can be found at http://www-air.larc.nasa.gov/missions/etc/ESDS-RFC-019-v1.00.pdf. The ICARTT data format is an ASCII, comma delimited format that was based on the NASA Ames and GTE file formats. The file header is detailed enough to fully describe the data for users outside of the instrument group and includes a description of the meta data. The ICARTT scanning tools, format structure, implementations, and examples will be presented.

  9. 30 CFR 250.1402 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Definitions. Terms used in this subpart have the following meaning: Case file means an MMS document file... fine. It is an MMS regulatory enforcement tool used in addition to Notices of Incidents of... employee assigned to review case files and assess civil penalties. Violation means failure to comply with...

  10. A parallel calibration utility for WRF-Hydro on high performance computers

    NASA Astrophysics Data System (ADS)

    Wang, J.; Wang, C.; Kotamarthi, V. R.

    2017-12-01

    A successful modeling of complex hydrological processes comprises establishing an integrated hydrological model which simulates the hydrological processes in each water regime, calibrates and validates the model performance based on observation data, and estimates the uncertainties from different sources especially those associated with parameters. Such a model system requires large computing resources and often have to be run on High Performance Computers (HPC). The recently developed WRF-Hydro modeling system provides a significant advancement in the capability to simulate regional water cycles more completely. The WRF-Hydro model has a large range of parameters such as those in the input table files — GENPARM.TBL, SOILPARM.TBL and CHANPARM.TBL — and several distributed scaling factors such as OVROUGHRTFAC. These parameters affect the behavior and outputs of the model and thus may need to be calibrated against the observations in order to obtain a good modeling performance. Having a parameter calibration tool specifically for automate calibration and uncertainty estimates of WRF-Hydro model can provide significant convenience for the modeling community. In this study, we developed a customized tool using the parallel version of the model-independent parameter estimation and uncertainty analysis tool, PEST, to enabled it to run on HPC with PBS and SLURM workload manager and job scheduler. We also developed a series of PEST input file templates that are specifically for WRF-Hydro model calibration and uncertainty analysis. Here we will present a flood case study occurred in April 2013 over Midwest. The sensitivity and uncertainties are analyzed using the customized PEST tool we developed.

  11. Launch Control System Software Development System Automation Testing

    NASA Technical Reports Server (NTRS)

    Hwang, Andrew

    2017-01-01

    The Spaceport Command and Control System (SCCS) is the National Aeronautics and Space Administration's (NASA) launch control system for the Orion capsule and Space Launch System, the next generation manned rocket currently in development. This system requires high quality testing that will measure and test the capabilities of the system. For the past two years, the Exploration and Operations Division at Kennedy Space Center (KSC) has assigned a group including interns and full-time engineers to develop automated tests to save the project time and money. The team worked on automating the testing process for the SCCS GUI that would use streamed simulated data from the testing servers to produce data, plots, statuses, etc. to the GUI. The software used to develop automated tests included an automated testing framework and an automation library. The automated testing framework has a tabular-style syntax, which means the functionality of a line of code must have the appropriate number of tabs for the line to function as intended. The header section contains either paths to custom resources or the names of libraries being used. The automation library contains functionality to automate anything that appears on a desired screen with the use of image recognition software to detect and control GUI components. The data section contains any data values strictly created for the current testing file. The body section holds the tests that are being run. The function section can include any number of functions that may be used by the current testing file or any other file that resources it. The resources and body section are required for all test files; the data and function sections can be left empty if the data values and functions being used are from a resourced library or another file. To help equip the automation team with better tools, the Project Lead of the Automated Testing Team, Jason Kapusta, assigned the task to install and train an optical character recognition (OCR) tool to Brandon Echols, a fellow intern, and I. The purpose of the OCR tool is to analyze an image and find the coordinates of any group of text. Some issues that arose while installing the OCR tool included the absence of certain libraries needed to train the tool and an outdated software version. We eventually resolved the issues and successfully installed the OCR tool. Training the tool required many images and different fonts and sizes, but in the end the tool learned to accurately decipher the text in the images and their coordinates. The OCR tool produced a file that contained significant metadata for each section of text, but only the text and coordinates of the text was required for our purpose. The team made a script to parse the information we wanted from the OCR file to a different file that would be used by automation functions within the automated framework. Since a majority of development and testing for the automated test cases for the GUI in question has been done using live simulated data on the workstations at the Launch Control Center (LCC), a large amount of progress has been made. As of this writing, about 60% of all of automated testing has been implemented. Additionally, the OCR tool will help make our automated tests more robust due to the tool's text recognition being highly scalable to different text fonts and text sizes. Soon we will have the whole test system automated, allowing for more full-time engineers working on development projects.

  12. Dynamic Non-Hierarchical File Systems for Exascale Storage

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Long, Darrell E.; Miller, Ethan L

    This constitutes the final report for “Dynamic Non-Hierarchical File Systems for Exascale Storage”. The ultimate goal of this project was to improve data management in scientific computing and high-end computing (HEC) applications, and to achieve this goal we proposed: to develop the first, HEC-targeted, file system featuring rich metadata and provenance collection, extreme scalability, and future storage hardware integration as core design goals, and to evaluate and develop a flexible non-hierarchical file system interface suitable for providing more powerful and intuitive data management interfaces to HEC and scientific computing users. Data management is swiftly becoming a serious problem in themore » scientific community – while copious amounts of data are good for obtaining results, finding the right data is often daunting and sometimes impossible. Scientists participating in a Department of Energy workshop noted that most of their time was spent “...finding, processing, organizing, and moving data and it’s going to get much worse”. Scientists should not be forced to become data mining experts in order to retrieve the data they want, nor should they be expected to remember the naming convention they used several years ago for a set of experiments they now wish to revisit. Ideally, locating the data you need would be as easy as browsing the web. Unfortunately, existing data management approaches are usually based on hierarchical naming, a 40 year-old technology designed to manage thousands of files, not exabytes of data. Today’s systems do not take advantage of the rich array of metadata that current high-end computing (HEC) file systems can gather, including content-based metadata and provenance1 information. As a result, current metadata search approaches are typically ad hoc and often work by providing a parallel management system to the “main” file system, as is done in Linux (the locate utility), personal computers, and enterprise search appliances. These search applications are often optimized for a single file system, making it difficult to move files and their metadata between file systems. Users have tried to solve this problem in several ways, including the use of separate databases to index file properties, the encoding of file properties into file names, and separately gathering and managing provenance data, but none of these approaches has worked well, either due to limited usefulness or scalability, or both. Our research addressed several key issues: High-performance, real-time metadata harvesting: extracting important attributes from files dynamically and immediately updating indexes used to improve search; Transparent, automatic, and secure provenance capture: recording the data inputs and processing steps used in the production of each file in the system; Scalable indexing: indexes that are optimized for integration with the file system; Dynamic file system structure: our approach provides dynamic directories similar to those in semantic file systems, but these are the native organization rather than a feature grafted onto a conventional system. In addition to these goals, our research effort will include evaluating the impact of new storage technologies on the file system design and performance. In particular, the indexing and metadata harvesting functions can potentially benefit from the performance improvements promised by new storage class memories.« less

  13. The Western Aeronautical Test Range. Chapter 10 Tools

    NASA Technical Reports Server (NTRS)

    Knudtson, Kevin; Park, Alice; Downing, Robert; Sheldon, Jack; Harvey, Robert; Norcross, April

    2011-01-01

    The Western Aeronautical Test Range (WATR) staff at the NASA Dryden Flight Research Center is developing a translation software called Chapter 10 Tools in response to challenges posed by post-flight processing data files originating from various on-board digital recorders that follow the Range Commanders Council Inter-Range Instrumentation Group (IRIG) 106 Chapter 10 Digital Recording Standard but use differing interpretations of the Standard. The software will read the date files regardless of the vendor implementation of the source recorder, displaying data, identifying and correcting errors, and producing a data file that can be successfully processed post-flight

  14. UNICON: A Powerful and Easy-to-Use Compound Library Converter.

    PubMed

    Sommer, Kai; Friedrich, Nils-Ole; Bietz, Stefan; Hilbig, Matthias; Inhester, Therese; Rarey, Matthias

    2016-06-27

    The accurate handling of different chemical file formats and the consistent conversion between them play important roles for calculations in complex cheminformatics workflows. Working with different cheminformatic tools often makes the conversion between file formats a mandatory step. Such a conversion might become a difficult task in cases where the information content substantially differs. This paper describes UNICON, an easy-to-use software tool for this task. The functionality of UNICON ranges from file conversion between standard formats SDF, MOL2, SMILES, PDB, and PDBx/mmCIF via the generation of 2D structure coordinates and 3D structures to the enumeration of tautomeric forms, protonation states, and conformer ensembles. For this purpose, UNICON bundles the key elements of the previously described NAOMI library in a single, easy-to-use command line tool.

  15. DockoMatic 2.0: high throughput inverse virtual screening and homology modeling.

    PubMed

    Bullock, Casey; Cornia, Nic; Jacob, Reed; Remm, Andrew; Peavey, Thomas; Weekes, Ken; Mallory, Chris; Oxford, Julia T; McDougal, Owen M; Andersen, Timothy L

    2013-08-26

    DockoMatic is a free and open source application that unifies a suite of software programs within a user-friendly graphical user interface (GUI) to facilitate molecular docking experiments. Here we describe the release of DockoMatic 2.0; significant software advances include the ability to (1) conduct high throughput inverse virtual screening (IVS); (2) construct 3D homology models; and (3) customize the user interface. Users can now efficiently setup, start, and manage IVS experiments through the DockoMatic GUI by specifying receptor(s), ligand(s), grid parameter file(s), and docking engine (either AutoDock or AutoDock Vina). DockoMatic automatically generates the needed experiment input files and output directories and allows the user to manage and monitor job progress. Upon job completion, a summary of results is generated by Dockomatic to facilitate interpretation by the user. DockoMatic functionality has also been expanded to facilitate the construction of 3D protein homology models using the Timely Integrated Modeler (TIM) wizard. The wizard TIM provides an interface that accesses the basic local alignment search tool (BLAST) and MODELER programs and guides the user through the necessary steps to easily and efficiently create 3D homology models for biomacromolecular structures. The DockoMatic GUI can be customized by the user, and the software design makes it relatively easy to integrate additional docking engines, scoring functions, or third party programs. DockoMatic is a free comprehensive molecular docking software program for all levels of scientists in both research and education.

  16. Production data in media systems and press front ends: capture, formats and database methods

    NASA Astrophysics Data System (ADS)

    Karttunen, Simo

    1997-02-01

    The nature, purpose and data presentation features of media jobs are analyzed in relation to the content, document, process and resource management in media production. Formats are the natural way of presenting, collecting and storing information, contents, document components and final documents. The state of the art and the trends in the media formats and production data are reviewed. The types and the amount of production data are listed, e.g. events, schedules, product descriptions, reports, visual support, quality, process states and color data. The data exchange must be vendor-neutral. Adequate infrastructure and system architecture are defined for production and media data. The roles of open servers and intranets are evaluated and their potential roles as future solutions are anticipated. The press frontend is the part of print media production where large files dominate. The new output alternatives, i.e. film recorders, direct plate output (CTP and CTP-on-press) and digital, plateless printing lines need new workflow tools and very efficient file and format management. The paper analyzes the capture, formatting and storing of job files and respective production data, such as the event logs of the processes. Intranet, browsers, Java applets and open web severs will be used to capture production data, especially where intranets are used anyhow, or where several companies are networked to plan, design and use documents and printed products. The user aspects of installing intranets is stressed since there are numerous more traditional and more dedicated networking solutions on the market.

  17. Information integration for a sky survey by data warehousing

    NASA Astrophysics Data System (ADS)

    Luo, A.; Zhang, Y.; Zhao, Y.

    The virtualization service of data system for a sky survey LAMOST is very important for astronomers The service needs to integrate information from data collections catalogs and references and support simple federation of a set of distributed files and associated metadata Data warehousing has been in existence for several years and demonstrated superiority over traditional relational database management systems by providing novel indexing schemes that supported efficient on-line analytical processing OLAP of large databases Now relational database systems such as Oracle etc support the warehouse capability which including extensions to the SQL language to support OLAP operations and a number of metadata management tools have been created The information integration of LAMOST by applying data warehousing is to effectively provide data and knowledge on-line

  18. MISR HDF-to-Binary Converter and Radiance/BRF Calculation Tools

    Atmospheric Science Data Center

    2013-04-01

    ... to have the HDF and HDF-EOS libraries for the target computer. The HDF libraries are available from  The HDF Group (THG) . The ... and the HDF-EOS include and library files on the target computer. The following files are included in the distribution tar file for ...

  19. SAM-FS: LSC's New Solaris-Based Storage Management Product

    NASA Technical Reports Server (NTRS)

    Angell, Kent

    1996-01-01

    SAM-FS is a full featured hierarchical storage management (HSM) device that operates as a file system on Solaris-based machines. The SAM-FS file system provides the user with all of the standard UNIX system utilities and calls, and adds some new commands, i.e. archive, release, stage, sls, sfind, and a family of maintenance commands. The system also offers enhancements such as high performance virtual disk read and write, control of the disk through an extent array, and the ability to dynamically allocate block size. SAM-FS provides 'archive sets' which are groupings of data to be copied to secondary storage. In practice, as soon as a file is written to disk, SAM-FS will make copies onto secondary media. SAM-FS is a scalable storage management system. The system can manage millions of files per system, though this is limited today by the speed of UNIX and its utilities. In the future, a new search algorithm will be implemented that will remove logical and performance restrictions on the number of files managed.

  20. 47 CFR 1.1159 - Filing locations and receipts for regulatory fees.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... payment should be filed with the Secretary, Federal Communications Commission, Attention: Managing... the Secretary, Federal Communications Commission, Attention: Managing Director, Washington, D.C. 20554... sufficient size to contain the receipt document. (e) The Managing Director may issue annually, at his...

  1. 41 CFR 102-118.450 - Can a TSP file a transportation claim against my agency?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 41 Public Contracts and Property Management 3 2010-07-01 2010-07-01 false Can a TSP file a transportation claim against my agency? 102-118.450 Section 102-118.450 Public Contracts and Property Management Federal Property Management Regulations System (Continued) FEDERAL MANAGEMENT REGULATION TRANSPORTATION...

  2. 77 FR 8252 - The International Consortium of Energy Managers; Notice of Preliminary Permit Application...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-14

    ... International Consortium of Energy Managers; Notice of Preliminary Permit Application Accepted for Filing and... Consortium of Energy Managers filed an application, pursuant to section 4(f) of the Federal Power Act (FPA...: Rexford Wait, International Consortium of Energy Managers, 2416 Cades Way, Vista, CA 92083; (760) 599-0086...

  3. 78 FR 34408 - Notice of Applications for Deregistration Under Section 8(f) of the Investment Company Act of 1940

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-07

    ... reorganization were paid by Stadion Money Management, LLC, investment adviser to the acquiring fund. Filing Date... Investment Management, Exemptive Applications Office, 100 F Street NE., Washington, DC 20549-8010... Money Market [File No. 811-2910] Madison Mosaic Tax-Free Trust [File No. 811-3486] Madison Mosaic Income...

  4. 20 CFR 658.411 - Filing and assignment of JS-related complaints.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... this section shall be handled by the local office manager or assigned by the local office manager to a...-related complaints may be filed in any office of the State job service agency. (b) Assignment of complaints to local office personnel shall be as follows: (1) All JS-related complaints filed with a local...

  5. 20 CFR 658.411 - Filing and assignment of JS-related complaints.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... this section shall be handled by the local office manager or assigned by the local office manager to a... complaints may be filed in any office of the State job service agency. (b) Assignment of complaints to local office personnel shall be as follows: (1) All JS-related complaints filed with a local office, and...

  6. 20 CFR 658.411 - Filing and assignment of JS-related complaints.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... this section shall be handled by the local office manager or assigned by the local office manager to a...-related complaints may be filed in any office of the State job service agency. (b) Assignment of complaints to local office personnel shall be as follows: (1) All JS-related complaints filed with a local...

  7. 20 CFR 658.411 - Filing and assignment of JS-related complaints.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... this section shall be handled by the local office manager or assigned by the local office manager to a...-related complaints may be filed in any office of the State job service agency. (b) Assignment of complaints to local office personnel shall be as follows: (1) All JS-related complaints filed with a local...

  8. Response, Emergency Staging, Communications, Uniform Management, and Evacuation (R.E.S.C.U.M.E.) : Concept of Operations. [supporting datasets

    DOT National Transportation Integrated Search

    2012-10-31

    This zip file contains 45 files of data to support FHWA-JPO-13-063 Response, Emergency Staging, Communications, Uniform Management, and Evacuation (R.E.S.C.U.M.E.) : Concept of Operations. Zip size is 9.9 MB. The files have been uploaded as-is; no fu...

  9. BioPig: Developing Cloud Computing Applications for Next-Generation Sequence Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bhatia, Karan; Wang, Zhong

    Next Generation sequencing is producing ever larger data sizes with a growth rate outpacing Moore's Law. The data deluge has made many of the current sequenceanalysis tools obsolete because they do not scale with data. Here we present BioPig, a collection of cloud computing tools to scale data analysis and management. Pig is aflexible data scripting language that uses Apache's Hadoop data structure and map reduce framework to process very large data files in parallel and combine the results.BioPig extends Pig with capability with sequence analysis. We will show the performance of BioPig on a variety of bioinformatics tasks, includingmore » screeningsequence contaminants, Illumina QA/QC, and gene discovery from metagenome data sets using the Rumen metagenome as an example.« less

  10. A tool for optimization of the production and user analysis on the Grid, C. Grigoras for the ALICE Collaboration

    NASA Astrophysics Data System (ADS)

    Grigoras, Costin; Carminati, Federico; Vladimirovna Datskova, Olga; Schreiner, Steffen; Lee, Sehoon; Zhu, Jianlin; Gheata, Mihaela; Gheata, Andrei; Saiz, Pablo; Betev, Latchezar; Furano, Fabrizio; Mendez Lorenzo, Patricia; Grigoras, Alina Gabriela; Bagnasco, Stefano; Peters, Andreas Joachim; Saiz Santos, Maria Dolores

    2011-12-01

    With the LHC and ALICE entering a full operation and production modes, the amount of Simulation and RAW data processing and end user analysis computational tasks are increasing. The efficient management of all these tasks, all of which have large differences in lifecycle, amounts of processed data and methods to analyze the end result, required the development and deployment of new tools in addition to the already existing Grid infrastructure. To facilitate the management of the large scale simulation and raw data reconstruction tasks, ALICE has developed a production framework called a Lightweight Production Manager (LPM). The LPM is automatically submitting jobs to the Grid based on triggers and conditions, for example after a physics run completion. It follows the evolution of the job and publishes the results on the web for worldwide access by the ALICE physicists. This framework is tightly integrated with the ALICE Grid framework AliEn. In addition to the publication of the job status, LPM is also allowing a fully authenticated interface to the AliEn Grid catalogue, to browse and download files, and in the near future will provide simple types of data analysis through ROOT plugins. The framework is also being extended to allow management of end user jobs.

  11. Maintaining Research Documents with Database Management Software.

    ERIC Educational Resources Information Center

    Harrington, Stuart A.

    1999-01-01

    Discusses taking notes for research projects and organizing them into card files; reviews the literature on personal filing systems; introduces the basic process of database management; and offers a plan for managing research notes. Describes field groups and field definitions, data entry, and creating reports. (LRW)

  12. LOGISTIC MANAGEMENT INFORMATION SYSTEM - MANUAL DATA STORAGE AND RETRIEVAL SYSTEM.

    DTIC Science & Technology

    Logistics Management Information System . The procedures are applicable to manual storage and retrieval of all data used in the Logistics Management ... Information System (LMIS) and include the following: (1) Action Officer data source file. (2) Action Officer presentation format file. (3) LMI Coordination

  13. 75 FR 63853 - Filing of Plats of Survey: California

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-18

    ... Management California State Office, Sacramento, California, on the next business day following the plat... DEPARTMENT OF THE INTERIOR Bureau of Land Management [LLCA 942000 L57000000 BX0000] Filing of Plats of Survey: California AGENCY: Bureau of Land Management, Interior. ACTION: Notice. SUMMARY: The...

  14. Pizza.py Toolkit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Plimpton, Steve; Jones, Matt; Crozier, Paul

    2006-01-01

    Pizza.py is a loosely integrated collection of tools, many of which provide support for the LAMMPS molecular dynamics and ChemCell cell modeling packages. There are tools to create input files. convert between file formats, process log and dump files, create plots, and visualize and animate simulation snapshots. Software packages that are wrapped by Pizza.py. so they can invoked from within Python, include GnuPlot, MatLab, Raster3d. and RasMol. Pizza.py is written in Python and runs on any platform that supports Python. Pizza.py enhances the standard Python interpreter in a few simple ways. Its tools are Python modules which can be invokedmore » interactively, from scripts, or from GUIs when appropriate. Some of the tools require additional Python packages to be installed as part of the users Python. Others are wrappers on software packages (as listed above) which must be available on the users system. It is easy to modify or extend Pizza.py with new functionality or new tools, which need not have anything to do with LAMMPS or ChemCell.« less

  15. CCDST: A free Canadian climate data scraping tool

    NASA Astrophysics Data System (ADS)

    Bonifacio, Charmaine; Barchyn, Thomas E.; Hugenholtz, Chris H.; Kienzle, Stefan W.

    2015-02-01

    In this paper we present a new software tool that automatically fetches, downloads and consolidates climate data from a Web database where the data are contained on multiple Web pages. The tool is called the Canadian Climate Data Scraping Tool (CCDST) and was developed to enhance access and simplify analysis of climate data from Canada's National Climate Data and Information Archive (NCDIA). The CCDST deconstructs a URL for a particular climate station in the NCDIA and then iteratively modifies the date parameters to download large volumes of data, remove individual file headers, and merge data files into one output file. This automated sequence enhances access to climate data by substantially reducing the time needed to manually download data from multiple Web pages. To this end, we present a case study of the temporal dynamics of blowing snow events that resulted in ~3.1 weeks time savings. Without the CCDST, the time involved in manually downloading climate data limits access and restrains researchers and students from exploring climate trends. The tool is coded as a Microsoft Excel macro and is available to researchers and students for free. The main concept and structure of the tool can be modified for other Web databases hosting geophysical data.

  16. Social Networking Adapted for Distributed Scientific Collaboration

    NASA Technical Reports Server (NTRS)

    Karimabadi, Homa

    2012-01-01

    Share is a social networking site with novel, specially designed feature sets to enable simultaneous remote collaboration and sharing of large data sets among scientists. The site will include not only the standard features found on popular consumer-oriented social networking sites such as Facebook and Myspace, but also a number of powerful tools to extend its functionality to a science collaboration site. A Virtual Observatory is a promising technology for making data accessible from various missions and instruments through a Web browser. Sci-Share augments services provided by Virtual Observatories by enabling distributed collaboration and sharing of downloaded and/or processed data among scientists. This will, in turn, increase science returns from NASA missions. Sci-Share also enables better utilization of NASA s high-performance computing resources by providing an easy and central mechanism to access and share large files on users space or those saved on mass storage. The most common means of remote scientific collaboration today remains the trio of e-mail for electronic communication, FTP for file sharing, and personalized Web sites for dissemination of papers and research results. Each of these tools has well-known limitations. Sci-Share transforms the social networking paradigm into a scientific collaboration environment by offering powerful tools for cooperative discourse and digital content sharing. Sci-Share differentiates itself by serving as an online repository for users digital content with the following unique features: a) Sharing of any file type, any size, from anywhere; b) Creation of projects and groups for controlled sharing; c) Module for sharing files on HPC (High Performance Computing) sites; d) Universal accessibility of staged files as embedded links on other sites (e.g. Facebook) and tools (e.g. e-mail); e) Drag-and-drop transfer of large files, replacing awkward e-mail attachments (and file size limitations); f) Enterprise-level data and messaging encryption; and g) Easy-to-use intuitive workflow.

  17. Framework for Integrating Science Data Processing Algorithms Into Process Control Systems

    NASA Technical Reports Server (NTRS)

    Mattmann, Chris A.; Crichton, Daniel J.; Chang, Albert Y.; Foster, Brian M.; Freeborn, Dana J.; Woollard, David M.; Ramirez, Paul M.

    2011-01-01

    A software framework called PCS Task Wrapper is responsible for standardizing the setup, process initiation, execution, and file management tasks surrounding the execution of science data algorithms, which are referred to by NASA as Product Generation Executives (PGEs). PGEs codify a scientific algorithm, some step in the overall scientific process involved in a mission science workflow. The PCS Task Wrapper provides a stable operating environment to the underlying PGE during its execution lifecycle. If the PGE requires a file, or metadata regarding the file, the PCS Task Wrapper is responsible for delivering that information to the PGE in a manner that meets its requirements. If the PGE requires knowledge of upstream or downstream PGEs in a sequence of executions, that information is also made available. Finally, if information regarding disk space, or node information such as CPU availability, etc., is required, the PCS Task Wrapper provides this information to the underlying PGE. After this information is collected, the PGE is executed, and its output Product file and Metadata generation is managed via the PCS Task Wrapper framework. The innovation is responsible for marshalling output Products and Metadata back to a PCS File Management component for use in downstream data processing and pedigree. In support of this, the PCS Task Wrapper leverages the PCS Crawler Framework to ingest (during pipeline processing) the output Product files and Metadata produced by the PGE. The architectural components of the PCS Task Wrapper framework include PGE Task Instance, PGE Config File Builder, Config File Property Adder, Science PGE Config File Writer, and PCS Met file Writer. This innovative framework is really the unifying bridge between the execution of a step in the overall processing pipeline, and the available PCS component services as well as the information that they collectively manage.

  18. Biomechanical ToolKit: Open-source framework to visualize and process biomechanical data.

    PubMed

    Barre, Arnaud; Armand, Stéphane

    2014-04-01

    C3D file format is widely used in the biomechanical field by companies and laboratories to store motion capture systems data. However, few software packages can visualize and modify the integrality of the data in the C3D file. Our objective was to develop an open-source and multi-platform framework to read, write, modify and visualize data from any motion analysis systems using standard (C3D) and proprietary file formats (used by many companies producing motion capture systems). The Biomechanical ToolKit (BTK) was developed to provide cost-effective and efficient tools for the biomechanical community to easily deal with motion analysis data. A large panel of operations is available to read, modify and process data through C++ API, bindings for high-level languages (Matlab, Octave, and Python), and standalone application (Mokka). All these tools are open-source and cross-platform and run on all major operating systems (Windows, Linux, MacOS X). Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  19. 75 FR 74038 - Twin Eagle Resource Management, LLC; Supplemental Notice That Initial Market-Based Rate Filing...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-30

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. ER11-2154-000] Twin Eagle Resource Management, LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes Request for... proceeding, of Twin Eagle Resource Management, LLC's [[Page 74039

  20. Active Management of Integrated Geothermal-CO2 Storage Reservoirs in Sedimentary Formations

    DOE Data Explorer

    Buscheck, Thomas A.

    2012-01-01

    Active Management of Integrated Geothermal–CO2 Storage Reservoirs in Sedimentary Formations: An Approach to Improve Energy Recovery and Mitigate Risk : FY1 Final Report The purpose of phase 1 is to determine the feasibility of integrating geologic CO2 storage (GCS) with geothermal energy production. Phase 1 includes reservoir analyses to determine injector/producer well schemes that balance the generation of economically useful flow rates at the producers with the need to manage reservoir overpressure to reduce the risks associated with overpressure, such as induced seismicity and CO2 leakage to overlying aquifers. This submittal contains input and output files of the reservoir model analyses. A reservoir-model "index-html" file was sent in a previous submittal to organize the reservoir-model input and output files according to sections of the FY1 Final Report to which they pertain. The recipient should save the file: Reservoir-models-inputs-outputs-index.html in the same directory that the files: Section2.1.*.tar.gz files are saved in.

  1. Active Management of Integrated Geothermal-CO2 Storage Reservoirs in Sedimentary Formations

    DOE Data Explorer

    Buscheck, Thomas A.

    2000-01-01

    Active Management of Integrated Geothermal–CO2 Storage Reservoirs in Sedimentary Formations: An Approach to Improve Energy Recovery and Mitigate Risk: FY1 Final Report The purpose of phase 1 is to determine the feasibility of integrating geologic CO2 storage (GCS) with geothermal energy production. Phase 1 includes reservoir analyses to determine injector/producer well schemes that balance the generation of economically useful flow rates at the producers with the need to manage reservoir overpressure to reduce the risks associated with overpressure, such as induced seismicity and CO2 leakage to overlying aquifers. This submittal contains input and output files of the reservoir model analyses. A reservoir-model "index-html" file was sent in a previous submittal to organize the reservoir-model input and output files according to sections of the FY1 Final Report to which they pertain. The recipient should save the file: Reservoir-models-inputs-outputs-index.html in the same directory that the files: Section2.1.*.tar.gz files are saved in.

  2. GTZ: a fast compression and cloud transmission tool optimized for FASTQ files.

    PubMed

    Xing, Yuting; Li, Gen; Wang, Zhenguo; Feng, Bolun; Song, Zhuo; Wu, Chengkun

    2017-12-28

    The dramatic development of DNA sequencing technology is generating real big data, craving for more storage and bandwidth. To speed up data sharing and bring data to computing resource faster and cheaper, it is necessary to develop a compression tool than can support efficient compression and transmission of sequencing data onto the cloud storage. This paper presents GTZ, a compression and transmission tool, optimized for FASTQ files. As a reference-free lossless FASTQ compressor, GTZ treats different lines of FASTQ separately, utilizes adaptive context modelling to estimate their characteristic probabilities, and compresses data blocks with arithmetic coding. GTZ can also be used to compress multiple files or directories at once. Furthermore, as a tool to be used in the cloud computing era, it is capable of saving compressed data locally or transmitting data directly into cloud by choice. We evaluated the performance of GTZ on some diverse FASTQ benchmarks. Results show that in most cases, it outperforms many other tools in terms of the compression ratio, speed and stability. GTZ is a tool that enables efficient lossless FASTQ data compression and simultaneous data transmission onto to cloud. It emerges as a useful tool for NGS data storage and transmission in the cloud environment. GTZ is freely available online at: https://github.com/Genetalks/gtz .

  3. Personal File Management for the Health Sciences.

    ERIC Educational Resources Information Center

    Apostle, Lynne

    Written as an introduction to the concepts of creating a personal or reprint file, this workbook discusses both manual and computerized systems, with emphasis on the preliminary groundwork that needs to be done before starting any filing system. A file assessment worksheet is provided; considerations in developing a personal filing system are…

  4. Automating linear accelerator quality assurance.

    PubMed

    Eckhause, Tobias; Al-Hallaq, Hania; Ritter, Timothy; DeMarco, John; Farrey, Karl; Pawlicki, Todd; Kim, Gwe-Ya; Popple, Richard; Sharma, Vijeshwar; Perez, Mario; Park, SungYong; Booth, Jeremy T; Thorwarth, Ryan; Moran, Jean M

    2015-10-01

    The purpose of this study was 2-fold. One purpose was to develop an automated, streamlined quality assurance (QA) program for use by multiple centers. The second purpose was to evaluate machine performance over time for multiple centers using linear accelerator (Linac) log files and electronic portal images. The authors sought to evaluate variations in Linac performance to establish as a reference for other centers. The authors developed analytical software tools for a QA program using both log files and electronic portal imaging device (EPID) measurements. The first tool is a general analysis tool which can read and visually represent data in the log file. This tool, which can be used to automatically analyze patient treatment or QA log files, examines the files for Linac deviations which exceed thresholds. The second set of tools consists of a test suite of QA fields, a standard phantom, and software to collect information from the log files on deviations from the expected values. The test suite was designed to focus on the mechanical tests of the Linac to include jaw, MLC, and collimator positions during static, IMRT, and volumetric modulated arc therapy delivery. A consortium of eight institutions delivered the test suite at monthly or weekly intervals on each Linac using a standard phantom. The behavior of various components was analyzed for eight TrueBeam Linacs. For the EPID and trajectory log file analysis, all observed deviations which exceeded established thresholds for Linac behavior resulted in a beam hold off. In the absence of an interlock-triggering event, the maximum observed log file deviations between the expected and actual component positions (such as MLC leaves) varied from less than 1% to 26% of published tolerance thresholds. The maximum and standard deviations of the variations due to gantry sag, collimator angle, jaw position, and MLC positions are presented. Gantry sag among Linacs was 0.336 ± 0.072 mm. The standard deviation in MLC position, as determined by EPID measurements, across the consortium was 0.33 mm for IMRT fields. With respect to the log files, the deviations between expected and actual positions for parameters were small (<0.12 mm) for all Linacs. Considering both log files and EPID measurements, all parameters were well within published tolerance values. Variations in collimator angle, MLC position, and gantry sag were also evaluated for all Linacs. The performance of the TrueBeam Linac model was shown to be consistent based on automated analysis of trajectory log files and EPID images acquired during delivery of a standardized test suite. The results can be compared directly to tolerance thresholds. In addition, sharing of results from standard tests across institutions can facilitate the identification of QA process and Linac changes. These reference values are presented along with the standard deviation for common tests so that the test suite can be used by other centers to evaluate their Linac performance against those in this consortium.

  5. An overview of the catalog manager

    NASA Technical Reports Server (NTRS)

    Irani, Frederick M.

    1986-01-01

    The Catalog Manager (CM) is being used at the Goddard Space Flight Center in conjunction with the Land Analysis System (LAS) running under the Transportable Applications Executive (TAE). CM maintains a catalog of file names for all users of the LAS system. The catalog provides a cross-reference between TAE user file names and fully qualified host-file names. It also maintains information about the content and status of each file. A brief history of CM development is given and a description of naming conventions, catalog structure and file attributes, and archive/retrieve capabilities is presented. General user operation and the LAS user scenario are also discussed.

  6. LipidMiner: A Software for Automated Identification and Quantification of Lipids from Multiple Liquid Chromatography-Mass Spectrometry Data Files

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meng, Da; Zhang, Qibin; Gao, Xiaoli

    2014-04-30

    We have developed a tool for automated, high-throughput analysis of LC-MS/MS data files, which greatly simplifies LC-MS based lipidomics analysis. Our results showed that LipidMiner is accurate and comprehensive in identification and quantification of lipid molecular species. In addition, the workflow implemented in LipidMiner is not limited to identification and quantification of lipids. If a suitable metabolite library is implemented in the library matching module, LipidMiner could be reconfigured as a tool for general metabolomics data analysis. It is of note that LipidMiner currently is limited to singly charged ions, although it is adequate for the purpose of lipidomics sincemore » lipids are rarely multiply charged,[14] even for the polyphosphoinositides. LipidMiner also only processes file formats generated from mass spectrometers from Thermo, i.e. the .RAW format. In the future, we are planning to accommodate file formats generated by mass spectrometers from other predominant instrument vendors to make this tool more universal.« less

  7. ColorTree: a batch customization tool for phylogenic trees

    PubMed Central

    Chen, Wei-Hua; Lercher, Martin J

    2009-01-01

    Background Genome sequencing projects and comparative genomics studies typically aim to trace the evolutionary history of large gene sets, often requiring human inspection of hundreds of phylogenetic trees. If trees are checked for compatibility with an explicit null hypothesis (e.g., the monophyly of certain groups), this daunting task is greatly facilitated by an appropriate coloring scheme. Findings In this note, we introduce ColorTree, a simple yet powerful batch customization tool for phylogenic trees. Based on pattern matching rules, ColorTree applies a set of customizations to an input tree file, e.g., coloring labels or branches. The customized trees are saved to an output file, which can then be viewed and further edited by Dendroscope (a freely available tree viewer). ColorTree runs on any Perl installation as a stand-alone command line tool, and its application can thus be easily automated. This way, hundreds of phylogenic trees can be customized for easy visual inspection in a matter of minutes. Conclusion ColorTree allows efficient and flexible visual customization of large tree sets through the application of a user-supplied configuration file to multiple tree files. PMID:19646243

  8. ColorTree: a batch customization tool for phylogenic trees.

    PubMed

    Chen, Wei-Hua; Lercher, Martin J

    2009-07-31

    Genome sequencing projects and comparative genomics studies typically aim to trace the evolutionary history of large gene sets, often requiring human inspection of hundreds of phylogenetic trees. If trees are checked for compatibility with an explicit null hypothesis (e.g., the monophyly of certain groups), this daunting task is greatly facilitated by an appropriate coloring scheme. In this note, we introduce ColorTree, a simple yet powerful batch customization tool for phylogenic trees. Based on pattern matching rules, ColorTree applies a set of customizations to an input tree file, e.g., coloring labels or branches. The customized trees are saved to an output file, which can then be viewed and further edited by Dendroscope (a freely available tree viewer). ColorTree runs on any Perl installation as a stand-alone command line tool, and its application can thus be easily automated. This way, hundreds of phylogenic trees can be customized for easy visual inspection in a matter of minutes. ColorTree allows efficient and flexible visual customization of large tree sets through the application of a user-supplied configuration file to multiple tree files.

  9. CONNJUR spectrum translator: an open source application for reformatting NMR spectral data.

    PubMed

    Nowling, Ronald J; Vyas, Jay; Weatherby, Gerard; Fenwick, Matthew W; Ellis, Heidi J C; Gryk, Michael R

    2011-05-01

    NMR spectroscopists are hindered by the lack of standardization for spectral data among the file formats for various NMR data processing tools. This lack of standardization is cumbersome as researchers must perform their own file conversion in order to switch between processing tools and also restricts the combination of tools employed if no conversion option is available. The CONNJUR Spectrum Translator introduces a new, extensible architecture for spectrum translation and introduces two key algorithmic improvements. This first is translation of NMR spectral data (time and frequency domain) to a single in-memory data model to allow addition of new file formats with two converter modules, a reader and a writer, instead of writing a separate converter to each existing format. Secondly, the use of layout descriptors allows a single fid data translation engine to be used for all formats. For the end user, sophisticated metadata readers allow conversion of the majority of files with minimum user configuration. The open source code is freely available at http://connjur.sourceforge.net for inspection and extension.

  10. Exploration Supply Chain Simulation

    NASA Technical Reports Server (NTRS)

    2008-01-01

    The Exploration Supply Chain Simulation project was chartered by the NASA Exploration Systems Mission Directorate to develop a software tool, with proper data, to quantitatively analyze supply chains for future program planning. This tool is a discrete-event simulation that uses the basic supply chain concepts of planning, sourcing, making, delivering, and returning. This supply chain perspective is combined with other discrete or continuous simulation factors. Discrete resource events (such as launch or delivery reviews) are represented as organizational functional units. Continuous resources (such as civil service or contractor program functions) are defined as enabling functional units. Concepts of fixed and variable costs are included in the model to allow the discrete events to interact with cost calculations. The definition file is intrinsic to the model, but a blank start can be initiated at any time. The current definition file is an Orion Ares I crew launch vehicle. Parameters stretch from Kennedy Space Center across and into other program entities (Michaud Assembly Facility, Aliant Techsystems, Stennis Space Center, Johnson Space Center, etc.) though these will only gain detail as the file continues to evolve. The Orion Ares I file definition in the tool continues to evolve, and analysis from this tool is expected in 2008. This is the first application of such business-driven modeling to a NASA/government-- aerospace contractor endeavor.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Temple, Brian Allen; Armstrong, Jerawan Chudoung

    This document is a mid-year report on a deliverable for the PYTHON Radiography Analysis Tool (PyRAT) for project LANL12-RS-107J in FY15. The deliverable is deliverable number 2 in the work package and is titled “Add the ability to read in more types of image file formats in PyRAT”. Right now PyRAT can only read in uncompressed TIF files (tiff files). It is planned to expand the file formats that can be read by PyRAT, making it easier to use in more situations. A summary of the file formats added include jpeg, jpg, png and formatted ASCII files.

  12. 17 CFR 230.486 - Effective date of post-effective amendments and registration statements filed by certain closed...

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ...-effective amendments and registration statements filed by certain closed-end management investment companies...-end management investment companies. (a) Automatic effectiveness. Except as otherwise provided in this... management investment company or business development company which makes periodic repurchase offers under...

  13. 36 CFR 1222.20 - How are personal files defined and managed?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 36 Parks, Forests, and Public Property 3 2014-07-01 2014-07-01 false How are personal files defined and managed? 1222.20 Section 1222.20 Parks, Forests, and Public Property NATIONAL ARCHIVES AND RECORDS ADMINISTRATION RECORDS MANAGEMENT CREATION AND MAINTENANCE OF FEDERAL RECORDS Identifying Federal...

  14. 36 CFR 1222.20 - How are personal files defined and managed?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 36 Parks, Forests, and Public Property 3 2012-07-01 2012-07-01 false How are personal files defined and managed? 1222.20 Section 1222.20 Parks, Forests, and Public Property NATIONAL ARCHIVES AND RECORDS ADMINISTRATION RECORDS MANAGEMENT CREATION AND MAINTENANCE OF FEDERAL RECORDS Identifying Federal...

  15. 48 CFR 833.103 - Protests to VA.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... review by filing a protest with the Deputy Assistant Secretary for Acquisition and Materiel Management... Management. A protest filed with the DAS for A&MM or the Director, Office of Construction and Facilities Management, will not be considered if the interested party has a protest on the same or similar issues...

  16. 77 FR 32980 - Notice of Correction to Filing of Plats, Colorado

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-04

    ... DEPARTMENT OF THE INTERIOR Bureau of Land Management [LLCO956000 L14200000.BJ0000] Notice of Correction to Filing of Plats, Colorado AGENCY: Bureau of Land Management, Interior. ACTION: Notice of Correction, Colorado. SUMMARY: On May 23, 2012, the Bureau of Land Management (BLM) published a Notice of...

  17. Creating Interactive Graphical Overlays in the Advanced Weather Interactive Processing System Using Shapefiles and DGM Files

    NASA Technical Reports Server (NTRS)

    Barrett, Joe H., III; Lafosse, Richard; Hood, Doris; Hoeth, Brian

    2007-01-01

    Graphical overlays can be created in real-time in the Advanced Weather Interactive Processing System (AWIPS) using shapefiles or Denver AWIPS Risk Reduction and Requirements Evaluation (DARE) Graphics Metafile (DGM) files. This presentation describes how to create graphical overlays on-the-fly for AWIPS, by using two examples of AWIPS applications that were created by the Applied Meteorology Unit (AMU) located at Cape Canaveral Air Force Station (CCAFS), Florida. The first example is the Anvil Threat Corridor Forecast Tool, which produces a shapefile that depicts a graphical threat corridor of the forecast movement of thunderstorm anvil clouds, based on the observed or forecast upper-level winds. This tool is used by the Spaceflight Meteorology Group (SMG) at Johnson Space Center, Texas and 45th Weather Squadron (45 WS) at CCAFS to analyze the threat of natural or space vehicle-triggered lightning over a location. The second example is a launch and landing trajectory tool that produces a DGM file that plots the ground track of space vehicles during launch or landing. The trajectory tool can be used by SMG and the 45 WS forecasters to analyze weather radar imagery along a launch or landing trajectory. The presentation will list the advantages and disadvantages of both file types for creating interactive graphical overlays in future AWIPS applications. Shapefiles are a popular format used extensively in Geographical Information Systems. They are usually used in AWIPS to depict static map backgrounds. A shapefile stores the geometry and attribute information of spatial features in a dataset (ESRI 1998). Shapefiles can contain point, line, and polygon features. Each shapefile contains a main file, index file, and a dBASE table. The main file contains a record for each spatial feature, which describes the feature with a list of its vertices. The index file contains the offset of each record from the beginning of the main file. The dBASE table contains records for each attribute. Attributes are commonly used to label spatial features. Shapefiles can be viewed, but not created in AWIPS. As a result, either third-party software can be installed on an AWIPS workstation, or new software must be written to create shapefiles in the correct format.

  18. Web GIS in practice IX: a demonstration of geospatial visual analytics using Microsoft Live Labs Pivot technology and WHO mortality data

    PubMed Central

    2011-01-01

    The goal of visual analytics is to facilitate the discourse between the user and the data by providing dynamic displays and versatile visual interaction opportunities with the data that can support analytical reasoning and the exploration of data from multiple user-customisable aspects. This paper introduces geospatial visual analytics, a specialised subtype of visual analytics, and provides pointers to a number of learning resources about the subject, as well as some examples of human health, surveillance, emergency management and epidemiology-related geospatial visual analytics applications and examples of free software tools that readers can experiment with, such as Google Public Data Explorer. The authors also present a practical demonstration of geospatial visual analytics using partial data for 35 countries from a publicly available World Health Organization (WHO) mortality dataset and Microsoft Live Labs Pivot technology, a free, general purpose visual analytics tool that offers a fresh way to visually browse and arrange massive amounts of data and images online and also supports geographic and temporal classifications of datasets featuring geospatial and temporal components. Interested readers can download a Zip archive (included with the manuscript as an additional file) containing all files, modules and library functions used to deploy the WHO mortality data Pivot collection described in this paper. PMID:21410968

  19. HEP Computing Tools, Grid and Supercomputers for Genome Sequencing Studies

    NASA Astrophysics Data System (ADS)

    De, K.; Klimentov, A.; Maeno, T.; Mashinistov, R.; Novikov, A.; Poyda, A.; Tertychnyy, I.; Wenaus, T.

    2017-10-01

    PanDA - Production and Distributed Analysis Workload Management System has been developed to address ATLAS experiment at LHC data processing and analysis challenges. Recently PanDA has been extended to run HEP scientific applications on Leadership Class Facilities and supercomputers. The success of the projects to use PanDA beyond HEP and Grid has drawn attention from other compute intensive sciences such as bioinformatics. Recent advances of Next Generation Genome Sequencing (NGS) technology led to increasing streams of sequencing data that need to be processed, analysed and made available for bioinformaticians worldwide. Analysis of genomes sequencing data using popular software pipeline PALEOMIX can take a month even running it on the powerful computer resource. In this paper we will describe the adaptation the PALEOMIX pipeline to run it on a distributed computing environment powered by PanDA. To run pipeline we split input files into chunks which are run separately on different nodes as separate inputs for PALEOMIX and finally merge output file, it is very similar to what it done by ATLAS to process and to simulate data. We dramatically decreased the total walltime because of jobs (re)submission automation and brokering within PanDA. Using software tools developed initially for HEP and Grid can reduce payload execution time for Mammoths DNA samples from weeks to days.

  20. Web GIS in practice IX: a demonstration of geospatial visual analytics using Microsoft Live Labs Pivot technology and WHO mortality data.

    PubMed

    Kamel Boulos, Maged N; Viangteeravat, Teeradache; Anyanwu, Matthew N; Ra Nagisetty, Venkateswara; Kuscu, Emin

    2011-03-16

    The goal of visual analytics is to facilitate the discourse between the user and the data by providing dynamic displays and versatile visual interaction opportunities with the data that can support analytical reasoning and the exploration of data from multiple user-customisable aspects. This paper introduces geospatial visual analytics, a specialised subtype of visual analytics, and provides pointers to a number of learning resources about the subject, as well as some examples of human health, surveillance, emergency management and epidemiology-related geospatial visual analytics applications and examples of free software tools that readers can experiment with, such as Google Public Data Explorer. The authors also present a practical demonstration of geospatial visual analytics using partial data for 35 countries from a publicly available World Health Organization (WHO) mortality dataset and Microsoft Live Labs Pivot technology, a free, general purpose visual analytics tool that offers a fresh way to visually browse and arrange massive amounts of data and images online and also supports geographic and temporal classifications of datasets featuring geospatial and temporal components. Interested readers can download a Zip archive (included with the manuscript as an additional file) containing all files, modules and library functions used to deploy the WHO mortality data Pivot collection described in this paper.

  1. The “NetBoard”: Network Monitoring Tools Integration for INFN Tier-1 Data Center

    NASA Astrophysics Data System (ADS)

    De Girolamo, D.; dell'Agnello and, L.; Zani, S.

    2012-12-01

    The monitoring and alert system is fundamental for the management and the operation of the network in a large data center such as an LHC Tier-1. The network of the INFN Tier-1 at CNAF is a multi-vendor environment: for its management and monitoring several tools have been adopted and different sensors have been developed. In this paper, after an overview on the different aspects to be monitored and the tools used for them (i.e. MRTG, Nagios, Arpwatch, NetFlow, Syslog, etc), we will describe the “NetBoard”, a monitoring toolkit developed at the INFN Tier-1. NetBoard, developed for a multi-vendor network, is able to install and auto-configure all tools needed for its monitoring, either via network devices discovery mechanism or via configuration file or via wizard. In this way, we are also able to activate different types of sensors and Nagios checks according to the equipment vendor specifications. Moreover, when a new device is connected in the LAN, NetBoard can detect where it is plugged. Finally the NetBoard web interface allows to have the overall status of the entire network “at a glance”, both the local and the geographical (including the LHCOPN and the LHCONE) link utilization, health status of network devices (with active alerts) and flow analysis.

  2. 77 FR 46109 - Notice of Filing of Plats of Survey; Montana

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-02

    ... DEPARTMENT OF THE INTERIOR Bureau of Land Management [LLMT926000- L19100000-BJ0000-LRCS42800800] Notice of Filing of Plats of Survey; Montana AGENCY: Bureau of Land Management, Interior. [[Page 46110

  3. The Cheetah Data Management System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kunz, P.F.; Word, G.B.

    1991-03-01

    Cheetah is a data management system based on the C programming language. The premise of Cheetah is that the banks' of FORTRAN based systems should be structures' as defined by the C language. Cheetah is a system to mange these structures, while preserving the use of the C language in its native form. For C structures managed by Cheetah, the user can use Cheetah utilities such as reading and writing, in a machine independent form, both binary and text files to disk or over a network. Files written by Cheetah also contain a dictionary describing in detail the data containedmore » in the file. Such information is intended to be used by interactive programs for presenting the contents of the file. Such information is intended to be used by interactive programs for presenting the contents of file. Cheetah has been ported to many different operating systems with no operating system dependent switches.« less

  4. DSSTOX MASTER STRUCTURE-INDEX FILE: SDF FILE AND DOCUMENTATION

    EPA Science Inventory

    The DSSTox Master Structure-Index File serves to consolidate, manage, and ensure quality and uniformity of the chemical and substance information spanning all DSSTox Structure Data Files, including those in development but not yet published separately on this website.

  5. 5 CFR 1201.4 - General definitions.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... documents associated with electronic filings under paragraph (h) of § 1201.14, on the MSPB. (l) Date of... date of electronic submission. (m) Electronic filing (e-filing). Filing and receiving documents in... of Personnel Management reconsideration decisions concerning retirement benefits, and appeals of...

  6. 5 CFR 1201.4 - General definitions.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... documents associated with electronic filings under paragraph (h) of § 1201.14, on the MSPB. (l) Date of... date of electronic submission. (m) Electronic filing (e-filing). Filing and receiving documents in... of Personnel Management reconsideration decisions concerning retirement benefits, and appeals of...

  7. KERNELHR: A program for estimating animal home ranges

    USGS Publications Warehouse

    Seaman, D.E.; Griffith, B.; Powell, R.A.

    1998-01-01

    Kernel methods are state of the art for estimating animal home-range area and utilization distribution (UD). The KERNELHR program was developed to provide researchers and managers a tool to implement this extremely flexible set of methods with many variants. KERNELHR runs interactively or from the command line on any personal computer (PC) running DOS. KERNELHR provides output of fixed and adaptive kernel home-range estimates, as well as density values in a format suitable for in-depth statistical and spatial analyses. An additional package of programs creates contour files for plotting in geographic information systems (GIS) and estimates core areas of ranges.

  8. A Simplified Shuttle Payload Thermal Analyzer /SSPTA/ program

    NASA Technical Reports Server (NTRS)

    Bartoszek, J. T.; Huckins, B.; Coyle, M.

    1979-01-01

    A simple thermal analysis program for Space Shuttle payloads has been developed to accommodate the user who requires an easily understood but dependable analytical tool. The thermal analysis program includes several thermal subprograms traditionally employed in spacecraft thermal studies, a data management system for data generated by the subprograms, and a master program to coordinate the data files and thermal subprograms. The language and logic used to run the thermal analysis program are designed for the small user. In addition, analytical and storage techniques which conserve computer time and minimize core requirements are incorporated into the program.

  9. Multiplexer/Demultiplexer Loading Tool (MDMLT)

    NASA Technical Reports Server (NTRS)

    Brewer, Lenox Allen; Hale, Elizabeth; Martella, Robert; Gyorfi, Ryan

    2012-01-01

    The purpose of the MDMLT is to improve the reliability and speed of loading multiplexers/demultiplexers (MDMs) in the Software Development and Integration Laboratory (SDIL) by automating the configuration management (CM) of the loads in the MDMs, automating the loading procedure, and providing the capability to load multiple or all MDMs concurrently. This loading may be accomplished in parallel, or single MDMs (remote). The MDMLT is a Web-based tool that is capable of loading the entire International Space Station (ISS) MDM configuration in parallel. It is able to load Flight Equivalent Units (FEUs), enhanced, standard, and prototype MDMs as well as both EEPROM (Electrically Erasable Programmable Read-Only Memory) and SSMMU (Solid State Mass Memory Unit) (MASS Memory). This software has extensive configuration management to track loading history, and the performance improvement means of loading the entire ISS MDM configuration of 49 MDMs in approximately 30 minutes, as opposed to 36 hours, which is what it took previously utilizing the flight method of S-Band uplink. The laptop version recently added to the MDMLT suite allows remote lab loading with the CM of information entered into a common database when it is reconnected to the network. This allows the program to reconfigure the test rigs quickly between shifts, allowing the lab to support a variety of onboard configurations during a single day, based on upcoming or current missions. The MDMLT Computer Software Configuration Item (CSCI) supports a Web-based command and control interface to the user. An interface to the SDIL File Transfer Protocol (FTP) server is supported to import Integrated Flight Loads (IFLs) and Internal Product Release Notes (IPRNs) into the database. An interface to the Monitor and Control System (MCS) is supported to control the power state, and to enable or disable the debug port of the MDMs to be loaded. Two direct interfaces to the MDM are supported: a serial interface (debug port) to receive MDM memory dump data and the calculated checksum, and the Small Computer System Interface (SCSI) to transfer load files to MDMs with hard disks. File transfer from the MDM Loading Tool to EEPROM within the MDM is performed via the MILSTD- 1553 bus, making use of the Real- Time Input/Output Processors (RTIOP) when using the rig-based MDMLT, and via a bus box when using the laptop MDMLT. The bus box is a cost-effective alternative to PC-1553 cards for the laptop. It is noted that this system can be modified and adapted to any avionic laboratory for spacecraft computer loading, ship avionics, or aircraft avionics where multiple configurations and strong configuration management of software/firmware loads are required.

  10. 75 FR 4582 - Filing of Plats of Survey; Nevada

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-01-28

    ... 4500011812; TAS: 14X1109] Filing of Plats of Survey; Nevada AGENCY: Bureau of Land Management, Interior... local government officials of the filing of Plats of Survey in Nevada. DATES: Effective Dates: Filing is... Survey of the following described lands was officially filed at the Nevada State Office, Reno, Nevada, on...

  11. Why can't I manage my digital images like MP3s? The evolution and intent of multimedia metadata

    NASA Astrophysics Data System (ADS)

    Goodrum, Abby; Howison, James

    2005-01-01

    This paper considers the deceptively simple question: Why can't digital images be managed in the simple and effective manner in which digital music files are managed? We make the case that the answer is different treatments of metadata in different domains with different goals. A central difference between the two formats stems from the fact that digital music metadata lookup services are collaborative and automate the movement from a digital file to the appropriate metadata, while image metadata services do not. To understand why this difference exists we examine the divergent evolution of metadata standards for digital music and digital images and observed that the processes differ in interesting ways according to their intent. Specifically music metadata was developed primarily for personal file management and community resource sharing, while the focus of image metadata has largely been on information retrieval. We argue that lessons from MP3 metadata can assist individuals facing their growing personal image management challenges. Our focus therefore is not on metadata for cultural heritage institutions or the publishing industry, it is limited to the personal libraries growing on our hard-drives. This bottom-up approach to file management combined with p2p distribution radically altered the music landscape. Might such an approach have a similar impact on image publishing? This paper outlines plans for improving the personal management of digital images-doing image metadata and file management the MP3 way-and considers the likelihood of success.

  12. Why can't I manage my digital images like MP3s? The evolution and intent of multimedia metadata

    NASA Astrophysics Data System (ADS)

    Goodrum, Abby; Howison, James

    2004-12-01

    This paper considers the deceptively simple question: Why can"t digital images be managed in the simple and effective manner in which digital music files are managed? We make the case that the answer is different treatments of metadata in different domains with different goals. A central difference between the two formats stems from the fact that digital music metadata lookup services are collaborative and automate the movement from a digital file to the appropriate metadata, while image metadata services do not. To understand why this difference exists we examine the divergent evolution of metadata standards for digital music and digital images and observed that the processes differ in interesting ways according to their intent. Specifically music metadata was developed primarily for personal file management and community resource sharing, while the focus of image metadata has largely been on information retrieval. We argue that lessons from MP3 metadata can assist individuals facing their growing personal image management challenges. Our focus therefore is not on metadata for cultural heritage institutions or the publishing industry, it is limited to the personal libraries growing on our hard-drives. This bottom-up approach to file management combined with p2p distribution radically altered the music landscape. Might such an approach have a similar impact on image publishing? This paper outlines plans for improving the personal management of digital images-doing image metadata and file management the MP3 way-and considers the likelihood of success.

  13. DataSync - sharing data via filesystem

    NASA Astrophysics Data System (ADS)

    Ulbricht, Damian; Klump, Jens

    2014-05-01

    Usually research work is a cycle of to hypothesize, to collect data, to corroborate the hypothesis, and finally to publish the results. In this sequence there are possibilities to base the own work on the work of others. Maybe there are candidates of physical samples listed in the IGSN-Registry and there is no need to go on excursion to acquire physical samples. Hopefully the DataCite catalogue lists already metadata of datasets that meet the constraints of the hypothesis and that are now open for reappraisal. After all, working with the measured data to corroborate the hypothesis involves new methods, and proven methods as well as different software tools. A cohort of intermediate data is created that can be shared with colleagues to discuss the research progress and receive a first evaluation. In consequence, the intermediate data should be versioned to easily get back to valid intermediate data, when you notice you get on the wrong track. Things are different for project managers. They want to know what is currently done, what has been done, and what is the last valid data, if somebody has to continue the work. To make life of members of small science projects easier we developed Datasync [1] as a software for sharing and versioning data. Datasync is designed to synchronize directory trees between different computers of a research team over the internet. The software is developed as JAVA application and watches a local directory tree for changes that are replicated as eSciDoc-objects into an eSciDoc-infrastructure [2] using the eSciDoc REST API. Modifications to the local filesystem automatically create a new version of an eSciDoc-object inside the eSciDoc-infrastructure. This way individual folders can be shared between team members while project managers can get a general idea of current status by synchronizing whole project inventories. Additionally XML metadata from separate files can be managed together with data files inside the eSciDoc-objects. While Datasync's major task is to distribute directory trees, we complement its functionality with the PHP-based application panMetaDocs [3]. panMetaDocs is the successor to panMetaWorks [4] and inherits most of its functionality. Through an internet browser PanMetaDocs provides a web-based overview of the datasets inside the eSciDoc-infrastructure. The software allows to upload further data, to add and edit metadata using the metadata editor, and it disseminates metadata through various channels. In addition, previous versions of a file can be downloaded and access rights can be defined on files and folders to control visibility of files for users of both panMetaDocs and Datasync. panMetaDocs serves as a publication agent for datasets and it serves as a registration agent for dataset DOIs. The application stack presented here allows sharing, versioning, and central storage of data from the very beginning of project activities by using the file synchronization service Datasync. The web-application panMetaDocs complements the functionality of DataSync by providing a dataset publication agent and other tools to handle administrative tasks on the data. [1] http://github.com/ulbricht/datasync [2] http://github.com/escidoc [3] http://panmetadocs.sf.net [4] http://metaworks.pangaea.de

  14. I-Maculaweb: A Tool to Support Data Reuse in Ophthalmology

    PubMed Central

    Bonetto, Monica; Nicolò, Massimo; Gazzarata, Roberta; Fraccaro, Paolo; Rosa, Raffaella; Musetti, Donatella; Musolino, Maria; Traverso, Carlo E.

    2016-01-01

    This paper intends to present a Web-based application to collect and manage clinical data and clinical trials together in a unique tool. I-maculaweb is a user-friendly Web-application designed to manage, share, and analyze clinical data from patients affected by degenerative and vascular diseases of the macula. The unique and innovative scientific and technological elements of this project are the integration with individual and population data, relevant for degenerative and vascular diseases of the macula. Clinical records can also be extracted for statistical purposes and used for clinical decision support systems. I-maculaweb is based on an existing multilevel and multiscale data management model, which includes general principles that are suitable for several different clinical domains. The database structure has been specifically built to respect laterality, a key aspect in ophthalmology. Users can add and manage patient records, follow-up visits, treatment, diagnoses, and clinical history. There are two different modalities to extract records: one for the patient’s own center, in which personal details are shown and the other for statistical purposes, where all center’s anonymized data are visible. The Web-platform allows effective management, sharing, and reuse of information within primary care and clinical research. Clear and precise clinical data will improve understanding of real-life management of degenerative and vascular diseases of the macula as well as increasing precise epidemiologic and statistical data. Furthermore, this Web-based application can be easily employed as an electronic clinical research file in clinical studies. PMID:27170913

  15. PIMS-Universal Payload Information Management

    NASA Technical Reports Server (NTRS)

    Elmore, Ralph; McNair, Ann R. (Technical Monitor)

    2002-01-01

    As the overall manager and integrator of International Space Station (ISS) science payloads and experiments, the Payload Operations Integration Center (POIC) at Marshall Space Flight Center had a critical need to provide an information management system for exchange and management of ISS payload files as well as to coordinate ISS payload related operational changes. The POIC's information management system has a fundamental requirement to provide secure operational access not only to users physically located at the POIC, but also to provide collaborative access to remote experimenters and International Partners. The Payload Information Management System (PIMS) is a ground based electronic document configuration management and workflow system that was built to service that need. Functionally, PIMS provides the following document management related capabilities: 1. File access control, storage and retrieval from a central repository vault. 2. Collect supplemental data about files in the vault. 3. File exchange with a PMS GUI client, or any FTP connection. 4. Files placement into an FTP accessible dropbox for pickup by interfacing facilities, included files transmitted for spacecraft uplink. 5. Transmission of email messages to users notifying them of new version availability. 6. Polling of intermediate facility dropboxes for files that will automatically be processed by PIMS. 7. Provide an API that allows other POIC applications to access PIMS information. Functionally, PIMS provides the following Change Request processing capabilities: 1. Ability to create, view, manipulate, and query information about Operations Change Requests (OCRs). 2. Provides an adaptable workflow approval of OCRs with routing through developers, facility leads, POIC leads, reviewers, and implementers. Email messages can be sent to users either involving them in the workflow process or simply notifying them of OCR approval progress. All PIMS document management and OCR workflow controls are coordinated through and routed to individual user's "to do" list tasks. A user is given a task when it is their turn to perform some action relating to the approval of the Document or OCR. The user's available actions are restricted to only functions available for the assigned task. Certain actions, such as review or action implementation by non-PIMS users, can also be coordinated through automated emails.

  16. A mass spectrometry proteomics data management platform.

    PubMed

    Sharma, Vagisha; Eng, Jimmy K; Maccoss, Michael J; Riffle, Michael

    2012-09-01

    Mass spectrometry-based proteomics is increasingly being used in biomedical research. These experiments typically generate a large volume of highly complex data, and the volume and complexity are only increasing with time. There exist many software pipelines for analyzing these data (each typically with its own file formats), and as technology improves, these file formats change and new formats are developed. Files produced from these myriad software programs may accumulate on hard disks or tape drives over time, with older files being rendered progressively more obsolete and unusable with each successive technical advancement and data format change. Although initiatives exist to standardize the file formats used in proteomics, they do not address the core failings of a file-based data management system: (1) files are typically poorly annotated experimentally, (2) files are "organically" distributed across laboratory file systems in an ad hoc manner, (3) files formats become obsolete, and (4) searching the data and comparing and contrasting results across separate experiments is very inefficient (if possible at all). Here we present a relational database architecture and accompanying web application dubbed Mass Spectrometry Data Platform that is designed to address the failings of the file-based mass spectrometry data management approach. The database is designed such that the output of disparate software pipelines may be imported into a core set of unified tables, with these core tables being extended to support data generated by specific pipelines. Because the data are unified, they may be queried, viewed, and compared across multiple experiments using a common web interface. Mass Spectrometry Data Platform is open source and freely available at http://code.google.com/p/msdapl/.

  17. Tripal v1.1: a standards-based toolkit for construction of online genetic and genomic databases.

    PubMed

    Sanderson, Lacey-Anne; Ficklin, Stephen P; Cheng, Chun-Huai; Jung, Sook; Feltus, Frank A; Bett, Kirstin E; Main, Dorrie

    2013-01-01

    Tripal is an open-source freely available toolkit for construction of online genomic and genetic databases. It aims to facilitate development of community-driven biological websites by integrating the GMOD Chado database schema with Drupal, a popular website creation and content management software. Tripal provides a suite of tools for interaction with a Chado database and display of content therein. The tools are designed to be generic to support the various ways in which data may be stored in Chado. Previous releases of Tripal have supported organisms, genomic libraries, biological stocks, stock collections and genomic features, their alignments and annotations. Also, Tripal and its extension modules provided loaders for commonly used file formats such as FASTA, GFF, OBO, GAF, BLAST XML, KEGG heir files and InterProScan XML. Default generic templates were provided for common views of biological data, which could be customized using an open Application Programming Interface to change the way data are displayed. Here, we report additional tools and functionality that are part of release v1.1 of Tripal. These include (i) a new bulk loader that allows a site curator to import data stored in a custom tab delimited format; (ii) full support of every Chado table for Drupal Views (a powerful tool allowing site developers to construct novel displays and search pages); (iii) new modules including 'Feature Map', 'Genetic', 'Publication', 'Project', 'Contact' and the 'Natural Diversity' modules. Tutorials, mailing lists, download and set-up instructions, extension modules and other documentation can be found at the Tripal website located at http://tripal.info. DATABASE URL: http://tripal.info/.

  18. Tripal v1.1: a standards-based toolkit for construction of online genetic and genomic databases

    PubMed Central

    Sanderson, Lacey-Anne; Ficklin, Stephen P.; Cheng, Chun-Huai; Jung, Sook; Feltus, Frank A.; Bett, Kirstin E.; Main, Dorrie

    2013-01-01

    Tripal is an open-source freely available toolkit for construction of online genomic and genetic databases. It aims to facilitate development of community-driven biological websites by integrating the GMOD Chado database schema with Drupal, a popular website creation and content management software. Tripal provides a suite of tools for interaction with a Chado database and display of content therein. The tools are designed to be generic to support the various ways in which data may be stored in Chado. Previous releases of Tripal have supported organisms, genomic libraries, biological stocks, stock collections and genomic features, their alignments and annotations. Also, Tripal and its extension modules provided loaders for commonly used file formats such as FASTA, GFF, OBO, GAF, BLAST XML, KEGG heir files and InterProScan XML. Default generic templates were provided for common views of biological data, which could be customized using an open Application Programming Interface to change the way data are displayed. Here, we report additional tools and functionality that are part of release v1.1 of Tripal. These include (i) a new bulk loader that allows a site curator to import data stored in a custom tab delimited format; (ii) full support of every Chado table for Drupal Views (a powerful tool allowing site developers to construct novel displays and search pages); (iii) new modules including ‘Feature Map’, ‘Genetic’, ‘Publication’, ‘Project’, ‘Contact’ and the ‘Natural Diversity’ modules. Tutorials, mailing lists, download and set-up instructions, extension modules and other documentation can be found at the Tripal website located at http://tripal.info. Database URL: http://tripal.info/ PMID:24163125

  19. 77 FR 48513 - Water Asset Management, Inc.; Notice of Preliminary Permit Application Accepted for Filing and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-14

    ... Humboldt River near the town of Paradise Valley, Humboldt County, Nevada. The project would affect federal... filing, documents may also be paper-filed. To paper-file, mail an original and seven copies to: Kimberly...

  20. 77 FR 48514 - Water Asset Management, Inc.; Notice of Preliminary Permit Application Accepted for Filing and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-14

    ... Humboldt River near the town of Paradise Valley, the Humboldt County, Nevada. The project would affect... filing, documents may also be paper-filed. To paper-file, mail an original and seven copies to: Kimberly...

  1. Using Robinhood to Manage Files on Peregrine | High-Performance Computing |

    Science.gov Websites

    : $ module load robinhood Here are some usage examples: To find all the files on /projects that you own scratch.conf -u $USER /scratch File purging on /scratch uses last access time to determine which files are

  2. 41 CFR 105-54.203-2 - Active charters file.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 41 Public Contracts and Property Management 3 2011-01-01 2011-01-01 false Active charters file... Administration 54-ADVISORY COMMITTEE MANAGEMENT 54.2-Establishment of Advisory Committees § 105-54.203-2 Active... active charters. ...

  3. 41 CFR 105-54.203-2 - Active charters file.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 41 Public Contracts and Property Management 3 2014-01-01 2014-01-01 false Active charters file... Administration 54-ADVISORY COMMITTEE MANAGEMENT 54.2-Establishment of Advisory Committees § 105-54.203-2 Active... active charters. ...

  4. 41 CFR 105-54.203-2 - Active charters file.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 41 Public Contracts and Property Management 3 2013-07-01 2013-07-01 false Active charters file... Administration 54-ADVISORY COMMITTEE MANAGEMENT 54.2-Establishment of Advisory Committees § 105-54.203-2 Active... active charters. ...

  5. 41 CFR 105-54.203-2 - Active charters file.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 41 Public Contracts and Property Management 3 2010-07-01 2010-07-01 false Active charters file... Administration 54-ADVISORY COMMITTEE MANAGEMENT 54.2-Establishment of Advisory Committees § 105-54.203-2 Active... active charters. ...

  6. A cascading failure analysis tool for post processing TRANSCARE simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    This is a MATLAB-based tool to post process simulation results in the EPRI software TRANSCARE, for massive cascading failure analysis following severe disturbances. There are a few key modules available in this tool, including: 1. automatically creating a contingency list to run TRANSCARE simulations, including substation outages above a certain kV threshold, N-k (1, 2 or 3) generator outages and branche outages; 2. read in and analyze a CKO file of PCG definition, an initiating event list, and a CDN file; 3. post process all the simulation results saved in a CDN file and perform critical event corridor analysis; 4.more » provide a summary of TRANSCARE simulations; 5. Identify the most frequently occurring event corridors in the system; and 6. Rank the contingencies using a user defined security index to quantify consequences in terms of total load loss, total number of cascades, etc.« less

  7. 75 FR 51507 - WisdomTree Asset Management, Inc., and WisdomTree Trust; Notice of Application

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-20

    ...] WisdomTree Asset Management, Inc., and WisdomTree Trust; Notice of Application August 13, 2010. AGENCY... subadvisory agreements without shareholder approval. Applicants: WisdomTree Asset Management, Inc (``WTAM'' or ``Adviser'') and WisdomTree Trust (``Trust''). Filing Dates: The application was filed on December 23, 2009...

  8. 76 FR 15994 - Notice of Filing of plats of survey, New Mexico

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-22

    ... of plats of survey, New Mexico AGENCY: Bureau of Land Management, Interior. ACTION: Notice of filing... in the New Mexico State Office, Bureau of Land Management, Santa Fe, New Mexico, thirty (30) calendar... for inspection in the New Mexico State Office, Bureau of Land Management, 301 Dinosaur Trail, Santa Fe...

  9. 76 FR 4372 - Notice of Filing of Plats of Survey, New Mexico

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-25

    ... of Plats of Survey, New Mexico AGENCY: Bureau of Land Management, Interior. ACTION: Notice of filing... in the New Mexico State Office, Bureau of Land Management, Santa Fe, New Mexico, thirty (30) calendar... for inspection in the New Mexico State Office, Bureau of Land Management, 301 Dinosaur Trail, Santa Fe...

  10. 75 FR 29577 - Notice of Filing of Plats of Survey, New Mexico

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-26

    ... of Plats of Survey, New Mexico AGENCY: Bureau of Land Management, Interior. ACTION: Notice of filing... in the New Mexico State Office, Bureau of Land Management (BLM), Santa Fe, New Mexico, thirty (30... available for inspection in the New Mexico State Office, Bureau of Land Management, 301 Dinosaur Trail...

  11. 75 FR 25234 - EquiPower Resources Management, LLC; Supplemental Notice That Initial Market-Based Rate Filing...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-07

    ..., please e-mail [email protected] . or call (866) 208-3676 (toll free). For TTY, call (202) 502... Resources Management, LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes Request for... proceeding of EquiPower Resources Management, LLC's application for market-based rate authority, with an...

  12. 78 FR 26091 - Notice of Applications for Deregistration Under the Investment Company Act of 1940

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-03

    ... FURTHER INFORMATION CONTACT: Diane L. Titus at (202) 551-6810, SEC, Division of Investment Management... application was filed on March 22, 2013. Applicant's Address: c/o Morgan Stanley Investment Management Inc... Investment Management Inc., 522 Fifth Ave., New York, NY 10036. Morgan Stanley International Fund [File No...

  13. 78 FR 61407 - Notice of Applications for Deregistration Under Section 8(f) of the Investment Company Act of 1940

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-10-03

    ...) 551-6810, SEC, Division of Investment Management, Exemptive Applications Office, 100 F Street NE..., Columbia Management Investment Advisers, LLC. Filing Dates: The application was filed on March 8, 2013, and... with the reorganization were paid by applicants and Columbia Management Investment Advisers, LLC...

  14. 76 FR 32245 - Notice of Applications for Deregistration Under Section 8(f) of the Investment Company Act of 1940

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-03

    ... Investment Management, Office of Investment Company Regulation, 100 F Street, NE., Washington, DC 20549-8010... Investors Fund Management LLC, applicant's investment adviser. Filing Date: The application was filed on..., New York 10105. For the Commission, by the Division of Investment Management, pursuant to delegated...

  15. 36 CFR § 1222.20 - How are personal files defined and managed?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 36 Parks, Forests, and Public Property 3 2013-07-01 2012-07-01 true How are personal files defined and managed? § 1222.20 Section § 1222.20 Parks, Forests, and Public Property NATIONAL ARCHIVES AND RECORDS ADMINISTRATION RECORDS MANAGEMENT CREATION AND MAINTENANCE OF FEDERAL RECORDS Identifying Federal...

  16. 41 CFR 105-70.036 - Post-hearing briefs.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 41 Public Contracts and Property Management 3 2010-07-01 2010-07-01 false Post-hearing briefs. 105-70.036 Section 105-70.036 Public Contracts and Property Management Federal Property Management.... The ALJ may require the parties to file post-hearing briefs. In any event, any party may file a post...

  17. 78 FR 20174 - Proposed Collection; Comment Request for Information Collection Tools

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-03

    ... Form 8879-EO, IRS e-file Signature Authorization for an Exempt Organization. DATES: Written comments... Annual Burden Hours: 140,300. (5) Title: IRS e-file Signature Authorization for an Exempt Organization...

  18. a Hadoop-Based Distributed Framework for Efficient Managing and Processing Big Remote Sensing Images

    NASA Astrophysics Data System (ADS)

    Wang, C.; Hu, F.; Hu, X.; Zhao, S.; Wen, W.; Yang, C.

    2015-07-01

    Various sensors from airborne and satellite platforms are producing large volumes of remote sensing images for mapping, environmental monitoring, disaster management, military intelligence, and others. However, it is challenging to efficiently storage, query and process such big data due to the data- and computing- intensive issues. In this paper, a Hadoop-based framework is proposed to manage and process the big remote sensing data in a distributed and parallel manner. Especially, remote sensing data can be directly fetched from other data platforms into the Hadoop Distributed File System (HDFS). The Orfeo toolbox, a ready-to-use tool for large image processing, is integrated into MapReduce to provide affluent image processing operations. With the integration of HDFS, Orfeo toolbox and MapReduce, these remote sensing images can be directly processed in parallel in a scalable computing environment. The experiment results show that the proposed framework can efficiently manage and process such big remote sensing data.

  19. Integrating risk management data in quality improvement initiatives within an academic neurosurgery department.

    PubMed

    McLaughlin, Nancy; Garrett, Matthew C; Emami, Leila; Foss, Sarah K; Klohn, Johanna L; Martin, Neil A

    2016-01-01

    OBJECT While malpractice litigation has had many negative impacts on health care delivery systems, information extracted from lawsuits could potentially guide toward venues to improve care. The authors present a comprehensive review of lawsuits within a tertiary academic neurosurgical department and report institutional and departmental strategies to mitigate liability by integrating risk management data with quality improvement initiatives. METHODS The Comprehensive Risk Intelligence Tool database was interrogated to extract claims/suits abstracts concerning neurosurgical cases that were closed from January 2008 to December 2012. Variables included demographics of the claimant, type of procedure performed (if any), claim description, insured information, case outcome, clinical summary, contributing factors and subfactors, amount incurred for indemnity and expenses, and independent expert opinion in regard to whether the standard of care was met. RESULTS During the study period, the Department of Neurosurgery received the most lawsuits of all surgical specialties (30 of 172), leading to a total incurred payment of $4,949,867. Of these lawsuits, 21 involved spinal pathologies and 9 cranial pathologies. The largest group of suits was from patients with challenging medical conditions who underwent uneventful surgeries and postoperative courses but filed lawsuits when they did not see the benefits for which they were hoping; 85% of these claims were withdrawn by the plaintiffs. The most commonly cited contributing factors included clinical judgment (20 of 30), technical skill (19 of 30), and communication (6 of 30). CONCLUSIONS While all medical and surgical subspecialties must deal with the issue of malpractice and liability, neurosurgery is most affected both in terms of the number of suits filed as well as monetary amounts awarded. To use the suits as learning tools for the faculty and residents and minimize the associated costs, quality initiatives addressing the most frequent contributing factors should be instituted in care redesign strategies, enabling strategic alignment of quality improvement and risk management efforts.

  20. doGlycans-Tools for Preparing Carbohydrate Structures for Atomistic Simulations of Glycoproteins, Glycolipids, and Carbohydrate Polymers for GROMACS.

    PubMed

    Danne, Reinis; Poojari, Chetan; Martinez-Seara, Hector; Rissanen, Sami; Lolicato, Fabio; Róg, Tomasz; Vattulainen, Ilpo

    2017-10-23

    Carbohydrates constitute a structurally and functionally diverse group of biological molecules and macromolecules. In cells they are involved in, e.g., energy storage, signaling, and cell-cell recognition. All of these phenomena take place in atomistic scales, thus atomistic simulation would be the method of choice to explore how carbohydrates function. However, the progress in the field is limited by the lack of appropriate tools for preparing carbohydrate structures and related topology files for the simulation models. Here we present tools that fill this gap. Applications where the tools discussed in this paper are particularly useful include, among others, the preparation of structures for glycolipids, nanocellulose, and glycans linked to glycoproteins. The molecular structures and simulation files generated by the tools are compatible with GROMACS.

Top