Sample records for capture file capability

  1. 4DCAPTURE: a general purpose software package for capturing and analyzing two- and three-dimensional motion data acquired from video sequences

    NASA Astrophysics Data System (ADS)

    Walton, James S.; Hodgson, Peter; Hallamasek, Karen; Palmer, Jake

    2003-07-01

    4DVideo is creating a general purpose capability for capturing and analyzing kinematic data from video sequences in near real-time. The core element of this capability is a software package designed for the PC platform. The software ("4DCapture") is designed to capture and manipulate customized AVI files that can contain a variety of synchronized data streams -- including audio, video, centroid locations -- and signals acquired from more traditional sources (such as accelerometers and strain gauges.) The code includes simultaneous capture or playback of multiple video streams, and linear editing of the images (together with the ancilliary data embedded in the files). Corresponding landmarks seen from two or more views are matched automatically, and photogrammetric algorithms permit multiple landmarks to be tracked in two- and three-dimensions -- with or without lens calibrations. Trajectory data can be processed within the main application or they can be exported to a spreadsheet where they can be processed or passed along to a more sophisticated, stand-alone, data analysis application. Previous attempts to develop such applications for high-speed imaging have been limited in their scope, or by the complexity of the application itself. 4DVideo has devised a friendly ("FlowStack") user interface that assists the end-user to capture and treat image sequences in a natural progression. 4DCapture employs the AVI 2.0 standard and DirectX technology which effectively eliminates the file size limitations found in older applications. In early tests, 4DVideo has streamed three RS-170 video sources to disk for more than an hour without loss of data. At this time, the software can acquire video sequences in three ways: (1) directly, from up to three hard-wired cameras supplying RS-170 (monochrome) signals; (2) directly, from a single camera or video recorder supplying an NTSC (color) signal; and (3) by importing existing video streams in the AVI 1.0 or AVI 2.0 formats. The latter is particularly useful for high-speed applications where the raw images are often captured and stored by the camera before being downloaded. Provision has been made to synchronize data acquired from any combination of these video sources using audio and visual "tags." Additional "front-ends," designed for digital cameras, are anticipated.

  2. jmzTab: a java interface to the mzTab data standard.

    PubMed

    Xu, Qing-Wei; Griss, Johannes; Wang, Rui; Jones, Andrew R; Hermjakob, Henning; Vizcaíno, Juan Antonio

    2014-06-01

    mzTab is the most recent standard format developed by the Proteomics Standards Initiative. mzTab is a flexible tab-delimited file that can capture identification and quantification results coming from MS-based proteomics and metabolomics approaches. We here present an open-source Java application programming interface for mzTab called jmzTab. The software allows the efficient processing of mzTab files, providing read and write capabilities, and is designed to be embedded in other software packages. The second key feature of the jmzTab model is that it provides a flexible framework to maintain the logical integrity between the metadata and the table-based sections in the mzTab files. In this article, as two example implementations, we also describe two stand-alone tools that can be used to validate mzTab files and to convert PRIDE XML files to mzTab. The library is freely available at http://mztab.googlecode.com. © 2014 The Authors PROTEOMICS Published by Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. A preliminary architecture for building communication software from traffic captures

    NASA Astrophysics Data System (ADS)

    Acosta, Jaime C.; Estrada, Pedro

    2017-05-01

    Security analysts are tasked with identifying and mitigating network service vulnerabilities. A common problem associated with in-depth testing of network protocols is the availability of software that communicates across disparate protocols. Many times, the software required to communicate with these services is not publicly available. Developing this software is a time-consuming undertaking that requires expertise and understanding of the protocol specification. The work described in this paper aims at developing a software package that is capable of automatically creating communication clients by using packet capture (pcap) and TShark dissectors. Currently, our focus is on simple protocols with fixed fields. The methodologies developed as part of this work will extend to other complex protocols such as the Gateway Load Balancing Protocol (GLBP), Port Aggregation Protocol (PAgP), and Open Shortest Path First (OSPF). Thus far, we have architected a modular pipeline for an automatic traffic-based software generator. We start the transformation of captured network traffic by employing TShark to convert packets into a Packet Details Markup Language (PDML) file. The PDML file contains a parsed, textual, representation of the packet data. Then, we extract field data, types, along with inter and intra-packet dependencies. This information is then utilized to construct an XML file that encompasses the protocol state machine and field vocabulary. Finally, this XML is converted into executable code. Using our methodology, and as a starting point, we have succeeded in automatically generating software that communicates with other hosts using an automatically generated Internet Control Message Protocol (ICMP) client program.

  4. Designing for Peta-Scale in the LSST Database

    NASA Astrophysics Data System (ADS)

    Kantor, J.; Axelrod, T.; Becla, J.; Cook, K.; Nikolaev, S.; Gray, J.; Plante, R.; Nieto-Santisteban, M.; Szalay, A.; Thakar, A.

    2007-10-01

    The Large Synoptic Survey Telescope (LSST), a proposed ground-based 8.4 m telescope with a 10 deg^2 field of view, will generate 15 TB of raw images every observing night. When calibration and processed data are added, the image archive, catalogs, and meta-data will grow 15 PB yr^{-1} on average. The LSST Data Management System (DMS) must capture, process, store, index, replicate, and provide open access to this data. Alerts must be triggered within 30 s of data acquisition. To do this in real-time at these data volumes will require advances in data management, database, and file system techniques. This paper describes the design of the LSST DMS and emphasizes features for peta-scale data. The LSST DMS will employ a combination of distributed database and file systems, with schema, partitioning, and indexing oriented for parallel operations. Image files are stored in a distributed file system with references to, and meta-data from, each file stored in the databases. The schema design supports pipeline processing, rapid ingest, and efficient query. Vertical partitioning reduces disk input/output requirements, horizontal partitioning allows parallel data access using arrays of servers and disks. Indexing is extensive, utilizing both conventional RAM-resident indexes and column-narrow, row-deep tag tables/covering indices that are extracted from tables that contain many more attributes. The DMS Data Access Framework is encapsulated in a middleware framework to provide a uniform service interface to all framework capabilities. This framework will provide the automated work-flow, replication, and data analysis capabilities necessary to make data processing and data quality analysis feasible at this scale.

  5. Sandbox for Mac Malware v 1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Walkup, Elizabeth

    This software is an analyzer for automated sandbox analysis of malware on the OS X operating system. It runs inside an OS X virtual machine to collect data about what happens when a given file is opened or run. As of August 2014, there was no sandbox software for Mac OS X malware, as it requires different methods from those used on the Windows OS (which most sandboxes are written for). This software adds OS X analysis capabilities to an existing open-source sandbox, Cuckoo Sandbox (http://cuckoosandbox.org/), which previously only worked for Windows. The analyzer itself can take many different typesmore » of files as input: the traditional Mach-O and FAT executables, .app files, zip files, Python scripts, Java archives, and web pages, as well as PDFs and other documents. While the file is running, the analyzer also simulates rudimentary human interaction with clicks and mouse movements in order to bypass the tests some malware use to see if they are being analyzed. The analyzer outputs several different kinds of data: function call traces, network captures, screenshots, and all created and modified files. This work also includes a static analysis Cuckoo module for Mach-O binary files. It extracts file structures, code library imports and exports, and signatures. This data can be used along with the analyzer results to create signatures for malware.« less

  6. The jmzQuantML programming interface and validator for the mzQuantML data standard.

    PubMed

    Qi, Da; Krishna, Ritesh; Jones, Andrew R

    2014-03-01

    The mzQuantML standard from the HUPO Proteomics Standards Initiative has recently been released, capturing quantitative data about peptides and proteins, following analysis of MS data. We present a Java application programming interface (API) for mzQuantML called jmzQuantML. The API provides robust bridges between Java classes and elements in mzQuantML files and allows random access to any part of the file. The API provides read and write capabilities, and is designed to be embedded in other software packages, enabling mzQuantML support to be added to proteomics software tools (http://code.google.com/p/jmzquantml/). The mzQuantML standard is designed around a multilevel validation system to ensure that files are structurally and semantically correct for different proteomics quantitative techniques. In this article, we also describe a Java software tool (http://code.google.com/p/mzquantml-validator/) for validating mzQuantML files, which is a formal part of the data standard. © 2014 The Authors. Proteomics published by Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. Oracle Applications Patch Administration Tool (PAT) Beta Version

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2002-01-04

    PAT is a Patch Administration Tool that provides analysis, tracking, and management of Oracle Application patches. This includes capabilities as outlined below: Patch Analysis & Management Tool Outline of capabilities: Administration Patch Data Maintenance -- track Oracle Application patches applied to what database instance & machine Patch Analysis capture text files (readme.txt and driver files) form comparison detail report comparison detail PL/SQL package comparison detail SQL scripts detail JSP module comparison detail Parse and load the current applptch.txt (10.7) or load patch data from Oracle Application database patch tables (11i) Display Analysis -- Compare patch to be applied with currentmore » Oracle Application installed Appl_top code versions Patch Detail Module comparison detail Analyze and display one Oracle Application module patch. Patch Management -- automatic queue and execution of patches Administration Parameter maintenance -- setting for directory structure of Oracle Application appl_top Validation data maintenance -- machine names and instances to patch Operation Patch Data Maintenance Schedule a patch (queue for later execution) Run a patch (queue for immediate execution) Review the patch logs Patch Management Reports« less

  8. Effectiveness of source documents for identifying fatal occupational injuries: a synthesis of studies.

    PubMed

    Stout, N; Bell, C

    1991-06-01

    The complete and accurate identification of fatal occupational injuries among the US work force is an important first step in developing work injury prevention efforts. Numerous sources of information, such as death certificates, Workers' Compensation files, Occupational Safety and Health Administration (OSHA) files, medical examiner records, state health and labor department reports, and various combinations of these, have been used to identify cases of work-related fatal injuries. Recent studies have questioned the effectiveness of these sources for identifying such cases. At least 10 studies have used multiple sources to define the universe of fatal work injuries within a state and to determine the capture rates, or proportion of the universe identified, by each source. Results of these studies, which are not all available in published literature, are summarized here in a format that allows researchers to readily compare the ascertainment capabilities of the sources. The overall average capture rates of sources were as follows: death certificates, 81%; medical examiner records, 61%; Workers' Compensation reports, 57%; and OSHA reports 32%. Variations by state and value added through the use of multiple sources are presented and discussed. This meta-analysis of 10 state-based studies summarizes the effectiveness of various source documents for capturing cases of fatal occupational injuries to help researchers make informed decisions when designing occupational injury surveillance systems.

  9. Effectiveness of source documents for identifying fatal occupational injuries: a synthesis of studies.

    PubMed Central

    Stout, N; Bell, C

    1991-01-01

    BACKGROUND: The complete and accurate identification of fatal occupational injuries among the US work force is an important first step in developing work injury prevention efforts. Numerous sources of information, such as death certificates, Workers' Compensation files, Occupational Safety and Health Administration (OSHA) files, medical examiner records, state health and labor department reports, and various combinations of these, have been used to identify cases of work-related fatal injuries. Recent studies have questioned the effectiveness of these sources for identifying such cases. METHODS: At least 10 studies have used multiple sources to define the universe of fatal work injuries within a state and to determine the capture rates, or proportion of the universe identified, by each source. Results of these studies, which are not all available in published literature, are summarized here in a format that allows researchers to readily compare the ascertainment capabilities of the sources. RESULTS: The overall average capture rates of sources were as follows: death certificates, 81%; medical examiner records, 61%; Workers' Compensation reports, 57%; and OSHA reports 32%. Variations by state and value added through the use of multiple sources are presented and discussed. CONCLUSIONS: This meta-analysis of 10 state-based studies summarizes the effectiveness of various source documents for capturing cases of fatal occupational injuries to help researchers make informed decisions when designing occupational injury surveillance systems. PMID:1827569

  10. High throughput imaging cytometer with acoustic focussing† †Electronic supplementary information (ESI) available: High throughput imaging cytometer with acoustic focussing. See DOI: 10.1039/c5ra19497k Click here for additional data file. Click here for additional data file. Click here for additional data file. Click here for additional data file. Click here for additional data file. Click here for additional data file. Click here for additional data file. Click here for additional data file.

    PubMed Central

    Zmijan, Robert; Jonnalagadda, Umesh S.; Carugo, Dario; Kochi, Yu; Lemm, Elizabeth; Packham, Graham; Hill, Martyn

    2015-01-01

    We demonstrate an imaging flow cytometer that uses acoustic levitation to assemble cells and other particles into a sheet structure. This technique enables a high resolution, low noise CMOS camera to capture images of thousands of cells with each frame. While ultrasonic focussing has previously been demonstrated for 1D cytometry systems, extending the technology to a planar, much higher throughput format and integrating imaging is non-trivial, and represents a significant jump forward in capability, leading to diagnostic possibilities not achievable with current systems. A galvo mirror is used to track the images of the moving cells permitting exposure times of 10 ms at frame rates of 50 fps with motion blur of only a few pixels. At 80 fps, we demonstrate a throughput of 208 000 beads per second. We investigate the factors affecting motion blur and throughput, and demonstrate the system with fluorescent beads, leukaemia cells and a chondrocyte cell line. Cells require more time to reach the acoustic focus than beads, resulting in lower throughputs; however a longer device would remove this constraint. PMID:29456838

  11. Low-Speed Fingerprint Image Capture System User`s Guide, June 1, 1993

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Whitus, B.R.; Goddard, J.S.; Jatko, W.B.

    1993-06-01

    The Low-Speed Fingerprint Image Capture System (LS-FICS) uses a Sun workstation controlling a Lenzar ElectroOptics Opacity 1000 imaging system to digitize fingerprint card images to support the Federal Bureau of Investigation`s (FBI`s) Automated Fingerprint Identification System (AFIS) program. The system also supports the operations performed by the Oak Ridge National Laboratory- (ORNL-) developed Image Transmission Network (ITN) prototype card scanning system. The input to the system is a single FBI fingerprint card of the agreed-upon standard format and a user-specified identification number. The output is a file formatted to be compatible with the National Institute of Standards and Technology (NIST)more » draft standard for fingerprint data exchange dated June 10, 1992. These NIST compatible files contain the required print and text images. The LS-FICS is designed to provide the FBI with the capability of scanning fingerprint cards into a digital format. The FBI will replicate the system to generate a data base of test images. The Host Workstation contains the image data paths and the compression algorithm. A local area network interface, disk storage, and tape drive are used for the image storage and retrieval, and the Lenzar Opacity 1000 scanner is used to acquire the image. The scanner is capable of resolving 500 pixels/in. in both x and y directions. The print images are maintained in full 8-bit gray scale and compressed with an FBI-approved wavelet-based compression algorithm. The text fields are downsampled to 250 pixels/in. and 2-bit gray scale. The text images are then compressed using a lossless Huffman coding scheme. The text fields retrieved from the output files are easily interpreted when displayed on the screen. Detailed procedures are provided for system calibration and operation. Software tools are provided to verify proper system operation.« less

  12. a Cost-Effective Method for Crack Detection and Measurement on Concrete Surface

    NASA Astrophysics Data System (ADS)

    Sarker, M. M.; Ali, T. A.; Abdelfatah, A.; Yehia, S.; Elaksher, A.

    2017-11-01

    Crack detection and measurement in the surface of concrete structures is currently carried out manually or through Non-Destructive Testing (NDT) such as imaging or scanning. The recent developments in depth (stereo) cameras have presented an opportunity for cost-effective, reliable crack detection and measurement. This study aimed at evaluating the feasibility of the new inexpensive depth camera (ZED) for crack detection and measurement. This depth camera with its lightweight and portable nature produces a 3D data file of the imaged surface. The ZED camera was utilized to image a concrete surface and the 3D file was processed to detect and analyse cracks. This article describes the outcome of the experiment carried out with the ZED camera as well as the processing tools used for crack detection and analysis. Crack properties that were also of interest were length, orientation, and width. The use of the ZED camera allowed for distinction between surface and concrete cracks. The ZED high-resolution capability and point cloud capture technology helped in generating a dense 3D data in low-lighting conditions. The results showed the ability of the ZED camera to capture the crack depth changes between surface (render) cracks, and crack that form in the concrete itself.

  13. Neutron Capture Energies for Flux Normalization and Approximate Model for Gamma-Smeared Power

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, Kang Seog; Clarno, Kevin T.; Liu, Yuxuan

    The Consortium for Advanced Simulation of Light Water Reactors (CASL) Virtual Environment for Reactor Applications (VERA) neutronics simulator MPACT has used a single recoverable fission energy for each fissionable nuclide assuming that all recoverable energies come only from fission reaction, for which capture energy is merged with fission energy. This approach includes approximations and requires improvement by separating capture energy from the merged effective recoverable energy. This report documents the procedure to generate recoverable neutron capture energies and the development of a program called CapKappa to generate capture energies. Recoverable neutron capture energies have been generated by using CapKappa withmore » the evaluated nuclear data file (ENDF)/B-7.0 and 7.1 cross section and decay libraries. The new capture kappas were compared to the current SCALE-6.2 and the CASMO-5 capture kappas. These new capture kappas have been incorporated into the Simplified AMPX 51- and 252-group libraries, and they can be used for the AMPX multigroup (MG) libraries and the SCALE code package. The CASL VERA neutronics simulator MPACT does not include a gamma transport capability, which limits it to explicitly estimating local energy deposition from fission, neutron, and gamma slowing down and capture. Since the mean free path of gamma rays is typically much longer than that for the neutron, and the total gamma energy is about 10% to the total energy, the gamma-smeared power distribution is different from the fission power distribution. Explicit local energy deposition through neutron and gamma transport calculation is significantly important in multi-physics whole core simulation with thermal-hydraulic feedback. Therefore, the gamma transport capability should be incorporated into the CASL neutronics simulator MPACT. However, this task will be timeconsuming in developing the neutron induced gamma production and gamma cross section libraries. This study is to investigate an approximate model to estimate gammasmeared power distribution without performing any gamma transport calculation. A simple approximate gamma smearing model has been investigated based on the facts that pinwise gamma energy depositions are almost flat over a fuel assembly, and assembly-wise gamma energy deposition is proportional to kappa-fission energy deposition. The approximate gamma smearing model works well for single assembly cases, and can partly improve the gamma smeared power distribution for the whole core model. Although the power distributions can be improved by the approximate gamma smearing model, still there is an issue to explicitly obtain local energy deposition. A new simple approach or gamma transport/diffusion capability may need to be incorporated into MPACT to estimate local energy deposition for more robust multi-physics simulation.« less

  14. Electronic Photography at the NASA Langley Research Center

    NASA Technical Reports Server (NTRS)

    Holm, Jack; Judge, Nancianne

    1995-01-01

    An electronic photography facility has been established in the Imaging & Photographic Technology Section, Visual Imaging Branch, at the NASA Langley Research Center (LaRC). The purpose of this facility is to provide the LaRC community with access to digital imaging technology. In particular, capabilities have been established for image scanning, direct image capture, optimized image processing for storage, image enhancement, and optimized device dependent image processing for output. Unique approaches include: evaluation and extraction of the entire film information content through scanning; standardization of image file tone reproduction characteristics for optimal bit utilization and viewing; education of digital imaging personnel on the effects of sampling and quantization to minimize image processing related information loss; investigation of the use of small kernel optimal filters for image restoration; characterization of a large array of output devices and development of image processing protocols for standardized output. Currently, the laboratory has a large collection of digital image files which contain essentially all the information present on the original films. These files are stored at 8-bits per color, but the initial image processing was done at higher bit depths and/or resolutions so that the full 8-bits are used in the stored files. The tone reproduction of these files has also been optimized so the available levels are distributed according to visual perceptibility. Look up tables are available which modify these files for standardized output on various devices, although color reproduction has been allowed to float to some extent to allow for full utilization of output device gamut.

  15. 77 FR 73988 - Marine Mammals; File No. 17152

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-12

    ... authorized to capture, mark, weigh, and sample (swabs and blood) northern elephant seals (Mirounga angustirostris); and incidentally harass elephant seals during captures and ground monitoring/photo...

  16. Converting from XML to HDF-EOS

    NASA Technical Reports Server (NTRS)

    Ullman, Richard; Bane, Bob; Yang, Jingli

    2008-01-01

    A computer program recreates an HDF-EOS file from an Extensible Markup Language (XML) representation of the contents of that file. This program is one of two programs written to enable testing of the schemas described in the immediately preceding article to determine whether the schemas capture all details of HDF-EOS files.

  17. Co-PylotDB - A Python-Based Single-Window User Interface for Transmitting Information to a Database

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barnette, Daniel W.

    2012-01-05

    Co-PylotDB, written completely in Python, provides a user interface (UI) with which to select user and data file(s), directories, and file content, and provide or capture various other information for sending data collected from running any computer program to a pre-formatted database table for persistent storage. The interface allows the user to select input, output, make, source, executable, and qsub files. It also provides fields for specifying the machine name on which the software was run, capturing compile and execution lines, and listing relevant user comments. Data automatically captured by Co-PylotDB and sent to the database are user, current directory,more » local hostname, current date, and time of send. The UI provides fields for logging into a local or remote database server, specifying a database and a table, and sending the information to the selected database table. If a server is not available, the UI provides for saving the command that would have saved the information to a database table for either later submission or for sending via email to a collaborator who has access to the desired database.« less

  18. 77 FR 48130 - Marine Mammals; File No. 17152

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-13

    ... harbor seals are requested over the duration of the permit. Each year, up to 2,500 northern elephant seals (Mirounga angustirostris) will be handled for marking without capture; up to 100 elephant seals will be handled for swab sampling without capture; up to 150 elephant seals will be captured, marked...

  19. An analysis of the low-earth-orbit communications environment

    NASA Astrophysics Data System (ADS)

    Diersing, Robert Joseph

    Advances in microprocessor technology and availability of launch opportunities have caused interest in low-earth-orbit satellite based communications systems to increase dramatically during the past several years. In this research the capabilities of two low-cost, store-and-forward LEO communications satellites operating in the public domain are examined--PACSAT-1 (operated by the Radio Amateur Satellite Corporation) and UoSAT-3 (operated by the University of Surrey, England, Electrical Engineering Department). The file broadcasting and file transfer facilities are examined in detail and a simulation model of the downlink traffic pattern is developed. The simulator will aid the assessment of changes in design and implementation for other systems. The development of the downlink traffic simulator is based on three major parts. First, is a characterization of the low-earth-orbit operating environment along with preliminary measurements of the PACSAT-1 and UoSAT-3 systems including: satellite visibility constraints on communications, monitoring equipment configuration, link margin computations, determination of block and bit error rates, and establishing typical data capture rates for ground stations using computer-pointed directional antennas and fixed omni-directional antennas. Second, arrival rates for successful and unsuccessful file server connections are established along with transaction service times. Downlink traffic has been further characterized by measuring: frame and byte counts for all data-link layer traffic; 30-second interval average response time for all traffic and for file server traffic only; file server response time on a per-connection basis; and retry rates for information and supervisory frames. Finally, the model is verified by comparison with measurements of actual traffic not previously used in the model building process. The simulator is then used to predict operation of the PACSAT-1 satellite with modifications to the original design.

  20. Computational provenance in hydrologic science: a snow mapping example.

    PubMed

    Dozier, Jeff; Frew, James

    2009-03-13

    Computational provenance--a record of the antecedents and processing history of digital information--is key to properly documenting computer-based scientific research. To support investigations in hydrologic science, we produce the daily fractional snow-covered area from NASA's moderate-resolution imaging spectroradiometer (MODIS). From the MODIS reflectance data in seven wavelengths, we estimate the fraction of each 500 m pixel that snow covers. The daily products have data gaps and errors because of cloud cover and sensor viewing geometry, so we interpolate and smooth to produce our best estimate of the daily snow cover. To manage the data, we have developed the Earth System Science Server (ES3), a software environment for data-intensive Earth science, with unique capabilities for automatically and transparently capturing and managing the provenance of arbitrary computations. Transparent acquisition avoids the scientists having to express their computations in specific languages or schemas in order for provenance to be acquired and maintained. ES3 models provenance as relationships between processes and their input and output files. It is particularly suited to capturing the provenance of an evolving algorithm whose components span multiple languages and execution environments.

  1. Parallax Player: a stereoscopic format converter

    NASA Astrophysics Data System (ADS)

    Feldman, Mark H.; Lipton, Lenny

    2003-05-01

    The Parallax Player is a software application that is, in essence, a stereoscopic format converter. Various formats may be inputted and outputted. In addition to being able to take any one of a wide variety of different formats and play them back on many different kinds of PCs and display screens. The Parallax Player has built into it the capability to produce ersatz stereo from a planar still or movie image. The player handles two basic forms of digital content - still images, and movies. It is assumed that all data is digital, either created by means of a photographic film process and later digitized, or directly captured or authored in a digital form. In its current implementation, running on a number of Windows Operating Systems, The Parallax Player reads in a broad selection of contemporary file formats.

  2. A program for the conversion of The National Map data from proprietary format to resource description framework (RDF)

    USGS Publications Warehouse

    Bulen, Andrew; Carter, Jonathan J.; Varanka, Dalia E.

    2011-01-01

    To expand data functionality and capabilities for users of The National Map of the U.S. Geological Survey, data sets for six watersheds and three urban areas were converted from the Best Practices vector data model formats to Semantic Web data formats. This report describes and documents the conver-sion process. The report begins with an introduction to basic Semantic Web standards and the background of The National Map. Data were converted from a proprietary format to Geog-raphy Markup Language to capture the geometric footprint of topographic data features. Configuration files were designed to eliminate redundancy and make the conversion more efficient. A SPARQL endpoint was established for data validation and queries. The report concludes by describing the results of the conversion.

  3. ORPC RivGen controller performance raw data - Igiugig 2015

    DOE Data Explorer

    McEntee, Jarlath

    2015-12-18

    Contains raw data for operations of Ocean Renewable Power Company (ORPC) RivGen Power System in Igiugig 2015 in Matlab data file format. Two data files capture the data and timestamps for data, including power in, voltage, rotation rate, and velocity.

  4. Manual for Getdata Version 3.1: a FORTRAN Utility Program for Time History Data

    NASA Technical Reports Server (NTRS)

    Maine, Richard E.

    1987-01-01

    This report documents version 3.1 of the GetData computer program. GetData is a utility program for manipulating files of time history data, i.e., data giving the values of parameters as functions of time. The most fundamental capability of GetData is extracting selected signals and time segments from an input file and writing the selected data to an output file. Other capabilities include converting file formats, merging data from several input files, time skewing, interpolating to common output times, and generating calculated output signals as functions of the input signals. This report also documents the interface standards for the subroutines used by GetData to read and write the time history files. All interface to the data files is through these subroutines, keeping the main body of GetData independent of the precise details of the file formats. Different file formats can be supported by changes restricted to these subroutines. Other computer programs conforming to the interface standards can call the same subroutines to read and write files in compatible formats.

  5. Design and Implementation of a Computation Server for Optimization with Application to the Analysis of Critical Infrastructure

    DTIC Science & Technology

    2013-06-01

    for crop irrigation. The disruptions also 1 idled key industries, led to billions of dollars of lost productivity, and stressed the entire Western...modify the super-system, and to resume the super-system run. 2.2 Requirements An important step in the software development life cycle is to capture...detects the . gms file and associated files in the remote directory that is allocated to the user. 4. If all of the files are present, the files are

  6. Methods and apparatus for capture and storage of semantic information with sub-files in a parallel computing system

    DOEpatents

    Faibish, Sorin; Bent, John M; Tzelnic, Percy; Grider, Gary; Torres, Aaron

    2015-02-03

    Techniques are provided for storing files in a parallel computing system using sub-files with semantically meaningful boundaries. A method is provided for storing at least one file generated by a distributed application in a parallel computing system. The file comprises one or more of a complete file and a plurality of sub-files. The method comprises the steps of obtaining a user specification of semantic information related to the file; providing the semantic information as a data structure description to a data formatting library write function; and storing the semantic information related to the file with one or more of the sub-files in one or more storage nodes of the parallel computing system. The semantic information provides a description of data in the file. The sub-files can be replicated based on semantically meaningful boundaries.

  7. Information retrieval system

    NASA Technical Reports Server (NTRS)

    Berg, R. F.; Holcomb, J. E.; Kelroy, E. A.; Levine, D. A.; Mee, C., III

    1970-01-01

    Generalized information storage and retrieval system capable of generating and maintaining a file, gathering statistics, sorting output, and generating final reports for output is reviewed. File generation and file maintenance programs written for the system are general purpose routines.

  8. Automated observatory in Antarctica: real-time data transfer on constrained networks in practice

    NASA Astrophysics Data System (ADS)

    Bracke, Stephan; Gonsette, Alexandre; Rasson, Jean; Poncelet, Antoine; Hendrickx, Olivier

    2017-08-01

    In 2013 a project was started by the geophysical centre in Dourbes to install a fully automated magnetic observatory in Antarctica. This isolated place comes with specific requirements: unmanned station during 6 months, low temperatures with extreme values down to -50 °C, minimum power consumption and satellite bandwidth limited to 56 Kbit s-1. The ultimate aim is to transfer real-time magnetic data every second: vector data from a LEMI-25 vector magnetometer, absolute F measurements from a GEM Systems scalar proton magnetometer and absolute magnetic inclination-declination (DI) measurements (five times a day) with an automated DI-fluxgate magnetometer. Traditional file transfer protocols (for instance File Transfer Protocol (FTP), email, rsync) show severe limitations when it comes to real-time capability. After evaluation of pro and cons of the available real-time Internet of things (IoT) protocols and seismic software solutions, we chose to use Message Queuing Telemetry Transport (MQTT) and receive the 1 s data with a negligible latency cost and no loss of data. Each individual instrument sends the magnetic data immediately after capturing, and the data arrive approximately 300 ms after being sent, which corresponds with the normal satellite latency.

  9. Project Integration Architecture (PIA) and Computational Analysis Programming Interface (CAPRI) for Accessing Geometry Data from CAD Files

    NASA Technical Reports Server (NTRS)

    Benyo, Theresa L.

    2002-01-01

    Integration of a supersonic inlet simulation with a computer aided design (CAD) system is demonstrated. The integration is performed using the Project Integration Architecture (PIA). PIA provides a common environment for wrapping many types of applications. Accessing geometry data from CAD files is accomplished by incorporating appropriate function calls from the Computational Analysis Programming Interface (CAPRI). CAPRI is a CAD vendor neutral programming interface that aids in acquiring geometry data directly from CAD files. The benefits of wrapping a supersonic inlet simulation into PIA using CAPRI are; direct access of geometry data, accurate capture of geometry data, automatic conversion of data units, CAD vendor neutral operation, and on-line interactive history capture. This paper describes the PIA and the CAPRI wrapper and details the supersonic inlet simulation demonstration.

  10. Life and dynamic capacity modeling for aircraft transmissions

    NASA Technical Reports Server (NTRS)

    Savage, Michael

    1991-01-01

    A computer program to simulate the dynamic capacity and life of parallel shaft aircraft transmissions is presented. Five basic configurations can be analyzed: single mesh, compound, parallel, reverted, and single plane reductions. In execution, the program prompts the user for the data file prefix name, takes input from a ASCII file, and writes its output to a second ASCII file with the same prefix name. The input data file includes the transmission configuration, the input shaft torque and speed, and descriptions of the transmission geometry and the component gears and bearings. The program output file describes the transmission, its components, their capabilities, locations, and loads. It also lists the dynamic capability, ninety percent reliability, and mean life of each component and the transmission as a system. Here, the program, its input and output files, and the theory behind the operation of the program are described.

  11. 78 FR 74126 - Combined Notice of Filings #1

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-10

    ... Installed Capacity Requirement, Hydro Quebec Interconnection Capability Credits and Related Values for the.... Take notice that the Commission received the following electric reliability filings: Docket Numbers: RR13-3-001. Applicants: North American Electric Reliability Corporation. Description: Compliance Filing...

  12. Hybrid cryptosystem for image file using elgamal and double playfair cipher algorithm

    NASA Astrophysics Data System (ADS)

    Hardi, S. M.; Tarigan, J. T.; Safrina, N.

    2018-03-01

    In this paper, we present an implementation of an image file encryption using hybrid cryptography. We chose ElGamal algorithm to perform asymmetric encryption and Double Playfair for the symmetric encryption. Our objective is to show that these algorithms are capable to encrypt an image file with an acceptable running time and encrypted file size while maintaining the level of security. The application was built using C# programming language and ran as a stand alone desktop application under Windows Operating System. Our test shows that the system is capable to encrypt an image with a resolution of 500×500 to a size of 976 kilobytes with an acceptable running time.

  13. The Future of the Andrew File System

    ScienceCinema

    Brashear, Drrick; Altman, Jeffry

    2018-05-25

    The talk will discuss the ten operational capabilities that have made AFS unique in the distributed file system space and how these capabilities are being expanded upon to meet the needs of the 21st century. Derrick Brashear and Jeffrey Altman will present a technical road map of new features and technical innovations that are under development by the OpenAFS community and Your File System, Inc. funded by a U.S. Department of Energy Small Business Innovative Research grant. The talk will end with a comparison of AFS to its modern days competitors.

  14. DSN command system Mark III-78. [data processing

    NASA Technical Reports Server (NTRS)

    Stinnett, W. G.

    1978-01-01

    The Deep Space Network command Mark III-78 data processing system includes a capability for a store-and-forward handling method. The functions of (1) storing the command files at a Deep Space station; (2) attaching the files to a queue; and (3) radiating the commands to the spacecraft are straightforward. However, the total data processing capability is a result of assuming worst case, failure-recovery, or nonnominal operating conditions. Optional data processing functions include: file erase, clearing the queue, suspend radiation, command abort, resume command radiation, and close window time override.

  15. The Future of the Andrew File System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brashear, Drrick; Altman, Jeffry

    2011-02-23

    The talk will discuss the ten operational capabilities that have made AFS unique in the distributed file system space and how these capabilities are being expanded upon to meet the needs of the 21st century. Derrick Brashear and Jeffrey Altman will present a technical road map of new features and technical innovations that are under development by the OpenAFS community and Your File System, Inc. funded by a U.S. Department of Energy Small Business Innovative Research grant. The talk will end with a comparison of AFS to its modern days competitors.

  16. Provenance Datasets Highlighting Capture Disparities

    DTIC Science & Technology

    2014-01-01

    Vistrails [20], Taverna [21] or Kepler [6], and an OS -observing system like PASS [18]. In less granular workflow systems, the data files, scripts...run, etc. are capturable as long as they are executed within the workflow system. In more granular OS -observing systems, the actual reads, writes...rolling up” very granular information to less granular information. OS -level capture knows that a socket was opened and that data was sent to a foreign

  17. TADPLOT program, version 2.0: User's guide

    NASA Technical Reports Server (NTRS)

    Hammond, Dana P.

    1991-01-01

    The TADPLOT Program, Version 2.0 is described. The TADPLOT program is a software package coordinated by a single, easy-to-use interface, enabling the researcher to access several standard file formats, selectively collect specific subsets of data, and create full-featured publication and viewgraph quality plots. The user-interface was designed to be independent from any file format, yet provide capabilities to accommodate highly specialized data queries. Integrated with an applications software network, data can be assessed, collected, and viewed quickly and easily. Since the commands are data independent, subsequent modifications to the file format will be transparent, while additional file formats can be integrated with minimal impact on the user-interface. The graphical capabilities are independent of the method of data collection; thus, the data specification and subsequent plotting can be modified and upgraded as separate functional components. The graphics kernel selected adheres to the full functional specifications of the CORE standard. Both interface and postprocessing capabilities are fully integrated into TADPLOT.

  18. Historic Methods for Capturing Magnetic Field Images

    ERIC Educational Resources Information Center

    Kwan, Alistair

    2016-01-01

    I investigated two late 19th-century methods for capturing magnetic field images from iron filings for historical insight into the pedagogy of hands-on physics education methods, and to flesh out teaching and learning practicalities tacit in the historical record. Both methods offer opportunities for close sensory engagement in data-collection…

  19. 47 CFR 43.21 - Transactions with affiliates.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... file, by April 1 of each year, a report designed to capture trends in service quality under price cap... report designed to capture trends in service quality under price cap regulation. The report shall contain...) REPORTS OF COMMUNICATION COMMON CARRIERS AND CERTAIN AFFILIATES § 43.21 Transactions with affiliates. (a...

  20. 47 CFR 43.21 - Transactions with affiliates.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... file, by April 1 of each year, a report designed to capture trends in service quality under price cap... report designed to capture trends in service quality under price cap regulation. The report shall contain...) REPORTS OF COMMUNICATION COMMON CARRIERS AND CERTAIN AFFILIATES § 43.21 Transactions with affiliates. (a...

  1. 77 FR 35376 - San Antonio Water System; Notice of Petition for Declaratory Order and Soliciting Comments...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-13

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. DI12-7-000] San Antonio.... To paper-file, an original and seven copies should be filed with: Secretary, Federal Energy... System. The LucidPipe Power System is an in-conduit hydropower device that captures excess head pressure...

  2. An Analysis of the Use of Graphical Representation in Participants' Solutions

    ERIC Educational Resources Information Center

    Bleich, Laurel; Ledford, Sarah; Orrill, Chandra Hawley; Polly, Drew

    2006-01-01

    InterMath participants spend time in workshops exploring technology-rich mathematical investigations and completing write-ups. These write-ups include a written explanation of their problem solving process, screen captures of files that they generated while completing the investigation and links to these files. This paper examines the use of…

  3. 78 FR 67396 - Agency Information Collection Activities: Proposed collection; comments requested: Amendment to...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-12

    ... capabilities, and the FARA eFile system in operation since March 1, 2011, permits registrants to file their registration forms electronically to the FARA Registration Unit, 24 hours a day, seven days a week. FARA eFile...

  4. 78 FR 67398 - Agency Information Collection Activities: Proposed Collection; Comments Requested: Exhibit A to...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-12

    ... capabilities, and the FARA eFile system in operation since March 1, 2011, permits registrants to file their registration forms electronically to the FARA Registration Unit, 24 hours a day, seven days a week. FARA eFile...

  5. 78 FR 67394 - Agency Information Collection Activities: Proposed Collection; Comments Requested: Exhibit B to...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-12

    ... capabilities, and the FARA eFile system in operation since March 1, 2011, permits registrants to file their registration forms electronically to the FARA Registration Unit, 24 hours a day, seven days a week. FARA eFile...

  6. A new microfluidic approach for the one-step capture, amplification and label-free quantification of bacteria from raw samples† †Electronic supplementary information (ESI) available. See DOI: 10.1039/c6sc03880h Click here for additional data file. Click here for additional data file. Click here for additional data file. Click here for additional data file. Click here for additional data file. Click here for additional data file.

    PubMed Central

    Pereiro, Iago; Bendali, Amel; Tabnaoui, Sanae; Alexandre, Lucile; Srbova, Jana; Bilkova, Zuzana; Deegan, Shane; Joshi, Lokesh; Viovy, Jean-Louis; Malaquin, Laurent

    2017-01-01

    A microfluidic method to specifically capture and detect infectious bacteria based on immunorecognition and proliferative power is presented. It involves a microscale fluidized bed in which magnetic and drag forces are balanced to retain antibody-functionalized superparamagnetic beads in a chamber during sample perfusion. Captured cells are then cultivated in situ by infusing nutritionally-rich medium. The system was validated by the direct one-step detection of Salmonella Typhimurium in undiluted unskimmed milk, without pre-treatment. The growth of bacteria induces an expansion of the fluidized bed, mainly due to the volume occupied by the newly formed bacteria. This expansion can be observed with the naked eye, providing simple low-cost detection of only a few bacteria and in a few hours. The time to expansion can also be measured with a low-cost camera, allowing quantitative detection down to 4 cfu (colony forming unit), with a dynamic range of 100 to 107 cfu ml–1 in 2 to 8 hours, depending on the initial concentration. This mode of operation is an equivalent of quantitative PCR, with which it shares a high dynamic range and outstanding sensitivity and specificity, operating at the live cell rather than DNA level. Specificity was demonstrated by controls performed in the presence of a 500× excess of non-pathogenic Lactococcus lactis. The system's versatility was demonstrated by its successful application to the detection and quantitation of Escherichia coli O157:H15 and Enterobacter cloacae. This new technology allows fast, low-cost, portable and automated bacteria detection for various applications in food, environment, security and clinics. PMID:28626552

  7. Challenges and Successes Managing Airborne Science Data for CARVE

    NASA Astrophysics Data System (ADS)

    Hardman, S. H.; Dinardo, S. J.; Lee, E. C.

    2014-12-01

    The Carbon in Arctic Reservoirs Vulnerability Experiment (CARVE) mission collects detailed measurements of important greenhouse gases on local to regional scales in the Alaskan Arctic and demonstrates new remote sensing and improved modeling capabilities to quantify Arctic carbon fluxes and carbon cycle-climate processes. Airborne missions offer a number of challenges when it comes to collecting and processing the science data and CARVE is no different. The biggest challenge relates to the flexibility of the instrument payload. Within the life of the mission, instruments may be removed from or added to the payload, or even reconfigured on a yearly, monthly or daily basis. Although modification of the instrument payload provides a distinct advantage for airborne missions compared to spaceborne missions, it does tend to wreak havoc on the underlying data system when introducing changes to existing data inputs or new data inputs that require modifications to the pipeline for processing the data. In addition to payload flexibility, it is not uncommon to find unsupported files in the field data submission. In the case of CARVE, these include video files, photographs taken during the flight and screen shots from terminal displays. These need to captured, saved and somehow integrated into the data system. The CARVE data system was built on a multi-mission data system infrastructure for airborne instruments called the Airborne Cloud Computing Environment (ACCE). ACCE encompasses the end-to-end lifecycle covering planning, provisioning of data system capabilities, and support for scientific analysis in order to improve the quality, cost effectiveness, and capabilities to enable new scientific discovery and research in earth observation. This well-tested and proven infrastructure allows the CARVE data system to be easily adapted in order to handle the challenges posed by the CARVE mission and to successfully process, manage and distribute the mission's science data. This research was carried out at the Jet Propulsion Laboratory, California Institute of Technology, under a contract with the National Aeronautics and Space Administration

  8. Software for Managing Personal Files.

    ERIC Educational Resources Information Center

    Lundeen, Gerald

    1989-01-01

    Discusses the special characteristics of personal file management software and compares four microcomputer software packages: Notebook II with Bibliography and Convert, Pro-Cite with Biblio-Links, askSam, and Reference Manager. Each package is evaluated in terms of the user interface, file maintenance, retrieval capabilities, output, and…

  9. mzML2ISA & nmrML2ISA: generating enriched ISA-Tab metadata files from metabolomics XML data

    PubMed Central

    Larralde, Martin; Lawson, Thomas N.; Weber, Ralf J. M.; Moreno, Pablo; Haug, Kenneth; Rocca-Serra, Philippe; Viant, Mark R.; Steinbeck, Christoph; Salek, Reza M.

    2017-01-01

    Abstract Summary Submission to the MetaboLights repository for metabolomics data currently places the burden of reporting instrument and acquisition parameters in ISA-Tab format on users, who have to do it manually, a process that is time consuming and prone to user input error. Since the large majority of these parameters are embedded in instrument raw data files, an opportunity exists to capture this metadata more accurately. Here we report a set of Python packages that can automatically generate ISA-Tab metadata file stubs from raw XML metabolomics data files. The parsing packages are separated into mzML2ISA (encompassing mzML and imzML formats) and nmrML2ISA (nmrML format only). Overall, the use of mzML2ISA & nmrML2ISA reduces the time needed to capture metadata substantially (capturing 90% of metadata on assay and sample levels), is much less prone to user input errors, improves compliance with minimum information reporting guidelines and facilitates more finely grained data exploration and querying of datasets. Availability and Implementation mzML2ISA & nmrML2ISA are available under version 3 of the GNU General Public Licence at https://github.com/ISA-tools. Documentation is available from http://2isa.readthedocs.io/en/latest/. Contact reza.salek@ebi.ac.uk or isatools@googlegroups.com Supplementary information Supplementary data are available at Bioinformatics online. PMID:28402395

  10. mzML2ISA & nmrML2ISA: generating enriched ISA-Tab metadata files from metabolomics XML data.

    PubMed

    Larralde, Martin; Lawson, Thomas N; Weber, Ralf J M; Moreno, Pablo; Haug, Kenneth; Rocca-Serra, Philippe; Viant, Mark R; Steinbeck, Christoph; Salek, Reza M

    2017-08-15

    Submission to the MetaboLights repository for metabolomics data currently places the burden of reporting instrument and acquisition parameters in ISA-Tab format on users, who have to do it manually, a process that is time consuming and prone to user input error. Since the large majority of these parameters are embedded in instrument raw data files, an opportunity exists to capture this metadata more accurately. Here we report a set of Python packages that can automatically generate ISA-Tab metadata file stubs from raw XML metabolomics data files. The parsing packages are separated into mzML2ISA (encompassing mzML and imzML formats) and nmrML2ISA (nmrML format only). Overall, the use of mzML2ISA & nmrML2ISA reduces the time needed to capture metadata substantially (capturing 90% of metadata on assay and sample levels), is much less prone to user input errors, improves compliance with minimum information reporting guidelines and facilitates more finely grained data exploration and querying of datasets. mzML2ISA & nmrML2ISA are available under version 3 of the GNU General Public Licence at https://github.com/ISA-tools. Documentation is available from http://2isa.readthedocs.io/en/latest/. reza.salek@ebi.ac.uk or isatools@googlegroups.com. Supplementary data are available at Bioinformatics online. © The Author(s) 2017. Published by Oxford University Press.

  11. 77 FR 68761 - Combined Notice of Filings #1

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-16

    ... Energy Marketing LLC, Champion Energy Services, LLC, Champion Energy, LLC. Description: Supplement to Updated Market Power Analysis for the Central Region of Champion Energy Marketing LLC, et al. Filed Date... and Related Values for the 2016/2017 Capability Year. Filed Date: 11/6/12. Accession Number: 20121106...

  12. 75 FR 80296 - Extension of Filing Accommodation for Static Pool Information in Filings With Respect to Asset...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-22

    ... Systems in 1993 for document exchange. PDF captures formatting information from a variety of desktop publishing applications, making it possible to send formatted documents and have them appear on the recipient... Administrative Procedure Act generally requires that an agency publish an adopted rule in the Federal Register 30...

  13. Sparse Coding for N-Gram Feature Extraction and Training for File Fragment Classification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Felix; Quach, Tu-Thach; Wheeler, Jason

    File fragment classification is an important step in the task of file carving in digital forensics. In file carving, files must be reconstructed based on their content as a result of their fragmented storage on disk or in memory. Existing methods for classification of file fragments typically use hand-engineered features such as byte histograms or entropy measures. In this paper, we propose an approach using sparse coding that enables automated feature extraction. Sparse coding, or sparse dictionary learning, is an unsupervised learning algorithm, and is capable of extracting features based simply on how well those features can be used tomore » reconstruct the original data. With respect to file fragments, we learn sparse dictionaries for n-grams, continuous sequences of bytes, of different sizes. These dictionaries may then be used to estimate n-gram frequencies for a given file fragment, but for significantly larger n-gram sizes than are typically found in existing methods which suffer from combinatorial explosion. To demonstrate the capability of our sparse coding approach, we used the resulting features to train standard classifiers such as support vector machines (SVMs) over multiple file types. Experimentally, we achieved significantly better classification results with respect to existing methods, especially when the features were used in supplement to existing hand-engineered features.« less

  14. Sparse Coding for N-Gram Feature Extraction and Training for File Fragment Classification

    DOE PAGES

    Wang, Felix; Quach, Tu-Thach; Wheeler, Jason; ...

    2018-04-05

    File fragment classification is an important step in the task of file carving in digital forensics. In file carving, files must be reconstructed based on their content as a result of their fragmented storage on disk or in memory. Existing methods for classification of file fragments typically use hand-engineered features such as byte histograms or entropy measures. In this paper, we propose an approach using sparse coding that enables automated feature extraction. Sparse coding, or sparse dictionary learning, is an unsupervised learning algorithm, and is capable of extracting features based simply on how well those features can be used tomore » reconstruct the original data. With respect to file fragments, we learn sparse dictionaries for n-grams, continuous sequences of bytes, of different sizes. These dictionaries may then be used to estimate n-gram frequencies for a given file fragment, but for significantly larger n-gram sizes than are typically found in existing methods which suffer from combinatorial explosion. To demonstrate the capability of our sparse coding approach, we used the resulting features to train standard classifiers such as support vector machines (SVMs) over multiple file types. Experimentally, we achieved significantly better classification results with respect to existing methods, especially when the features were used in supplement to existing hand-engineered features.« less

  15. Preliminary system design of a Three Arm Capture Mechanism (TACM) flight demonstration article

    NASA Technical Reports Server (NTRS)

    Schaefer, Otto; Stasi, Bill

    1993-01-01

    The overall objective of the Three Arm Capture Mechanism (TACM) is to serve as a demonstration of capability for capture of objects in space. These objects could be satellites, expended boosters, pieces of debris, etc.; anything of significant size. With this capability we can significantly diminish the danger of major collisions of debris with valuable space assets and with each other, which would otherwise produce many smaller, high velocity pieces of debris which also become concerns. The captured objects would be jettisoned into the atmosphere, relocated in 'parking' orbits, or recovered for disposition or refurbishment. The dollar value of satellites launched into space continues to grow along with the cost of insurance; having a capture capability takes a positive step towards diminishing this added cost. The effort covered is a planning step towards a flight demonstration of the satellite capture capability. Based on the requirement to capture a communication class satellite, its associated booster, or both, a preliminary system definition of a retrieval kit is defined. The objective of the flight demonstration is to demonstrate the techniques proposed to perform the mission and to obtain data on technical issues requiring an in situ space environment. The former especially includes issues such as automated image recognition techniques and control strategies that enable an unmanned vehicle to rendezvous and capture a satellite, contact dynamics between the two bodies, and the flight segment level of automation required to support the mission. A development plan for the operational retrieval capability includes analysis work, computer and ground test simulations, and finally a flight demonstration. A concept to perform a selected mission capturing a precessing communications satellite is described. Further development efforts using analytical tools and laboratory facilities are required prior to reaching the point at which a full commitment to the flight demonstration design can be made.

  16. Center-TRACON Automation System (CTAS) En Route Trajectory Predictor Requirements and Capabilities

    NASA Technical Reports Server (NTRS)

    Vivona, Robert; Cate, Karen Tung

    2013-01-01

    This requirements framework document is designed to support the capture of requirements and capabilities for state-of-the-art trajectory predictors (TPs). This framework has been developed to assist TP experts in capturing a clear, consistent, and cross-comparable set of requirements and capabilities. The goal is to capture capabilities (types of trajectories that can be built), functional requirements (including inputs and outputs), non-functional requirements (including prediction accuracy and computational performance), approaches for constraint relaxation, and input uncertainties. The sections of this framework are based on the Common Trajectory Predictor structure developed by the FAA/Eurocontrol Cooperative R&D Action Plan 16 Committee on Common Trajectory Prediction. It is assumed that the reader is familiar with the Common TP Structure.1 This initial draft is intended as a first cut capture of the En Route TS Capabilities and Requirements. As such, it contains many annotations indicating possible logic errors in the CTAS code or in the description provided. It is intended to work out the details of the annotations with NASA and to update this document at a later time.

  17. A JPEG backward-compatible HDR image compression

    NASA Astrophysics Data System (ADS)

    Korshunov, Pavel; Ebrahimi, Touradj

    2012-10-01

    High Dynamic Range (HDR) imaging is expected to become one of the technologies that could shape next generation of consumer digital photography. Manufacturers are rolling out cameras and displays capable of capturing and rendering HDR images. The popularity and full public adoption of HDR content is however hindered by the lack of standards in evaluation of quality, file formats, and compression, as well as large legacy base of Low Dynamic Range (LDR) displays that are unable to render HDR. To facilitate wide spread of HDR usage, the backward compatibility of HDR technology with commonly used legacy image storage, rendering, and compression is necessary. Although many tone-mapping algorithms were developed for generating viewable LDR images from HDR content, there is no consensus on which algorithm to use and under which conditions. This paper, via a series of subjective evaluations, demonstrates the dependency of perceived quality of the tone-mapped LDR images on environmental parameters and image content. Based on the results of subjective tests, it proposes to extend JPEG file format, as the most popular image format, in a backward compatible manner to also deal with HDR pictures. To this end, the paper provides an architecture to achieve such backward compatibility with JPEG and demonstrates efficiency of a simple implementation of this framework when compared to the state of the art HDR image compression.

  18. Thermal-neutron capture for A=26-35

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chunmei, Z.; Firestone, R.B.

    2001-06-01

    The prompt gamma-ray data of thermal- neutron captures fornuclear mass number A=26-35 had been evaluated and published in "ATOMICDATA AND NUCLEAR DATA TABLES, 26, 511 (1981)". Since that time themanyexperimental data of the thermal-neutron captures have been measuredand published. The update of the evaluated prompt gamma-ray data is verynecessary for use in PGAA of high-resolution analytical prompt gamma-rayspectroscopy. Besides, the evaluation is also very needed in theEvaluated Nuclear Structure Data File, ENSDF, because there are no promptgamma-ray data in ENSDF. The levels, prompt gamma-rays and decay schemesof thermal-neutron captures for nuclides (26Mg, 27Al, 28Si, 29Si, 30Si,31P, 32S, 33S, 34S, andmore » 35Cl) with nuclear mass number A=26-35 have beenevaluated on the basis of all experimental data. The normalizationfactors, from which absolute prompt gamma-ray intensity can be obtained,and necessary comments are given in the text. The ENSDF format has beenadopted in this evaluation. The physical check (intensity balance andenergy balance) of evaluated thermal-neutron capture data has been done.The evaluated data have been put into Evaluated Nuclear Structure DataFile, ENSDF. This evaluation may be considered as an update of the promptgamma-ray from thermal-neutron capture data tables as published in"ATOMIC DATA AND NUCLEAR DATA TABLES, 26, 511 (1981)".« less

  19. Thermal-neutron capture for A=36-44

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chunmei, Z.; Firestone, R.B.

    2003-01-01

    The prompt gamma-ray data of thermal- neutron captures fornuclear mass number A=26-35 had been evaluated and published in "ATOMICDATA AND NUCLEAR DATA TABLES, 26, 511 (1981)". Since that time the manyexperimental data of the thermal-neutron captures have been measured andpublished. The update of the evaluated prompt gamma-ray data is verynecessary for use in PGAA of high-resolution analytical prompt gamma-rayspectroscopy. Besides, the evaluation is also very needed in theEvaluated Nuclear Structure Data File, ENSDF, because there are no promptgamma-ray data in ENSDF. The levels, prompt gamma-rays and decay schemesof thermal-neutron captures fornuclides (26Mg, 27Al, 28Si, 29Si, 30Si,31P, 32S, 33S, 34S, andmore » 35Cl) with nuclear mass number A=26-35 have beenevaluated on the basis of all experimental data. The normalizationfactors, from which absolute prompt gamma-ray intensity can be obtained,and necessary comments are given in the text. The ENSDF format has beenadopted in this evaluation. The physical check (intensity balance andenergy balance) of evaluated thermal-neutron capture data has been done.The evaluated data have been put into Evaluated Nuclear Structure DataFile, ENSDF. This evaluation may be considered as an update of the promptgamma-ray from thermal-neutron capture data tables as published in"ATOMIC DATA AND NUCLEAR DATA TABLES, 26, 511 (1981)".« less

  20. 78 FR 67395 - Agency Information Collection Activities: Proposed Collection; Comments Requested: Registration...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-12

    ..., contain fillable- fileable, and E-signature capabilities, and the FARA eFile system in operation since March 1, 2011, permits registrants to file their registration forms electronically to the FARA Registration Unit, 24 hours a day, seven days a week. FARA eFile is accessed via the FARA public Web site...

  1. Biomechanical ToolKit: Open-source framework to visualize and process biomechanical data.

    PubMed

    Barre, Arnaud; Armand, Stéphane

    2014-04-01

    C3D file format is widely used in the biomechanical field by companies and laboratories to store motion capture systems data. However, few software packages can visualize and modify the integrality of the data in the C3D file. Our objective was to develop an open-source and multi-platform framework to read, write, modify and visualize data from any motion analysis systems using standard (C3D) and proprietary file formats (used by many companies producing motion capture systems). The Biomechanical ToolKit (BTK) was developed to provide cost-effective and efficient tools for the biomechanical community to easily deal with motion analysis data. A large panel of operations is available to read, modify and process data through C++ API, bindings for high-level languages (Matlab, Octave, and Python), and standalone application (Mokka). All these tools are open-source and cross-platform and run on all major operating systems (Windows, Linux, MacOS X). Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  2. Long wavelength propagation capacity, version 1.1 (computer diskette)

    NASA Astrophysics Data System (ADS)

    1994-05-01

    File Characteristics: software and data file. (72 files); ASCII character set. Physical Description: 2 computer diskettes; 3 1/2 in.; high density; 1.44 MB. System Requirements: PC compatible; Digital Equipment Corp. VMS; PKZIP (included on diskette). This report describes a revision of the Naval Command, Control and Ocean Surveillance Center RDT&E Division's Long Wavelength Propagation Capability (LWPC). The first version of this capability was a collection of separate FORTRAN programs linked together in operation by a command procedure written in an operating system unique to the Digital Equipment Corporation (Ferguson & Snyder, 1989a, b). A FORTRAN computer program named Long Wavelength Propagation Model (LWPM) was developed to replace the VMS control system (Ferguson & Snyder, 1990; Ferguson, 1990). This was designated version 1 (LWPC-1). This program implemented all the features of the original VMS plus a number of auxiliary programs that provided summaries of the files and graphical displays of the output files. This report describes a revision of the LWPC, designated version 1.1 (LWPC-1.1)

  3. The Design and Usage of the New Data Management Features in NASTRAN

    NASA Technical Reports Server (NTRS)

    Pamidi, P. R.; Brown, W. K.

    1984-01-01

    Two new data management features are installed in the April 1984 release of NASTRAN. These two features are the Rigid Format Data Base and the READFILE capability. The Rigid Format Data Base is stored on external files in card image format and can be easily maintained and expanded by the use of standard text editors. This data base provides the user and the NASTRAN maintenance contractor with an easy means for making changes to a Rigid Format or for generating new Rigid Formats without unnecessary compilations and link editing of NASTRAN. Each Rigid Format entry in the data base contains the Direct Matrix Abstraction Program (DMAP), along with the associated restart, DMAP sequence subset and substructure control flags. The READFILE capability allows an user to reference an external secondary file from the NASTRAN primary input file and to read data from this secondary file. There is no limit to the number of external secondary files that may be referenced and read.

  4. Unified Simulation and Analysis Framework for Deep Space Navigation Design

    NASA Technical Reports Server (NTRS)

    Anzalone, Evan; Chuang, Jason; Olsen, Carrie

    2013-01-01

    As the technology that enables advanced deep space autonomous navigation continues to develop and the requirements for such capability continues to grow, there is a clear need for a modular expandable simulation framework. This tool's purpose is to address multiple measurement and information sources in order to capture system capability. This is needed to analyze the capability of competing navigation systems as well as to develop system requirements, in order to determine its effect on the sizing of the integrated vehicle. The development for such a framework is built upon Model-Based Systems Engineering techniques to capture the architecture of the navigation system and possible state measurements and observations to feed into the simulation implementation structure. These models also allow a common environment for the capture of an increasingly complex operational architecture, involving multiple spacecraft, ground stations, and communication networks. In order to address these architectural developments, a framework of agent-based modules is implemented to capture the independent operations of individual spacecraft as well as the network interactions amongst spacecraft. This paper describes the development of this framework, and the modeling processes used to capture a deep space navigation system. Additionally, a sample implementation describing a concept of network-based navigation utilizing digitally transmitted data packets is described in detail. This developed package shows the capability of the modeling framework, including its modularity, analysis capabilities, and its unification back to the overall system requirements and definition.

  5. Purple L1 Milestone Review Panel GPFS Functionality and Performance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Loewe, W E

    2006-12-01

    The GPFS deliverable for the Purple system requires the functionality and performance necessary for ASC I/O needs. The functionality includes POSIX and MPIIO compatibility, and multi-TB file capability across the entire machine. The bandwidth performance required is 122.15 GB/s, as necessary for productive and defensive I/O requirements, and the metadata performance requirement is 5,000 file stats per second. To determine success for this deliverable, several tools are employed. For functionality testing of POSIX, 10TB-files, and high-node-count capability, the parallel file system bandwidth performance test IOR is used. IOR is an MPI-coordinated application that can write and then read to amore » single shared file or to an individual file per process and check the data integrity of the file(s). The MPIIO functionality is tested with the MPIIO test suite from the MPICH library. Bandwidth performance is tested using IOR for the required 122.15 GB/s sustained write. All IOR tests are performanced with data checking enabled. Metadata performance is tested after ''aging'' the file system with 80% data block usage and 20% inode usage. The fdtree metadata test is expected to create/remove a large directory/file structure in under 20 minutes time, akin to interactive metadata usage. Multiple (10) instances of ''ls -lR'', each performing over 100K stats, are run concurrently in different large directories to demonstrate 5,000 stats/sec.« less

  6. 76 FR 799 - Publication of Year 2010 Form M-1 With Electronic Filing Option, Notice

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-06

    ...-EBSA (3272). Questions on completing the form are being directed to the EBSA help desk at (202) 693-8360. For questions regarding the electronic filing capability, contact the EBSA computer help desk at... working together with administrators to help them comply with this filing requirement. Copies of the Form...

  7. 78 FR 67396 - Agency Information Collection Activities: Proposed collection; comments requested: Supplemental...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-12

    ...- fileable, and E-signature capabilities, and the FARA eFile system in operation since March 1 2011, permits registrants to file their registration forms electronically to the FARA Registration Unit, 24 hours a day, seven days a week. FARA eFile is accessed via the FARA public Web site located at http://www.fara.gov...

  8. Interoperability Using Lightweight Metadata Standards: Service & Data Casting, OpenSearch, OPM Provenance, and Shared SciFlo Workflows

    NASA Astrophysics Data System (ADS)

    Wilson, B. D.; Manipon, G.; Hua, H.; Fetzer, E.

    2011-12-01

    Under several NASA grants, we are generating multi-sensor merged atmospheric datasets to enable the detection of instrument biases and studies of climate trends over decades of data. For example, under a NASA MEASURES grant we are producing a water vapor climatology from the A-Train instruments, stratified by the Cloudsat cloud classification for each geophysical scene. The generation and proper use of such multi-sensor climate data records (CDR's) requires a high level of openness, transparency, and traceability. To make the datasets self-documenting and provide access to full metadata and traceability, we have implemented a set of capabilities and services using known, interoperable protocols. These protocols include OpenSearch, OPeNDAP, Open Provenance Model, service & data casting technologies using Atom feeds, and REST-callable analysis workflows implemented as SciFlo (XML) documents. We advocate that our approach can serve as a blueprint for how to openly "document and serve" complex, multi-sensor CDR's with full traceability. The capabilities and services provided include: - Discovery of the collections by keyword search, exposed using OpenSearch protocol; - Space/time query across the CDR's granules and all of the input datasets via OpenSearch; - User-level configuration of the production workflows so that scientists can select additional physical variables from the A-Train to add to the next iteration of the merged datasets; - Efficient data merging using on-the-fly OPeNDAP variable slicing & spatial subsetting of data out of input netCDF and HDF files (without moving the entire files); - Self-documenting CDR's published in a highly usable netCDF4 format with groups used to organize the variables, CF-style attributes for each variable, numeric array compression, & links to OPM provenance; - Recording of processing provenance and data lineage into a query-able provenance trail in Open Provenance Model (OPM) format, auto-captured by the workflow engine; - Open Publishing of all of the workflows used to generate products as machine-callable REST web services, using the capabilities of the SciFlo workflow engine; - Advertising of the metadata (e.g. physical variables provided, space/time bounding box, etc.) for our prepared datasets as "datacasts" using the Atom feed format; - Publishing of all datasets via our "DataDrop" service, which exploits the WebDAV protocol to enable scientists to access remote data directories as local files on their laptops; - Rich "web browse" of the CDR's with full metadata and the provenance trail one click away; - Advertising of all services as Google-discoverable "service casts" using the Atom format. The presentation will describe our use of the interoperable protocols and demonstrate the capabilities and service GUI's.

  9. Fallon, Nevada FORGE Distinct Element Reservoir Modeling

    DOE Data Explorer

    Blankenship, Doug; Pettitt, Will; Riahi, Azadeh; Hazzard, Jim; Blanksma, Derrick

    2018-03-12

    Archive containing input/output data for distinct element reservoir modeling for Fallon FORGE. Models created using 3DEC, InSite, and in-house Python algorithms (ITASCA). List of archived files follows; please see 'Modeling Metadata.pdf' (included as a resource below) for additional file descriptions. Data sources include regional geochemical model, well positions and geometry, principal stress field, capability for hydraulic fractures, capability for hydro-shearing, reservoir geomechanical model-stimulation into multiple zones, modeled thermal behavior during circulation, and microseismicity.

  10. A Brief Introduction to Web-Based Note Capture

    ERIC Educational Resources Information Center

    Ovadia, Steven

    2012-01-01

    While physical notebooks and locally saved electronic files are certainly helpful, there are a number of web-based solutions that might be useful to someone conducting research online, or looking to hold their notes in a web-based environment. The main advantage of a web-based note capture tool is that one is able to access it from just about…

  11. Health Capability: Conceptualization and Operationalization

    PubMed Central

    2010-01-01

    Current theoretical approaches to bioethics and public health ethics propose varied justifications as the basis for health care and public health, yet none captures a fundamental reality: people seek good health and the ability to pursue it. Existing models do not effectively address these twin goals. The approach I espouse captures both of these orientations through a concept here called health capability. Conceptually, health capability illuminates the conditions that affect health and one's ability to make health choices. By respecting the health consequences individuals face and their health agency, health capability offers promise for finding a balance between paternalism and autonomy. I offer a conceptual model of health capability and present a health capability profile to identify and address health capability gaps. PMID:19965570

  12. Trapping self-propelled micromotors with microfabricated chevron and heart-shaped chips† †Electronic supplementary information (ESI) available: Supporting videos (S1; S2 and S3). See DOI: 10.1039/c3lc51419f Click here for additional data file. Click here for additional data file. Click here for additional data file. Click here for additional data file.

    PubMed Central

    Restrepo-Pérez, Laura; Soler, Lluís; Martínez-Cisneros, Cynthia S.; Schmidt, Oliver G.

    2014-01-01

    We demonstrate that catalytic micromotors can be trapped in microfluidic chips containing chevron and heart-shaped structures. Despite the challenge presented by the reduced size of the traps, microfluidic chips with different trapping geometries can be fabricated via replica moulding. We prove that these microfluidic chips can capture micromotors without the need for any external mechanism to control their motion. PMID:24643940

  13. Data::Downloader

    NASA Technical Reports Server (NTRS)

    Duggan, Brian

    2012-01-01

    Downloading and organizing large amounts of files is challenging, and often done using ad hoc methods. This software is capable of downloading and organizing files as an OpenSearch client. It can subscribe to RSS (Really Simple Syndication) feeds and Atom feeds containing arbitrary metadata, and maintains a local content addressable data store. It uses existing standards for obtaining the files, and uses efficient techniques for storing the files. Novel features include symbolic links to maintain a sane directory structure, checksums for validating file integrity during transfer and storage, and flexible use of server-provided metadata.

  14. Electronic photography at NASA Langley Research Center

    NASA Technical Reports Server (NTRS)

    Holm, Jack M.

    1994-01-01

    The field of photography began a metamorphosis several years ago which promises to fundamentally change how images are captured, transmitted, and output. At this time the metamorphosis is still in the early stages, but already new processes, hardware, and software are allowing many individuals and organizations to explore the entry of imaging into the information revolution. Exploration at this time is prerequisite to leading expertise in the future, and a number of branches at LaRC have ventured into electronic and digital imaging. Their progress until recently has been limited by two factors: the lack of an integrated approach and the lack of an electronic photographic capability. The purpose of the research conducted was to address these two items. In some respects, the lack of electronic photographs has prevented application of an integrated imaging approach. Since everything could not be electronic, the tendency was to work with hard copy. Over the summer, the Photographics Section has set up an Electronic Photography Laboratory. This laboratory now has the capability to scan film images, process the images, and output the images in a variety of forms. Future plans also include electronic capture capability. The current forms of image processing available include sharpening, noise reduction, dust removal, tone correction, color balancing, image editing, cropping, electronic separations, and halftoning. Output choices include customer specified electronic file formats which can be output on magnetic or optical disks or over the network, 4400 line photographic quality prints and transparencies to 8.5 by 11 inches, and 8000 line film negatives and transparencies to 4 by 5 inches. The problem of integrated imaging involves a number of branches at LaRC including Visual Imaging, Research Printing and Publishing, Data Visualization and Animation, Advanced Computing, and various research groups. These units must work together to develop common approaches to image processing and archiving. The ultimate goal is to be able to search for images using an on-line database and image catalog. These images could then be retrieved over the network as needed, along with information on the acquisition and processing prior to storage. For this goal to be realized, a number of standard processing protocols must be developed to allow the classification of images into categories. Standard series of processing algorithms can then be applied to each category (although many of these may be adaptive between images). Since the archived image files would be standardized, it should also be possible to develop standard output processing protocols for a number of output devices. If LaRC continues the research effort begun this summer, it may be one of the first organizations to develop an integrated approach to imaging. As such, it could serve as a model for other organizations in government and the private sector.

  15. New research discovery may mean less radioactive contamination, safer nuclear power plants

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Murph, S.

    Murph has now made another nanoparticle breakthrough that could benefit various work environments such as nuclear power plants. Murph and her team have created nanoparticle treated stainless steel filters that are capable to capturing radioactive vapor materials. Just like air filters capture dust and dirt, these filters are capable of capturing large amounts of radioactive vapors. The new research may one day mean that nuclear power plant workers, and other workers in related fields, will have a safer working environment.

  16. Scanning electron microscope automatic defect classification of process induced defects

    NASA Astrophysics Data System (ADS)

    Wolfe, Scott; McGarvey, Steve

    2017-03-01

    With the integration of high speed Scanning Electron Microscope (SEM) based Automated Defect Redetection (ADR) in both high volume semiconductor manufacturing and Research and Development (R and D), the need for reliable SEM Automated Defect Classification (ADC) has grown tremendously in the past few years. In many high volume manufacturing facilities and R and D operations, defect inspection is performed on EBeam (EB), Bright Field (BF) or Dark Field (DF) defect inspection equipment. A comma separated value (CSV) file is created by both the patterned and non-patterned defect inspection tools. The defect inspection result file contains a list of the inspection anomalies detected during the inspection tools' examination of each structure, or the examination of an entire wafers surface for non-patterned applications. This file is imported into the Defect Review Scanning Electron Microscope (DRSEM). Following the defect inspection result file import, the DRSEM automatically moves the wafer to each defect coordinate and performs ADR. During ADR the DRSEM operates in a reference mode, capturing a SEM image at the exact position of the anomalies coordinates and capturing a SEM image of a reference location in the center of the wafer. A Defect reference image is created based on the Reference image minus the Defect image. The exact coordinates of the defect is calculated based on the calculated defect position and the anomalies stage coordinate calculated when the high magnification SEM defect image is captured. The captured SEM image is processed through either DRSEM ADC binning, exporting to a Yield Analysis System (YAS), or a combination of both. Process Engineers, Yield Analysis Engineers or Failure Analysis Engineers will manually review the captured images to insure that either the YAS defect binning is accurately classifying the defects or that the DRSEM defect binning is accurately classifying the defects. This paper is an exploration of the feasibility of the utilization of a Hitachi RS4000 Defect Review SEM to perform Automatic Defect Classification with the objective of the total automated classification accuracy being greater than human based defect classification binning when the defects do not require multiple process step knowledge for accurate classification. The implementation of DRSEM ADC has the potential to improve the response time between defect detection and defect classification. Faster defect classification will allow for rapid response to yield anomalies that will ultimately reduce the wafer and/or the die yield.

  17. Efficient, quality-assured data capture in operational research through innovative use of open-access technology

    PubMed Central

    Naik, B.; Guddemane, D. K.; Bhat, P.; Wilson, N.; Sreenivas, A. N.; Lauritsen, J. M.; Rieder, H. L.

    2013-01-01

    Ensuring quality of data during electronic data capture has been one of the most neglected components of operational research. Multicentre studies are also challenged with issues about logistics of travel, training, supervision, monitoring and troubleshooting support. Allocating resources to these issues can pose a significant bottleneck for operational research in resource-limited settings. In this article, we describe an innovative and efficient way of coordinating data capture in multicentre operational research using a combination of three open access technologies—EpiData for data capture, Dropbox for sharing files and TeamViewer for providing remote support. PMID:26392997

  18. Operating CFDP in the Interplanetary Internet

    NASA Technical Reports Server (NTRS)

    Burleigh, S.

    2002-01-01

    This paper examines the design elements of CCSDS File Delivery Protocol and Interplanetary Internet technologies that will simplify their integration and discusses the resulting new capabilities, such as efficient transmission of large files via multiple relay satellites operating in parallel.

  19. Trick Simulation Environment 07

    NASA Technical Reports Server (NTRS)

    Lin, Alexander S.; Penn, John M.

    2012-01-01

    The Trick Simulation Environment is a generic simulation toolkit used for constructing and running simulations. This release includes a Monte Carlo analysis simulation framework and a data analysis package. It produces all auto documentation in XML. Also, the software is capable of inserting a malfunction at any point during the simulation. Trick 07 adds variable server output options and error messaging and is capable of using and manipulating wide characters for international support. Wide character strings are available as a fundamental type for variables processed by Trick. A Trick Monte Carlo simulation uses a statistically generated, or predetermined, set of inputs to iteratively drive the simulation. Also, there is a framework in place for optimization and solution finding where developers may iteratively modify the inputs per run based on some analysis of the outputs. The data analysis package is capable of reading data from external simulation packages such as MATLAB and Octave, as well as the common comma-separated values (CSV) format used by Excel, without the use of external converters. The file formats for MATLAB and Octave were obtained from their documentation sets, and Trick maintains generic file readers for each format. XML tags store the fields in the Trick header comments. For header files, XML tags for structures and enumerations, and the members within are stored in the auto documentation. For source code files, XML tags for each function and the calling arguments are stored in the auto documentation. When a simulation is built, a top level XML file, which includes all of the header and source code XML auto documentation files, is created in the simulation directory. Trick 07 provides an XML to TeX converter. The converter reads in header and source code XML documentation files and converts the data to TeX labels and tables suitable for inclusion in TeX documents. A malfunction insertion capability allows users to override the value of any simulation variable, or call a malfunction job, at any time during the simulation. Users may specify conditions, use the return value of a malfunction trigger job, or manually activate a malfunction. The malfunction action may consist of executing a block of input file statements in an action block, setting simulation variable values, call a malfunction job, or turn on/off simulation jobs.

  20. 76 FR 51002 - Marine Mammals; File No. 16553

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-17

    ... californianus), northern elephant seals (Mirounga angustirostris), and harbor seals (Phoca vitulina). DATES.... California sea lions, northern elephant seals, and harbor seals would be captured and samples at several...

  1. 77 FR 5493 - Marine Mammals; File No. 15802

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-03

    ... applicant also seeks authorization to capture green (Chelonia mydas), hawksbill (Eretmochelys imbricata... caretta) sea turtles. Sea turtles would be measured, photographed, and released. On December 14, 2011 (76...

  2. JSATS Detector Field Manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Choi, Eric Y.; Flory, Adam E.; Lamarche, Brian L.

    2014-06-01

    The Juvenile Salmon Acoustic Telemetry System (JSATS) Detector is a software and hardware system that captures JSATS Acoustic Micro Transmitter (AMT) signals. The system uses hydrophones to capture acoustic signals in the water. This analog signal is then amplified and processed by the Analog to Digital Converter (ADC) and Digital Signal Processor (DSP) board in the computer. This board digitizes and processes the acoustic signal to determine if a possible JSATS tag is present. With this detection, the data will be saved to the computer for further analysis. This document details the features and functionality of the JSATS Detector software.more » The document covers how to install the software, setup and run the detector software. The document will also go over the raw binary waveform file format and CSV files containing RMS values« less

  3. 76 FR 68719 - Marine Mammals; File No. 16553

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-07

    ...), northern elephant seals (Mirounga angustirostris), and harbor seals (Phoca vitulina). ADDRESSES: The permit... species. California sea lions, northern elephant seals, and harbor seals may be captured and sampled at...

  4. Recent Developments in Toxico-Cheminformatics: A New ...

    EPA Pesticide Factsheets

    Efforts to improve public access to chemical toxicity information resources, coupled with new high-throughput screening (HTS) data and efforts to systematize legacy toxicity studies, have the potential to significantly improve predictive capabilities in toxicology. Important recent developments include: 1) large and growing public resources that link chemical structures to biological activity and toxicity data in searchable format, and that offer more nuanced and varied representations of activity; 2) standardized relational data models that capture relevant details of chemical treatment and effects of published in vivo experiments; and 3) the generation of large amounts of new data from public efforts that are employing HTS technologies to probe a wide range of bioactivity and cellular processes across large swaths of chemical space. Most recently, EPA’s DSSTox project has published several new EPA chemical data inventories (IRIS, HPV, ToxCast) and added an on-line capability for structure (substructure or similarity)-searching through all or parts of the published DSSTox data files. These efforts are, for the first time in many cases, opening up a structure-paved two-way highway between previously inaccessible or isolated public chemical data repositories and large public resources, such as PubChem. In addition, public initiatives (such as ToxML) are developing systematized data models of toxicity study areas, and introducing standardized templates, contr

  5. Crew procedures and workload of retrofit concepts for microwave landing system

    NASA Technical Reports Server (NTRS)

    Summers, Leland G.; Jonsson, Jon E.

    1989-01-01

    Crew procedures and workload for Microwave Landing Systems (MLS) that could be retrofitted into existing transport aircraft were evaluated. Two MLS receiver concepts were developed. One is capable of capturing a runway centerline and the other is capable of capturing a segmented approach path. Crew procedures were identified and crew task analyses were performed using each concept. Crew workload comparisons were made between the MLS concepts and an ILS baseline using a task-timeline workload model. Workload indexes were obtained for each scenario. The results showed that workload was comparable to the ILS baseline for the MLS centerline capture concept, but significantly higher for the segmented path capture concept.

  6. Creating and Searching a Local Inventory for Data Granules in a Remote Archive

    NASA Astrophysics Data System (ADS)

    Cornillon, P. C.

    2016-12-01

    More often than not, search capabilities for network accessible data do not exist or do not meet the requirements of the user. For large archives this can make finding data of interest tedious at best. This summer, the author encountered such a problem with regard to the two existing archives of VIIRS L2 sea surface temperature (SST) fields obtained with the new ACSPO retrieval algorithm; one at the Jet Propulsion Laboratory's PO-DAAC and the other at NOAA's National Centers for Environmental Information (NCEI). In both cases the data were available via ftp and OPeNDAP but there was no search capability at the PO-DAAC and the NCEI archive was incomplete. Furthermore, in order to meet the needs of a broad range of datasets and users, the beta version of the search engine at NCEI was cumbersome for the searches of interest. Although some of these problems have been resolved since (and may be described in other posters/presentations at this meeting), the solution described in this presentation offers the user the ability to develop a search capability for archives lacking a search capability and/or to configure searches more to his or her preferences than the generic searches offered by the data provider. The solution, a Matlab script, used html access to the PO-DAAC web site to locate all VIIRS 10 minute granules and OPeNDAP access to acquire the bounding box for each granule from the metadata bound to the file. This task required several hours of wall time to acquire the data and to write the bounding boxes to a local file with the associated ftp and OPeNDAP urls for the 110,000+ granule archive. A second Matlab script searched the local archive, seconds, for granules falling in a user defined space-time window and an ascii file of wget commands associated with these was generated. This file was then executed to acquire the data of interest. The wget commands can be configured to acquire the entire files via ftp or a subset of each file via OPeNDAP. Furthermore, the search capability, based on bounding boxes and rectangular regions, could easily be modified to further refine the search. Finally, the script that builds the inventory has been designed to update the local inventory, minutes per month rather than hours.

  7. Thermal-neutron capture gamma-rays. Volume 2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tuli, J.K.

    1997-05-01

    The energy and photon intensity of gamma rays as seen in thermal-neutron capture are presented ordered by Z, A of target nuclei. All gamma-rays with intensity of {ge}2% of the strongest transition are included. The strongest transition is indicated in each case. Where the target nuclide mass number is indicated as nat the natural target was used. The gamma energies given are in keV. The gamma intensities given are relative to 100 for the strongest transition. All data for A > 44 are taken from Evaluated Nuclear Structure Data File (4/97), a computer file of evaluated nuclear structure data maintainedmore » by the National Nuclear Data Center, Brookhaven National Laboratory, on behalf of the Nuclear Structure and Decay and Decay Data network, coordinated by the International Atomic Energy Agency, Vienna. These data are published in Nuclear Data Sheets, Academic Press, San Diego, CA. The data for A {le} 44 is taken from ``Prompt Gamma Rays from Thermal-Neutron Capture,`` M.A. Lone, R.A. Leavitt, D.A. Harrison, Atomic Data and Nuclear Data Tables 26, 511 (1981).« less

  8. Thermal-neutron capture gamma-rays. Volume 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tuli, J.K.

    1997-05-01

    The energy and photon intensity of gamma rays as seen in thermal-neutron capture are presented in ascending order of gamma energy. All those gamma-rays with intensity of {ge} 2% of the strongest transition are included. The two strongest transitions seen for the target nuclide are indicated in each case. Where the target nuclide mass number is indicated as nat the natural target was used. The gamma energies given are in keV. The gamma intensities given are relative to 100 for the strongest transition. All data for A > 44 are taken from Evaluated Nuclear Structure Data File (4/97), a computermore » file of evaluated nuclear structure data maintained by the National Nuclear Data Center, Brookhaven National Laboratory, on behalf of the Nuclear Structure and Decay and Decay Data network, coordinated by the International Atomic Energy Agency, Vienna. These data are published in Nuclear Data Sheets, Academic Press, San Diego, CA. The data for A {le} 44 is taken from ``Prompt Gamma Rays from Thermal-Neutron Capture,`` M.A. Lone, R.A. Leavitt, D.A. Harrison, Atomic Data and Nuclear Data Tables 26, 511 (1981).« less

  9. ModelMate - A graphical user interface for model analysis

    USGS Publications Warehouse

    Banta, Edward R.

    2011-01-01

    ModelMate is a graphical user interface designed to facilitate use of model-analysis programs with models. This initial version of ModelMate supports one model-analysis program, UCODE_2005, and one model software program, MODFLOW-2005. ModelMate can be used to prepare input files for UCODE_2005, run UCODE_2005, and display analysis results. A link to the GW_Chart graphing program facilitates visual interpretation of results. ModelMate includes capabilities for organizing directories used with the parallel-processing capabilities of UCODE_2005 and for maintaining files in those directories to be identical to a set of files in a master directory. ModelMate can be used on its own or in conjunction with ModelMuse, a graphical user interface for MODFLOW-2005 and PHAST.

  10. 75 FR 81601 - North American Electric Reliability Corporation; Notice of Compliance Filing

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-28

    ... Electric Reliability Corporation; Notice of Compliance Filing December 20, 2010. Take notice that on December 1, 2010, North American Electric Reliability Corporation, in response to Paragraph 274 of the... Transfer Capability Reliability Standards. \\1\\ Mandatory Reliability Standards for the Calculation of...

  11. Network issues for large mass storage requirements

    NASA Technical Reports Server (NTRS)

    Perdue, James

    1992-01-01

    File Servers and Supercomputing environments need high performance networks to balance the I/O requirements seen in today's demanding computing scenarios. UltraNet is one solution which permits both high aggregate transfer rates and high task-to-task transfer rates as demonstrated in actual tests. UltraNet provides this capability as both a Server-to-Server and Server-to-Client access network giving the supercomputing center the following advantages highest performance Transport Level connections (to 40 MBytes/sec effective rates); matches the throughput of the emerging high performance disk technologies, such as RAID, parallel head transfer devices and software striping; supports standard network and file system applications using SOCKET's based application program interface such as FTP, rcp, rdump, etc.; supports access to the Network File System (NFS) and LARGE aggregate bandwidth for large NFS usage; provides access to a distributed, hierarchical data server capability using DISCOS UniTree product; supports file server solutions available from multiple vendors, including Cray, Convex, Alliant, FPS, IBM, and others.

  12. Production data in media systems and press front ends: capture, formats and database methods

    NASA Astrophysics Data System (ADS)

    Karttunen, Simo

    1997-02-01

    The nature, purpose and data presentation features of media jobs are analyzed in relation to the content, document, process and resource management in media production. Formats are the natural way of presenting, collecting and storing information, contents, document components and final documents. The state of the art and the trends in the media formats and production data are reviewed. The types and the amount of production data are listed, e.g. events, schedules, product descriptions, reports, visual support, quality, process states and color data. The data exchange must be vendor-neutral. Adequate infrastructure and system architecture are defined for production and media data. The roles of open servers and intranets are evaluated and their potential roles as future solutions are anticipated. The press frontend is the part of print media production where large files dominate. The new output alternatives, i.e. film recorders, direct plate output (CTP and CTP-on-press) and digital, plateless printing lines need new workflow tools and very efficient file and format management. The paper analyzes the capture, formatting and storing of job files and respective production data, such as the event logs of the processes. Intranet, browsers, Java applets and open web severs will be used to capture production data, especially where intranets are used anyhow, or where several companies are networked to plan, design and use documents and printed products. The user aspects of installing intranets is stressed since there are numerous more traditional and more dedicated networking solutions on the market.

  13. Digital photography for the light microscope: results with a gated, video-rate CCD camera and NIH-image software.

    PubMed

    Shaw, S L; Salmon, E D; Quatrano, R S

    1995-12-01

    In this report, we describe a relatively inexpensive method for acquiring, storing and processing light microscope images that combines the advantages of video technology with the powerful medium now termed digital photography. Digital photography refers to the recording of images as digital files that are stored, manipulated and displayed using a computer. This report details the use of a gated video-rate charge-coupled device (CCD) camera and a frame grabber board for capturing 256 gray-level digital images from the light microscope. This camera gives high-resolution bright-field, phase contrast and differential interference contrast (DIC) images but, also, with gated on-chip integration, has the capability to record low-light level fluorescent images. The basic components of the digital photography system are described, and examples are presented of fluorescence and bright-field micrographs. Digital processing of images to remove noise, to enhance contrast and to prepare figures for printing is discussed.

  14. Integration experiences and performance studies of A COTS parallel archive systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Hsing-bung; Scott, Cody; Grider, Bary

    2010-01-01

    Current and future Archive Storage Systems have been asked to (a) scale to very high bandwidths, (b) scale in metadata performance, (c) support policy-based hierarchical storage management capability, (d) scale in supporting changing needs of very large data sets, (e) support standard interface, and (f) utilize commercial-off-the-shelf(COTS) hardware. Parallel file systems have been asked to do the same thing but at one or more orders of magnitude faster in performance. Archive systems continue to move closer to file systems in their design due to the need for speed and bandwidth, especially metadata searching speeds such as more caching and lessmore » robust semantics. Currently the number of extreme highly scalable parallel archive solutions is very small especially those that will move a single large striped parallel disk file onto many tapes in parallel. We believe that a hybrid storage approach of using COTS components and innovative software technology can bring new capabilities into a production environment for the HPC community much faster than the approach of creating and maintaining a complete end-to-end unique parallel archive software solution. In this paper, we relay our experience of integrating a global parallel file system and a standard backup/archive product with a very small amount of additional code to provide a scalable, parallel archive. Our solution has a high degree of overlap with current parallel archive products including (a) doing parallel movement to/from tape for a single large parallel file, (b) hierarchical storage management, (c) ILM features, (d) high volume (non-single parallel file) archives for backup/archive/content management, and (e) leveraging all free file movement tools in Linux such as copy, move, ls, tar, etc. We have successfully applied our working COTS Parallel Archive System to the current world's first petaflop/s computing system, LANL's Roadrunner, and demonstrated its capability to address requirements of future archival storage systems.« less

  15. Integration experiments and performance studies of a COTS parallel archive system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Hsing-bung; Scott, Cody; Grider, Gary

    2010-06-16

    Current and future Archive Storage Systems have been asked to (a) scale to very high bandwidths, (b) scale in metadata performance, (c) support policy-based hierarchical storage management capability, (d) scale in supporting changing needs of very large data sets, (e) support standard interface, and (f) utilize commercial-off-the-shelf (COTS) hardware. Parallel file systems have been asked to do the same thing but at one or more orders of magnitude faster in performance. Archive systems continue to move closer to file systems in their design due to the need for speed and bandwidth, especially metadata searching speeds such as more caching andmore » less robust semantics. Currently the number of extreme highly scalable parallel archive solutions is very small especially those that will move a single large striped parallel disk file onto many tapes in parallel. We believe that a hybrid storage approach of using COTS components and innovative software technology can bring new capabilities into a production environment for the HPC community much faster than the approach of creating and maintaining a complete end-to-end unique parallel archive software solution. In this paper, we relay our experience of integrating a global parallel file system and a standard backup/archive product with a very small amount of additional code to provide a scalable, parallel archive. Our solution has a high degree of overlap with current parallel archive products including (a) doing parallel movement to/from tape for a single large parallel file, (b) hierarchical storage management, (c) ILM features, (d) high volume (non-single parallel file) archives for backup/archive/content management, and (e) leveraging all free file movement tools in Linux such as copy, move, Is, tar, etc. We have successfully applied our working COTS Parallel Archive System to the current world's first petafiop/s computing system, LANL's Roadrunner machine, and demonstrated its capability to address requirements of future archival storage systems.« less

  16. Method and system for capturing carbon dioxide and/or sulfur dioxide from gas stream

    DOEpatents

    Chang, Shih-Ger; Li, Yang; Zhao, Xinglei

    2014-07-08

    The present invention provides a system for capturing CO.sub.2 and/or SO.sub.2, comprising: (a) a CO.sub.2 and/or SO.sub.2 absorber comprising an amine and/or amino acid salt capable of absorbing the CO.sub.2 and/or SO.sub.2 to produce a CO.sub.2- and/or SO.sub.2-containing solution; (b) an amine regenerator to regenerate the amine and/or amino acid salt; and, when the system captures CO.sub.2, (c) an alkali metal carbonate regenerator comprising an ammonium catalyst capable catalyzing the aqueous alkali metal bicarbonate into the alkali metal carbonate and CO.sub.2 gas. The present invention also provides for a system for capturing SO.sub.2, comprising: (a) a SO.sub.2 absorber comprising aqueous alkali metal carbonate, wherein the alkali metal carbonate is capable of absorbing the SO.sub.2 to produce an alkali metal sulfite/sulfate precipitate and CO.sub.2.

  17. Studies of an extensively axisymmetric rocket based combined cycle (RBCC) engine powered single-stage-to-orbit (SSTO) vehicle

    NASA Technical Reports Server (NTRS)

    Foster, Richard W.; Escher, William J. D.; Robinson, John W.

    1989-01-01

    The present comparative performance study has established that rocket-based combined cycle (RBCC) propulsion systems, when incorporated by essentially axisymmetric SSTO launch vehicle configurations whose conical forebody maximizes both capture-area ratio and total capture area, are capable of furnishing payload-delivery capabilities superior to those of most multistage, all-rocket launchers. Airbreathing thrust augmentation in the rocket-ejector mode of an RBCC powerplant is noted to make a major contribution to final payload capability, by comparison to nonair-augmented rocket engine propulsion systems.

  18. An overview of the catalog manager

    NASA Technical Reports Server (NTRS)

    Irani, Frederick M.

    1986-01-01

    The Catalog Manager (CM) is being used at the Goddard Space Flight Center in conjunction with the Land Analysis System (LAS) running under the Transportable Applications Executive (TAE). CM maintains a catalog of file names for all users of the LAS system. The catalog provides a cross-reference between TAE user file names and fully qualified host-file names. It also maintains information about the content and status of each file. A brief history of CM development is given and a description of naming conventions, catalog structure and file attributes, and archive/retrieve capabilities is presented. General user operation and the LAS user scenario are also discussed.

  19. Demonstration of New OLAF Capabilities and Technologies

    NASA Astrophysics Data System (ADS)

    Kingston, C.; Palmer, E.; Stone, J.; Neese, C.; Mueller, B.

    2017-06-01

    Upgrades to the On-Line Archiving Facility (OLAF) PDS tool are leading to improved usability and additional functionality by integration of JavaScript web app frameworks. Also included is the capability to upload tabular data as CSV files.

  20. Securing Sensitive Flight and Engine Simulation Data Using Smart Card Technology

    NASA Technical Reports Server (NTRS)

    Blaser, Tammy M.

    2003-01-01

    NASA Glenn Research Center has developed a smart card prototype capable of encrypting and decrypting disk files required to run a distributed aerospace propulsion simulation. Triple Data Encryption Standard (3DES) encryption is used to secure the sensitive intellectual property on disk pre, during, and post simulation execution. The prototype operates as a secure system and maintains its authorized state by safely storing and permanently retaining the encryption keys only on the smart card. The prototype is capable of authenticating a single smart card user and includes pre simulation and post simulation tools for analysis and training purposes. The prototype's design is highly generic and can be used to protect any sensitive disk files with growth capability to urn multiple simulations. The NASA computer engineer developed the prototype on an interoperable programming environment to enable porting to other Numerical Propulsion System Simulation (NPSS) capable operating system environments.

  1. HDF4 Maps: For Now and For the Future

    NASA Astrophysics Data System (ADS)

    Plutchak, J.; Aydt, R.; Folk, M. J.

    2013-12-01

    Data formats and access tools necessarily change as technology improves to address emerging requirements with new capabilities. This on-going process inevitably leaves behind significant data collections in legacy formats that are difficult to support and sustain. NASA ESDIS and The HDF Group currently face this problem with large and growing archives of data in HDF4, an older version of the HDF format. Indefinitely guaranteeing the ability to read these data with multi-platform libraries in many languages is very difficult. As an alternative, HDF and NASA worked together to create maps of the files that contain metadata and information about data types, locations, and sizes of data objects in the files. These maps are written in XML and have successfully been used to access and understand data in HDF4 files without the HDF libraries. While originally developed to support sustainable access to these data, these maps can also be used to provide access to HDF4 metadata, facilitate user understanding of files prior to download, and validate the files for compliance with particular conventions. These capabilities are now available as a service for HDF4 archives and users.

  2. An overview of the National Space Science data Center Standard Information Retrieval System (SIRS)

    NASA Technical Reports Server (NTRS)

    Shapiro, A.; Blecher, S.; Verson, E. E.; King, M. L. (Editor)

    1974-01-01

    A general overview is given of the National Space Science Data Center (NSSDC) Standard Information Retrieval System. A description, in general terms, the information system that contains the data files and the software system that processes and manipulates the files maintained at the Data Center. Emphasis is placed on providing users with an overview of the capabilities and uses of the NSSDC Standard Information Retrieval System (SIRS). Examples given are taken from the files at the Data Center. Detailed information about NSSDC data files is documented in a set of File Users Guides, with one user's guide prepared for each file processed by SIRS. Detailed information about SIRS is presented in the SIRS Users Guide.

  3. GWM-VI: groundwater management with parallel processing for multiple MODFLOW versions

    USGS Publications Warehouse

    Banta, Edward R.; Ahlfeld, David P.

    2013-01-01

    Groundwater Management–Version Independent (GWM–VI) is a new version of the Groundwater Management Process of MODFLOW. The Groundwater Management Process couples groundwater-flow simulation with a capability to optimize stresses on the simulated aquifer based on an objective function and constraints imposed on stresses and aquifer state. GWM–VI extends prior versions of Groundwater Management in two significant ways—(1) it can be used with any version of MODFLOW that meets certain requirements on input and output, and (2) it is structured to allow parallel processing of the repeated runs of the MODFLOW model that are required to solve the optimization problem. GWM–VI uses the same input structure for files that describe the management problem as that used by prior versions of Groundwater Management. GWM–VI requires only minor changes to the input files used by the MODFLOW model. GWM–VI uses the Joint Universal Parameter IdenTification and Evaluation of Reliability Application Programming Interface (JUPITER-API) to implement both version independence and parallel processing. GWM–VI communicates with the MODFLOW model by manipulating certain input files and interpreting results from the MODFLOW listing file and binary output files. Nearly all capabilities of prior versions of Groundwater Management are available in GWM–VI. GWM–VI has been tested with MODFLOW-2005, MODFLOW-NWT (a Newton formulation for MODFLOW-2005), MF2005-FMP2 (the Farm Process for MODFLOW-2005), SEAWAT, and CFP (Conduit Flow Process for MODFLOW-2005). This report provides sample problems that demonstrate a range of applications of GWM–VI and the directory structure and input information required to use the parallel-processing capability.

  4. Use of Grasp Force Focus Positioning to Enhance the Torque Resistance Capability of Robotic Grasps

    DTIC Science & Technology

    1990-12-13

    8217SEQUENTIAL’, 124 1 FORM=’FORMATTED’) 125 OPEN(18,FILE=’CNTCTS’//RUN//’. DkT ’ ,STATUS=’NEW’, 126 1 ACCESS=’SEqtJENTIAL’, 127 1 FO’FOMTED’) 128 OPEN(19,FILE

  5. Program Description: Financial Master File Processor-SWRL Financial System.

    ERIC Educational Resources Information Center

    Ideda, Masumi

    Computer routines designed to produce various management and accounting reports required by the Southwest Regional Laboratory's (SWRL) Financial System are described. Input data requirements and output report formats are presented together with a discussion of the Financial Master File updating capabilities of the system. This document should be…

  6. 47 CFR 1.7001 - Scope and content of filed reports.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Telecommunications Capability Data § 1.7001 Scope and content of filed reports. (a) Definitions. Terms used in this... services over their own facilities or over Unbundled Network Elements (UNEs), special access lines, and other leased lines and wireless channels that the entity obtains from a communications service provider...

  7. Student-Built Underwater Video and Data Capturing Device

    NASA Astrophysics Data System (ADS)

    Whitt, F.

    2016-12-01

    Students from Stockbridge High School Robotics Team invention is a low cost underwater video and data capturing device. This system is capable of shooting time-lapse photography and/or video for up to 3 days of video at a time. It can be used in remote locations without having to change batteries or adding additional external hard drives for data storage. The video capturing device has a unique base and mounting system which houses a pi drive and a programmable raspberry pi with a camera module. This system is powered by two 12 volt batteries, which makes it easier for users to recharge after use. Our data capturing device has the same unique base and mounting system as the underwater camera. The data capturing device consists of an Arduino and SD card shield that is capable of collecting continuous temperature and pH readings underwater. This data will then be logged onto the SD card for easy access and recording. The low cost underwater video and data capturing device can reach depths up to 100 meters while recording 36 hours of video on 1 terabyte of storage. It also features night vision infrared light capabilities. The cost to build our invention is $500. The goal of this was to provide a device that can easily be accessed by marine biologists, teachers, researchers and citizen scientists to capture photographic and water quality data in marine environments over extended periods of time.

  8. A generic interface between COSMIC/NASTRAN and PATRAN (R)

    NASA Technical Reports Server (NTRS)

    Roschke, Paul N.; Premthamkorn, Prakit; Maxwell, James C.

    1990-01-01

    Despite its powerful analytical capabilities, COSMIC/NASTRAN lacks adequate post-processing adroitness. PATRAN, on the other hand is widely accepted for its graphical capabilities. A nonproprietary, public domain code mnemonically titled CPI (for COSMIC/NASTRAN-PATRAN Interface) is designed to manipulate a large number of files rapidly and efficiently between the two parent codes. In addition to PATRAN's results file preparation, CPI also prepares PATRAN's P/PLOT data files for xy plotting. The user is prompted for necessary information during an interactive session. Current implementation supports NASTRAN's displacement approach including the following rigid formats: (1) static analysis, (2) normal modal analysis, (3) direct transient response, and (4) modal transient response. A wide variety of data blocks are also supported. Error trapping is given special consideration. A sample session with CPI illustrates its simplicity and ease of use.

  9. Catalytic conversion reactions mediated by single-file diffusion in linear nanopores: hydrodynamic versus stochastic behavior.

    PubMed

    Ackerman, David M; Wang, Jing; Wendel, Joseph H; Liu, Da-Jiang; Pruski, Marek; Evans, James W

    2011-03-21

    We analyze the spatiotemporal behavior of species concentrations in a diffusion-mediated conversion reaction which occurs at catalytic sites within linear pores of nanometer diameter. Diffusion within the pores is subject to a strict single-file (no passing) constraint. Both transient and steady-state behavior is precisely characterized by kinetic Monte Carlo simulations of a spatially discrete lattice-gas model for this reaction-diffusion process considering various distributions of catalytic sites. Exact hierarchical master equations can also be developed for this model. Their analysis, after application of mean-field type truncation approximations, produces discrete reaction-diffusion type equations (mf-RDE). For slowly varying concentrations, we further develop coarse-grained continuum hydrodynamic reaction-diffusion equations (h-RDE) incorporating a precise treatment of single-file diffusion in this multispecies system. The h-RDE successfully describe nontrivial aspects of transient behavior, in contrast to the mf-RDE, and also correctly capture unreactive steady-state behavior in the pore interior. However, steady-state reactivity, which is localized near the pore ends when those regions are catalytic, is controlled by fluctuations not incorporated into the hydrodynamic treatment. The mf-RDE partly capture these fluctuation effects, but cannot describe scaling behavior of the reactivity.

  10. Application of whole slide image markup and annotation for pathologist knowledge capture.

    PubMed

    Campbell, Walter S; Foster, Kirk W; Hinrichs, Steven H

    2013-01-01

    The ability to transfer image markup and annotation data from one scanned image of a slide to a newly acquired image of the same slide within a single vendor platform was investigated. The goal was to study the ability to use image markup and annotation data files as a mechanism to capture and retain pathologist knowledge without retaining the entire whole slide image (WSI) file. Accepted mathematical principles were investigated as a method to overcome variations in scans of the same glass slide and to accurately associate image markup and annotation data across different WSI of the same glass slide. Trilateration was used to link fixed points within the image and slide to the placement of markups and annotations of the image in a metadata file. Variation in markup and annotation placement between WSI of the same glass slide was reduced from over 80 μ to less than 4 μ in the x-axis and from 17 μ to 6 μ in the y-axis (P < 0.025). This methodology allows for the creation of a highly reproducible image library of histopathology images and interpretations for educational and research use.

  11. Application of whole slide image markup and annotation for pathologist knowledge capture

    PubMed Central

    Campbell, Walter S.; Foster, Kirk W.; Hinrichs, Steven H.

    2013-01-01

    Objective: The ability to transfer image markup and annotation data from one scanned image of a slide to a newly acquired image of the same slide within a single vendor platform was investigated. The goal was to study the ability to use image markup and annotation data files as a mechanism to capture and retain pathologist knowledge without retaining the entire whole slide image (WSI) file. Methods: Accepted mathematical principles were investigated as a method to overcome variations in scans of the same glass slide and to accurately associate image markup and annotation data across different WSI of the same glass slide. Trilateration was used to link fixed points within the image and slide to the placement of markups and annotations of the image in a metadata file. Results: Variation in markup and annotation placement between WSI of the same glass slide was reduced from over 80 μ to less than 4 μ in the x-axis and from 17 μ to 6 μ in the y-axis (P < 0.025). Conclusion: This methodology allows for the creation of a highly reproducible image library of histopathology images and interpretations for educational and research use. PMID:23599902

  12. Developments in capture-γ libraries for nonproliferation applications

    NASA Astrophysics Data System (ADS)

    Hurst, A. M.; Firestone, R. B.; Sleaford, B. W.; Bleuel, D. L.; Basunia, M. S.; Bečvář, F.; Belgya, T.; Bernstein, L. A.; Carroll, J. J.; Detwiler, B.; Escher, J. E.; Genreith, C.; Goldblum, B. L.; Krtička, M.; Lerch, A. G.; Matters, D. A.; McClory, J. W.; McHale, S. R.; Révay, Zs.; Szentmiklosi, L.; Turkoglu, D.; Ureche, A.; Vujic, J.

    2017-09-01

    The neutron-capture reaction is fundamental for identifying and analyzing the γ-ray spectrum from an unknown assembly because it provides unambiguous information on the neutron-absorbing isotopes. Nondestructive-assay applications may exploit this phenomenon passively, for example, in the presence of spontaneous-fission neutrons, or actively where an external neutron source is used as a probe. There are known gaps in the Evaluated Nuclear Data File libraries corresponding to neutron-capture γ-ray data that otherwise limit transport-modeling applications. In this work, we describe how new thermal neutron-capture data are being used to improve information in the neutron-data libraries for isotopes relevant to nonproliferation applications. We address this problem by providing new experimentally-deduced partial and total neutron-capture reaction cross sections and then evaluate these data by comparison with statistical-model calculations.

  13. 76 FR 23571 - Marine Mammals and Endangered Species; File Nos. 15415 and 14622

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-27

    ... border. Dr. Kraus is authorized to conduct control and experimental visual trials to determine if right... during non-linear transect surveys and hand capture loggerhead, Kemp's ridley, and hawksbill sea turtles...

  14. Digital geologic map and GIS database of Venezuela

    USGS Publications Warehouse

    Garrity, Christopher P.; Hackley, Paul C.; Urbani, Franco

    2006-01-01

    The digital geologic map and GIS database of Venezuela captures GIS compatible geologic and hydrologic data from the 'Geologic Shaded Relief Map of Venezuela,' which was released online as U.S. Geological Survey Open-File Report 2005-1038. Digital datasets and corresponding metadata files are stored in ESRI geodatabase format; accessible via ArcGIS 9.X. Feature classes in the geodatabase include geologic unit polygons, open water polygons, coincident geologic unit linework (contacts, faults, etc.) and non-coincident geologic unit linework (folds, drainage networks, etc.). Geologic unit polygon data were attributed for age, name, and lithologic type following the Lexico Estratigrafico de Venezuela. All digital datasets were captured from source data at 1:750,000. Although users may view and analyze data at varying scales, the authors make no guarantee as to the accuracy of the data at scales larger than 1:750,000.

  15. Effective Elastic and Neutron Capture Cross Section Calculations Corresponding to Simulated Fluid Properties from CO2 Push-Pull Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chugunov, Nikita; Altundas, Bilgin

    The submission contains a .xls files consisting of 10 excel sheets, which contain combined list of pressure, saturation, salinity, temperature profiles from the simulation of CO2 push-pull using Brady reservoir model and the corresponding effective compressional and shear velocity, bulk density, and fluid and time-lapse neutron capture cross section profiles of rock at times 0 day (baseline) through 14 days. First 9 sheets (each named after the corresponding CO2 push-pull simulation time) contains simulated pressure, saturation, temperature, salinity profiles and the corresponding effective elastic and neutron capture cross section profiles of rock matrix at the time of CO2 injection. Eachmore » sheet contains two sets of effective compressional velocity profiles of the rock, one based on Gassmann and the other based on Patchy saturation model. Effective neutron capture cross section calculations are done using a proprietary neutron cross-section simulator (SNUPAR) whereas for the thermodynamic properties of CO2 and bulk density of rock matrix filled with fluid, a standalone fluid substitution tool by Schlumberger is used. Last sheet in the file contains the bulk modulus of solid rock, which is inverted from the rock properties (porosity, sound speed etc) based on Gassmann model. Bulk modulus of solid rock in turn is used in the fluid substitution.« less

  16. Complete-arch accuracy of intraoral scanners.

    PubMed

    Treesh, Joshua C; Liacouras, Peter C; Taft, Robert M; Brooks, Daniel I; Raiciulescu, Sorana; Ellert, Daniel O; Grant, Gerald T; Ye, Ling

    2018-04-30

    Intraoral scanners have shown varied results in complete-arch applications. The purpose of this in vitro study was to evaluate the complete-arch accuracy of 4 intraoral scanners based on trueness and precision measurements compared with a known reference (trueness) and with each other (precision). Four intraoral scanners were evaluated: CEREC Bluecam, CEREC Omnicam, TRIOS Color, and Carestream CS 3500. A complete-arch reference cast was created and printed using a 3-dimensional dental cast printer with photopolymer resin. The reference cast was digitized using a laboratory-based white light 3-dimensional scanner. The printed reference cast was scanned 10 times with each intraoral scanner. The digital standard tessellation language (STL) files from each scanner were then registered to the reference file and compared with differences in trueness and precision using a 3-dimensional modeling software. Additionally, scanning time was recorded for each scan performed. The Wilcoxon signed rank, Kruskal-Wallis, and Dunn tests were used to detect differences for trueness, precision, and scanning time (α=.05). Carestream CS 3500 had the lowest overall trueness and precision compared with Bluecam and TRIOS Color. The fourth scanner, Omnicam, had intermediate trueness and precision. All of the scanners tended to underestimate the size of the reference file, with exception of the Carestream CS 3500, which was more variable. Based on visual inspection of the color rendering of signed differences, the greatest amount of error tended to be in the posterior aspects of the arch, with local errors exceeding 100 μm for all scans. The single capture scanner Carestream CS 3500 had the overall longest scan times and was significantly slower than the continuous capture scanners TRIOS Color and Omnicam. Significant differences in both trueness and precision were found among the scanners. Scan times of the continuous capture scanners were faster than the single capture scanners. Published by Elsevier Inc.

  17. Use of On- Board File System: A Real Simplification for the Operators?

    NASA Astrophysics Data System (ADS)

    Olive, X.; Garcia, G.; Alison, B.; Charmeau, M. C.

    2008-08-01

    On-board file system allows to control and operate a spacecraft in new way offering more possibilities. It should permit to provide to the Operator a more abstract data view of the spacecraft, letting them focus on the functional part of their work and not on the exchange mechanism between Ground and Board. Files are usually used in the recent space project but in a restricted way limiting their capabilities. In this paper we describe what we consider as being a file system and its usage on 2 examples among those studied : OBCP and patch. We discuss how files can be handled with the PUS standard and give in the last section some perspectives such as the use of files to standardize all the exchange between Ground / Board and Board / Board.

  18. File Server-Based CD-ROM Networking: Using SCSI Express.

    ERIC Educational Resources Information Center

    McQueen, Howard

    1992-01-01

    Provides guidelines for evaluating SCSI Express Novell 386, a new product allowing CD-ROM drives to be attached to a Netware 3.11 file server, increasing CD-ROM networking capability. Specific limitations concerning software, hardware, and human resources are outlined, as well as its unique features and potential for future networking uses. (EA)

  19. 78 FR 5765 - Wireline Competition Bureau Releases Connect America Phase II Cost Model Virtual Workshop...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-28

    ... variation by geography; inter-office transport cost; voice capability; wire center facilities; sizing of... through traditional channels at the FCC, such as the Commission's Electronic Comment Filing System (ECFS... Electronic Comment Filing System (ECFS). In the meantime, parties are encouraged to examine both the Virtual...

  20. The Area Resource File: ARF. A Manpower Planning and Research Tool.

    ERIC Educational Resources Information Center

    Applied Management Sciences, Inc., Silver Spring, MD.

    This publication describes the Area Resource File (ARF), a computer-based, county-specific health information system with broad analytical capabilities which utilizes manpower and manpower-related data that are available on a compatible basis for all counties in the United States, and which was developed to summarize statistics from many disparate…

  1. ';Best' Practices for Aggregating Subset Results from Archived Datasets

    NASA Astrophysics Data System (ADS)

    Baskin, W. E.; Perez, J.

    2013-12-01

    In response to the exponential growth in science data analysis and visualization capabilities Data Centers have been developing new delivery mechanisms to package and deliver large volumes of aggregated subsets of archived data. New standards are evolving to help data providers and application programmers deal with growing needs of the science community. These standards evolve from the best practices gleaned from new products and capabilities. The NASA Atmospheric Sciences Data Center (ASDC) has developed and deployed production provider-specific search and subset web applications for the CALIPSO, CERES, TES, and MOPITT missions. This presentation explores several use cases that leverage aggregated subset results and examines the standards and formats ASDC developers applied to the delivered files as well as the implementation strategies for subsetting and processing the aggregated products. The following topics will be addressed: - Applications of NetCDF CF conventions to aggregated level 2 satellite subsets - Data-Provider-Specific format requirements vs. generalized standards - Organization of the file structure of aggregated NetCDF subset output - Global Attributes of individual subsetted files vs. aggregated results - Specific applications and framework used for subsetting and delivering derivative data files

  2. Cambio : a file format translation and analysis application for the nuclear response emergency community.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lasche, George P.

    2009-10-01

    Cambio is an application intended to automatically read and display any spectrum file of any format in the world that the nuclear emergency response community might encounter. Cambio also provides an analysis capability suitable for HPGe spectra when detector response and scattering environment are not well known. Why is Cambio needed: (1) Cambio solves the following problem - With over 50 types of formats from instruments used in the field and new format variations appearing frequently, it is impractical for every responder to have current versions of the manufacturer's software from every instrument used in the field; (2) Cambio convertsmore » field spectra to any one of several common formats that are used for analysis, saving valuable time in an emergency situation; (3) Cambio provides basic tools for comparing spectra, calibrating spectra, and isotope identification with analysis suited especially for HPGe spectra; and (4) Cambio has a batch processing capability to automatically translate a large number of archival spectral files of any format to one of several common formats, such as the IAEA SPE or the DHS N42. Currently over 540 analysts and members of the nuclear emergency response community worldwide are on the distribution list for updates to Cambio. Cambio users come from all levels of government, university, and commercial partners around the world that support efforts to counter terrorist nuclear activities. Cambio is Unclassified Unlimited Release (UUR) and distributed by internet downloads with email notifications whenever a new build of Cambio provides for new formats, bug fixes, or new or improved capabilities. Cambio is also provided as a DLL to the Karlsruhe Institute for Transuranium Elements so that Cambio's automatic file-reading capability can be included at the Nucleonica web site.« less

  3. SU-F-T-469: A Clinically Observed Discrepancy Between Image-Based and Log- Based MLC Position

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Neal, B; Ahmed, M; Siebers, J

    2016-06-15

    Purpose: To present a clinical case which challenges the base assumption of log-file based QA, by showing that the actual position of a MLC leaf can suddenly deviate from its programmed and logged position by >1 mm as observed with real-time imaging. Methods: An EPID-based exit-fluence dosimetry system designed to prevent gross delivery errors was used in cine mode to capture portal images during treatment. Visual monitoring identified an anomalous MLC leaf pair gap not otherwise detected by the automatic position verification. The position of the erred leaf was measured on EPID images and log files were analyzed for themore » treatment in question, the prior day’s treatment, and for daily MLC test patterns acquired on those treatment days. Additional standard test patterns were used to quantify the leaf position. Results: Whereas the log file reported no difference between planned and recorded positions, image-based measurements showed the leaf to be 1.3±0.1 mm medial from the planned position. This offset was confirmed with the test pattern irradiations. Conclusion: It has been clinically observed that log-file derived leaf positions can differ from their actual positions by >1 mm, and therefore cannot be considered to be the actual leaf positions. This cautions the use of log-based methods for MLC or patient quality assurance without independent confirmation of log integrity. Frequent verification of MLC positions through independent means is a necessary precondition to trusting log file records. Intra-treatment EPID imaging provides a method to capture departures from MLC planned positions. Work was supported in part by Varian Medical Systems.« less

  4. Rhinoplasty perioperative database using a personal digital assistant.

    PubMed

    Kotler, Howard S

    2004-01-01

    To construct a reliable, accurate, and easy-to-use handheld computer database that facilitates the point-of-care acquisition of perioperative text and image data specific to rhinoplasty. A user-modified database (Pendragon Forms [v.3.2]; Pendragon Software Corporation, Libertyville, Ill) and graphic image program (Tealpaint [v.4.87]; Tealpaint Software, San Rafael, Calif) were used to capture text and image data, respectively, on a Palm OS (v.4.11) handheld operating with 8 megabytes of memory. The handheld and desktop databases were maintained secure using PDASecure (v.2.0) and GoldSecure (v.3.0) (Trust Digital LLC, Fairfax, Va). The handheld data were then uploaded to a desktop database of either FileMaker Pro 5.0 (v.1) (FileMaker Inc, Santa Clara, Calif) or Microsoft Access 2000 (Microsoft Corp, Redmond, Wash). Patient data were collected from 15 patients undergoing rhinoplasty in a private practice outpatient ambulatory setting. Data integrity was assessed after 6 months' disk and hard drive storage. The handheld database was able to facilitate data collection and accurately record, transfer, and reliably maintain perioperative rhinoplasty data. Query capability allowed rapid search using a multitude of keyword search terms specific to the operative maneuvers performed in rhinoplasty. Handheld computer technology provides a method of reliably recording and storing perioperative rhinoplasty information. The handheld computer facilitates the reliable and accurate storage and query of perioperative data, assisting the retrospective review of one's own results and enhancement of surgical skills.

  5. Context-dependent JPEG backward-compatible high-dynamic range image compression

    NASA Astrophysics Data System (ADS)

    Korshunov, Pavel; Ebrahimi, Touradj

    2013-10-01

    High-dynamic range (HDR) imaging is expected, together with ultrahigh definition and high-frame rate video, to become a technology that may change photo, TV, and film industries. Many cameras and displays capable of capturing and rendering both HDR images and video are already available in the market. The popularity and full-public adoption of HDR content is, however, hindered by the lack of standards in evaluation of quality, file formats, and compression, as well as large legacy base of low-dynamic range (LDR) displays that are unable to render HDR. To facilitate the wide spread of HDR usage, the backward compatibility of HDR with commonly used legacy technologies for storage, rendering, and compression of video and images are necessary. Although many tone-mapping algorithms are developed for generating viewable LDR content from HDR, there is no consensus of which algorithm to use and under which conditions. We, via a series of subjective evaluations, demonstrate the dependency of the perceptual quality of the tone-mapped LDR images on the context: environmental factors, display parameters, and image content itself. Based on the results of subjective tests, it proposes to extend JPEG file format, the most popular image format, in a backward compatible manner to deal with HDR images also. An architecture to achieve such backward compatibility with JPEG is proposed. A simple implementation of lossy compression demonstrates the efficiency of the proposed architecture compared with the state-of-the-art HDR image compression.

  6. MISSE in the Materials and Processes Technical Information System (MAPTIS )

    NASA Technical Reports Server (NTRS)

    Burns, DeWitt; Finckenor, Miria; Henrie, Ben

    2013-01-01

    Materials International Space Station Experiment (MISSE) data is now being collected and distributed through the Materials and Processes Technical Information System (MAPTIS) at Marshall Space Flight Center in Huntsville, Alabama. MISSE data has been instrumental in many programs and continues to be an important source of data for the space community. To facilitate great access to the MISSE data the International Space Station (ISS) program office and MAPTIS are working to gather this data into a central location. The MISSE database contains information about materials, samples, and flights along with pictures, pdfs, excel files, word documents, and other files types. Major capabilities of the system are: access control, browsing, searching, reports, and record comparison. The search capabilities will search within any searchable files so even if the desired meta-data has not been associated data can still be retrieved. Other functionality will continue to be added to the MISSE database as the Athena Platform is expanded

  7. A clinically observed discrepancy between image-based and log-based MLC positions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Neal, Brian, E-mail: bpn2p@virginia.edu; Ahmed, Mahmoud; Kathuria, Kunal

    2016-06-15

    Purpose: To present a clinical case in which real-time intratreatment imaging identified an multileaf collimator (MLC) leaf to be consistently deviating from its programmed and logged position by >1 mm. Methods: An EPID-based exit-fluence dosimetry system designed to prevent gross delivery errors was used to capture cine during treatment images. The author serendipitously visually identified a suspected MLC leaf displacement that was not otherwise detected. The leaf position as recorded on the EPID images was measured and log-files were analyzed for the treatment in question, the prior day’s treatment, and for daily MLC test patterns acquired on those treatment days.more » Additional standard test patterns were used to quantify the leaf position. Results: Whereas the log-file reported no difference between planned and recorded positions, image-based measurements showed the leaf to be 1.3 ± 0.1 mm medial from the planned position. This offset was confirmed with the test pattern irradiations. Conclusions: It has been clinically observed that log-file derived leaf positions can differ from their actual position by >1 mm, and therefore cannot be considered to be the actual leaf positions. This cautions the use of log-based methods for MLC or patient quality assurance without independent confirmation of log integrity. Frequent verification of MLC positions through independent means is a necessary precondition to trust log-file records. Intratreatment EPID imaging provides a method to capture departures from MLC planned positions.« less

  8. A high-speed scintillation-based electronic portal imaging device to quantitatively characterize IMRT delivery.

    PubMed

    Ranade, Manisha K; Lynch, Bart D; Li, Jonathan G; Dempsey, James F

    2006-01-01

    We have developed an electronic portal imaging device (EPID) employing a fast scintillator and a high-speed camera. The device is designed to accurately and independently characterize the fluence delivered by a linear accelerator during intensity modulated radiation therapy (IMRT) with either step-and-shoot or dynamic multileaf collimator (MLC) delivery. Our aim is to accurately obtain the beam shape and fluence of all segments delivered during IMRT, in order to study the nature of discrepancies between the plan and the delivered doses. A commercial high-speed camera was combined with a terbium-doped gadolinium-oxy-sulfide (Gd2O2S:Tb) scintillator to form an EPID for the unaliased capture of two-dimensional fluence distributions of each beam in an IMRT delivery. The high speed EPID was synchronized to the accelerator pulse-forming network and gated to capture every possible pulse emitted from the accelerator, with an approximate frame rate of 360 frames-per-second (fps). A 62-segment beam from a head-and-neck IMRT treatment plan requiring 68 s to deliver was recorded with our high speed EPID producing approximately 6 Gbytes of imaging data. The EPID data were compared with the MLC instruction files and the MLC controller log files. The frames were binned to provide a frame rate of 72 fps with a signal-to-noise ratio that was sufficient to resolve leaf positions and segment fluence. The fractional fluence from the log files and EPID data agreed well. An ambiguity in the motion of the MLC during beam on was resolved. The log files reported leaf motions at the end of 33 of the 42 segments, while the EPID observed leaf motions in only 7 of the 42 segments. The static IMRT segment shapes observed by the high speed EPID were in good agreement with the shapes reported in the log files. The leaf motions observed during beam-on for step-and-shoot delivery were not temporally resolved by the log files.

  9. 78 FR 54879 - Notice of Filing of Self-Certification of Coal Capability Under the Powerplant and Industrial...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-06

    ... Capability Under the Powerplant and Industrial Fuel Use Act AGENCY: Office Electricity Delivery and Energy... the Powerplant and Industrial Fuel Use Act of 1978 (FUA), as amended, and DOE regulations in 10 CFR... requirement of coal capability, the owner or operator of such a facility proposing to use natural gas or...

  10. LACIE performance predictor final operational capability program description, volume 2

    NASA Technical Reports Server (NTRS)

    1976-01-01

    Given the swath table files, the segment set for one country and cloud cover data, the SAGE program determines how many times and under what conditions each segment is accessed by satellites. The program writes a record for each segment on a data file which contains the pertinent acquisition data. The weather data file can also be generated from a NASA supplied tape. The Segment Acquisition Selector Program (SACS) selects data from the segment reference file based upon data input manually and from a crop window file. It writes the extracted data to a data acquisition file and prints two summary reports. The POUT program reads from associated LACIE files and produces printed reports. The major types of reports that can be produced are: (1) Substrate Reference Data Reports, (2) Population Mean, Standard Deviation and Histogram Reports, (3) Histograms of Monte Carlo Statistics Reports, and (4) Frequency of Sample Segment Acquisitions Reports.

  11. Enhancing AFLOW Visualization using Jmol

    NASA Astrophysics Data System (ADS)

    Lanasa, Jacob; New, Elizabeth; Stefek, Patrik; Honaker, Brigette; Hanson, Robert; Aflow Collaboration

    The AFLOW library is a database of theoretical solid-state structures and calculated properties created using high-throughput ab initio calculations. Jmol is a Java-based program capable of visualizing and analyzing complex molecular structures and energy landscapes. In collaboration with the AFLOW consortium, our goal is the enhancement of the AFLOWLIB database through the extension of Jmol's capabilities in the area of materials science. Modifications made to Jmol include the ability to read and visualize AFLOW binary alloy data files, the ability to extract from these files information using Jmol scripting macros that can be utilized in the creation of interactive web-based convex hull graphs, the capability to identify and classify local atomic environments by symmetry, and the ability to search one or more related crystal structures for atomic environments using a novel extension of inorganic polyhedron-based SMILES strings

  12. 76 FR 31942 - Marine Mammals; File No. 14329

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-02

    ... appointment in the following offices: Permits, Conservation and Education Division, Office of Protected... sampling and instrument attachment for lactating female fur seals already authorized for capture, and..., 2011. P. Michael Payne, Chief, Permits, Conservation and Education Division, Office of Protected...

  13. PHREEQCI; a graphical user interface for the geochemical computer program PHREEQC

    USGS Publications Warehouse

    Charlton, Scott R.; Macklin, Clifford L.; Parkhurst, David L.

    1997-01-01

    PhreeqcI is a Windows-based graphical user interface for the geochemical computer program PHREEQC. PhreeqcI provides the capability to generate and edit input data files, run simulations, and view text files containing simulation results, all within the framework of a single interface. PHREEQC is a multipurpose geochemical program that can perform speciation, inverse, reaction-path, and 1D advective reaction-transport modeling. Interactive access to all of the capabilities of PHREEQC is available with PhreeqcI. The interface is written in Visual Basic and will run on personal computers under the Windows(3.1), Windows95, and WindowsNT operating systems.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burnett, R.A.

    A major goal of the Analysis of Large Data Sets (ALDS) research project at Pacific Northwest Laboratory (PNL) is to provide efficient data organization, storage, and access capabilities for statistical applications involving large amounts of data. As part of the effort to achieve this goal, a self-describing binary (SDB) data file structure has been designed and implemented together with a set of basic data manipulation functions and supporting SDB data access routines. Logical and physical data descriptors are stored in SDB files preceding the data values. SDB files thus provide a common data representation for interfacing diverse software components. Thismore » paper describes the various types of data descriptors and data structures permitted by the file design. Data buffering, file segmentation, and a segment overflow handler are also discussed.« less

  15. Modeling Thermally Driven Flow Problems with a Grid-Free Vortex Filament Scheme: Part 1

    DTIC Science & Technology

    2018-02-01

    provision has been made to include grid-free energy particles and thus a capability of capturing 2-way coupling between momentum and energy via barotropic...vorticity generation associated with thermal gradients. The validation studies have focused on natural convection following a release of energy into...a stagnant field and show that this new method is capable of capturing the correct physics of 3-D natural convection problems. vortex filament, energy

  16. Purity and Cleanness of Aeorgel as a Cosmic Dust Capture Medium

    NASA Technical Reports Server (NTRS)

    Tsou, P.; Fleming, R.; Lindley, P.; Craig, A.; Blake, D.

    1994-01-01

    The capability for capturing micrometeoroids intact through laboratory simulations [Tsou 1988] and in space [Tsou 1993] in passive underdense silica aerogel offers a valuable tool for cosmic dust research. The integrity of the sample handling medium can substantially modify the integrity of the sample. Intact capture is a violent hypervelocity event: the integrity of the capturing medium can cause even greater modification of the sample.

  17. Qualification of the RSRM field joint CF case-to-insulation bondline inspection using the Thiokol Corporation ultrasonic RSRM bondline inspection system

    NASA Technical Reports Server (NTRS)

    Cook, M.

    1990-01-01

    Qualification testing of Combustion Engineering's AMDATA Intraspect/98 Data Acquisition and Imaging System that applies to the redesigned solid rocket motor field joint capture feature case-to-insulation bondline inspection was performed. Testing was performed at M-111, the Thiokol Corp. Inert Parts Preparation Building. The purpose of the inspection was to verify the integrity of the capture feature area case-to-insulation bondline. The capture feature scanner was calibrated over an intentional 1.0 to 1.0 in. case-to-insulation unbond. The capture feature scanner was then used to scan 60 deg of a capture feature field joint. Calibration of the capture feature scanner was then rechecked over the intentional unbond to ensure that the calibration settings did not change during the case scan. This procedure was successfully performed five times to qualify the unbond detection capability of the capture feature scanner. The capture feature scanner qualified in this test contains many points of mechanical instability that can affect the overall ultrasonic signal response. A new generation scanner, designated the sigma scanner, should be implemented to replace the current configuration scanner. The sigma scanner eliminates the unstable connection points of the current scanner and has additional inspection capabilities.

  18. Tilted pillar array fabrication by the combination of proton beam writing and soft lithography for microfluidic cell capture: Part 1 Design and feasibility.

    PubMed

    Rajta, Istvan; Huszánk, Robert; Szabó, Atilla T T; Nagy, Gyula U L; Szilasi, Szabolcs; Fürjes, Peter; Holczer, Eszter; Fekete, Zoltan; Járvás, Gabor; Szigeti, Marton; Hajba, Laszlo; Bodnár, Judit; Guttman, Andras

    2016-02-01

    Design, fabrication, integration, and feasibility test results of a novel microfluidic cell capture device is presented, exploiting the advantages of proton beam writing to make lithographic irradiations under multiple target tilting angles and UV lithography to easily reproduce large area structures. A cell capture device is demonstrated with a unique doubly tilted micropillar array design for cell manipulation in microfluidic applications. Tilting the pillars increased their functional surface, therefore, enhanced fluidic interaction when special bioaffinity coating was used, and improved fluid dynamic behavior regarding cell culture injection. The proposed microstructures were capable to support adequate distribution of body fluids, such as blood, spinal fluid, etc., between the inlet and outlet of the microfluidic sample reservoirs, offering advanced cell capture capability on the functionalized surfaces. The hydrodynamic characteristics of the microfluidic systems were tested with yeast cells (similar size as red blood cells) for efficient capture. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. Development of a Risk-Based Comparison Methodology of Carbon Capture Technologies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Engel, David W.; Dalton, Angela C.; Dale, Crystal

    2014-06-01

    Given the varying degrees of maturity among existing carbon capture (CC) technology alternatives, an understanding of the inherent technical and financial risk and uncertainty associated with these competing technologies is requisite to the success of carbon capture as a viable solution to the greenhouse gas emission challenge. The availability of tools and capabilities to conduct rigorous, risk–based technology comparisons is thus highly desirable for directing valuable resources toward the technology option(s) with a high return on investment, superior carbon capture performance, and minimum risk. To address this research need, we introduce a novel risk-based technology comparison method supported by anmore » integrated multi-domain risk model set to estimate risks related to technological maturity, technical performance, and profitability. Through a comparison between solid sorbent and liquid solvent systems, we illustrate the feasibility of estimating risk and quantifying uncertainty in a single domain (modular analytical capability) as well as across multiple risk dimensions (coupled analytical capability) for comparison. This method brings technological maturity and performance to bear on profitability projections, and carries risk and uncertainty modeling across domains via inter-model sharing of parameters, distributions, and input/output. The integration of the models facilitates multidimensional technology comparisons within a common probabilistic risk analysis framework. This approach and model set can equip potential technology adopters with the necessary computational capabilities to make risk-informed decisions about CC technology investment. The method and modeling effort can also be extended to other industries where robust tools and analytical capabilities are currently lacking for evaluating nascent technologies.« less

  20. Program documentation for the space environment test division post-test data reduction program (GNFLEX)

    NASA Technical Reports Server (NTRS)

    Jones, L. D.

    1979-01-01

    The Space Environment Test Division Post-Test Data Reduction Program processes data from test history tapes generated on the Flexible Data System in the Space Environment Simulation Laboratory at the National Aeronautics and Space Administration/Lyndon B. Johnson Space Center. The program reads the tape's data base records to retrieve the item directory conversion file, the item capture file and the process link file to determine the active parameters. The desired parameter names are read in by lead cards after which the periodic data records are read to determine parameter data level changes. The data is considered to be compressed rather than full sample rate. Tabulations and/or a tape for generating plots may be output.

  1. Evolution of the LBT Telemetry System

    NASA Astrophysics Data System (ADS)

    Summers, K.; Biddick, C.; De La Peña, M. D.; Summers, D.

    2014-05-01

    The Large Binocular Telescope (LBT) Telescope Control System (TCS) records about 10GB of telemetry data per night. Additionally, the vibration monitoring system records about 9GB of telemetry data per night. Through 2013, we have amassed over 6TB of Hierarchical Data Format (HDF5) files and almost 9TB in a MySQL database of TCS and vibration data. The LBT telemetry system, in its third major revision since 2004, provides the mechanism to capture and store this data. The telemetry system has evolved from a simple HDF file system with MySQL stream definitions within the TCS, to a separate system using a MySQL database system for the definitions and data, and finally to no database use at all, using HDF5 files.

  2. 77 FR 13097 - Endangered Species; File Nos. 15661, 10027, and 15685

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-05

    ... population structure, size class composition, foraging ecology, and migration patterns for green and... of green and hawksbill sea turtles focusing on distribution and abundance, ecology, health, and... and population structure, foraging ecology, habitat use, and movements. Researchers may capture...

  3. 77 FR 69670 - Self-Regulatory Organizations; NYSE Arca, Inc.; Notice of Filing and Immediate Effectiveness of a...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-20

    ... some cases its port fees are less expensive than many of its primary competitors.\\12\\ The Exchange... monitoring of the drop copy port is more substantial and because drop copy ports capture cumulative activity...

  4. The National Anesthesia Clinical Outcomes Registry.

    PubMed

    Liau, Adrian; Havidich, Jeana E; Onega, Tracy; Dutton, Richard P

    2015-12-01

    The Anesthesia Quality Institute (AQI) was chartered in 2008 by the American Society of Anesthesiologists to develop the National Anesthesia Clinical Outcomes Registry (NACOR). In this Technical Communication, we will describe how data enter NACOR, how they are authenticated, and how they are analyzed and reported. NACOR accepts case-level administrative, clinical, and quality capture data from voluntarily participating anesthesia practices and health care facilities in the United States. All data are transmitted to the AQI in summary electronic files generated by billing, quality capture, and electronic health care record software, typically on a monthly basis. All data elements are mapped to fields in the NACOR schema in accordance with a publicly available data dictionary. Incoming data are loaded into NACOR by AQI technologists and are subject to both manual and automated review to identify systematically missing elements, miscoding, and inadvertent corruption. Data are deidentified in compliance with Health Insurance Portability and Accountability Act regulations. The database server of AQI, which houses the NACOR database, is protected by 2 firewalls within the American Society of Anesthesiologists' network infrastructure; this system has not been breached. The NACOR Participant User File, a deidentified case-level dataset of information from NACOR, is available to researchers at participating institutions. NACOR architecture and the nature of the Participant User File include both strengths and weaknesses.

  5. Updated User's Guide for Sammy: Multilevel R-Matrix Fits to Neutron Data Using Bayes' Equations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Larson, Nancy M

    2008-10-01

    In 1980 the multilevel multichannel R-matrix code SAMMY was released for use in analysis of neutron-induced cross section data at the Oak Ridge Electron Linear Accelerator. Since that time, SAMMY has evolved to the point where it is now in use around the world for analysis of many different types of data. SAMMY is not limited to incident neutrons but can also be used for incident protons, alpha particles, or other charged particles; likewise, Coulomb exit hannels can be included. Corrections for a wide variety of experimental conditions are available in the code: Doppler and resolution broadening, multiple-scattering corrections formore » capture or reaction yields, normalizations and backgrounds, to name but a few. The fitting procedure is Bayes' method, and data and parameter covariance matrices are properly treated within the code. Pre- and post-processing capabilities are also available, including (but not limited to) connections with the Evaluated Nuclear Data Files. Though originally designed for use in the resolved resonance region, SAMMY also includes a treatment for data analysis in the unresolved resonance region.« less

  6. For Kids, by Kids: Our City Podcast

    ERIC Educational Resources Information Center

    Vincent, Tony; van't Hooft, Mark

    2007-01-01

    In this article, the authors discuss podcasting and provide ways on how to create podcasts. A podcast is an audio or video file that is posted on the web that can easily be cataloged and automatically downloaded to a computer or mobile device capable of playing back audio or video files. Podcasting is a powerful tool for educators to get students…

  7. 78 FR 26339 - Exelon Generation Company, LLC; Notice of Application Accepted for Filing, Soliciting Motions To...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-06

    ... prior registration, using the eComment system at http://www.ferc.gov/docs-filing/ecomment.asp . You must... capability of 28,000 cfs. Water flowing through the turbines is discharged via the draft tubes into the Susquehanna River adjacent to the powerhouse. The units are equipped with trash racks between the draft tube...

  8. 77 FR 30508 - Marine Mammals; File No. 16991

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-23

    ... Road, Moss Landing, California 95039 to conduct research on harbor seals (Phoca vitulina). ADDRESSES... seals had been submitted by the above-named applicant. The requested permit has been issued under the... harassment and capture of harbor seals in California, Oregon, Washington, and Alaska. Harassment is...

  9. 77 FR 26747 - Marine Mammals; File No. 15748

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-07

    ... Scientific Research Permit No. 15748. DATES: Written, telefaxed, or email comments must be received on or... shore of Ross Island, Antarctica to study thermoregulation. The research involves capture and restraint of adult females and pups/juveniles of either sex for attachment of scientific instruments...

  10. Scanners, optical character readers, Cyrillic alphabet and Russian translations

    NASA Technical Reports Server (NTRS)

    Johnson, Gordon G.

    1995-01-01

    The writing of code for capture, in a uniform format, of bit maps of words and characters from scanner PICT files is presented. The coding of Dynamic Pattern Matched for the identification of the characters, words and sentences in preparation for translation is discussed.

  11. 75 FR 67347 - Marine Mammals; File No. 14326

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-02

    ...(s): Permits, Conservation and Education Division, Office of Protected Resources, NMFS, 1315 East... amended permit allows takes of up to 20 adult female Steller sea lions annually by capture using... the ESA. Dated: October 27, 2010. Tammy C. Adams, Acting Chief, Permits, Conservation and Education...

  12. Study on a High Compression Processing for Video-on-Demand e-learning System

    NASA Astrophysics Data System (ADS)

    Nomura, Yoshihiko; Matsuda, Ryutaro; Sakamoto, Ryota; Sugiura, Tokuhiro; Matsui, Hirokazu; Kato, Norihiko

    The authors proposed a high-quality and small-capacity lecture-video-file creating system for distance e-learning system. Examining the feature of the lecturing scene, the authors ingeniously employ two kinds of image-capturing equipment having complementary characteristics : one is a digital video camera with a low resolution and a high frame rate, and the other is a digital still camera with a high resolution and a very low frame rate. By managing the two kinds of image-capturing equipment, and by integrating them with image processing, we can produce course materials with the greatly reduced file capacity : the course materials satisfy the requirements both for the temporal resolution to see the lecturer's point-indicating actions and for the high spatial resolution to read the small written letters. As a result of a comparative experiment, the e-lecture using the proposed system was confirmed to be more effective than an ordinary lecture from the viewpoint of educational effect.

  13. Ground Software Maintenance Facility (GSMF) system manual

    NASA Technical Reports Server (NTRS)

    Derrig, D.; Griffith, G.

    1986-01-01

    The Ground Software Maintenance Facility (GSMF) is designed to support development and maintenance of spacelab ground support software. THE GSMF consists of a Perkin Elmer 3250 (Host computer) and a MITRA 125s (ATE computer), with appropriate interface devices and software to simulate the Electrical Ground Support Equipment (EGSE). This document is presented in three sections: (1) GSMF Overview; (2) Software Structure; and (3) Fault Isolation Capability. The overview contains information on hardware and software organization along with their corresponding block diagrams. The Software Structure section describes the modes of software structure including source files, link information, and database files. The Fault Isolation section describes the capabilities of the Ground Computer Interface Device, Perkin Elmer host, and MITRA ATE.

  14. Communities that thrive in extreme conditions captured from a freshwater lake.

    PubMed

    Low-Décarie, Etienne; Fussmann, Gregor F; Dumbrell, Alex J; Bell, Graham

    2016-09-01

    Organisms that can grow in extreme conditions would be expected to be confined to extreme environments. However, we were able to capture highly productive communities of algae and bacteria capable of growing in acidic (pH 2), basic (pH 12) and saline (40 ppt) conditions from an ordinary freshwater lake. Microbial communities may thus include taxa that are highly productive in conditions that are far outside the range of conditions experienced in their host ecosystem. The organisms we captured were not obligate extremophiles, but were capable of growing in both extreme and benign conditions. The ability to grow in extreme conditions may thus be a common functional attribute in microbial communities. © 2016 The Author(s).

  15. CARMA: Software for continuous affect rating and media annotation

    PubMed Central

    Girard, Jeffrey M

    2017-01-01

    CARMA is a media annotation program that collects continuous ratings while displaying audio and video files. It is designed to be highly user-friendly and easily customizable. Based on Gottman and Levenson's affect rating dial, CARMA enables researchers and study participants to provide moment-by-moment ratings of multimedia files using a computer mouse or keyboard. The rating scale can be configured on a number of parameters including the labels for its upper and lower bounds, its numerical range, and its visual representation. Annotations can be displayed alongside the multimedia file and saved for easy import into statistical analysis software. CARMA provides a tool for researchers in affective computing, human-computer interaction, and the social sciences who need to capture the unfolding of subjective experience and observable behavior over time. PMID:29308198

  16. Screencasts

    ERIC Educational Resources Information Center

    Yee, Kevin; Hargis, Jace

    2010-01-01

    This article discusses the benefits of screencasts and its instructional uses. Well-known for some years to advanced technology users, Screen Capture Software (SCS) offers the promise of recording action on the computer desktop together with voiceover narration, all combined into a single movie file that can be shared, emailed, or uploaded.…

  17. 76 FR 13603 - Marine Mammals; File No. 16087

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-14

    ... californianus), Pacific harbor seals (Phoca vitulina), and northern elephant seals (Mirounga angustrirostris... capture and handling, and 1,135 by incidental disturbance. Up to 2,766 northern elephant seals may be... northern elephant seals. Up to 4,500 northern fur seals (Callorhinus ursinus) may be incidentally disturbed...

  18. 75 FR 55625 - Occupational Information Development Advisory Panel Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-13

    ... claims; occupational analysis, including definitions, ratings and capture of physical and mental... should contact the Panel staff by any one of these three methods: Mail: Occupational Information... Federal Officer. [FR Doc. 2010-22711 Filed 9-10-10; 8:45 am] BILLING CODE 4191-02-P ...

  19. Downloading from the OPAC: The Innovative Interfaces Environment.

    ERIC Educational Resources Information Center

    Spore, Stuart

    1991-01-01

    Discussion of downloading from online public access catalogs focuses on downloading to MS-DOS microcomputers from the INNOPAC online catalog system. Tools for capturing and postprocessing downloaded files are described, technical and institutional constraints on downloading are addressed, and an innovative program for overcoming such constraints…

  20. 75 FR 44042 - Self-Regulatory Organizations; Notice of Filing and Immediate Effectiveness of Proposed Rule...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-27

    ... submit order execution reports to the Exchange's Front End Systemic Capture (``FESC'') database linking... that would apply across their respective marketplaces, including a harmonized approach to riskless... approach to customer order protection rules, including how riskless principal transactions should be...

  1. 78 FR 17359 - Endangered Species; File No. 17095-01

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-21

    ...: Monitor shortnose and Atlantic sturgeon abundance and distribution through the Hudson River Biological.../or larvae (ELS) annually. To account for a higher than expected catch per tow sampling performed... sturgeon captured over the permit life. The Permit Holder will also expand the sampling activities for...

  2. Efficacy of ProTaper universal retreatment files in removing filling materials during root canal retreatment.

    PubMed

    Giuliani, Valentina; Cocchetti, Roberto; Pagavino, Gabriella

    2008-11-01

    The aim of this study was to evaluate the efficacy of the ProTaper Universal System rotary retreatment system and of Profile 0.06 and hand instruments (K-file) in the removal of root filling materials. Forty-two extracted single-rooted anterior teeth were selected. The root canals were enlarged with nickel-titanium (NiTi) rotary files, filled with gutta-percha and sealer, and randomly divided into 3 experimental groups. The filling materials were removed with solvent in conjunction with one of the following devices and techniques: the ProTaper Universal System for retreatment, ProFile 0.06, and hand instruments (K-file). The roots were longitudinally sectioned, and the image of the root surface was photographed. The images were captured in JPEG format; the areas of the remaining filling materials and the time required for removing the gutta-percha and sealer were calculated by using the nonparametric one-way Kruskal-Wallis test and Tukey-Kramer tests, respectively. The group that showed better results for removing filling materials was the ProTaper Universal System for retreatment files, whereas the group of ProFile rotary instruments yielded better root canal cleanliness than the hand instruments, even though there was no statistically significant difference. The ProTaper Universal System for retreatment and ProFile rotary instruments worked significantly faster than the K-file. The ProTaper Universal System for retreatment files left cleaner root canal walls than the K-file hand instruments and the ProFile Rotary instruments, although none of the devices used guaranteed complete removal of the filling materials. The rotary NiTi system proved to be faster than hand instruments in removing root filling materials.

  3. Data File Standard for Flow Cytometry, version FCS 3.1.

    PubMed

    Spidlen, Josef; Moore, Wayne; Parks, David; Goldberg, Michael; Bray, Chris; Bierre, Pierre; Gorombey, Peter; Hyun, Bill; Hubbard, Mark; Lange, Simon; Lefebvre, Ray; Leif, Robert; Novo, David; Ostruszka, Leo; Treister, Adam; Wood, James; Murphy, Robert F; Roederer, Mario; Sudar, Damir; Zigon, Robert; Brinkman, Ryan R

    2010-01-01

    The flow cytometry data file standard provides the specifications needed to completely describe flow cytometry data sets within the confines of the file containing the experimental data. In 1984, the first Flow Cytometry Standard format for data files was adopted as FCS 1.0. This standard was modified in 1990 as FCS 2.0 and again in 1997 as FCS 3.0. We report here on the next generation flow cytometry standard data file format. FCS 3.1 is a minor revision based on suggested improvements from the community. The unchanged goal of the standard is to provide a uniform file format that allows files created by one type of acquisition hardware and software to be analyzed by any other type.The FCS 3.1 standard retains the basic FCS file structure and most features of previous versions of the standard. Changes included in FCS 3.1 address potential ambiguities in the previous versions and provide a more robust standard. The major changes include simplified support for international characters and improved support for storing compensation. The major additions are support for preferred display scale, a standardized way of capturing the sample volume, information about originality of the data file, and support for plate and well identification in high throughput, plate based experiments. Please see the normative version of the FCS 3.1 specification in Supporting Information for this manuscript (or at http://www.isac-net.org/ in the Current standards section) for a complete list of changes.

  4. Data File Standard for Flow Cytometry, Version FCS 3.1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spidlen, Josef; Moore, Wayne; Parks, David

    2009-11-10

    The flow cytometry data file standard provides the specifications needed to completely describe flow cytometry data sets within the confines of the file containing the experimental data. In 1984, the first Flow Cytometry Standard format for data files was adopted as FCS 1.0. This standard was modified in 1990 as FCS 2.0 and again in 1997 as FCS 3.0. We report here on the next generation flow cytometry standard data file format. FCS 3.1 is a minor revision based on suggested improvements from the community. The unchanged goal of the standard is to provide a uniform file format that allowsmore » files created by one type of acquisition hardware and software to be analyzed by any other type. The FCS 3.1 standard retains the basic FCS file structure and most features of previous versions of the standard. Changes included in FCS 3.1 address potential ambiguities in the previous versions and provide a more robust standard. The major changes include simplified support for international characters and improved support for storing compensation. The major additions are support for preferred display scale, a standardized way of capturing the sample volume, information about originality of the data file, and support for plate and well identification in high throughput, plate based experiments. Please see the normative version of the FCS 3.1 specification in Supporting Information for this manuscript (or at http://www.isac-net.org/ in the Current standards section) for a complete list of changes.« less

  5. Incorporating a Capability for Estimating Inhalation Doses in ...

    EPA Pesticide Factsheets

    Report and Data Files This report presents the approach to be used to incorporate in the U.S. Environmental Protection Agency’s TEVA-SPOT software (U.S.EPA 2014) a capability for estimating inhalation doses that result from the most important sources of contaminated aerosols and volatile contaminants during a contamination event.

  6. AQUATOX Frequently Asked Questions

    EPA Pesticide Factsheets

    Capabilities, Installation, Source Code, Example Study Files, Biotic State Variables, Initial Conditions, Loadings, Volume, Sediments, Parameters, Libraries, Ecotoxicology, Waterbodies, Link to Watershed Models, Output, Metals, Troubleshooting

  7. Continuous-Energy Data Checks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Haeck, Wim; Conlin, Jeremy Lloyd; McCartney, Austin Paul

    The purpose of this report is to provide an overview of all Quality Assurance tests that have to be performed on a nuclear data set to be transformed into an ACE formatted nuclear data file. The ACE file is capable of containing different types of data such as continuous energy neutron data, thermal scattering data, etc. Within this report, we will limit ourselves to continuous energy neutron data.

  8. A Modernization Plan for the Technical Data Department of the Naval Ships Weapon Systems Engineering Station

    DTIC Science & Technology

    1976-09-01

    technology has made possible the deployment of very sophisticated and highly capable weapon systems. Taking advantage of this technology has carried...3) Ancillary Equipment 208 Types Numerous Notes : 1. Number of ships with this system 2. Includes Tartar used only for surface capability 3. These...maintains the Configuration Item Identification File (CIIF) . The CIIF provides storage and retrieval capability for technical and logistics data specified on

  9. VSO For Dummies

    NASA Astrophysics Data System (ADS)

    Schwartz, Richard A.; Zarro, D.; Csillaghy, A.; Dennis, B.; Tolbert, A. K.; Etesi, L.

    2009-05-01

    We report on our activities to integrate VSO search and retrieval capabilities into standard data access, display, and analysis tools. In addition to its standard Web-based search form, the VSO provides an Interactive Data Language (IDL) client (vso_search) that is available through the Solar Software (SSW) package. We have incorporated this client into an IDL-widget interface program (show_synop) that allows for more simplified searching and downloading of VSO datasets directly into a user's IDL data analysis environment. In particular, we have provided the capability to read VSO datasets into a general purpose IDL package (plotman) that can display different datatypes (lightcurves, images, and spectra) and perform basic data operations such as zooming, image overlays, solar rotation, etc. Currently, the show_synop tool supports access to ground-based and space-based (SOHO, STEREO, and Hinode) observations, and has the capability to include new datasets as they become available. A user encounters two major hurdles when using the VSO: (1) Instrument-specific software (such as level-0 file readers and data-prepping procedures) may not be available in the user's local SSW distribution. (2) Recent calibration files (such as flat-fields) are not automatically distributed with the analysis software. To address these issues, we have developed a dedicated server (prepserver) that incorporates all the latest instrument-specific software libraries and calibration files. The prepserver uses an IDL-Java bridge to read and implement data processing requests from a client and return a processed data file that can be readily displayed with the show_synop/plotman package. The advantage of the prepserver is that the user is only required to install the general branch (gen) of the SSW tree, and is freed from the more onerous task of installing instrument-specific libraries and calibration files. We will demonstrate how the prepserver can be used to read, process, and overlay SOHO/EIT, TRACE, SECCHI/EUVI, and RHESSI images.

  10. 77 FR 38587 - Marine Mammals; File No. 14325

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-28

    ... appointment in the following offices: Permits and Conservation Division, Office of Protected Resources, NMFS... restraint of pups in the eastern Distinct Population Segment (eDPS) and western DPS (wDPS); capture of adult... have a significant adverse impact on the human environment. As required by the ESA, issuance of this...

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ruiz, Steven Adriel

    The following discussion contains a high-level description of methods used to implement software for data processing. It describes the required directory structures and file handling required to use Excel's Visual Basic for Applications programming language and how to identify shot, test and capture types to appropriately process data. It also describes how to interface with the software.

  12. 76 FR 78890 - Endangered Species; File No. 15566

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-20

    ... coriacea), and 1 hawksbill (Eretmochelys imbricata) sea turtle in order to assess temporal change in catch rates, size distributions, sex and genetic ratios, and health of sea turtles. Captures occur annually in... PIT tagged, photographed, and released. No other changes would be made to the permit. The purpose of...

  13. 78 FR 16392 - Submission for OMB Review; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-14

    ... Broadcast messaging system to a self-managed software with almost immediate dissemination; and (9) upgraded... exporting used self-propelled vehicles. The requirement to file in the AES for all used self- propelled... the AES are for used self-propelled vehicles. The Census Bureau does not capture statistics for used...

  14. 76 FR 58469 - Endangered Species; File Nos. 16526, 16323, 16436, 16422, 16438, 16431, 16507, 16547, 16375...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-21

    ... juvenile Atlantic sturgeon in the Delaware River to locate nursery habitat, characterize population ecology... movement patterns and rate of exchange between coastal river systems in Maine, characterize the population structure and generate estimates of population abundance. Researchers would capture adult, juvenile, and...

  15. 77 FR 21754 - Endangered Species; File Nos. 16526, 16323, 16436, 16422, 16438, 16431, 16507, 16547, 16375...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-04-11

    ... locate nursery habitat, and characterize population ecology and habitat use. Fish will be captured using... river systems in Maine, characterize the population structure, and generate estimates of population... populations to determine behavior, movement, and current status of the species in Connecticut waters. Adult...

  16. 75 FR 33578 - Endangered Species; File Nos. 14508 and 14655

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-14

    ... (Lepidochelys kempii) sea turtles for purposes of scientific research. ADDRESSES: The permit and related... abundance, genetic origin and feeding ecology of sea turtles using Lake Worth Lagoon in Palm Beach County, Florida. Up to 50 green, 5 loggerhead, 2 hawksbill, and 1 Kemp's ridley sea turtles may be captured...

  17. 78 FR 19217 - Endangered Species; File No. 16547-01

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-29

    ... below 22 parts per thousand salinity. Researchers are currently authorized to capture adult, juvenile... telemetry tags dependent on the life stage (adult, sub-adult and juvenile) and the salinity level where... salinity level in the waters of Virginia and Maryland. All previous activities are authorized; however, the...

  18. Variable-Speed Screw Chiller, Sidney Yates Building, Washington, DC

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ostrouchov, George; Adams, Mark B.; Howett, Daniel H.

    2017-07-01

    This report captures the findings from an evaluation ORNL performed on a new chiller technology as part of GSA's Proving Ground Program. Note: Appendices B&C were removed from this report while the author looks for a way to insert them without consuming over 200MB of file size.

  19. Records Inventory Data Collection Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cole, Brian A.

    1995-03-01

    DATALINK was created to provide an easy to use data collection program for records management software products. It provides several useful tools for capturing and validating record index data in the field. It also allows users to easily create a comma delimited, ASCII text file for data export into most records management software products.

  20. 75 FR 11132 - Marine Mammals; File No. 555-1870

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-10

    ... appointment in the following offices: Permits, Conservation and Education Division, Office of Protected... capture, sedation, tagging, and sampling from 40 seals (20 males and 20 females) to 70 seals (35 males and 35 females). The request to bring up to six seals into temporary captivity for a pilot study to...

  1. 75 FR 54093 - Marine Mammals; File No. 555-1870

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-03

    ... appointment in the following offices: Permits, Conservation and Education Division, Office of Protected... annually in California by capture, sedation, tagging, and sampling from 40 seals (20 males and 20 females) to 70 seals (35 males and 35 females). Also, a pilot study is authorized to bring up to six seals...

  2. Files synchronization from a large number of insertions and deletions

    NASA Astrophysics Data System (ADS)

    Ellappan, Vijayan; Kumari, Savera

    2017-11-01

    Synchronization between different versions of files is becoming a major issue that most of the applications are facing. To make the applications more efficient a economical algorithm is developed from the previously used algorithm of “File Loading Algorithm”. I am extending this algorithm in three ways: First, dealing with non-binary files, Second backup is generated for uploaded files and lastly each files are synchronized with insertions and deletions. User can reconstruct file from the former file with minimizing the error and also provides interactive communication by eliminating the frequency without any disturbance. The drawback of previous system is overcome by using synchronization, in which multiple copies of each file/record is created and stored in backup database and is efficiently restored in case of any unwanted deletion or loss of data. That is, to introduce a protocol that user B may use to reconstruct file X from file Y with suitably low probability of error. Synchronization algorithms find numerous areas of use, including data storage, file sharing, source code control systems, and cloud applications. For example, cloud storage services such as Drop box synchronize between local copies and cloud backups each time users make changes to local versions. Similarly, synchronization tools are necessary in mobile devices. Specialized synchronization algorithms are used for video and sound editing. Synchronization tools are also capable of performing data duplication.

  3. A Next-Generation Parallel File System Environment for the OLCF

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dillow, David A; Fuller, Douglas; Gunasekaran, Raghul

    2012-01-01

    When deployed in 2008/2009 the Spider system at the Oak Ridge National Laboratory s Leadership Computing Facility (OLCF) was the world s largest scale Lustre parallel file system. Envisioned as a shared parallel file system capable of delivering both the bandwidth and capacity requirements of the OLCF s diverse computational environment, Spider has since become a blueprint for shared Lustre environments deployed worldwide. Designed to support the parallel I/O requirements of the Jaguar XT5 system and other smallerscale platforms at the OLCF, the upgrade to the Titan XK6 heterogeneous system will begin to push the limits of Spider s originalmore » design by mid 2013. With a doubling in total system memory and a 10x increase in FLOPS, Titan will require both higher bandwidth and larger total capacity. Our goal is to provide a 4x increase in total I/O bandwidth from over 240GB=sec today to 1TB=sec and a doubling in total capacity. While aggregate bandwidth and total capacity remain important capabilities, an equally important goal in our efforts is dramatically increasing metadata performance, currently the Achilles heel of parallel file systems at leadership. We present in this paper an analysis of our current I/O workloads, our operational experiences with the Spider parallel file systems, the high-level design of our Spider upgrade, and our efforts in developing benchmarks that synthesize our performance requirements based on our workload characterization studies.« less

  4. Interfaces between statistical analysis packages and the ESRI geographic information system

    NASA Technical Reports Server (NTRS)

    Masuoka, E.

    1980-01-01

    Interfaces between ESRI's geographic information system (GIS) data files and real valued data files written to facilitate statistical analysis and display of spatially referenced multivariable data are described. An example of data analysis which utilized the GIS and the statistical analysis system is presented to illustrate the utility of combining the analytic capability of a statistical package with the data management and display features of the GIS.

  5. JOVIAL (J73) to Ada Translator.

    DTIC Science & Technology

    1982-06-01

    editors, file managers , and other APSE , the Translator will Provide significant (though not total) Ltion of the conversion of J73 Proorams for use...vlobal knowlede only of compool declarationsi externals are not resolved until the compiled modules are linked. Creatinv a vlobal data base durin...translation (as shown in Figure 2-1) will require the Job control, file management , and text editing capabilities which are provided by a typical

  6. Protecting Your Computer from Viruses

    ERIC Educational Resources Information Center

    Descy, Don E.

    2006-01-01

    A computer virus is defined as a software program capable of reproducing itself and usually capable of causing great harm to files or other programs on the same computer. The existence of computer viruses--or the necessity of avoiding viruses--is part of using a computer. With the advent of the Internet, the door was opened wide for these…

  7. WADeG Cell Phone

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2009-09-01

    The on cell phone software captures the images from the CMOS camera periodically, stores the pictures, and periodically transmits those images over the cellular network to the server. The cell phone software consists of several modules: CamTest.cpp, CamStarter.cpp, StreamIOHandler .cpp, and covertSmartDevice.cpp. The camera application on the SmartPhone is CamStarter, which is "the" user interface for the camera system. The CamStarter user interface allows a user to start/stop the camera application and transfer files to the server. The CamStarter application interfaces to the CamTest application through registry settings. Both the CamStarter and CamTest applications must be separately deployed on themore » smartphone to run the camera system application. When a user selects the Start button in CamStarter, CamTest is created as a process. The smartphone begins taking small pictures (CAPTURE mode), analyzing those pictures for certain conditions, and saving those pictures on the smartphone. This process will terminate when the user selects the Stop button. The camtest code spins off an asynchronous thread, StreamIOHandler, to check for pictures taken by the camera. The received image is then tested by StreamIOHandler to see if it meets certain conditions. If those conditions are met, the CamTest program is notified through the setting of a registry key value and the image is saved in a designated directory in a custom BMP file which includes a header and the image data. When the user selects the Transfer button in the CamStarter user interface, the covertsmartdevice code is created as a process. Covertsmartdevice gets all of the files in a designated directory, opens a socket connection to the server, sends each file, and then terminates.« less

  8. Performance of the Galley Parallel File System

    NASA Technical Reports Server (NTRS)

    Nieuwejaar, Nils; Kotz, David

    1996-01-01

    As the input/output (I/O) needs of parallel scientific applications increase, file systems for multiprocessors are being designed to provide applications with parallel access to multiple disks. Many parallel file systems present applications with a conventional Unix-like interface that allows the application to access multiple disks transparently. This interface conceals the parallism within the file system, which increases the ease of programmability, but makes it difficult or impossible for sophisticated programmers and libraries to use knowledge about their I/O needs to exploit that parallelism. Furthermore, most current parallel file systems are optimized for a different workload than they are being asked to support. We introduce Galley, a new parallel file system that is intended to efficiently support realistic parallel workloads. Initial experiments, reported in this paper, indicate that Galley is capable of providing high-performance 1/O to applications the applications that rely on them. In Section 3 we describe that access data in patterns that have been observed to be common.

  9. LVFS: A Big Data File Storage Bridge for the HPC Community

    NASA Astrophysics Data System (ADS)

    Golpayegani, N.; Halem, M.; Mauoka, E.; Fonseca, L. F.

    2015-12-01

    Merging Big Data capabilities into High Performance Computing architecture starts at the file storage level. Heterogeneous storage systems are emerging which offer enhanced features for dealing with Big Data such as the IBM GPFS storage system's integration into Hadoop Map-Reduce. Taking advantage of these capabilities requires file storage systems to be adaptive and accommodate these new storage technologies. We present the extension of the Lightweight Virtual File System (LVFS) currently running as the production system for the MODIS Level 1 and Atmosphere Archive and Distribution System (LAADS) to incorporate a flexible plugin architecture which allows easy integration of new HPC hardware and/or software storage technologies without disrupting workflows, system architectures and only minimal impact on existing tools. We consider two essential aspects provided by the LVFS plugin architecture needed for the future HPC community. First, it allows for the seamless integration of new and emerging hardware technologies which are significantly different than existing technologies such as Segate's Kinetic disks and Intel's 3DXPoint non-volatile storage. Second is the transparent and instantaneous conversion between new software technologies and various file formats. With most current storage system a switch in file format would require costly reprocessing and nearly doubling of storage requirements. We will install LVFS on UMBC's IBM iDataPlex cluster with a heterogeneous storage architecture utilizing local, remote, and Seagate Kinetic storage as a case study. LVFS merges different kinds of storage architectures to show users a uniform layout and, therefore, prevent any disruption in workflows, architecture design, or tool usage. We will show how LVFS will convert HDF data produced by applying machine learning algorithms to Xco2 Level 2 data from the OCO-2 satellite to produce CO2 surface fluxes into GeoTIFF for visualization.

  10. MPST Software: MoonKommand

    NASA Technical Reports Server (NTRS)

    Kwok, John H.; Call, Jared A.; Khanampornpan, Teerapat

    2012-01-01

    This software automatically processes Sally Ride Science (SRS) delivered MoonKAM camera control files (ccf) into uplink products for the GRAIL-A and GRAIL-B spacecraft as part of an education and public outreach (EPO) extension to the Grail Mission. Once properly validated and deemed safe for execution onboard the spacecraft, MoonKommand generates the command products via the Automated Sequence Processor (ASP) and generates uplink (.scmf) files for radiation to the Grail-A and/or Grail-B spacecraft. Any errors detected along the way are reported back to SRS via email. With Moon Kommand, SRS can control their EPO instrument as part of a fully automated process. Inputs are received from SRS as either image capture files (.ccficd) for new image requests, or downlink/delete files (.ccfdl) for requesting image downlink from the instrument and on-board memory management. The Moon - Kommand outputs are command and file-load (.scmf) files that will be uplinked by the Deep Space Network (DSN). Without MoonKommand software, uplink product generation for the MoonKAM instrument would be a manual process. The software is specific to the Moon - KAM instrument on the GRAIL mission. At the time of this writing, the GRAIL mission was making final preparations to begin the science phase, which was scheduled to continue until June 2012.

  11. ESCHER: An interactive mesh-generating editor for preparing finite-element input

    NASA Technical Reports Server (NTRS)

    Oakes, W. R., Jr.

    1984-01-01

    ESCHER is an interactive mesh generation and editing program designed to help the user create a finite-element mesh, create additional input for finite-element analysis, including initial conditions, boundary conditions, and slidelines, and generate a NEUTRAL FILE that can be postprocessed for input into several finite-element codes, including ADINA, ADINAT, DYNA, NIKE, TSAAS, and ABUQUS. Two important ESCHER capabilities, interactive geometry creation and mesh archival storge are described in detail. Also described is the interactive command language and the use of interactive graphics. The archival storage and restart file is a modular, entity-based mesh data file. Modules of this file correspond to separate editing modes in the mesh editor, with data definition syntax preserved between the interactive commands and the archival storage file. Because ESCHER was expected to be highly interactive, extensive user documentation was provided in the form of an interactive HELP package.

  12. Efficiency of the Self Adjusting File, WaveOne, Reciproc, ProTaper and hand files in root canal debridement.

    PubMed

    Topcu, K Meltem; Karatas, Ertugrul; Ozsu, Damla; Ersoy, Ibrahim

    2014-07-01

    The aim of this study was to compare the canal debridement capabilities of three single file systems, ProTaper, and K-files in oval-shaped canals. Seventy-five extracted human mandibular central incisors with oval-shaped root canals were selected. A radiopaque contrast medium (Metapex; Meta Biomed Co. Ltd., Chungcheongbuk-do, Korea) was introduced into the canal systems and the self-adjusting file (SAF), WaveOne, Reciproc, ProTaper, and K-files were used for the instrumentation of the canals. The percentage of removed contrast medium was calculated using pre- and post-operative radiographs. An overall comparison between the groups revealed that the hand file (HF) and SAF groups presented the lowest percentage of removed contrast medium, whereas the WaveOne group showed the highest percentage (P < 0.001). The ProTaper group removed more contrast medium than the SAF and HF groups (P < 0.05). None of the instruments was able to remove the contrast medium completely. WaveOne performed significantly better than other groups.

  13. Efficiency of the Self Adjusting File, WaveOne, Reciproc, ProTaper and hand files in root canal debridement

    PubMed Central

    Topcu, K. Meltem; Karatas, Ertugrul; Ozsu, Damla; Ersoy, Ibrahim

    2014-01-01

    Objectives: The aim of this study was to compare the canal debridement capabilities of three single file systems, ProTaper, and K-files in oval-shaped canals. Materials and Methods: Seventy-five extracted human mandibular central incisors with oval-shaped root canals were selected. A radiopaque contrast medium (Metapex; Meta Biomed Co. Ltd., Chungcheongbuk-do, Korea) was introduced into the canal systems and the self-adjusting file (SAF), WaveOne, Reciproc, ProTaper, and K-files were used for the instrumentation of the canals. The percentage of removed contrast medium was calculated using pre- and post-operative radiographs. Results: An overall comparison between the groups revealed that the hand file (HF) and SAF groups presented the lowest percentage of removed contrast medium, whereas the WaveOne group showed the highest percentage (P < 0.001). The ProTaper group removed more contrast medium than the SAF and HF groups (P < 0.05). Conclusions: None of the instruments was able to remove the contrast medium completely. WaveOne performed significantly better than other groups. PMID:25202211

  14. Technology Tips

    ERIC Educational Resources Information Center

    Mathematics Teacher, 2004

    2004-01-01

    Some inexpensive or free ways that enable to capture and use images in work are mentioned. The first tip demonstrates the methods of using some of the built-in capabilities of the Macintosh and Windows-based PC operating systems, and the second tip describes methods to capture and create images using SnagIt.

  15. Brady's Geothermal Field - DTS Raw Data

    DOE Data Explorer

    Thomas Coleman

    2016-03-26

    The submitted data correspond to the complete raw temperature datasets captured by the distributed temperature sensing (DTS) horizontal and vertical arrays during the PoroTomo Experiment. Files in each submitted resource include: .xml (level 0): Data that includes Stokes, Anti-Stokes, and Temperature data .csv (level 1): Data that includes temperature PT100: Reference probe data

  16. DATALINK. Records Inventory Data Collection Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cole, B.

    1995-03-01

    DATALINK was created to provide an easy to use data collection program for records management software products. It provides several useful tools for capturing and validating record index data in the field. It also allows users to easily create a comma delimited, ASCII text file for data export into most records management software products.

  17. 76 FR 13603 - Marine Mammals; File No. 15748

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-14

    ... capture of up to 30 adult females and 20 pups/juveniles of either sex annually. Adult females determined to not be pregnant and pups/juveniles of either sex would be anesthetized or sedated, have scientific... retrieve instruments. An additional 300 seals of any age and either sex may be harassed incidental to the...

  18. The IBM PC as an Online Search Machine. Part 5: Searching through Crosstalk.

    ERIC Educational Resources Information Center

    Kolner, Stuart J.

    1985-01-01

    This last of a five-part series on using the IBM personal computer for online searching highlights a brief review, search process, making the connection, switching between screens and modes, online transaction, capture buffer controls, coping with options, function keys, script files, processing downloaded information, note to TELEX users, and…

  19. The Development of an Occupational Information System (OIS), Volume II.

    ERIC Educational Resources Information Center

    Louisiana State Univ., Baton Rouge. Div. of Continuing Education.

    Suppliers of postsecondary trained manpower data in Louisiana were surveyed during a project to obtain labor market information and occupational supply/demand information. All supply data were computerized and assigned to the appropriate training file. The data compiled fell into four major categories with regard to the method(s) of capture and…

  20. 75 FR 14154 - Notice of Receipt of Several Pesticide Petitions Filed for Residues of Pesticide Chemicals in or...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-24

    ... for plant commodities) are both gas liquid chromatography (GLC) methods with electron capture... analytical methodology using liquid chromatography/mass spectrometry/mass spectrometry (LC/MS/MS) detection...-aminopropane]) using gas chromatography (GC) have been submitted to the EPA. In addition, a new validated...

  1. Hypersonic flow analysis

    NASA Technical Reports Server (NTRS)

    Chow, Chuen-Yen; Ryan, James S.

    1987-01-01

    While the zonal grid system of Transonic Navier-Stokes (TNS) provides excellent modeling of complex geometries, improved shock capturing, and a higher Mach number range will be required if flows about hypersonic aircraft are to be modeled accurately. A computational fluid dynamics (CFD) code, the Compressible Navier-Stokes (CNS), is under development to combine the required high Mach number capability with the existing TNS geometry capability. One of several candidate flow solvers for inclusion in the CNS is that of F3D. This upwinding flow solver promises improved shock capturing, and more accurate hypersonic solutions overall, compared to the solver currently used in TNS.

  2. NASA's Kepler Reveals Potential New Worlds - Raw Video New File

    NASA Image and Video Library

    2017-06-19

    This is a video file, or a collection of unedited video clips for media usage, in support of the Kepler mission's latest discovery announcement. Launched in 2009, the Kepler space telescope is our first mission capable of identifying Earth-size planets around other stars. On Monday, June 19, 2017, scientists announced the results from the latest Kepler candidate catalog of the mission at a press conference at NASA's Ames Research Center.

  3. The Use of NanoTrap Particles as a Sample Enrichment Method to Enhance the Detection of Rift Valley Fever Virus

    PubMed Central

    Shafagati, Nazly; Narayanan, Aarthi; Baer, Alan; Fite, Katherine; Pinkham, Chelsea; Bailey, Charles; Kashanchi, Fatah; Lepene, Benjamin; Kehn-Hall, Kylene

    2013-01-01

    Background Rift Valley Fever Virus (RVFV) is a zoonotic virus that is not only an emerging pathogen but is also considered a biodefense pathogen due to the threat it may cause to public health and national security. The current state of diagnosis has led to misdiagnosis early on in infection. Here we describe the use of a novel sample preparation technology, NanoTrap particles, to enhance the detection of RVFV. Previous studies demonstrated that NanoTrap particles lead to both 100 percent capture of protein analytes as well as an improvement of more than 100-fold in sensitivity compared to existing methods. Here we extend these findings by demonstrating the capture and enrichment of viruses. Results Screening of NanoTrap particles indicated that one particle, NT53, was the most efficient at RVFV capture as demonstrated by both qRT-PCR and plaque assays. Importantly, NT53 capture of RVFV resulted in greater than 100-fold enrichment from low viral titers when other diagnostics assays may produce false negatives. NT53 was also capable of capturing and enhancing RVFV detection from serum samples. RVFV that was inactivated through either detergent or heat treatment was still found bound to NT53, indicating the ability to use NanoTrap particles for viral capture prior to transport to a BSL-2 environment. Furthermore, both NP-40-lysed virus and purified RVFV RNA were bound by NT53. Importantly, NT53 protected viral RNA from RNase A degradation, which was not observed with other commercially available beads. Incubation of RVFV samples with NT53 also resulted in increased viral stability as demonstrated through preservation of infectivity at elevated temperatures. Finally, NanoTrap particles were capable of capturing VEEV and HIV, demonstrating the broad applicability of NanoTrap particles for viral diagnostics. Conclusion This study demonstrates NanoTrap particles are capable of capturing, enriching, and protecting RVFV virions. Furthermore, the use of NanoTrap particles can be extended to a variety of viruses, including VEEV and HIV. PMID:23861988

  4. Evaluation of prompt gamma-ray data and nuclear structure of niobium-94 with statistical model calculations

    NASA Astrophysics Data System (ADS)

    Turkoglu, Danyal

    Precise knowledge of prompt gamma-ray intensities following neutron capture is critical for elemental and isotopic analyses, homeland security, modeling nuclear reactors, etc. A recently-developed database of prompt gamma-ray production cross sections and nuclear structure information in the form of a decay scheme, called the Evaluated Gamma-ray Activation File (EGAF), is under revision. Statistical model calculations are useful for checking the consistency of the decay scheme, providing insight on its completeness and accuracy. Furthermore, these statistical model calculations are necessary to estimate the contribution of continuum gamma-rays, which cannot be experimentally resolved due to the high density of excited states in medium- and heavy-mass nuclei. Decay-scheme improvements in EGAF lead to improvements to other databases (Evaluated Nuclear Structure Data File, Reference Input Parameter Library) that are ultimately used in nuclear-reaction models to generate the Evaluated Nuclear Data File (ENDF). Gamma-ray transitions following neutron capture in 93Nb have been studied at the cold-neutron beam facility at the Budapest Research Reactor. Measurements have been performed using a coaxial HPGe detector with Compton suppression. Partial gamma-ray production capture cross sections at a neutron velocity of 2200 m/s have been deduced relative to that of the 255.9-keV transition after cold-neutron capture by 93Nb. With the measurement of a niobium chloride target, this partial cross section was internally standardized to the cross section for the 1951-keV transition after cold-neutron capture by 35Cl. The resulting (0.1377 +/- 0.0018) barn (b) partial cross section produced a calibration factor that was 23% lower than previously measured for the EGAF database. The thermal-neutron cross sections were deduced for the 93Nb(n,gamma ) 94mNb and 93Nb(n,gamma) 94gNb reactions by summing the experimentally-measured partial gamma-ray production cross sections associated with the ground-state transitions below the 396-keV level and combining that summation with the contribution to the ground state from the quasi-continuum above 396 keV, determined with Monte Carlo statistical model calculations using the DICEBOX computer code. These values, sigmam and sigma 0, were (0.83 +/- 0.05) b and (1.16 +/- 0.11) b, respectively, and found to be in agreement with literature values. Comparison of the modeled population and experimental depopulation of individual levels confirmed tentative spin assignments and suggested changes where imbalances existed.

  5. Proceedings of the Workshop on The Human-Computer Partnership in Decision-Support Held in San Luis Obispo, California on May 2-4, 2000

    DTIC Science & Technology

    2000-09-01

    commission in 1979. He holds a Bachelor of Science Degree from Southwest Missouri State University and is a graduate of the US Army’s Armor Officer Advance...1195/12 TARAWA ARG / 13TH MEU ~ WARNET successfully supported VTC, chat, file transfers, whiteboard collaboration • Used regularly to conduct CPR5 staff...Novel employment of WARNET capability • Whiteboard capability supported CIWS repair • Whiteboard capability used to familiarize medical staff on

  6. Functional Gap Analysis of the Maritime Operations Centers

    DTIC Science & Technology

    2009-12-01

    Messaging Services TBMCS , DJC2 MI.1.3.5 Manage Suspense Control Capability Gap MI.1.3.6 Provide Component IM Cell Services Capability Gap MI.1.4 Provide...Admin Support MSRT MI.1.3.3 Manage Electronic File Plan Capability Gap MI.1.3.4 Manage Messaging Services TBMCS , DJC2 MI.1.3.5 Manage Suspense...1.5.9 Execute C4 Policies & Procedures for the Joint Operations Area GCCS-J, DCGS-N, TBMCS , CENTRIX-M EHQ.1.11 Sub Component Interagency

  7. 76 FR 48833 - Notice of Filings of Self-Certifications of Coal Capability Under the Powerplant and Industrial...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-09

    ... Capability Under the Powerplant and Industrial Fuel Use Act AGENCY: Office Electricity Delivery and Energy...) of the Powerplant and Industrial Fuel Use Act of 1978 (FUA), as amended, and DOE regulations in 10..., the owner or operator of such a facility proposing to use natural gas or petroleum as its primary...

  8. 78 FR 26337 - Notice of Filing of Self-Certification of Coal Capability Under the Powerplant and Industrial...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-06

    ... Capability Under the Powerplant and Industrial Fuel Use Act AGENCY: Office of Electricity Delivery and Energy... Department of Energy (DOE) pursuant to Sec. 201(d) of the Powerplant and Industrial Fuel Use Act of 1978 (FUA... proposing to use natural gas or petroleum as its primary energy source shall certify to the Secretary of...

  9. 77 FR 74473 - Notice of Filing of Self-Certification of Coal Capability Under the Powerplant and Industrial...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-14

    ... Capability Under the Powerplant and Industrial Fuel Use Act AGENCY: Office Electricity Delivery and Energy... Department of Energy (DOE) pursuant to Sec. 201(d) of the Powerplant and Industrial Fuel Use Act of 1978 (FUA... of such a facility proposing to use natural gas or petroleum as its primary energy source shall...

  10. Multiplexed evaluation of capture agent binding kinetics using arrays of silicon photonic microring resonators.

    PubMed

    Byeon, Ji-Yeon; Bailey, Ryan C

    2011-09-07

    High affinity capture agents recognizing biomolecular targets are essential in the performance of many proteomic detection methods. Herein, we report the application of a label-free silicon photonic biomolecular analysis platform for simultaneously determining kinetic association and dissociation constants for two representative protein capture agents: a thrombin-binding DNA aptamer and an anti-thrombin monoclonal antibody. The scalability and inherent multiplexing capability of the technology make it an attractive platform for simultaneously evaluating the binding characteristics of multiple capture agents recognizing the same target antigen, and thus a tool complementary to emerging high-throughput capture agent generation strategies.

  11. Carcinogen File: The Ames Test.

    ERIC Educational Resources Information Center

    Kendall, Jim; Kriebel, David

    1979-01-01

    This test measures the capability of a chemical substance to cause mutations in special strains of the bacterium Salmonella. It is quick, taking only forty-eight hours, inexpensive, and reliable. (BB)

  12. Capabilities, methodologies, and use of the cambio file-translation application.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lasche, George P.

    2007-03-01

    This report describes the capabilities, methodologies, and uses of the Cambio computer application, designed to automatically read and display nuclear spectral data files of any known format in the world and to convert spectral data to one of several commonly used analysis formats. To further assist responders, Cambio incorporates an analysis method based on non-linear fitting techniques found in open literature and implemented in openly published source code in the late 1980s. A brief description is provided of how Cambio works, of what basic formats it can currently read, and how it can be used. Cambio was developed at Sandiamore » National Laboratories and is provided as a free service to assist nuclear emergency response analysts anywhere in the world in the fight against nuclear terrorism.« less

  13. Biodegradable nano-films for capture and non-invasive release of circulating tumor cells.

    PubMed

    Li, Wei; Reátegui, Eduardo; Park, Myoung-Hwan; Castleberry, Steven; Deng, Jason Z; Hsu, Bryan; Mayner, Sarah; Jensen, Anne E; Sequist, Lecia V; Maheswaran, Shyamala; Haber, Daniel A; Toner, Mehmet; Stott, Shannon L; Hammond, Paula T

    2015-10-01

    Selective isolation and purification of circulating tumor cells (CTCs) from whole blood is an important capability for both clinical medicine and biological research. Current techniques to perform this task place the isolated cells under excessive stresses that reduce cell viability, and potentially induce phenotype change, therefore losing valuable information about the isolated cells. We present a biodegradable nano-film coating on the surface of a microfluidic chip, which can be used to effectively capture as well as non-invasively release cancer cell lines such as PC-3, LNCaP, DU 145, H1650 and H1975. We have applied layer-by-layer (LbL) assembly to create a library of ultrathin coatings using a broad range of materials through complementary interactions. By developing an LbL nano-film coating with an affinity-based cell-capture surface that is capable of selectively isolating cancer cells from whole blood, and that can be rapidly degraded on command, we are able to gently isolate cancer cells and recover them without compromising cell viability or proliferative potential. Our approach has the capability to overcome practical hurdles and provide viable cancer cells for downstream analyses, such as live cell imaging, single cell genomics, and in vitro cell culture of recovered cells. Furthermore, CTCs from cancer patients were also captured, identified, and successfully released using the LbL-modified microchips. Published by Elsevier Ltd.

  14. A Scientific Data Provenance API for Distributed Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Raju, Bibi; Elsethagen, Todd O.; Stephan, Eric G.

    Data provenance has been an active area of research as a means to standardize how the origin of data, process event history, and what or who was responsible for influencing results is explained. There are two approaches to capture provenance information. The first approach is to collect observed evidence produced by an executing application using log files, event listeners, and temporary files that are used by the application or application developer. The provenance translated from these observations is an interpretation of the provided evidence. The second approach is called disclosed because the application provides a firsthand account of the provenancemore » based on the anticipated questions on data flow, process flow, and responsible agents. Most observed provenance collection systems collect lot of provenance information during an application run or workflow execution. The common trend in capturing provenance is to collect all possible information, then attempt to find relevant information, which is not efficient. Existing disclosed provenance system APIs do not work well in distributed environment and have trouble finding where to fit the individual pieces of provenance information. This work focuses on determining more reliable solutions for provenance capture. As part of the Integrated End-to-end Performance Prediction and Diagnosis for Extreme Scientific Workflows (IPPD) project, an API was developed, called Producer API (PAPI), which can disclose application targeted provenance, designed to work in distributed environments by means of unique object identification methods. The provenance disclosure approach used adds additional metadata to the provenance information to uniquely identify the pieces and connect them together. PAPI uses a common provenance model to support this provenance integration across disclosure sources. The API also provides the flexibility to let the user decide what to do with the collected provenance. The collected provenance can be sent to a triple store using REST services or it can be logged to a file.« less

  15. Intact capture of hypervelocity projectiles

    NASA Technical Reports Server (NTRS)

    Tsou, P.

    1990-01-01

    The ability to capture projectiles intact at hypervelocities opens new applications in science and technology that would either not be possible or would be very costly by other means. This capability has been demonstrated in the laboratory for aluminum projectiles of 1.6 mm diameter, captured at 6 km/s, in one unmelted piece, and retaining up to 95% of the original mass. Furthermore, capture was accomplished passively using microcellular underdense polymer foam. Another advantage of capturing projectiles in an underdense medium is the ability of such a medium to preserve a record of the projectile's original velocity components of speed and direction. A survey of these experimental results is described in terms of a dozen parameters which characterize the amount of capture and the effect on the projectile due to different capture media.

  16. Intact capture of hypervelocity projectiles.

    PubMed

    Tsou, P

    1990-01-01

    The ability to capture projectiles intact at hypervelocities opens new applications in science and technology that would either not be possible or would be very costly by other means. This capability has been demonstrated in the laboratory for aluminum projectiles of 1.6 mm diameter, captured at 6 km/s, in one unmelted piece, and retaining up to 95% of the original mass. Furthermore, capture was accomplished passively using microcellular underdense polymer foam. Another advantage of capturing projectiles in an underdense medium is the ability of such a medium to preserve a record of the projectile's original velocity components of speed and direction. A survey of these experimental results is described in terms of a dozen parameters which characterize the amount of capture and the effect on the projectile due to different capture media.

  17. SU-F-T-288: Impact of Trajectory Log Files for Clarkson-Based Independent Dose Verification of IMRT and VMAT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Takahashi, R; Kamima, T; Tachibana, H

    2016-06-15

    Purpose: To investigate the effect of the trajectory files from linear accelerator for Clarkson-based independent dose verification in IMRT and VMAT plans. Methods: A CT-based independent dose verification software (Simple MU Analysis: SMU, Triangle Products, Japan) with a Clarksonbased algorithm was modified to calculate dose using the trajectory log files. Eclipse with the three techniques of step and shoot (SS), sliding window (SW) and Rapid Arc (RA) was used as treatment planning system (TPS). In this study, clinically approved IMRT and VMAT plans for prostate and head and neck (HN) at two institutions were retrospectively analyzed to assess the dosemore » deviation between DICOM-RT plan (PL) and trajectory log file (TJ). An additional analysis was performed to evaluate MLC error detection capability of SMU when the trajectory log files was modified by adding systematic errors (0.2, 0.5, 1.0 mm) and random errors (5, 10, 30 mm) to actual MLC position. Results: The dose deviations for prostate and HN in the two sites were 0.0% and 0.0% in SS, 0.1±0.0%, 0.1±0.1% in SW and 0.6±0.5%, 0.7±0.9% in RA, respectively. The MLC error detection capability shows the plans for HN IMRT were the most sensitive and 0.2 mm of systematic error affected 0.7% dose deviation on average. Effect of the MLC random error did not affect dose error. Conclusion: The use of trajectory log files including actual information of MLC location, gantry angle, etc should be more effective for an independent verification. The tolerance level for the secondary check using the trajectory file may be similar to that of the verification using DICOM-RT plan file. From the view of the resolution of MLC positional error detection, the secondary check could detect the MLC position error corresponding to the treatment sites and techniques. This research is partially supported by Japan Agency for Medical Research and Development (AMED)« less

  18. Doing Your Science While You're in Orbit

    NASA Astrophysics Data System (ADS)

    Green, Mark L.; Miller, Stephen D.; Vazhkudai, Sudharshan S.; Trater, James R.

    2010-11-01

    Large-scale neutron facilities such as the Spallation Neutron Source (SNS) located at Oak Ridge National Laboratory need easy-to-use access to Department of Energy Leadership Computing Facilities and experiment repository data. The Orbiter thick- and thin-client and its supporting Service Oriented Architecture (SOA) based services (available at https://orbiter.sns.gov) consist of standards-based components that are reusable and extensible for accessing high performance computing, data and computational grid infrastructure, and cluster-based resources easily from a user configurable interface. The primary Orbiter system goals consist of (1) developing infrastructure for the creation and automation of virtual instrumentation experiment optimization, (2) developing user interfaces for thin- and thick-client access, (3) provide a prototype incorporating major instrument simulation packages, and (4) facilitate neutron science community access and collaboration. The secure Orbiter SOA authentication and authorization is achieved through the developed Virtual File System (VFS) services, which use Role-Based Access Control (RBAC) for data repository file access, thin-and thick-client functionality and application access, and computational job workflow management. The VFS Relational Database Management System (RDMS) consists of approximately 45 database tables describing 498 user accounts with 495 groups over 432,000 directories with 904,077 repository files. Over 59 million NeXus file metadata records are associated to the 12,800 unique NeXus file field/class names generated from the 52,824 repository NeXus files. Services that enable (a) summary dashboards of data repository status with Quality of Service (QoS) metrics, (b) data repository NeXus file field/class name full text search capabilities within a Google like interface, (c) fully functional RBAC browser for the read-only data repository and shared areas, (d) user/group defined and shared metadata for data repository files, (e) user, group, repository, and web 2.0 based global positioning with additional service capabilities are currently available. The SNS based Orbiter SOA integration progress with the Distributed Data Analysis for Neutron Scattering Experiments (DANSE) software development project is summarized with an emphasis on DANSE Central Services and the Virtual Neutron Facility (VNF). Additionally, the DANSE utilization of the Orbiter SOA authentication, authorization, and data transfer services best practice implementations are presented.

  19. The NCAR Research Data Archive's Hybrid Approach for Data Discovery and Access

    NASA Astrophysics Data System (ADS)

    Schuster, D.; Worley, S. J.

    2013-12-01

    The NCAR Research Data Archive (RDA http://rda.ucar.edu) maintains a variety of data discovery and access capabilities for it's 600+ dataset collections to support the varying needs of a diverse user community. In-house developed and standards-based community tools offer services to more than 10,000 users annually. By number of users the largest group is external and access the RDA through web based protocols; the internal NCAR HPC users are fewer in number, but typically access more data volume. This paper will detail the data discovery and access services maintained by the RDA to support both user groups, and show metrics that illustrate how the community is using the services. The distributed search capability enabled by standards-based community tools, such as Geoportal and an OAI-PMH access point that serves multiple metadata standards, provide pathways for external users to initially discover RDA holdings. From here, in-house developed web interfaces leverage primary discovery level metadata databases that support keyword and faceted searches. Internal NCAR HPC users, or those familiar with the RDA, may go directly to the dataset collection of interest and refine their search based on rich file collection metadata. Multiple levels of metadata have proven to be invaluable for discovery within terabyte-sized archives composed of many atmospheric or oceanic levels, hundreds of parameters, and often numerous grid and time resolutions. Once users find the data they want, their access needs may vary as well. A THREDDS data server running on targeted dataset collections enables remote file access through OPENDAP and other web based protocols primarily for external users. In-house developed tools give all users the capability to submit data subset extraction and format conversion requests through scalable, HPC based delayed mode batch processing. Users can monitor their RDA-based data processing progress and receive instructions on how to access the data when it is ready. External users are provided with RDA server generated scripts to download the resulting request output. Similarly they can download native dataset collection files or partial files using Wget or cURL based scripts supplied by the RDA server. Internal users can access the resulting request output or native dataset collection files directly from centralized file systems.

  20. On-orbit demonstration of automated closure and capture using ESA-developed proximity operations technologies and an existing, serviceable NASA Explorer Platform spacecraft

    NASA Technical Reports Server (NTRS)

    Hohwiesner, Bill; Claudinon, Bernard

    1991-01-01

    The European Space Agency (ESA) has been working to develop an autonomous rendezvous and docking capability since 1984 to enable Hermes to automatically dock with Columbus. As a result, ESA with Matra, MBB, and other space companies have developed technologies that are also directly supportive of the current NASA initiative for Automated Rendezvous and Capture. Fairchild and Matra would like to discuss the results of the applicable ESA/Matra rendezvous and capture developments, and suggest how these capabilities could be used, together with an existing NASA Explorer Platform satellite, to minimize new development and accomplish a cost effective automatic closure and capture demonstration program. Several RV sensors have been developed at breadboard level for the Hermes/Columbus program by Matra, MBB, and SAAB. Detailed algorithms for automatic rendezvous, closure, and capture have been developed by ESA and CNES for application with Hermes to Columbus rendezvous and docking, and they currently are being verified with closed-loop software simulation. The algorithms have multiple closed-loop control modes and phases starting at long range using GPS navigation. Differential navigation is used for coast/continuous thrust homing, holdpoint acquisition, V-bar hopping, and station point acquisition. The proximity operation sensor is used for final closure and capture. A subset of these algorithms, comprising the proximity operations algorithms, could easily be extracted and tailored to a limited objective closure and capture flight demonstration.

  1. PcapDB: Search Optimized Packet Capture, Version 0.1.0.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ferrell, Paul; Steinfadt, Shannon

    PcapDB is a packet capture system designed to optimize the captured data for fast search in the typical (network incident response) use case. The technology involved in this software has been submitted via the IDEAS system and has been filed as a provisional patent. It includes the following primary components: capture: The capture component utilizes existing capture libraries to retrieve packets from network interfaces. Once retrieved the packets are passed to additional threads for sorting into flows and indexing. The sorted flows and indexes are passed to other threads so that they can be written to disk. These components aremore » written in the C programming language. search: The search components provide a means to find relevant flows and the associated packets. A search query is parsed and represented as a search tree. Various search commands, written in C, are then used resolve this tree into a set of search results. The tree generation and search execution management components are written in python. interface: The PcapDB web interface is written in Python on the Django framework. It provides a series of pages, API's, and asynchronous tasks that allow the user to manage the capture system, perform searches, and retrieve results. Web page components are written in HTML,CSS and Javascript.« less

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wackerbarth, David

    Sandia National Laboratories has developed a computer program to review, reduce and manipulate waveform data. PlotData is designed for post-acquisition waveform data analysis. PlotData is both a post-acquisition and an advanced interactive data analysis environment. PlotData requires unidirectional waveform data with both uniform and discrete time-series measurements. PlotData operates on a National Instruments' LabVIEW™ software platform. Using PlotData, the user can capture waveform data from digitizing oscilloscopes over a GPIB, USB and Ethernet interface from Tektronix, Lecroy or Agilent scopes. PlotData can both import and export several types of binary waveform files including, but not limited to, Tektronix .wmf files,more » Lecroy.trc files and xy pair ASCIIfiles. Waveform manipulation includes numerous math functions, integration, differentiation, smoothing, truncation, and other specialized data reduction routines such as VISAR, POV, PVDF (Bauer) piezoelectric gauges, and piezoresistive gauges such as carbon manganin pressure gauges.« less

  3. Automated Rendezvous and Capture System Development and Simulation for NASA

    NASA Technical Reports Server (NTRS)

    Roe, Fred D.; Howard, Richard T.; Murphy, Leslie

    2004-01-01

    The United States does not have an Automated Rendezvous and Capture/Docking (AR and C) capability and is reliant on manned control for rendezvous and docking of orbiting spacecraft. This reliance on the labor intensive manned interface for control of rendezvous and docking vehicles has a significant impact on the cost of the operation of the International Space Station (ISS) and precludes the use of any U.S. expendable launch capabilities for Space Station resupply. The Soviets have the capability to autonomously dock in space, but their system produces a hard docking with excessive force and contact velocity. Automated Rendezvous and Capture/Docking has been identified as a key enabling technology for the Space Launch Initiative (SLI) Program, DARPA Orbital Express and other DOD Programs. The development and implementation of an AR&C capability can significantly enhance system flexibility, improve safety, and lower the cost of maintaining, supplying, and operating the International Space Station. The Marshall Space Flight Center (MSFC) has conducted pioneering research in the development of an automated rendezvous and capture (or docking) (AR and C) system for U.S. space vehicles. This AR&C system was tested extensively using hardware-in-the-loop simulations in the Flight Robotics Laboratory, and a rendezvous sensor, the Video Guidance Sensor was developed and successfully flown on the Space Shuttle on flights STS-87 and STS-95, proving the concept of a video- based sensor. Further developments in sensor technology and vehicle and target configuration have lead to continued improvements and changes in AR&C system development and simulation. A new Advanced Video Guidance Sensor (AVGS) with target will be utilized on the Demonstration of Autonomous Rendezvous Technologies (DART) flight experiment in 2004.

  4. Traj_opt User's Guide

    NASA Technical Reports Server (NTRS)

    Saunders, David A.

    2005-01-01

    Trajectory optimization program Traj_opt was developed at Ames Research Center to help assess the potential benefits of ultrahigh temperature ceramic materials applied to reusable space vehicles with sharp noses and wing leading edges. Traj_opt loosely couples the Ames three-degrees-of-freedom trajectory package Traj (see NASA-TM-2004-212847) with the SNOPT optimization package (Stanford University Technical Report SOL 98-1). Traj_opt version January 22, 2003 is covered by this user guide. The program has been applied extensively to entry and ascent abort trajectory calculations for sharp and blunt crew transfer vehicles. The main optimization variables are control points for the angle of attack and bank angle time histories. No propulsion options are provided, but numerous objective functions may be specified and the nonlinear constraints implemented include a distributed surface heating constraint capability. Aero-capture calculations are also treated with an option to minimize orbital eccentricity at apoapsis. Traj_opt runs efficiently on a single processor, using forward or central differences for the gradient calculations. Results may be displayed conveniently with Gnuplot scripts. Control files recommended for five standard reentry and ascent abort trajectories are included along with detailed descriptions of the inputs and outputs.

  5. MolabIS--an integrated information system for storing and managing molecular genetics data.

    PubMed

    Truong, Cong V C; Groeneveld, Linn F; Morgenstern, Burkhard; Groeneveld, Eildert

    2011-10-31

    Long-term sample storage, tracing of data flow and data export for subsequent analyses are of great importance in genetics studies. Therefore, molecular labs do need a proper information system to handle an increasing amount of data from different projects. We have developed a molecular labs information management system (MolabIS). It was implemented as a web-based system allowing the users to capture original data at each step of their workflow. MolabIS provides essential functionality for managing information on individuals, tracking samples and storage locations, capturing raw files, importing final data from external files, searching results, accessing and modifying data. Further important features are options to generate ready-to-print reports and convert sequence and microsatellite data into various data formats, which can be used as input files in subsequent analyses. Moreover, MolabIS also provides a tool for data migration. MolabIS is designed for small-to-medium sized labs conducting Sanger sequencing and microsatellite genotyping to store and efficiently handle a relative large amount of data. MolabIS not only helps to avoid time consuming tasks but also ensures the availability of data for further analyses. The software is packaged as a virtual appliance which can run on different platforms (e.g. Linux, Windows). MolabIS can be distributed to a wide range of molecular genetics labs since it was developed according to a general data model. Released under GPL, MolabIS is freely available at http://www.molabis.org.

  6. MolabIS - An integrated information system for storing and managing molecular genetics data

    PubMed Central

    2011-01-01

    Background Long-term sample storage, tracing of data flow and data export for subsequent analyses are of great importance in genetics studies. Therefore, molecular labs do need a proper information system to handle an increasing amount of data from different projects. Results We have developed a molecular labs information management system (MolabIS). It was implemented as a web-based system allowing the users to capture original data at each step of their workflow. MolabIS provides essential functionality for managing information on individuals, tracking samples and storage locations, capturing raw files, importing final data from external files, searching results, accessing and modifying data. Further important features are options to generate ready-to-print reports and convert sequence and microsatellite data into various data formats, which can be used as input files in subsequent analyses. Moreover, MolabIS also provides a tool for data migration. Conclusions MolabIS is designed for small-to-medium sized labs conducting Sanger sequencing and microsatellite genotyping to store and efficiently handle a relative large amount of data. MolabIS not only helps to avoid time consuming tasks but also ensures the availability of data for further analyses. The software is packaged as a virtual appliance which can run on different platforms (e.g. Linux, Windows). MolabIS can be distributed to a wide range of molecular genetics labs since it was developed according to a general data model. Released under GPL, MolabIS is freely available at http://www.molabis.org. PMID:22040322

  7. Oscar — Using Byte Pairs to Find File Type and Camera Make of Data Fragments

    NASA Astrophysics Data System (ADS)

    Karresand, Martin; Shahmehri, Nahid

    Mapping out the contents of fragmented storage media is hard if the file system has been corrupted, especially as the current forensic tools rely on meta information to do their job. If it was possible to find all fragments belonging to a certain file type, it would also be possible to recover a lost file. Such a tool could for example be used in the hunt for child pornography. The Oscar method identifies the file type of data fragments based solely on statistics calculated from their structure. The method does not need any meta data to work. We have previously used the byte frequency distribution and the rate of change between consecutive bytes as basis for the statistics, as well as calculating the 2-gram frequency distribution to create a model of different file types. This paper present a variant of the 2-gram method, in that it uses a dynamic smoothing factor. In this way we take the amount of data used to create the centroid into consideration. A previous experiment on file type identification is extended with .mp3 files reaching a detection rate of 76% with a false positives rate of 0.4%. We also use the method to identify the camera make used to capture a .jpg picture from a fragment of the picture. The result shows that we can clearly separate a picture fragment coming from a Fuji or Olympus cameras from a fragment of a picture of the other camera makes used in our test.

  8. Motor Transport Operator Training: An Approach to Preparing Training Managers and Instructors to Design, Conduct, and Evaluate Performance Oriented Training

    DTIC Science & Technology

    1977-09-01

    the needed knowledge and skills must be provided. Ideally, these programs should be self- contained , capable of easy administration within...type of vehicle, duplicate cards were prepared. In effect, a task file was prepared, the file containing cards which described the task and...test situation contains a description of the situation. (If the operator does not actually encounter the situation, the situation is read to him by

  9. VLBA Archive &Distribution Architecture

    NASA Astrophysics Data System (ADS)

    Wells, D. C.

    1994-01-01

    Signals from the 10 antennas of NRAO's VLBA [Very Long Baseline Array] are processed by a Correlator. The complex fringe visibilities produced by the Correlator are archived on magnetic cartridges using a low-cost architecture which is capable of scaling and evolving. Archive files are copied to magnetic media to be distributed to users in FITS format, using the BINTABLE extension. Archive files are labelled using SQL INSERT statements, in order to bind the DBMS-based archive catalog to the archive media.

  10. Volume serving and media management in a networked, distributed client/server environment

    NASA Technical Reports Server (NTRS)

    Herring, Ralph H.; Tefend, Linda L.

    1993-01-01

    The E-Systems Modular Automated Storage System (EMASS) is a family of hierarchical mass storage systems providing complete storage/'file space' management. The EMASS volume server provides the flexibility to work with different clients (file servers), different platforms, and different archives with a 'mix and match' capability. The EMASS design considers all file management programs as clients of the volume server system. System storage capacities are tailored to customer needs ranging from small data centers to large central libraries serving multiple users simultaneously. All EMASS hardware is commercial off the shelf (COTS), selected to provide the performance and reliability needed in current and future mass storage solutions. All interfaces use standard commercial protocols and networks suitable to service multiple hosts. EMASS is designed to efficiently store and retrieve in excess of 10,000 terabytes of data. Current clients include CRAY's YMP Model E based Data Migration Facility (DMF), IBM's RS/6000 based Unitree, and CONVEX based EMASS File Server software. The VolSer software provides the capability to accept client or graphical user interface (GUI) commands from the operator's console and translate them to the commands needed to control any configured archive. The VolSer system offers advanced features to enhance media handling and particularly media mounting such as: automated media migration, preferred media placement, drive load leveling, registered MediaClass groupings, and drive pooling.

  11. Thermal Neutron Capture onto the Stable Tungsten Isotopes

    NASA Astrophysics Data System (ADS)

    Hurst, A. M.; Firestone, R. B.; Sleaford, B. W.; Summers, N. C.; Revay, Zs.; Szentmiklósi, L.; Belgya, T.; Basunia, M. S.; Capote, R.; Choi, H.; Dashdorj, D.; Escher, J.; Krticka, M.; Nichols, A.

    2012-02-01

    Thermal neutron-capture measurements of the stable tungsten isotopes have been carried out using the guided thermal-neutron beam at the Budapest Reactor. Prompt singles spectra were collected and analyzed using the HYPERMET γ-ray analysis software package for the compound tungsten systems 183W, 184W, and 187W, prepared from isotopically-enriched samples of 182W, 183W, and 186W, respectively. These new data provide both confirmation and new insights into the decay schemes and structure of the tungsten isotopes reported in the Evaluated Gamma-ray Activation File based upon previous elemental analysis. The experimental data have also been compared to Monte Carlo simulations of γ-ray emission following the thermal neutron-capture process using the statistical-decay code DICEBOX. Together, the experimental cross sections and modeledfeeding contribution from the quasi continuum, have been used to determine the total radiative thermal neutron-capture cross sections for the tungsten isotopes and provide improved decay-scheme information for the structural- and neutron-data libraries.

  12. Mahanaxar: quality of service guarantees in high-bandwidth, real-time streaming data storage

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bigelow, David; Bent, John; Chen, Hsing-Bung

    2010-04-05

    Large radio telescopes, cyber-security systems monitoring real-time network traffic, and others have specialized data storage needs: guaranteed capture of an ultra-high-bandwidth data stream, retention of the data long enough to determine what is 'interesting,' retention of interesting data indefinitely, and concurrent read/write access to determine what data is interesting, without interrupting the ongoing capture of incoming data. Mahanaxar addresses this problem. Mahanaxar guarantees streaming real-time data capture at (nearly) the full rate of the raw device, allows concurrent read and write access to the device on a best-effort basis without interrupting the data capture, and retains data as long asmore » possible given the available storage. It has built in mechanisms for reliability and indexing, can scale to meet arbitrary bandwidth requirements, and handles both small and large data elements equally well. Results from our prototype implementation shows that Mahanaxar provides both better guarantees and better performance than traditional file systems.« less

  13. Trajectory-capture cell instrumentation for measurement of dust particle mass, velocity and trajectory, and particle capture

    NASA Technical Reports Server (NTRS)

    Simpson, J. A.; Tuzzolino, A. J.

    1989-01-01

    The development of the polyvinylidene fluoride (PVDF) dust detector for space missions--such as the Halley Comet Missions where the impact velocity was very high as well as for missions where the impact velocity is low was extended to include: (1) the capability for impact position determination - i.e., x,y coordinate of impact; and (2) the capability for particle velocity determination using two thin PVDF sensors spaced a given distance apart - i.e., by time-of-flight. These developments have led to space flight instrumentation for recovery-type missions, which will measure the masses (sizes), fluxes and trajectories of incoming dust particles and will capture the dust material in a form suitable for later Earth-based laboratory measurements. These laboratory measurements would determine the elemental, isotopic and mineralogical properties of the captured dust and relate these to possible sources of the dust material (i.e., comets, asteroids), using the trajectory information. The instrumentation described here has the unique advantages of providing both orbital characteristics and physical and chemical properties--as well as possible origin--of incoming dust.

  14. Planning the National Agricultural Library's Multimedia CD-ROM "Ornamental Horticulture."

    ERIC Educational Resources Information Center

    Mason, Pamela R.

    1991-01-01

    Discussion of issues involved in planning a multimedia CD-ROM product explains the selection of authoring tools, the design of a user interface, expert systems, text conversion and capture (including scanning and optical character recognition), and problems associated with image files. The use of audio is also discussed, and a 14-item glossary is…

  15. 76 FR 53667 - Establishing a One-Year Retention Period for Patent-Related Papers That Have Been Scanned Into...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-29

    ... System or the Supplemental Complex Repository for Examiners AGENCY: United States Patent and Trademark... been scanned into the Image File Wrapper system (IFW) or the Supplemental Complex Repository for..., the USPTO had fully deployed SCORE, a data repository system designed to augment IFW with the capture...

  16. Making the Decision to Provide Enhanced Podcasts to Post-Secondary Science Students

    ERIC Educational Resources Information Center

    Holbrook, Jane; Dupont, Christine

    2011-01-01

    Providing students with supplementary course materials such as audio podcasts, enhanced podcasts, video podcasts and other forms of lecture-capture video files after a lecture is now a common occurrence in many post-secondary courses. We used an online questionnaire to ask students how helpful enhanced podcasts were for a variety of course…

  17. 77 FR 48192 - Self-Regulatory Organizations; Chicago Mercantile Exchange, Inc.; Notice of Filing of Proposed...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-13

    ... Liquidity Factor of Its Credit Default Swap Margin Methodology August 7, 2012. Pursuant to Section 19(b)(1... model. The liquidity margin component of the CME CDS margin model is designed to capture the risk... CDS Clearing Member. The current methodology for the liquidity factor is a function of a portfolio's...

  18. Understanding Learners' Self-Assessment and Self-Feedback on Their Foreign Language Speaking Performance

    ERIC Educational Resources Information Center

    Huang, Shu-Chen

    2016-01-01

    This study examines university learners' self-assessment and self-feedback on performance as captured in audio files from a foreign language speaking test. The learners' were guided to listen, transcribe and analyse their own speaking samples, as well as propose future actions for improvement. Content of learners' self-feedback was scrutinised…

  19. Developing an Efficient Computational Method that Estimates the Ability of Students in a Web-Based Learning Environment

    ERIC Educational Resources Information Center

    Lee, Young-Jin

    2012-01-01

    This paper presents a computational method that can efficiently estimate the ability of students from the log files of a Web-based learning environment capturing their problem solving processes. The computational method developed in this study approximates the posterior distribution of the student's ability obtained from the conventional Bayes…

  20. Ground Processing of Data From the Mars Exploration Rovers

    NASA Technical Reports Server (NTRS)

    Wright, Jesse; Sturdevant, Kathryn; Noble, David

    2006-01-01

    A computer program implements the Earth side of the protocol that governs the transfer of data files generated by the Mars Exploration Rovers. It also provides tools for viewing data in these files and integrating data-product files into automated and manual processes. It reconstitutes files from telemetry data packets. Even if only one packet is received, metadata provide enough information to enable this program to identify and use partial data products. This software can generate commands to acknowledge received files and retransmit missed parts of files, or it can feed a manual process to make decisions about retransmission. The software uses an Extensible Markup Language (XML) data dictionary to provide a generic capability for displaying files of basic types, and uses external "plug-in" application programs to provide more sophisticated displays. This program makes data products available with very low latency, and can trigger automated actions when complete or partial products are received. The software is easy to install and use. The only system requirement for installing the software is a Java J2SE 1.4 platform. Several instances of the software can be executed simultaneously on the same machine.

  1. IMPLICATIONS OF CROSS DOMAIN FIRES IN MULTI-DOMAIN BATTLE

    DTIC Science & Technology

    2017-04-06

    States Air Force 6 April 2017 DISTRIBUTION A. Approved for public release: distribution unlimited. 1 DISCLAIMER The views expressed in this...their cyber capability that will ultimately reinforce their influence and power across the Middle East. In viewing North Korea threat capabilities...land-based assets operating in cross domain denial type operations. In viewing the historical warfare capabilities captured in 13 the case study

  2. Simultaneous capture of metal, sulfur and chlorine by sorbents during fluidized bed incineration.

    PubMed

    Ho, T C; Chuang, T C; Chelluri, S; Lee, Y; Hopper, J R

    2001-01-01

    Metal capture experiments were carried out in an atmospheric fluidized bed incinerator to investigate the effect of sulfur and chlorine on metal capture efficiency and the potential for simultaneous capture of metal, sulfur and chlorine by sorbents. In addition to experimental investigation, the effect of sulfur and chlorine on the metal capture process was also theoretically investigated through performing equilibrium calculations based on the minimization of system free energy. The observed results have indicated that, in general, the existence of sulfur and chlorine enhances the efficiency of metal capture especially at low to medium combustion temperatures. The capture mechanisms appear to include particulate scrubbing and chemisorption depending on the type of sorbents. Among the three sorbents tested, calcined limestone is capable of capturing all the three air pollutants simultaneously. The results also indicate that a mixture of the three sorbents, in general, captures more metals than a single sorbent during the process. In addition, the existence of sulfur and chlorine apparently enhances the metal capture process.

  3. Designed amyloid fibers as materials for selective carbon dioxide capture

    PubMed Central

    Li, Dan; Furukawa, Hiroyasu; Deng, Hexiang; Liu, Cong; Yaghi, Omar M.; Eisenberg, David S.

    2014-01-01

    New materials capable of binding carbon dioxide are essential for addressing climate change. Here, we demonstrate that amyloids, self-assembling protein fibers, are effective for selective carbon dioxide capture. Solid-state NMR proves that amyloid fibers containing alkylamine groups reversibly bind carbon dioxide via carbamate formation. Thermodynamic and kinetic capture-and-release tests show the carbamate formation rate is fast enough to capture carbon dioxide by dynamic separation, undiminished by the presence of water, in both a natural amyloid and designed amyloids having increased carbon dioxide capacity. Heating to 100 °C regenerates the material. These results demonstrate the potential of amyloid fibers for environmental carbon dioxide capture. PMID:24367077

  4. Spatial and Angular Resolution Enhancement of Light Fields Using Convolutional Neural Networks

    NASA Astrophysics Data System (ADS)

    Gul, M. Shahzeb Khan; Gunturk, Bahadir K.

    2018-05-01

    Light field imaging extends the traditional photography by capturing both spatial and angular distribution of light, which enables new capabilities, including post-capture refocusing, post-capture aperture control, and depth estimation from a single shot. Micro-lens array (MLA) based light field cameras offer a cost-effective approach to capture light field. A major drawback of MLA based light field cameras is low spatial resolution, which is due to the fact that a single image sensor is shared to capture both spatial and angular information. In this paper, we present a learning based light field enhancement approach. Both spatial and angular resolution of captured light field is enhanced using convolutional neural networks. The proposed method is tested with real light field data captured with a Lytro light field camera, clearly demonstrating spatial and angular resolution improvement.

  5. Spatial and Angular Resolution Enhancement of Light Fields Using Convolutional Neural Networks.

    PubMed

    Gul, M Shahzeb Khan; Gunturk, Bahadir K

    2018-05-01

    Light field imaging extends the traditional photography by capturing both spatial and angular distribution of light, which enables new capabilities, including post-capture refocusing, post-capture aperture control, and depth estimation from a single shot. Micro-lens array (MLA) based light field cameras offer a cost-effective approach to capture light field. A major drawback of MLA based light field cameras is low spatial resolution, which is due to the fact that a single image sensor is shared to capture both spatial and angular information. In this paper, we present a learning based light field enhancement approach. Both spatial and angular resolution of captured light field is enhanced using convolutional neural networks. The proposed method is tested with real light field data captured with a Lytro light field camera, clearly demonstrating spatial and angular resolution improvement.

  6. A design for a new catalog manager and associated file management for the Land Analysis System (LAS)

    NASA Technical Reports Server (NTRS)

    Greenhagen, Cheryl

    1986-01-01

    Due to the larger number of different types of files used in an image processing system, a mechanism for file management beyond the bounds of typical operating systems is necessary. The Transportable Applications Executive (TAE) Catalog Manager was written to meet this need. Land Analysis System (LAS) users at the EROS Data Center (EDC) encountered some problems in using the TAE catalog manager, including catalog corruption, networking difficulties, and lack of a reliable tape storage and retrieval capability. These problems, coupled with the complexity of the TAE catalog manager, led to the decision to design a new file management system for LAS, tailored to the needs of the EDC user community. This design effort, which addressed catalog management, label services, associated data management, and enhancements to LAS applications, is described. The new file management design will provide many benefits including improved system integration, increased flexibility, enhanced reliability, enhanced portability, improved performance, and improved maintainability.

  7. Rapid Diagnostics of Onboard Sequences

    NASA Technical Reports Server (NTRS)

    Starbird, Thomas W.; Morris, John R.; Shams, Khawaja S.; Maimone, Mark W.

    2012-01-01

    Keeping track of sequences onboard a spacecraft is challenging. When reviewing Event Verification Records (EVRs) of sequence executions on the Mars Exploration Rover (MER), operators often found themselves wondering which version of a named sequence the EVR corresponded to. The lack of this information drastically impacts the operators diagnostic capabilities as well as their situational awareness with respect to the commands the spacecraft has executed, since the EVRs do not provide argument values or explanatory comments. Having this information immediately available can be instrumental in diagnosing critical events and can significantly enhance the overall safety of the spacecraft. This software provides auditing capability that can eliminate that uncertainty while diagnosing critical conditions. Furthermore, the Restful interface provides a simple way for sequencing tools to automatically retrieve binary compiled sequence SCMFs (Space Command Message Files) on demand. It also enables developers to change the underlying database, while maintaining the same interface to the existing applications. The logging capabilities are also beneficial to operators when they are trying to recall how they solved a similar problem many days ago: this software enables automatic recovery of SCMF and RML (Robot Markup Language) sequence files directly from the command EVRs, eliminating the need for people to find and validate the corresponding sequences. To address the lack of auditing capability for sequences onboard a spacecraft during earlier missions, extensive logging support was added on the Mars Science Laboratory (MSL) sequencing server. This server is responsible for generating all MSL binary SCMFs from RML input sequences. The sequencing server logs every SCMF it generates into a MySQL database, as well as the high-level RML file and dictionary name inputs used to create the SCMF. The SCMF is then indexed by a hash value that is automatically included in all command EVRs by the onboard flight software. Second, both the binary SCMF result and the RML input file can be retrieved simply by specifying the hash to a Restful web interface. This interface enables command line tools as well as large sophisticated programs to download the SCMF and RMLs on-demand from the database, enabling a vast array of tools to be built on top of it. One such command line tool can retrieve and display RML files, or annotate a list of EVRs by interleaving them with the original sequence commands. This software has been integrated with the MSL sequencing pipeline where it will serve sequences useful in diagnostics, debugging, and situational awareness throughout the mission.

  8. Introducing the depth transfer curve for 3D capture system characterization

    NASA Astrophysics Data System (ADS)

    Goma, Sergio R.; Atanassov, Kalin; Ramachandra, Vikas

    2011-03-01

    3D technology has recently made a transition from movie theaters to consumer electronic devices such as 3D cameras and camcorders. In addition to what 2D imaging conveys, 3D content also contains information regarding the scene depth. Scene depth is simulated through the strongest brain depth cue, namely retinal disparity. This can be achieved by capturing an image by horizontally separated cameras. Objects at different depths will be projected with different horizontal displacement on the left and right camera images. These images, when fed separately to either eye, leads to retinal disparity. Since the perception of depth is the single most important 3D imaging capability, an evaluation procedure is needed to quantify the depth capture characteristics. Evaluating depth capture characteristics subjectively is a very difficult task since the intended and/or unintended side effects from 3D image fusion (depth interpretation) by the brain are not immediately perceived by the observer, nor do such effects lend themselves easily to objective quantification. Objective evaluation of 3D camera depth characteristics is an important tool that can be used for "black box" characterization of 3D cameras. In this paper we propose a methodology to evaluate the 3D cameras' depth capture capabilities.

  9. Hi-G electronic gated camera for precision trajectory analysis

    NASA Astrophysics Data System (ADS)

    Snyder, Donald R.; Payne, Scott; Keller, Ed; Longo, Salvatore; Caudle, Dennis E.; Walker, Dennis C.; Sartor, Mark A.; Keeler, Joe E.; Kerr, David A.; Fail, R. Wallace; Gannon, Jim; Carrol, Ernie; Jamison, Todd A.

    1997-12-01

    It is extremely difficult and expensive to determine the flight attitude and aimpoint of small maneuvering miniature air vehicles from ground based fixed or tracking photography. Telemetry alone cannot provide sufficient information bandwidth on 'what' the ground tracking is seeing and consequently 'why' it did or did not function properly. Additionally, it is anticipated that 'smart' and 'brilliant' guided vehicles now in development will require a high resolution imaging support system to determine which target and which part of a ground feature is being used for navigation or targeting. Other requirements include support of sub-component separation from developmental supersonic vehicles, where the clean separation from the container is not determinable from ground based film systems and film cameras do not survive vehicle breakup and impact. Hence, the requirement is to develop and demonstrate an imaging support system for development/testing that can provide the flight vehicle developer/analyst with imagery (combined with miniature telemetry sources) sufficient to recreate the trajectory, terminal navigation, and flight termination events. This project is a development and demonstration of a real-time, launch-rated, shuttered, electronic imager, transmitter, and analysis system. This effort demonstrated boresighted imagery from inside small flight vehicles for post flight analysis of trajectory, and capture of ground imagery during random triggered vehicle functions. The initial studies for this capability have been accomplished by the Experimental Dynamics Section of the Air Force Wright Laboratory, Armament Directorate, Eglin AFB, Florida, and the Telemetry Support Branch of the Army Material Research and Development Center at Picatinny Arsenal, New Jersey. It has been determined that at 1/10,000 of a second exposure time, new ultra-miniature CCD sensors have sufficient sensitivity to image key ground target features without blur, thereby providing data for trajectory, timing, and advanced sensor development. This system will be used for ground tracking data reduction in support of small air vehicle and munition testing. It will provide a means of integrating the imagery and telemetry data from the item with ground based photographic support. The technique we have designed will exploit off-the-shelf software and analysis components. A differential GPS survey instrument will establish a photogrammetric calibration grid throughout the range and reference targets along the flight path. Images from the on-board sensor will be used to calibrate the ortho- rectification model in the analysis software. The projectile images will be transmitted and recorded on several tape recorders to insure complete capture of each video field. The images will be combined with a non-linear video editor into a time-correlated record. Each correlated video field will be written to video disk. The files will be converted to DMA compatible format and then analyzed for determination of the projectile altitude, attitude and position in space. The resulting data file will be used to create a photomosaic of the ground the projectile flew over and the targets it saw. The data will be then transformed to a trajectory file and used to generate a graphic overlay that will merge digital photo data of the range with actual images captured. The plan is to superimpose the flight path of the projectile, the path of the weapons aimpoint, and annotation of each internal sequence event. With tools used to produce state-of-the-art computer graphics, we now think it will be possible to reconstruct the test event from the viewpoint of the warhead, the target, and a 'God's-Eye' view looking over the shoulder of the projectile.

  10. Identifying compromised systems through correlation of suspicious traffic from malware behavioral analysis

    NASA Astrophysics Data System (ADS)

    Camilo, Ana E. F.; Grégio, André; Santos, Rafael D. C.

    2016-05-01

    Malware detection may be accomplished through the analysis of their infection behavior. To do so, dynamic analysis systems run malware samples and extract their operating system activities and network traffic. This traffic may represent malware accessing external systems, either to steal sensitive data from victims or to fetch other malicious artifacts (configuration files, additional modules, commands). In this work, we propose the use of visualization as a tool to identify compromised systems based on correlating malware communications in the form of graphs and finding isomorphisms between them. We produced graphs from over 6 thousand distinct network traffic files captured during malware execution and analyzed the existing relationships among malware samples and IP addresses.

  11. Learning from LANCE: Developing a Web Portal Infrastructure for NASA Earth Science Data (Invited)

    NASA Astrophysics Data System (ADS)

    Murphy, K. J.

    2013-12-01

    NASA developed the Land Atmosphere Near real-time Capability for EOS (LANCE) in response to a growing need for timely satellite observations by applications users, operational agencies and researchers. EOS capabilities originally intended for long-term Earth science research were modified to deliver satellite data products with sufficient latencies to meet the needs of the NRT user communities. LANCE products are primarily distributed as HDF data files for analysis, however novel capabilities for distribution of NRT imagery for visualization have been added which have expanded the user base. Additionally systems to convert data to information such as the MODIS hotspot/active fire data are also provided through the Fire Information for Resource Management System (FIRMS). LANCE services include: FTP/HTTP file distribution, Rapid Response (RR), Worldview, Global Imagery Browse Services (GIBS) and FIRMS. This paper discusses how NASA has developed services specifically for LANCE and is taking the lessons learned through these activities to develop an Earthdata Web Infrastructure. This infrastructure is being used as a platform to support development of data portals that address specific science issues for much of EOSDIS data.

  12. Evaluation of a sticky trap (AedesTraP), made from disposable plastic bottles, as a monitoring tool for Aedes aegypti populations.

    PubMed

    de Santos, Eloína Maria Mendonça; de Melo-Santos, Maria Alice Varjal; de Oliveira, Claudia Maria Fontes; Correia, Juliana Cavalcanti; de Albuquerque, Cleide Maria Ribeiro

    2012-09-07

    Dengue virus, which is transmitted by Aedes aegypti mosquitoes is the most important emerging viral disease, infecting more than 50 million people annually. Currently used sticky traps are useful tools for monitoring and control of A. aegypti, despite differences in efficiency, labor requirements and cost. In the present work, a field assay was carried out to evaluate the performance of a sticky trap (AedesTrap), produced using disposable material, in capturing gravid Aedes spp. females. Additionally, conditions necessary for the improved performance of the device, such as number of traps per site and location (indoors or outdoors) were evaluated. During a one year period, traps were placed in a dengue endemic area in 28 day cycles. The trap, named AedesTrap, consisted of a disposable plastic soda bottle coated inside with colophony resin, which served as a sticky substrate. Disposable bottles were donated by restaurants, and traps were made by laboratory staff, reducing the cost of the sticky trap (less than U$3). Mosquito capture in indoor and outdoor areas was compared by placing the traps in laundry room, kitchen or bedroom (indoors) and front or back yard (outdoors). The relationship between the number of AedesTraps and quantity of captured mosquitoes was investigated by utilizing one or three traps/site. During a 28 day cycle, a single AedesTrap was capable of capturing up to 15 A. aegypti in a house, with a mean capture of 0.5 to 2.63 females per premise. The AedesTrap collected three times more outdoors versus indoors. Similarly, the capability of detecting Aedes spp. infestation, and of capturing females, was three times higher when using three AedesTraps per house, compared with one trap per house. AedesTrap was shown to be capable of capturing A. aegypti and other culicidae, providing information on the adult mosquito population, and allowing the identification of areas critically infested by mosquitoes. Low requirements for skilled labor together with easy maintenance and low cost are additional advantages of using this sticky trap.

  13. Evaluation of a sticky trap (AedesTraP), made from disposable plastic bottles, as a monitoring tool for Aedes aegypti populations

    PubMed Central

    2012-01-01

    Background Dengue virus, which is transmitted by Aedes aegypti mosquitoes is the most important emerging viral disease, infecting more than 50 million people annually. Currently used sticky traps are useful tools for monitoring and control of A. aegypti, despite differences in efficiency, labor requirements and cost. In the present work, a field assay was carried out to evaluate the performance of a sticky trap (AedesTrap), produced using disposable material, in capturing gravid Aedes spp. females. Additionally, conditions necessary for the improved performance of the device, such as number of traps per site and location (indoors or outdoors) were evaluated. Methods During a one year period, traps were placed in a dengue endemic area in 28 day cycles. The trap, named AedesTrap, consisted of a disposable plastic soda bottle coated inside with colophony resin, which served as a sticky substrate. Disposable bottles were donated by restaurants, and traps were made by laboratory staff, reducing the cost of the sticky trap (less than U$3). Mosquito capture in indoor and outdoor areas was compared by placing the traps in laundry room, kitchen or bedroom (indoors) and front or back yard (outdoors). The relationship between the number of AedesTraps and quantity of captured mosquitoes was investigated by utilizing one or three traps/site. Results During a 28 day cycle, a single AedesTrap was capable of capturing up to 15 A. aegypti in a house, with a mean capture of 0.5 to 2.63 females per premise. The AedesTrap collected three times more outdoors versus indoors. Similarly, the capability of detecting Aedes spp. infestation, and of capturing females, was three times higher when using three AedesTraps per house, compared with one trap per house. Conclusions AedesTrap was shown to be capable of capturing A. aegypti and other culicidae, providing information on the adult mosquito population, and allowing the identification of areas critically infested by mosquitoes. Low requirements for skilled labor together with easy maintenance and low cost are additional advantages of using this sticky trap. PMID:22958376

  14. Managing Data From Signal-Propagation Experiments

    NASA Technical Reports Server (NTRS)

    Kantak, A. V.

    1989-01-01

    Computer programs generate characteristic plots from amplitudes and phases. Software system enables minicomputer to process data on amplitudes and phases of signals received during experiments in ground-mobile/satellite radio propagation. Takes advantage of file-handling capabilities of UNIX operating system and C programming language. Interacts with user, under whose guidance programs in FORTRAN language generate plots of spectra or other curves of types commonly used to characterize signals. FORTRAN programs used to process file-handling outputs into any of several useful forms.

  15. Development and integration of sub-hourly rainfall-runoff modeling capability within a watershed model

    USDA-ARS?s Scientific Manuscript database

    Increasing urbanization changes runoff patterns to be flashy and instantaneous with decreased base flow. A model with the ability to simulate sub-daily rainfall–runoff processes and continuous simulation capability is required to realistically capture the long-term flow and water quality trends in w...

  16. A quartz nanopillar hemocytometer for high-yield separation and counting of CD4+ T lymphocytes

    NASA Astrophysics Data System (ADS)

    Kim, Dong-Joo; Seol, Jin-Kyeong; Wu, Yu; Ji, Seungmuk; Kim, Gil-Sung; Hyung, Jung-Hwan; Lee, Seung-Yong; Lim, Hyuneui; Fan, Rong; Lee, Sang-Kwon

    2012-03-01

    We report the development of a novel quartz nanopillar (QNP) array cell separation system capable of selectively capturing and isolating a single cell population including primary CD4+ T lymphocytes from the whole pool of splenocytes. Integrated with a photolithographically patterned hemocytometer structure, the streptavidin (STR)-functionalized-QNP (STR-QNP) arrays allow for direct quantitation of captured cells using high content imaging. This technology exhibits an excellent separation yield (efficiency) of ~95.3 +/- 1.1% for the CD4+ T lymphocytes from the mouse splenocyte suspensions and good linear response for quantitating captured CD4+ T-lymphoblasts, which is comparable to flow cytometry and outperforms any non-nanostructured surface capture techniques, i.e. cell panning. This nanopillar hemocytometer represents a simple, yet efficient cell capture and counting technology and may find immediate applications for diagnosis and immune monitoring in the point-of-care setting.We report the development of a novel quartz nanopillar (QNP) array cell separation system capable of selectively capturing and isolating a single cell population including primary CD4+ T lymphocytes from the whole pool of splenocytes. Integrated with a photolithographically patterned hemocytometer structure, the streptavidin (STR)-functionalized-QNP (STR-QNP) arrays allow for direct quantitation of captured cells using high content imaging. This technology exhibits an excellent separation yield (efficiency) of ~95.3 +/- 1.1% for the CD4+ T lymphocytes from the mouse splenocyte suspensions and good linear response for quantitating captured CD4+ T-lymphoblasts, which is comparable to flow cytometry and outperforms any non-nanostructured surface capture techniques, i.e. cell panning. This nanopillar hemocytometer represents a simple, yet efficient cell capture and counting technology and may find immediate applications for diagnosis and immune monitoring in the point-of-care setting. Electronic supplementary information (ESI) available. See DOI: 10.1039/c2nr11338d

  17. Attention Capture by Faces

    ERIC Educational Resources Information Center

    Langton, Stephen R. H.; Law, Anna S.; Burton, A. Mike; Schweinberger, Stefan R.

    2008-01-01

    We report three experiments that investigate whether faces are capable of capturing attention when in competition with other non-face objects. In Experiment 1a participants took longer to decide that an array of objects contained a butterfly target when a face appeared as one of the distracting items than when the face did not appear in the array.…

  18. THE DEVELOPMENT OF IODINE BASED IMPINGER SOLUTIONS FOR THE EFFICIENT CAPTURE OF HG USING DIRECT INJECTION NEBULIZATION - INDUCTIVELY COUPLED PLASMA MASS SPECTROMETRY ANALYSIS

    EPA Science Inventory

    Inductively coupled plasma mass spectrometry (ICP/MS) with direct injection nebulization (DIN) was used to evaluate novel impinger solution compositions capable of capturing elemental mercury (Hgo) in EPA Method 5 type sampling. An iodine based impinger solutoin proved to be ver...

  19. Lecture Capture with Real-Time Rearrangement of Visual Elements: Impact on Student Performance

    ERIC Educational Resources Information Center

    Yu, P.-T.; Wang, B.-Y.; Su, M.-H.

    2015-01-01

    The primary goal of this study is to create and test a lecture-capture system that can rearrange visual elements while recording is still taking place, in such a way that student performance can be positively influenced. The system we have devised is capable of integrating and rearranging multimedia sources, including learning content, the…

  20. Web-based document and content management with off-the-shelf software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schuster, J

    1999-03-18

    This, then, is the current status of the project: Since we made the switch to Intradoc, we are now treating the project as a document and image management system. In reality, it could be considered a document and content management system since we can manage almost any file input to the system such as video or audio. At present, however, we are concentrating on images. As mentioned above, my CRADA funding was only targeted at including thumbnails of images in Intradoc. We still had to modify Intradoc so that it would compress images submitted to the system. All processing ofmore » files submitted to Intradoc is handled in what is called the Document Refinery. Even though MrSID created thumbnails in the process of compressing an image, work needed to be done to somehow build this capability into the Document Refinery. Therefore we made the decision to contract the Intradoc Engineering Team to perform this custom development work. To make Intradoc even more capable of handling images, we have also contracted for customization of the Document Refinery to accept Adobe PhotoShop and Illustrator file in their native format.« less

  1. Comparing Commercial WWW Browsers.

    ERIC Educational Resources Information Center

    Notess, Greg R.

    1995-01-01

    Four commercial World Wide Web browsers are evaluated for features such as handling of WWW protocols and different URLs: FTP, Telnet, Gopher and WAIS, and e-mail and news; bookmark capabilities; navigation features; file management; and security support. (JKP)

  2. Software Document Inventory Program

    NASA Technical Reports Server (NTRS)

    Merwarth, P. D.

    1984-01-01

    Program offers ways to file and locate sources of reference. DOCLIB system consists of two parts to serve needs of two type of users: general user and librarian. DOCLIB systems provides user with interactive, menudriven document inventory capability.

  3. FERMI/GLAST Integrated Trending and Plotting System Release 5.0

    NASA Technical Reports Server (NTRS)

    Ritter, Sheila; Brumer, Haim; Reitan, Denise

    2012-01-01

    An Integrated Trending and Plotting System (ITPS) is a trending, analysis, and plotting system used by space missions to determine performance and status of spacecraft and its instruments. ITPS supports several NASA mission operational control centers providing engineers, ground controllers, and scientists with access to the entire spacecraft telemetry data archive for the life of the mission, and includes a secure Web component for remote access. FERMI/GLAST ITPS Release 5.0 features include the option to display dates (yyyy/ddd) instead of orbit numbers along orbital Long-Term Trend (LTT) plot axis, the ability to save statistics from daily production plots as image files, and removal of redundant edit/create Input Definition File (IDF) screens. Other features are a fix to address invalid packet lengths, a change in naming convention of image files in order to use in script, the ability to save all ITPS plot images (from Windows or the Web) as GIF or PNG format, the ability to specify ymin and ymax on plots where previously only the desired range could be specified, Web interface capability to plot IDFs that contain out-oforder page and plot numbers, and a fix to change all default file names to show yyyydddhhmmss time stamps instead of hhmmssdddyyyy. A Web interface capability sorts files based on modification date (with newest one at top), and the statistics block can be displayed via a Web interface. Via the Web, users can graphically view the volume of telemetry data from each day contained in the ITPS archive in the Web digest. The ITPS could be also used in nonspace fields that need to plot data or trend data, including financial and banking systems, aviation and transportation systems, healthcare and educational systems, sales and marketing, and housing and construction.

  4. 75 FR 65546 - Self-Regulatory Organizations; National Securities Clearing Corporation; Notice of Filing of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-25

    ... interested persons. \\1\\ 15 U.S.C. 78s(b)(1). \\2\\ 17 CFR 240.19b-4. \\3\\ The text of the proposed rule change... comparison or recording through other NSCC trade capture services; (2) tracking, storage, and maintenance of... obligations. The tracking, storage, and maintenance functionality of the OW will provide transparency, will...

  5. Storage, retrieval, and edit of digital video using Motion JPEG

    NASA Astrophysics Data System (ADS)

    Sudharsanan, Subramania I.; Lee, D. H.

    1994-04-01

    In a companion paper we describe a Micro Channel adapter card that can perform real-time JPEG (Joint Photographic Experts Group) compression of a 640 by 480 24-bit image within 1/30th of a second. Since this corresponds to NTSC video rates at considerably good perceptual quality, this system can be used for real-time capture and manipulation of continuously fed video. To facilitate capturing the compressed video in a storage medium, an IBM Bus master SCSI adapter with cache is utilized. Efficacy of the data transfer mechanism is considerably improved using the System Control Block architecture, an extension to Micro Channel bus masters. We show experimental results that the overall system can perform at compressed data rates of about 1.5 MBytes/second sustained and with sporadic peaks to about 1.8 MBytes/second depending on the image sequence content. We also describe mechanisms to access the compressed data very efficiently through special file formats. This in turn permits creation of simpler sequence editors. Another advantage of the special file format is easy control of forward, backward and slow motion playback. The proposed method can be extended for design of a video compression subsystem for a variety of personal computing systems.

  6. Wireless local area network in a prehospital environment

    PubMed Central

    Chen, Dongquan; Soong, Seng-jaw; Grimes, Gary J; Orthner, Helmuth F

    2004-01-01

    Background Wireless local area networks (WLANs) are considered the next generation of clinical data network. They open the possibility for capturing clinical data in a prehospital setting (e.g., a patient's home) using various devices, such as personal digital assistants, laptops, digital electrocardiogram (EKG) machines, and even cellular phones, and transmitting the captured data to a physician or hospital. The transmission rate is crucial to the applicability of the technology in the prehospital setting. Methods We created two separate WLANs to simulate a virtual local are network environment such as in a patient's home or an emergency room (ER). The effects of different methods of data transmission, number of clients, and roaming among different access points on the file transfer rate were determined. Results The present results suggest that it is feasible to transfer small files such as patient demographics and EKG data from the patient's home to the ER at a reasonable speed. Encryption, user control, and access control were implemented and results discussed. Conclusions Implementing a WLAN in a centrally managed and multiple-layer-controlled access control server is the key to ensuring its security and accessibility. Future studies should focus on product capacity, speed, compatibility, interoperability, and security management. PMID:15339336

  7. Wireless local area network in a prehospital environment.

    PubMed

    Chen, Dongquan; Soong, Seng-jaw; Grimes, Gary J; Orthner, Helmuth F

    2004-08-31

    Wireless local area networks (WLANs) are considered the next generation of clinical data network. They open the possibility for capturing clinical data in a prehospital setting (e.g., a patient's home) using various devices, such as personal digital assistants, laptops, digital electrocardiogram (EKG) machines, and even cellular phones, and transmitting the captured data to a physician or hospital. The transmission rate is crucial to the applicability of the technology in the prehospital setting. We created two separate WLANs to simulate a virtual local are network environment such as in a patient's home or an emergency room (ER). The effects of different methods of data transmission, number of clients, and roaming among different access points on the file transfer rate were determined. The present results suggest that it is feasible to transfer small files such as patient demographics and EKG data from the patient's home to the ER at a reasonable speed. Encryption, user control, and access control were implemented and results discussed. Implementing a WLAN in a centrally managed and multiple-layer-controlled access control server is the key to ensuring its security and accessibility. Future studies should focus on product capacity, speed, compatibility, interoperability, and security management.

  8. Laboratory data manipulation tools basic data handling programs. Volume 2: Detailed software/hardware documentation

    NASA Technical Reports Server (NTRS)

    1981-01-01

    The set of computer programs described allows for data definition, data input, and data transfer between the LSI-11 microcomputers and the VAX-11/780 minicomputer. Program VAXCOM allows for a simple method of textual file transfer from the LSI to the VAX. Program LSICOM allows for easy file transfer from the VAX to the LSI. Program TTY changes the LSI-11 operators console to the LSI's printing device. Program DICTIN provides a means for defining a data set for input to either computer. Program DATAIN is a simple to operate data entry program which is capable of building data files on either machine. Program LEDITV is an extremely powerful, easy to use, line oriented text editor. Program COPYSBF is designed to print out textual files on the line printer without character loss from FORTRAN carriage control or wide record transfer.

  9. Conflict Detection Algorithm to Minimize Locking for MPI-IO Atomicity

    NASA Astrophysics Data System (ADS)

    Sehrish, Saba; Wang, Jun; Thakur, Rajeev

    Many scientific applications require high-performance concurrent I/O accesses to a file by multiple processes. Those applications rely indirectly on atomic I/O capabilities in order to perform updates to structured datasets, such as those stored in HDF5 format files. Current support for atomicity in MPI-IO is provided by locking around the operations, imposing lock overhead in all situations, even though in many cases these operations are non-overlapping in the file. We propose to isolate non-overlapping accesses from overlapping ones in independent I/O cases, allowing the non-overlapping ones to proceed without imposing lock overhead. To enable this, we have implemented an efficient conflict detection algorithm in MPI-IO using MPI file views and datatypes. We show that our conflict detection scheme incurs minimal overhead on I/O operations, making it an effective mechanism for avoiding locks when they are not needed.

  10. Mission Operations Center (MOC) - Precipitation Processing System (PPS) Interface Software System (MPISS)

    NASA Technical Reports Server (NTRS)

    Ferrara, Jeffrey; Calk, William; Atwell, William; Tsui, Tina

    2013-01-01

    MPISS is an automatic file transfer system that implements a combination of standard and mission-unique transfer protocols required by the Global Precipitation Measurement Mission (GPM) Precipitation Processing System (PPS) to control the flow of data between the MOC and the PPS. The primary features of MPISS are file transfers (both with and without PPS specific protocols), logging of file transfer and system events to local files and a standard messaging bus, short term storage of data files to facilitate retransmissions, and generation of file transfer accounting reports. The system includes a graphical user interface (GUI) to control the system, allow manual operations, and to display events in real time. The PPS specific protocols are an enhanced version of those that were developed for the Tropical Rainfall Measuring Mission (TRMM). All file transfers between the MOC and the PPS use the SSH File Transfer Protocol (SFTP). For reports and data files generated within the MOC, no additional protocols are used when transferring files to the PPS. For observatory data files, an additional handshaking protocol of data notices and data receipts is used. MPISS generates and sends to the PPS data notices containing data start and stop times along with a checksum for the file for each observatory data file transmitted. MPISS retrieves the PPS generated data receipts that indicate the success or failure of the PPS to ingest the data file and/or notice. MPISS retransmits the appropriate files as indicated in the receipt when required. MPISS also automatically retrieves files from the PPS. The unique feature of this software is the use of both standard and PPS specific protocols in parallel. The advantage of this capability is that it supports users that require the PPS protocol as well as those that do not require it. The system is highly configurable to accommodate the needs of future users.

  11. PIMS-Universal Payload Information Management

    NASA Technical Reports Server (NTRS)

    Elmore, Ralph; McNair, Ann R. (Technical Monitor)

    2002-01-01

    As the overall manager and integrator of International Space Station (ISS) science payloads and experiments, the Payload Operations Integration Center (POIC) at Marshall Space Flight Center had a critical need to provide an information management system for exchange and management of ISS payload files as well as to coordinate ISS payload related operational changes. The POIC's information management system has a fundamental requirement to provide secure operational access not only to users physically located at the POIC, but also to provide collaborative access to remote experimenters and International Partners. The Payload Information Management System (PIMS) is a ground based electronic document configuration management and workflow system that was built to service that need. Functionally, PIMS provides the following document management related capabilities: 1. File access control, storage and retrieval from a central repository vault. 2. Collect supplemental data about files in the vault. 3. File exchange with a PMS GUI client, or any FTP connection. 4. Files placement into an FTP accessible dropbox for pickup by interfacing facilities, included files transmitted for spacecraft uplink. 5. Transmission of email messages to users notifying them of new version availability. 6. Polling of intermediate facility dropboxes for files that will automatically be processed by PIMS. 7. Provide an API that allows other POIC applications to access PIMS information. Functionally, PIMS provides the following Change Request processing capabilities: 1. Ability to create, view, manipulate, and query information about Operations Change Requests (OCRs). 2. Provides an adaptable workflow approval of OCRs with routing through developers, facility leads, POIC leads, reviewers, and implementers. Email messages can be sent to users either involving them in the workflow process or simply notifying them of OCR approval progress. All PIMS document management and OCR workflow controls are coordinated through and routed to individual user's "to do" list tasks. A user is given a task when it is their turn to perform some action relating to the approval of the Document or OCR. The user's available actions are restricted to only functions available for the assigned task. Certain actions, such as review or action implementation by non-PIMS users, can also be coordinated through automated emails.

  12. Metal–organic framework based mixed matrix membranes: a solution for highly efficient CO2 capture?† †Electronic supplementary information (ESI) available. See DOI: 10.1039/c4cs00437j Click here for additional data file.

    PubMed Central

    Seoane, Beatriz; Coronas, Joaquin; Gascon, Ignacio; Benavides, Miren Etxeberria; Karvan, Oğuz; Caro, Jürgen; Kapteijn, Freek

    2015-01-01

    The field of metal–organic framework based mixed matrix membranes (M4s) is critically reviewed, with special emphasis on their application in CO2 capture during energy generation. After introducing the most relevant parameters affecting membrane performance, we define targets in terms of selectivity and productivity based on existing literature on process design for pre- and post-combustion CO2 capture. Subsequently, the state of the art in M4s is reviewed against these targets. Because final application of these membranes will only be possible if thin separation layers can be produced, the latest advances in the manufacture of M4 hollow fibers are discussed. Finally, the recent efforts in understanding the separation performance of these complex composite materials and future research directions are outlined. PMID:25692487

  13. A Pixel Correlation Technique for Smaller Telescopes to Measure Doubles

    NASA Astrophysics Data System (ADS)

    Wiley, E. O.

    2013-04-01

    Pixel correlation uses the same reduction techniques as speckle imaging but relies on autocorrelation among captured pixel hits rather than true speckles. A video camera operating at speeds (8-66 milliseconds) similar to lucky imaging to capture 400-1,000 video frames. The AVI files are converted to bitmap images and analyzed using the interferometric algorithms in REDUC using all frames. This results in a series of corellograms from which theta and rho can be measured. Results using a 20 cm (8") Dall-Kirkham working at f22.5 are presented for doubles with separations between 1" to 5.7" under average seeing conditions. I conclude that this form of visualizing and analyzing visual double stars is a viable alternative to lucky imaging that can be employed by telescopes that are too small in aperture to capture a sufficient number of speckles for true speckle interferometry.

  14. Development of a High Fidelity Dynamic Module of the Advanced Resistive Exercise Device (ARED) Using Adams

    NASA Technical Reports Server (NTRS)

    Humphreys, B. T.; Thompson, W. K.; Lewandowski, B. E.; Cadwell, E. E.; Newby, N. J.; Fincke, R. S.; Sheehan, C.; Mulugeta, L.

    2012-01-01

    NASA's Digital Astronaut Project (DAP) implements well-vetted computational models to predict and assess spaceflight health and performance risks, and enhance countermeasure development. DAP provides expertise and computation tools to its research customers for model development, integration, or analysis. DAP is currently supporting the NASA Exercise Physiology and Countermeasures (ExPC) project by integrating their biomechanical models of specific exercise movements with dynamic models of the devices on which the exercises were performed. This presentation focuses on the development of a high fidelity dynamic module of the Advanced Resistive Exercise Device (ARED) on board the ISS. The ARED module, illustrated in the figure below, was developed using the Adams (MSC Santa Ana, California) simulation package. The Adams package provides the capabilities to perform multi rigid body, flexible body, and mixed dynamic analyses of complex mechanisms. These capabilities were applied to accurately simulate: Inertial and mass properties of the device such as the vibration isolation system (VIS) effects and other ARED components, Non-linear joint friction effects, The gas law dynamics of the vacuum cylinders and VIS components using custom written differential state equations, The ARED flywheel dynamics, including torque limiting clutch. Design data from the JSC ARED Engineering team was utilized in developing the model. This included solid modeling geometry files, component/system specifications, engineering reports and available data sets. The Adams ARED module is importable into LifeMOD (Life Modeler, Inc., San Clemente, CA) for biomechanical analyses of different resistive exercises such as squat and dead-lift. Using motion capture data from ground test subjects, the ExPC developed biomechanical exercise models in LifeMOD. The Adams ARED device module was then integrated with the exercise subject model into one integrated dynamic model. This presentation will describe the development of the Adams ARED module including its capabilities, limitations, and assumptions. Preliminary results, validation activities, and a practical application of the module to inform the relative effect of the flywheels on exercise will be discussed.

  15. Functional evaluation of telemedicine with super high definition images and B-ISDN.

    PubMed

    Takeda, H; Matsumura, Y; Okada, T; Kuwata, S; Komori, M; Takahashi, T; Minatom, K; Hashimoto, T; Wada, M; Fujio, Y

    1998-01-01

    In order to determine whether a super high definition (SHD) image running at a series of 2048 resolution x 2048 line x 60 frame/sec was capable of telemedicine, we established a filing system for medical images and two experiments for transmission of high quality images were performed. All images of various types, produced from one case of ischemic heart disease were digitized and registered into the filing system. Images consisted of plain chest x-ray, electrocardiogram, ultrasound cardiogram, cardiac scintigram, coronary angiogram, left ventriculogram and so on. All images were animated and totaled a number of 243. We prepared a graphic user interface (GUI) for image retrieval based on the medical events and modalities. Twenty one cardiac specialists evaluated quality of the SHD images to be somewhat poor compared to the original pictures but sufficient for making diagnoses, and effective as a tool for teaching and case study purposes. The system capability of simultaneously displaying several animated images was especially deemed effective in grasping comprehension of diagnosis. Efficient input methods and creating capacity of filing all produced images are future issue. Using B-ISDN network, the SHD file was prefetched to the servers at Kyoto University Hospital and BBCC (Bradband ISDN Business chance & Culture Creation) laboratory as an telemedicine experiment. Simultaneous video conference system, the control of image retrieval and pointing function made the teleconference successful in terms of high quality of medical images, quick response time and interactive data exchange.

  16. A Scientific Data Provenance Harvester for Distributed Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stephan, Eric G.; Raju, Bibi; Elsethagen, Todd O.

    Data provenance provides a way for scientists to observe how experimental data originates, conveys process history, and explains influential factors such as experimental rationale and associated environmental factors from system metrics measured at runtime. The US Department of Energy Office of Science Integrated end-to-end Performance Prediction and Diagnosis for Extreme Scientific Workflows (IPPD) project has developed a provenance harvester that is capable of collecting observations from file based evidence typically produced by distributed applications. To achieve this, file based evidence is extracted and transformed into an intermediate data format inspired in part by W3C CSV on the Web recommendations, calledmore » the Harvester Provenance Application Interface (HAPI) syntax. This syntax provides a general means to pre-stage provenance into messages that are both human readable and capable of being written to a provenance store, Provenance Environment (ProvEn). HAPI is being applied to harvest provenance from climate ensemble runs for Accelerated Climate Modeling for Energy (ACME) project funded under the U.S. Department of Energy’s Office of Biological and Environmental Research (BER) Earth System Modeling (ESM) program. ACME informally provides provenance in a native form through configuration files, directory structures, and log files that contain success/failure indicators, code traces, and performance measurements. Because of its generic format, HAPI is also being applied to harvest tabular job management provenance from Belle II DIRAC scheduler relational database tables as well as other scientific applications that log provenance related information.« less

  17. Improved Ascertainment of Pregnancy-Associated Suicides and Homicides in North Carolina.

    PubMed

    Austin, Anna E; Vladutiu, Catherine J; Jones-Vessey, Kathleen A; Norwood, Tammy S; Proescholdbell, Scott K; Menard, M Kathryn

    2016-11-01

    Injuries, including those resulting from violence, are a leading cause of death during pregnancy and the postpartum period. North Carolina, along with other states, has implemented surveillance systems to improve reporting of maternal deaths, but their ability to capture violent deaths is unknown. The purpose of this study was to quantify the improvement in ascertainment of pregnancy-associated suicides and homicides by linking data from the North Carolina Violent Death Reporting System (NC-VDRS) to traditional maternal mortality surveillance files. Enhanced case ascertainment was used to identify suicides and homicides that occurred during or up to 1 year after pregnancy from 2005 to 2011 in North Carolina. NC-VDRS data were linked to traditional maternal mortality surveillance files (i.e., death certificates with any mention of pregnancy or matched to a live birth or fetal death record and hospital discharge records for women who died in the hospital with a pregnancy-related diagnosis). Mortality ratios were calculated by case ascertainment method. Analyses were conducted in 2015. A total of 29 suicides and 55 homicides were identified among pregnant and postpartum women through enhanced case ascertainment as compared with 20 and 34, respectively, from traditional case ascertainment. Linkage to NC-VDRS captured 55.6% more pregnancy-associated violent deaths than traditional surveillance alone, resulting in higher mortality ratios for suicide (2.3 vs 3.3 deaths per 100,000 live births) and homicide (3.9 vs 6.2 deaths per 100,000 live births). Linking traditional maternal mortality files to NC-VDRS provided a notable improvement in ascertainment of pregnancy-associated violent deaths. Published by Elsevier Inc.

  18. Efficacy of CM-Wire, M-Wire, and Nickel-Titanium Instruments for Removing Filling Material from Curved Root Canals: A Micro-Computed Tomography Study.

    PubMed

    Rodrigues, Clarissa Teles; Duarte, Marco Antonio Hungaro; de Almeida, Marcela Milanezi; de Andrade, Flaviana Bombarda; Bernardineli, Norberti

    2016-11-01

    The aim of this ex vivo study was to evaluate the removal of filling material after using CM-wire, M-wire, and nickel-titanium instruments in both reciprocating and rotary motions in curved canals. Thirty maxillary lateral incisors were divided into 9 groups according to retreatment procedures: Reciproc R25 followed by Mtwo 40/.04 and ProDesign Logic 50/.01 files; ProDesign R 25/.06 followed by ProDesign Logic 40/.05 and ProDesign Logic 50/.01 files; and Gates-Glidden drills, Hedström files, and K-files up to apical size 30 followed by K-file 40 and K-file 50 up to the working length. Micro-computed tomography scans were performed before and after each reinstrumentation procedure to evaluate root canal filling removal. Statistical analysis was performed with Kruskal-Wallis, Friedman, and Wilcoxon tests (P < .05). No significant differences in filling material removal were found in the 3 groups of teeth. The use of Mtwo and ProDesign Logic 40/.05 rotary files did not enhance filling material removal after the use of reciprocating files. The use of ProDesign Logic 50/.01 files significantly reduced the amount of filling material at the apical levels compared with the use of reciprocating files. Association of reciprocating and rotary files was capable of removing a large amount of filling material in the retreatment of curved canals, irrespective of the type of alloy of the instruments. The use of a ProDesign Logic 50/.01 file for apical preparation significantly reduced the amount of remnant material in the apical portion when compared with reciprocating instruments. Copyright © 2016 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  19. Purity and cleanness of aerogel as a cosmic dust capture medium

    NASA Technical Reports Server (NTRS)

    Tsou, P.; Fleming, R. H.; Lindley, P. M.; Craig, A. Y.; Blake, D.

    1994-01-01

    The capability for capturing micrometeoroids intact through laboratory simulations and in space in passive underdense silica aerogel offers a valuable tool for cosmic dust research. The integrity of the sample handling medium can substantially modify the integrity of the sample. Intact capture is a violent hypervelocity event: the integrity of the capturing medium can cause even greater modification of the sample. Doubts of the suitability of silica aerogel as a capture medium were raised at the 20th LPSC, and questions were raised again at the recent workshop on Particle Capture, Recovery, and Velocity Trajectory Measurement Technologies. Assessment of aerogel's volatile components and carbon contents have been made. We report the results of laboratory measurements of the purity and cleanliness of silica aerogel used for several Sample Return Experiments flown on the Get Away Special program.

  20. Development of a database for Louisiana highway bridge scour data : technical summary.

    DOT National Transportation Integrated Search

    1999-10-01

    The objectives of the project included: 1) developed a database with manipulation capabilities such as data retrieval, visualization, and update; 2) Input the existing scour data from DOTD files into the database.

  1. 78 FR 54968 - Defense Federal Acquisition Regulation Supplement; Technical Amendments

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-09

    ...--COMPETITION REQUIREMENTS 0 3. Revise the section heading of 203.302-3 to read as follows: 206.302-3 Industrial mobilization, engineering, developmental, or research capability, or expert services. [FR Doc. 2013-21835 Filed...

  2. Statistical evaluation of PACSTAT random number generation capabilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Piepel, G.F.; Toland, M.R.; Harty, H.

    1988-05-01

    This report summarizes the work performed in verifying the general purpose Monte Carlo driver-program PACSTAT. The main objective of the work was to verify the performance of PACSTAT's random number generation capabilities. Secondary objectives were to document (using controlled configuration management procedures) changes made in PACSTAT at Pacific Northwest Laboratory, and to assure that PACSTAT input and output files satisfy quality assurance traceability constraints. Upon receipt of the PRIME version of the PACSTAT code from the Basalt Waste Isolation Project, Pacific Northwest Laboratory staff converted the code to run on Digital Equipment Corporation (DEC) VAXs. The modifications to PACSTAT weremore » implemented using the WITNESS configuration management system, with the modifications themselves intended to make the code as portable as possible. Certain modifications were made to make the PACSTAT input and output files conform to quality assurance traceability constraints. 10 refs., 17 figs., 6 tabs.« less

  3. The distributed production system of the SuperB project: description and results

    NASA Astrophysics Data System (ADS)

    Brown, D.; Corvo, M.; Di Simone, A.; Fella, A.; Luppi, E.; Paoloni, E.; Stroili, R.; Tomassetti, L.

    2011-12-01

    The SuperB experiment needs large samples of MonteCarlo simulated events in order to finalize the detector design and to estimate the data analysis performances. The requirements are beyond the capabilities of a single computing farm, so a distributed production model capable of exploiting the existing HEP worldwide distributed computing infrastructure is needed. In this paper we describe the set of tools that have been developed to manage the production of the required simulated events. The production of events follows three main phases: distribution of input data files to the remote site Storage Elements (SE); job submission, via SuperB GANGA interface, to all available remote sites; output files transfer to CNAF repository. The job workflow includes procedures for consistency checking, monitoring, data handling and bookkeeping. A replication mechanism allows storing the job output on the local site SE. Results from 2010 official productions are reported.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ilsche, Thomas; Schuchart, Joseph; Cope, Joseph

    Event tracing is an important tool for understanding the performance of parallel applications. As concurrency increases in leadership-class computing systems, the quantity of performance log data can overload the parallel file system, perturbing the application being observed. In this work we present a solution for event tracing at leadership scales. We enhance the I/O forwarding system software to aggregate and reorganize log data prior to writing to the storage system, significantly reducing the burden on the underlying file system for this type of traffic. Furthermore, we augment the I/O forwarding system with a write buffering capability to limit the impactmore » of artificial perturbations from log data accesses on traced applications. To validate the approach, we modify the Vampir tracing tool to take advantage of this new capability and show that the approach increases the maximum traced application size by a factor of 5x to more than 200,000 processors.« less

  5. Automated Predictive Diagnosis (APD): A 3-tiered shell for building expert systems for automated predictions and decision making

    NASA Technical Reports Server (NTRS)

    Steib, Michael

    1991-01-01

    The APD software features include: On-line help, Three level architecture, (Logic environments, Setup/Application environment, Data environment), Explanation capability, and File handling. The kinds of experimentation and record keeping that leads to effective expert systems is facilitated by: (1) a library of inferencing modules (in the logic environment); (2) an explanation capability which reveals logic strategies to users; (3) automated file naming conventions; (4) an information retrieval system; and (5) on-line help. These aid with effective use of knowledge, debugging and experimentation. Since the APD software anticipates the logical rules becoming complicated, it is embedded in a production system language (CLIPS) to insure the full power of the production system paradigm of CLIPS and availability of the procedural language C. The development is discussed of the APD software and three example applications: toy, experimental, and operational prototype for submarine maintenance predictions.

  6. An introduction to the Marshall information retrieval and display system

    NASA Technical Reports Server (NTRS)

    1974-01-01

    An on-line terminal oriented data storage and retrieval system is presented which allows a user to extract and process information from stored data bases. The use of on-line terminals for extracting and displaying data from the data bases provides a fast and responsive method for obtaining needed information. The system consists of general purpose computer programs that provide the overall capabilities of the total system. The system can process any number of data files via a Dictionary (one for each file) which describes the data format to the system. New files may be added to the system at any time, and reprogramming is not required. Illustrations of the system are shown, and sample inquiries and responses are given.

  7. A Subgrid Approach for Modeling Microtopography Effects on Overland Flow: Application to Polygonal Tundra: Modeling Archive

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ahmad Jan; Ethan Coon; Scott Painter

    This Modeling Archive is in support of an NGEE Arctic manuscript under review. A new subgrid model was implemented in the Advanced Terrestrial Simulator (ATS) to capture micro-topography effects on surface flow. A comparison of the fine-scale simulations on seven individual ice-wedge polygons and a cluster of polygons was made between the results of the subgrid model and no-subgrid model. Our finding confirms that the effects of small-scale spatial heterogeneities can be captured in the coarsened models. The dataset contains meshes, inputfiles, subgrid parameters used in the simulations. Python scripts for post-processing and files for geometric analyses are also included.

  8. Transforming War Fighting through the Use of Service Based Architecture (SBA) Technology

    DTIC Science & Technology

    2006-05-04

    near-real-time video & telemetry to users on network using standard web-based protocols – Provides web-based access to archived video files MTI...Target Tracks Service Capabilities – Disseminates near-real-time MTI and Target Tracks to users on network based on consumer specified geographic...filter IBS SIGINT Service Capabilities – Disseminates near-real-time IBS SIGINT data to users on network based on consumer specified geographic filter

  9. Enhanced Modeling of First-Order Plant Equations of Motion for Aeroelastic and Aeroservoelastic Applications

    NASA Technical Reports Server (NTRS)

    Pototzky, Anthony S.

    2010-01-01

    A methodology is described for generating first-order plant equations of motion for aeroelastic and aeroservoelastic applications. The description begins with the process of generating data files representing specialized mode-shapes, such as rigid-body and control surface modes, using both PATRAN and NASTRAN analysis. NASTRAN executes the 146 solution sequence using numerous Direct Matrix Abstraction Program (DMAP) calls to import the mode-shape files and to perform the aeroelastic response analysis. The aeroelastic response analysis calculates and extracts structural frequencies, generalized masses, frequency-dependent generalized aerodynamic force (GAF) coefficients, sensor deflections and load coefficients data as text-formatted data files. The data files are then re-sequenced and re-formatted using a custom written FORTRAN program. The text-formatted data files are stored and coefficients for s-plane equations are fitted to the frequency-dependent GAF coefficients using two Interactions of Structures, Aerodynamics and Controls (ISAC) programs. With tabular files from stored data created by ISAC, MATLAB generates the first-order aeroservoelastic plant equations of motion. These equations include control-surface actuator, turbulence, sensor and load modeling. Altitude varying root-locus plot and PSD plot results for a model of the F-18 aircraft are presented to demonstrate the capability.

  10. Capturing asteroids into bound orbits around the earth: Massive early return on an asteroid terminal defense system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hills, J.G.

    1992-02-06

    Nuclear explosives may be used to capture small asteroids (e.g., 20--50 meters in diameter) into bound orbits around the earth. The captured objects could be used for construction material for manned and unmanned activity in Earth orbit. Asteroids with small approach velocities, which are the ones most likely to have close approaches to the Earth, require the least energy for capture. They are particularly easy to capture if they pass within one Earth radius of the surface of the Earth. They could be intercepted with intercontinental missiles if the latter were retrofit with a more flexible guiding and homing capability.more » This asteroid capture-defense system could be implemented in a few years at low cost by using decommissioned ICMs. The economic value of even one captured asteroid is many times the initial investment. The asteroid capture system would be an essential part of the learning curve for dealing with larger asteroids that can hit the earth.« less

  11. ION Configuration Editor

    NASA Technical Reports Server (NTRS)

    Borgen, Richard L.

    2013-01-01

    The configuration of ION (Inter - planetary Overlay Network) network nodes is a manual task that is complex, time-consuming, and error-prone. This program seeks to accelerate this job and produce reliable configurations. The ION Configuration Editor is a model-based smart editor based on Eclipse Modeling Framework technology. An ION network designer uses this Eclipse-based GUI to construct a data model of the complete target network and then generate configurations. The data model is captured in an XML file. Intrinsic editor features aid in achieving model correctness, such as field fill-in, type-checking, lists of valid values, and suitable default values. Additionally, an explicit "validation" feature executes custom rules to catch more subtle model errors. A "survey" feature provides a set of reports providing an overview of the entire network, enabling a quick assessment of the model s completeness and correctness. The "configuration" feature produces the main final result, a complete set of ION configuration files (eight distinct file types) for each ION node in the network.

  12. Development of an e-VLBI Data Transport Software Suite with VDIF

    NASA Technical Reports Server (NTRS)

    Sekido, Mamoru; Takefuji, Kazuhiro; Kimura, Moritaka; Hobiger, Thomas; Kokado, Kensuke; Nozawa, Kentarou; Kurihara, Shinobu; Shinno, Takuya; Takahashi, Fujinobu

    2010-01-01

    We have developed a software library (KVTP-lib) for VLBI data transmission over the network with the VDIF (VLBI Data Interchange Format), which is the newly proposed standard VLBI data format designed for electronic data transfer over the network. The software package keeps the application layer (VDIF frame) and the transmission layer separate, so that each layer can be developed efficiently. The real-time VLBI data transmission tool sudp-send is an application tool based on the KVTP-lib library. sudp-send captures the VLBI data stream from the VSI-H interface with the K5/VSI PC-board and writes the data to file in standard Linux file format or transmits it to the network using the simple- UDP (SUDP) protocol. Another tool, sudp-recv , receives the data stream from the network and writes the data to file in a specific VLBI format (K5/VSSP, VDIF, or Mark 5B). This software system has been implemented on the Wettzell Tsukuba baseline; evaluation before operational employment is under way.

  13. Atmospheric Dispersion Capability for T2VOC

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oldenburg, Curtis M.

    2005-09-19

    Atmospheric transport by variable-K theory dispersion has been added to T2VOC. The new code, T2VOCA, models flow and transport in the subsurface identically to T2VOC, but includes also the capability for modeling passive multicomponent variable-K theory dispersion in an atmospheric region assumed to be flat, horizontal, and with a logarithmic wind profile. The specification of the logarithmic wind profile in the T2VOC input file is automated through the use of a build code called ATMDISPV. The new capability is demonstrated on 2-D and 3-D example problems described in this report.

  14. The integrated analysis capability (IAC Level 2.0)

    NASA Technical Reports Server (NTRS)

    Frisch, Harold P.; Vos, Robert G.

    1988-01-01

    The critical data management issues involved in the development of the integral analysis capability (IAC), Level 2, to support the design analysis and performance evaluation of large space structures, are examined. In particular, attention is given to the advantages and disadvantages of the formalized data base; merging of the matrix and relational data concepts; data types, query operators, and data handling; sequential versus direct-access files; local versus global data access; programming languages and host machines; and data flow techniques. The discussion also covers system architecture, recent system level enhancements, executive/user interface capabilities, and technology applications.

  15. The design of a microfluidic biochip for the rapid, multiplexed detection of foodborne pathogens by surface plasmon resonance imaging

    NASA Astrophysics Data System (ADS)

    Zordan, Michael D.; Grafton, Meggie M. G.; Park, Kinam; Leary, James F.

    2010-02-01

    The rapid detection of foodborne pathogens is increasingly important due to the rising occurrence of contaminated food supplies. We have previously demonstrated the design of a hybrid optical device that has the capability to perform realtime surface plasmon resonance (SPR) and epi-fluorescence imaging. We now present the design of a microfluidic biochip consisting of a two-dimensional array of functionalized gold spots. The spots on the array have been functionalized with capture peptides that specifically bind E. coli O157:H7 or Salmonella enterica. This array is enclosed by a PDMS microfluidic flow cell. A magnetically pre-concentrated sample is injected into the biochip, and whole pathogens will bind to the capture array. The previously constructed optical device is being used to detect the presence and identity of captured pathogens using SPR imaging. This detection occurs in a label-free manner, and does not require the culture of bacterial samples. Molecular imaging can also be performed using the epi-fluorescence capabilities of the device to determine pathogen state, or to validate the identity of the captured pathogens using fluorescently labeled antibodies. We demonstrate the real-time screening of a sample for the presence of E. coli O157:H7 and Salmonella enterica. Additionally the mechanical properties of the microfluidic flow cell will be assessed. The effect of these properties on pathogen capture will be examined.

  16. Androgynous, Reconfigurable Closed Loop Feedback Controlled Low Impact Docking System With Load Sensing Electromagnetic Capture Ring

    NASA Technical Reports Server (NTRS)

    Lewis, James L. (Inventor); Carroll, Monty B. (Inventor); Morales, Ray H. (Inventor); Le, Thang D. (Inventor)

    2002-01-01

    The present invention relates to a fully androgynous, reconfigurable closed loop feedback controlled low impact docking system with load sensing electromagnetic capture ring. The docking system of the present invention preferably comprises two Docking- assemblies, each docking assembly comprising a load sensing ring having an outer face, one of more electromagnets, one or more load cells coupled to said load sensing ring. The docking assembly further comprises a plurality of actuator arms coupled to said load sensing ring and capable of dynamically adjusting the orientation of said load sensing ring and a reconfigurable closed loop control system capable of analyzing signals originating from said plurality of load cells and of outputting real time control for each of the actuators. The docking assembly of the present invention incorporates an active load sensing system to automatically dynamically adjust the load sensing ring during capture instead of requiring significant force to push and realign the ring.

  17. Applying Content Management to Automated Provenance Capture

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schuchardt, Karen L.; Gibson, Tara D.; Stephan, Eric G.

    2008-04-10

    Workflows and data pipelines are becoming increasingly valuable in both computational and experimen-tal sciences. These automated systems are capable of generating significantly more data within the same amount of time than their manual counterparts. Automatically capturing and recording data prove-nance and annotation as part of these workflows is critical for data management, verification, and dis-semination. Our goal in addressing the provenance challenge was to develop and end-to-end system that demonstrates real-time capture, persistent content management, and ad-hoc searches of both provenance and metadata using open source software and standard protocols. We describe our prototype, which extends the Kepler workflow toolsmore » for the execution environment, the Scientific Annotation Middleware (SAM) content management software for data services, and an existing HTTP-based query protocol. Our implementation offers several unique capabilities, and through the use of standards, is able to pro-vide access to the provenance record to a variety of commonly available client tools.« less

  18. Converting laserdisc video to digital video: a demonstration project using brain animations.

    PubMed

    Jao, C S; Hier, D B; Brint, S U

    1995-01-01

    Interactive laserdiscs are of limited value in large group learning situations due to the expense of establishing multiple workstations. The authors implemented an alternative to laserdisc video by using indexed digital video combined with an expert system. High-quality video was captured from a laserdisc player and combined with waveform audio into an audio-video-interleave (AVI) file format in the Microsoft Video-for-Windows environment (Microsoft Corp., Seattle, WA). With the use of an expert system, a knowledge-based computer program provided random access to these indexed AVI files. The program can be played on any multimedia computer without the need for laserdiscs. This system offers a high level of interactive video without the overhead and cost of a laserdisc player.

  19. JWST science data products

    NASA Astrophysics Data System (ADS)

    Swade, Daryl; Bushouse, Howard; Greene, Gretchen; Swam, Michael

    2014-07-01

    Science data products for James Webb Space Telescope (JWST) ©observations will be generated by the Data Management Subsystem (DMS) within the JWST Science and Operations Center (S&OC) at the Space Telescope Science Institute (STScI). Data processing pipelines within the DMS will produce uncalibrated and calibrated exposure files, as well as higher level data products that result from combined exposures, such as mosaic images. Information to support the science observations, for example data from engineering telemetry, proposer inputs, and observation planning will be captured and incorporated into the science data products. All files will be generated in Flexible Image Transport System (FITS) format. The data products will be made available through the Mikulski Archive for Space Telescopes (MAST) and adhere to International Virtual Observatory Alliance (IVOA) standard data protocols.

  20. REPHLEX II: An information management system for the ARS Water Data Base

    NASA Astrophysics Data System (ADS)

    Thurman, Jane L.

    1993-08-01

    The REPHLEX II computer system is an on-line information management system which allows scientists, engineers, and other researchers to retrieve data from the ARS Water Data Base using asynchronous communications. The system features two phone lines handling baud rates from 300 to 2400, customized menus to facilitate browsing, help screens, direct access to information and data files, electronic mail processing, file transfers using the XMODEM protocol, and log-in procedures which capture information on new users, process passwords, and log activity for a permanent audit trail. The primary data base on the REPHLEX II system is the ARS Water Data Base which consists of rainfall and runoff data from experimental agricultural watersheds located in the United States.

  1. Teaching Bibliometric Analysis and MS/DOS Commands.

    ERIC Educational Resources Information Center

    Dou, Henri; And Others

    1988-01-01

    Outlines the steps involved in bibliometric studies, and demonstrates the ability to execute simple studies on microcomputers by downloading files using only the capability of MS/DOS. Detailed illustrations of the MS/DOS commands used are provided. (eight references) (CLB)

  2. BOPACE 3-D addendum: The Boeing plastic analysis capabilities for 3-dimensional solids using isoparametric finite elements

    NASA Technical Reports Server (NTRS)

    Vos, R. G.; Straayer, J. W.

    1975-01-01

    Modifications and additions incorporated into the BOPACE 3-D program are described. Updates to the program input data formats, error messages, file usage, size limitations, and overlay schematic are included.

  3. Neutron Capture Gamma-Ray Libraries for Nuclear Applications

    NASA Astrophysics Data System (ADS)

    Sleaford, B. W.; Firestone, R. B.; Summers, N.; Escher, J.; Hurst, A.; Krticka, M.; Basunia, S.; Molnar, G.; Belgya, T.; Revay, Z.; Choi, H. D.

    2011-06-01

    The neutron capture reaction is useful in identifying and analyzing the gamma-ray spectrum from an unknown assembly as it gives unambiguous information on its composition. This can be done passively or actively where an external neutron source is used to probe an unknown assembly. There are known capture gamma-ray data gaps in the ENDF libraries used by transport codes for various nuclear applications. The Evaluated Gamma-ray Activation file (EGAF) is a new thermal neutron capture database of discrete line spectra and cross sections for over 260 isotopes that was developed as part of an IAEA Coordinated Research Project. EGAF is being used to improve the capture gamma production in ENDF libraries. For medium to heavy nuclei the quasi continuum contribution to the gamma cascades is not experimentally resolved. The continuum contains up to 90% of all the decay energy and is modeled here with the statistical nuclear structure code DICEBOX. This code also provides a consistency check of the level scheme nuclear structure evaluation. The calculated continuum is of sufficient accuracy to include in the ENDF libraries. This analysis also determines new total thermal capture cross sections and provides an improved RIPL database. For higher energy neutron capture there is less experimental data available making benchmarking of the modeling codes more difficult. We are investigating the capture spectra from higher energy neutrons experimentally using surrogate reactions and modeling this with Hauser-Feshbach codes. This can then be used to benchmark CASINO, a version of DICEBOX modified for neutron capture at higher energy. This can be used to simulate spectra from neutron capture at incident neutron energies up to 20 MeV to improve the gamma-ray spectrum in neutron data libraries used for transport modeling of unknown assemblies.

  4. Database Objects vs Files: Evaluation of alternative strategies for managing large remote sensing data

    NASA Astrophysics Data System (ADS)

    Baru, Chaitan; Nandigam, Viswanath; Krishnan, Sriram

    2010-05-01

    Increasingly, the geoscience user community expects modern IT capabilities to be available in service of their research and education activities, including the ability to easily access and process large remote sensing datasets via online portals such as GEON (www.geongrid.org) and OpenTopography (opentopography.org). However, serving such datasets via online data portals presents a number of challenges. In this talk, we will evaluate the pros and cons of alternative storage strategies for management and processing of such datasets using binary large object implementations (BLOBs) in database systems versus implementation in Hadoop files using the Hadoop Distributed File System (HDFS). The storage and I/O requirements for providing online access to large datasets dictate the need for declustering data across multiple disks, for capacity as well as bandwidth and response time performance. This requires partitioning larger files into a set of smaller files, and is accompanied by the concomitant requirement for managing large numbers of file. Storing these sub-files as blobs in a shared-nothing database implemented across a cluster provides the advantage that all the distributed storage management is done by the DBMS. Furthermore, subsetting and processing routines can be implemented as user-defined functions (UDFs) on these blobs and would run in parallel across the set of nodes in the cluster. On the other hand, there are both storage overheads and constraints, and software licensing dependencies created by such an implementation. Another approach is to store the files in an external filesystem with pointers to them from within database tables. The filesystem may be a regular UNIX filesystem, a parallel filesystem, or HDFS. In the HDFS case, HDFS would provide the file management capability, while the subsetting and processing routines would be implemented as Hadoop programs using the MapReduce model. Hadoop and its related software libraries are freely available. Another consideration is the strategy used for partitioning large data collections, and large datasets within collections, using round-robin vs hash partitioning vs range partitioning methods. Each has different characteristics in terms of spatial locality of data and resultant degree of declustering of the computations on the data. Furthermore, we have observed that, in practice, there can be large variations in the frequency of access to different parts of a large data collection and/or dataset, thereby creating "hotspots" in the data. We will evaluate the ability of different approaches for dealing effectively with such hotspots and alternative strategies for dealing with hotspots.

  5. Data display and analysis with μView

    NASA Astrophysics Data System (ADS)

    Tucakov, Ivan; Cosman, Jacob; Brewer, Jess H.

    2006-03-01

    The μView utility is a new Java applet version of the old db program, extended to include direct access to MUD data files, from which it can construct a variety of spectrum types, including complex and RRF-transformed spectra. By using graphics features built into all modern Web browsers, it provides full graphical display capabilities consistently across all platforms. It has the full command-line functionality of db as well as a more intuitive graphical user interface and extensive documentation, and can read and write db, csv and XML format files.

  6. DATALINK: Records inventory data collection software. User`s guide

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cole, B.A.

    1995-03-01

    DATALINK was created to provide an easy to use data collection program for records management software products. It provides several useful tools for capturing and validating record index data in the field. It also allows users to easily create a comma delimited, ASCII text file for data export into most records management software products. It runs on virtually any computer us MS-DOS.

  7. Schema for Spacecraft-Command Dictionary

    NASA Technical Reports Server (NTRS)

    Laubach, Sharon; Garcia, Celina; Maxwell, Scott; Wright, Jesse

    2008-01-01

    An Extensible Markup Language (XML) schema was developed as a means of defining and describing a structure for capturing spacecraft command- definition and tracking information in a single location in a form readable by both engineers and software used to generate software for flight and ground systems. A structure defined within this schema is then used as the basis for creating an XML file that contains command definitions.

  8. VCF-Explorer: filtering and analysing whole genome VCF files.

    PubMed

    Akgün, Mete; Demirci, Hüseyin

    2017-11-01

    The decreasing cost in high-throughput technologies led to a number of sequencing projects consisting of thousands of whole genomes. The paradigm shift from exome to whole genome brings a significant increase in the size of output files. Most of the existing tools which are developed to analyse exome files are not adequate for larger VCF files produced by whole genome studies. In this work we present VCF-Explorer, a variant analysis software capable of handling large files. Memory efficiency and avoiding computationally costly pre-processing step enable to carry out the analysis to be performed with ordinary computers. VCF-Explorer provides an easy to use environment where users can define various types of queries based on variant and sample genotype level annotations. VCF-Explorer can be run in different environments and computational platforms ranging from a standard laptop to a high performance server. VCF-Explorer is freely available at: http://vcfexplorer.sourceforge.net/. mete.akgun@tubitak.gov.tr. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  9. NetpathXL - An Excel Interface to the Program NETPATH

    USGS Publications Warehouse

    Parkhurst, David L.; Charlton, Scott R.

    2008-01-01

    NetpathXL is a revised version of NETPATH that runs under Windows? operating systems. NETPATH is a computer program that uses inverse geochemical modeling techniques to calculate net geochemical reactions that can account for changes in water composition between initial and final evolutionary waters in hydrologic systems. The inverse models also can account for the isotopic composition of waters and can be used to estimate radiocarbon ages of dissolved carbon in ground water. NETPATH relies on an auxiliary, database program, DB, to enter the chemical analyses and to perform speciation calculations that define total concentrations of elements, charge balance, and redox state of aqueous solutions that are then used in inverse modeling. Instead of DB, NetpathXL relies on Microsoft Excel? to enter the chemical analyses. The speciation calculation formerly included in DB is implemented within the program NetpathXL. A program DBXL can be used to translate files from the old DB format (.lon files) to NetpathXL spreadsheets, or to create new NetpathXL spreadsheets. Once users have a NetpathXL spreadsheet with the proper format, new spreadsheets can be generated by copying or saving NetpathXL spreadsheets. In addition, DBXL can convert NetpathXL spreadsheets to PHREEQC input files. New capabilities in PHREEQC (version 2.15) allow solution compositions to be written to a .lon file, and inverse models developed in PHREEQC to be written as NetpathXL .pat and model files. NetpathXL can open NetpathXL spreadsheets, NETPATH-format path files (.pat files), and NetpathXL-format path files (.pat files). Once the speciation calculations have been performed on a spreadsheet file or a .pat file has been opened, the NetpathXL calculation engine is identical to the original NETPATH. Development of models and viewing results in NetpathXL rely on keyboard entry as in NETPATH.

  10. FEAT - FAILURE ENVIRONMENT ANALYSIS TOOL (UNIX VERSION)

    NASA Technical Reports Server (NTRS)

    Pack, G.

    1994-01-01

    The Failure Environment Analysis Tool, FEAT, enables people to see and better understand the effects of failures in a system. FEAT uses digraph models to determine what will happen to a system if a set of failure events occurs and to identify the possible causes of a selected set of failures. Failures can be user-selected from either engineering schematic or digraph model graphics, and the effects or potential causes of the failures will be color highlighted on the same schematic or model graphic. As a design tool, FEAT helps design reviewers understand exactly what redundancies have been built into a system and where weaknesses need to be protected or designed out. A properly developed digraph will reflect how a system functionally degrades as failures accumulate. FEAT is also useful in operations, where it can help identify causes of failures after they occur. Finally, FEAT is valuable both in conceptual development and as a training aid, since digraphs can identify weaknesses in scenarios as well as hardware. Digraphs models for use with FEAT are generally built with the Digraph Editor, a Macintosh-based application which is distributed with FEAT. The Digraph Editor was developed specifically with the needs of FEAT users in mind and offers several time-saving features. It includes an icon toolbox of components required in a digraph model and a menu of functions for manipulating these components. It also offers FEAT users a convenient way to attach a formatted textual description to each digraph node. FEAT needs these node descriptions in order to recognize nodes and propagate failures within the digraph. FEAT users store their node descriptions in modelling tables using any word processing or spreadsheet package capable of saving data to an ASCII text file. From within the Digraph Editor they can then interactively attach a properly formatted textual description to each node in a digraph. Once descriptions are attached to them, a selected set of nodes can be saved as a library file which represents a generic digraph structure for a class of components. The Generate Model feature can then use library files to generate digraphs for every component listed in the modeling tables, and these individual digraph files can be used in a variety of ways to speed generation of complete digraph models. FEAT contains a preprocessor which performs transitive closure on the digraph. This multi-step algorithm builds a series of phantom bridges, or gates, that allow accurate bi-directional processing of digraphs. This preprocessing can be time-consuming, but once preprocessing is complete, queries can be answered and displayed within seconds. A UNIX X-Windows port of version 3.5 of FEAT, XFEAT, is also available to speed the processing of digraph models created on the Macintosh. FEAT v3.6, which is only available for the Macintosh, has some report generation capabilities which are not available in XFEAT. For very large integrated systems, FEAT can be a real cost saver in terms of design evaluation, training, and knowledge capture. The capability of loading multiple digraphs and schematics into FEAT allows modelers to build smaller, more focused digraphs. Typically, each digraph file will represent only a portion of a larger failure scenario. FEAT will combine these files and digraphs from other modelers to form a continuous mathematical model of the system's failure logic. Since multiple digraphs can be cumbersome to use, FEAT ties propagation results to schematic drawings produced using MacDraw II (v1.1v2 or later) or MacDraw Pro. This makes it easier to identify single and double point failures that may have to cross several system boundaries and multiple engineering disciplines before creating a hazardous condition. FEAT v3.6 for the Macintosh is written in C-language using Macintosh Programmer's Workshop C v3.2. It requires at least a Mac II series computer running System 7 or System 6.0.8 and 32 Bit QuickDraw. It also requires a math coprocessor or coprocessor emulator and a color monitor (or one with 256 gray scale capability). A minimum of 4Mb of free RAM is highly recommended. The UNIX version of FEAT includes both FEAT v3.6 for the Macintosh and XFEAT. XFEAT is written in C-language for Sun series workstations running SunOS, SGI workstations running IRIX, DECstations running ULTRIX, and Intergraph workstations running CLIX version 6. It requires the MIT X Window System, Version 11 Revision 4, with OSF/Motif 1.1.3, and 16Mb of RAM. The standard distribution medium for FEAT 3.6 (Macintosh version) is a set of three 3.5 inch Macintosh format diskettes. The standard distribution package for the UNIX version includes the three FEAT 3.6 Macintosh diskettes plus a .25 inch streaming magnetic tape cartridge (QIC-24) in UNIX tar format which contains XFEAT. Alternate distribution media and formats for XFEAT are available upon request. FEAT has been under development since 1990. Both FEAT v3.6 for the Macintosh and XFEAT v3.5 were released in 1993.

  11. Successful linking of the Society of Thoracic Surgeons Database to Social Security data to examine the accuracy of Society of Thoracic Surgeons mortality data.

    PubMed

    Jacobs, Jeffrey P; O'Brien, Sean M; Shahian, David M; Edwards, Fred H; Badhwar, Vinay; Dokholyan, Rachel S; Sanchez, Juan A; Morales, David L; Prager, Richard L; Wright, Cameron D; Puskas, John D; Gammie, James S; Haan, Constance K; George, Kristopher M; Sheng, Shubin; Peterson, Eric D; Shewan, Cynthia M; Han, Jane M; Bongiorno, Phillip A; Yohe, Courtney; Williams, William G; Mayer, John E; Grover, Frederick L

    2013-04-01

    The Society of Thoracic Surgeons Adult Cardiac Surgery Database has been linked to the Social Security Death Master File to verify "life status" and evaluate long-term surgical outcomes. The objective of this study is explore practical applications of the linkage of the Society of Thoracic Surgeons Adult Cardiac Surgery Database to Social Securtiy Death Master File, including the use of the Social Securtiy Death Master File to examine the accuracy of the Society of Thoracic Surgeons 30-day mortality data. On January 1, 2008, the Society of Thoracic Surgeons Adult Cardiac Surgery Database began collecting Social Security numbers in its new version 2.61. This study includes all Society of Thoracic Surgeons Adult Cardiac Surgery Database records for operations with nonmissing Social Security numbers between January 1, 2008, and December 31, 2010, inclusive. To match records between the Society of Thoracic Surgeons Adult Cardiac Surgery Database and the Social Security Death Master File, we used a combined probabilistic and deterministic matching rule with reported high sensitivity and nearly perfect specificity. Between January 1, 2008, and December 31, 2010, the Society of Thoracic Surgeons Adult Cardiac Surgery Database collected data for 870,406 operations. Social Security numbers were available for 541,953 operations and unavailable for 328,453 operations. According to the Society of Thoracic Surgeons Adult Cardiac Surgery Database, the 30-day mortality rate was 17,757/541,953 = 3.3%. Linkage to the Social Security Death Master File identified 16,565 cases of suspected 30-day deaths (3.1%). Of these, 14,983 were recorded as 30-day deaths in the Society of Thoracic Surgeons database (relative sensitivity = 90.4%). Relative sensitivity was 98.8% (12,863/13,014) for suspected 30-day deaths occurring before discharge and 59.7% (2120/3551) for suspected 30-day deaths occurring after discharge. Linkage to the Social Security Death Master File confirms the accuracy of data describing "mortality within 30 days of surgery" in the Society of Thoracic Surgeons Adult Cardiac Surgery Database. The Society of Thoracic Surgeons and Social Security Death Master File link reveals that capture of 30-day deaths occurring before discharge is highly accurate, and that these in-hospital deaths represent the majority (79% [13,014/16,565]) of all 30-day deaths. Capture of the remaining 30-day deaths occurring after discharge is less complete and needs improvement. Efforts continue to encourage Society of Thoracic Surgeons Database participants to submit Social Security numbers to the Database, thereby enhancing accurate determination of 30-day life status. The Society of Thoracic Surgeons and Social Security Death Master File linkage can facilitate ongoing refinement of mortality reporting. Copyright © 2013 The American Association for Thoracic Surgery. Published by Mosby, Inc. All rights reserved.

  12. Study on data acquisition system based on reconfigurable cache technology

    NASA Astrophysics Data System (ADS)

    Zhang, Qinchuan; Li, Min; Jiang, Jun

    2018-03-01

    Waveform capture rate is one of the key features of digital acquisition systems, which represents the waveform processing capability of the system in a unit time. The higher the waveform capture rate is, the larger the chance to capture elusive events is and the more reliable the test result is. First, this paper analyzes the impact of several factors on the waveform capture rate of the system, then the novel technology based on reconfigurable cache is further proposed to optimize system architecture, and the simulation results show that the signal-to-noise ratio of signal, capacity, and structure of cache have significant effects on the waveform capture rate. Finally, the technology is demonstrated by the engineering practice, and the results show that the waveform capture rate of the system is improved substantially without significant increase of system's cost, and the technology proposed has a broad application prospect.

  13. Efficient stereoscopic contents file format on the basis of ISO base media file format

    NASA Astrophysics Data System (ADS)

    Kim, Kyuheon; Lee, Jangwon; Suh, Doug Young; Park, Gwang Hoon

    2009-02-01

    A lot of 3D contents haven been widely used for multimedia services, however, real 3D video contents have been adopted for a limited applications such as a specially designed 3D cinema. This is because of the difficulty of capturing real 3D video contents and the limitation of display devices available in a market. However, diverse types of display devices for stereoscopic video contents for real 3D video contents have been recently released in a market. Especially, a mobile phone with a stereoscopic camera has been released in a market, which provides a user as a consumer to have more realistic experiences without glasses, and also, as a content creator to take stereoscopic images or record the stereoscopic video contents. However, a user can only store and display these acquired stereoscopic contents with his/her own devices due to the non-existence of a common file format for these contents. This limitation causes a user not share his/her contents with any other users, which makes it difficult the relevant market to stereoscopic contents is getting expanded. Therefore, this paper proposes the common file format on the basis of ISO base media file format for stereoscopic contents, which enables users to store and exchange pure stereoscopic contents. This technology is also currently under development for an international standard of MPEG as being called as a stereoscopic video application format.

  14. Online Patent Searching: The Realities.

    ERIC Educational Resources Information Center

    Kaback, Stuart M.

    1983-01-01

    Considers patent subject searching capabilities of major online databases, noting patent claims, "deep-indexed" files, test searches, retrieval of related references, multi-database searching, improvements needed in indexing of chemical structures, full text searching, improvements needed in handling numerical data, and augmenting a…

  15. Information Systems: Fact or Fiction.

    ERIC Educational Resources Information Center

    Bearley, William

    Rising costs of programming and program maintenance have caused discussion concerning the need for generalized information systems. These would provide data base functions plus complete report writing and file maintenance capabilities. All administrative applications, including online registration, student records, and financial applications are…

  16. 77 FR 29435 - Self-Regulatory Organizations; The NASDAQ Stock Market LLC; Notice of Filing of Proposed Rule...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-17

    ... types to make benchmarking easier and more efficient. For the members that already have such... benchmarking capability to firms that currently lack it or lack an exchange-based alternative. NASDAQ further...

  17. 75 FR 81626 - Government-Owned Inventions; Availability for Licensing

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-28

    ... of federally-funded research and development. Foreign patent applications are filed on selected... . Collaborative Research Opportunity: The National Institute on Aging, Cellular Biophysics Section, is seeking statements of capability or interest from parties interested in collaborative research to further develop...

  18. 75 FR 14168 - Government-Owned Inventions; Availability for Licensing

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-24

    ... of federally-funded research and development. Foreign patent applications are filed on selected... . Collaborative Research Opportunity: The Immunotherapeutics Unit, National Institute on Aging, is seeking statements of capability or interest from parties interested in collaborative research to further develop...

  19. Visualizing 3D data obtained from microscopy on the Internet.

    PubMed

    Pittet, J J; Henn, C; Engel, A; Heymann, J B

    1999-01-01

    The Internet is a powerful communication medium increasingly exploited by business and science alike, especially in structural biology and bioinformatics. The traditional presentation of static two-dimensional images of real-world objects on the limited medium of paper can now be shown interactively in three dimensions. Many facets of this new capability have already been developed, particularly in the form of VRML (virtual reality modeling language), but there is a need to extend this capability for visualizing scientific data. Here we introduce a real-time isosurfacing node for VRML, based on the marching cube approach, allowing interactive isosurfacing. A second node does three-dimensional (3D) texture-based volume-rendering for a variety of representations. The use of computers in the microscopic and structural biosciences is extensive, and many scientific file formats exist. To overcome the problem of accessing such data from VRML and other tools, we implemented extensions to SGI's IFL (image format library). IFL is a file format abstraction layer defining communication between a program and a data file. These technologies are developed in support of the BioImage project, aiming to establish a database prototype for multidimensional microscopic data with the ability to view the data within a 3D interactive environment. Copyright 1999 Academic Press.

  20. CBP PHASE I CODE INTEGRATION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, F.; Brown, K.; Flach, G.

    The goal of the Cementitious Barriers Partnership (CBP) is to develop a reasonable and credible set of software tools to predict the structural, hydraulic, and chemical performance of cement barriers used in nuclear applications over extended time frames (greater than 100 years for operating facilities and greater than 1000 years for waste management). The simulation tools will be used to evaluate and predict the behavior of cementitious barriers used in near surface engineered waste disposal systems including waste forms, containment structures, entombments, and environmental remediation. These cementitious materials are exposed to dynamic environmental conditions that cause changes in material propertiesmore » via (i) aging, (ii) chloride attack, (iii) sulfate attack, (iv) carbonation, (v) oxidation, and (vi) primary constituent leaching. A set of state-of-the-art software tools has been selected as a starting point to capture these important aging and degradation phenomena. Integration of existing software developed by the CBP partner organizations was determined to be the quickest method of meeting the CBP goal of providing a computational tool that improves the prediction of the long-term behavior of cementitious materials. These partner codes were selected based on their maturity and ability to address the problems outlined above. The GoldSim Monte Carlo simulation program (GTG 2010a, GTG 2010b) was chosen as the code integration platform (Brown & Flach 2009b). GoldSim (current Version 10.5) is a Windows based graphical object-oriented computer program that provides a flexible environment for model development (Brown & Flach 2009b). The linking of GoldSim to external codes has previously been successfully demonstrated (Eary 2007, Mattie et al. 2007). GoldSim is capable of performing deterministic and probabilistic simulations and of modeling radioactive decay and constituent transport. As part of the CBP project, a general Dynamic Link Library (DLL) interface was developed to link GoldSim with external codes (Smith III et al. 2010). The DLL uses a list of code inputs provided by GoldSim to create an input file for the external application, runs the external code, and returns a list of outputs (read from files created by the external application) back to GoldSim. In this way GoldSim provides: (1) a unified user interface to the applications, (2) the capability of coupling selected codes in a synergistic manner, and (3) the capability of performing probabilistic uncertainty analysis with the codes. GoldSim is made available by the GoldSim Technology Group as a free 'Player' version that allows running but not editing GoldSim models. The player version makes the software readily available to a wider community of users that would wish to use the CBP application but do not have a license for GoldSim.« less

  1. File-Based One-Way BISON Coupling Through VERA: User's Manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stimpson, Shane G.

    Activities to incorporate fuel performance capabilities into the Virtual Environment for Reactor Applications (VERA) are receiving increasing attention [1–6]. The multiphysics emphasis is expanding as the neutronics (MPACT) and thermal-hydraulics (CTF) packages are becoming more mature. Capturing the finer details of fuel phenomena (swelling, densification, relocation, gap closure, etc.) is the natural next step in the VERA development process since these phenomena are currently not directly taken into account. While several codes could be used to accomplish this, the BISON fuel performance code [8,9] being developed by the Idaho National Laboratory (INL) is the focus of ongoing work in themore » Consortium for Advanced Simulation of Light Water Reactors (CASL). Built on INL’s MOOSE framework [10], BISON uses the finite element method for geometric representation and a Jacobian-free Newton-Krylov (JFNK) scheme to solve systems of partial differential equations for various fuel characteristic relationships. There are several modes of operation in BISON, but, this work uses a 2D azimuthally symmetric (R-Z) smeared-pellet model. This manual is intended to cover (1) the procedure pertaining to the standalone BISON one-way coupling from VERA and (2) the procedure to generate BISON fuel temperature tables that VERA can use.« less

  2. Development of Software to Model AXAF-I Image Quality

    NASA Technical Reports Server (NTRS)

    Geary, Joseph; Hawkins, Lamar; Ahmad, Anees; Gong, Qian

    1997-01-01

    This report describes work conducted on Delivery Order 181 between October 1996 through June 1997. During this period software was written to: compute axial PSD's from RDOS AXAF-I mirror surface maps; plot axial surface errors and compute PSD's from HDOS "Big 8" axial scans; plot PSD's from FITS format PSD files; plot band-limited RMS vs axial and azimuthal position for multiple PSD files; combine and organize PSD's from multiple mirror surface measurements formatted as input to GRAZTRACE; modify GRAZTRACE to read FITS formatted PSD files; evaluate AXAF-I test results; improve and expand the capabilities of the GT x-ray mirror analysis package. During this period work began on a more user-friendly manual for the GT program, and improvements were made to the on-line help manual.

  3. Evaluating standard terminologies for encoding allergy information.

    PubMed

    Goss, Foster R; Zhou, Li; Plasek, Joseph M; Broverman, Carol; Robinson, George; Middleton, Blackford; Rocha, Roberto A

    2013-01-01

    Allergy documentation and exchange are vital to ensuring patient safety. This study aims to analyze and compare various existing standard terminologies for representing allergy information. Five terminologies were identified, including the Systemized Nomenclature of Medical Clinical Terms (SNOMED CT), National Drug File-Reference Terminology (NDF-RT), Medication Dictionary for Regulatory Activities (MedDRA), Unique Ingredient Identifier (UNII), and RxNorm. A qualitative analysis was conducted to compare desirable characteristics of each terminology, including content coverage, concept orientation, formal definitions, multiple granularities, vocabulary structure, subset capability, and maintainability. A quantitative analysis was also performed to compare the content coverage of each terminology for (1) common food, drug, and environmental allergens and (2) descriptive concepts for common drug allergies, adverse reactions (AR), and no known allergies. Our qualitative results show that SNOMED CT fulfilled the greatest number of desirable characteristics, followed by NDF-RT, RxNorm, UNII, and MedDRA. Our quantitative results demonstrate that RxNorm had the highest concept coverage for representing drug allergens, followed by UNII, SNOMED CT, NDF-RT, and MedDRA. For food and environmental allergens, UNII demonstrated the highest concept coverage, followed by SNOMED CT. For representing descriptive allergy concepts and adverse reactions, SNOMED CT and NDF-RT showed the highest coverage. Only SNOMED CT was capable of representing unique concepts for encoding no known allergies. The proper terminology for encoding a patient's allergy is complex, as multiple elements need to be captured to form a fully structured clinical finding. Our results suggest that while gaps still exist, a combination of SNOMED CT and RxNorm can satisfy most criteria for encoding common allergies and provide sufficient content coverage.

  4. Relative navigation requirements for automatic rendezvous and capture systems

    NASA Technical Reports Server (NTRS)

    Kachmar, Peter M.; Polutchko, Robert J.; Chu, William; Montez, Moises

    1991-01-01

    This paper will discuss in detail the relative navigation system requirements and sensor trade-offs for Automatic Rendezvous and Capture. Rendezvous navigation filter development will be discussed in the context of navigation performance requirements for a 'Phase One' AR&C system capability. Navigation system architectures and the resulting relative navigation performance for both cooperative and uncooperative target vehicles will be assessed. Relative navigation performance using rendezvous radar, star tracker, radiometric, laser and GPS navigation sensors during appropriate phases of the trajectory will be presented. The effect of relative navigation performance on the Integrated AR&C system performance will be addressed. Linear covariance and deterministic simulation results will be used. Evaluation of relative navigation and IGN&C system performance for several representative relative approach profiles will be presented in order to demonstrate the full range of system capabilities. A summary of the sensor requirements and recommendations for AR&C system capabilities for several programs requiring AR&C will be presented.

  5. Airborne Cloud Computing Environment (ACCE)

    NASA Technical Reports Server (NTRS)

    Hardman, Sean; Freeborn, Dana; Crichton, Dan; Law, Emily; Kay-Im, Liz

    2011-01-01

    Airborne Cloud Computing Environment (ACCE) is JPL's internal investment to improve the return on airborne missions. Improve development performance of the data system. Improve return on the captured science data. The investment is to develop a common science data system capability for airborne instruments that encompasses the end-to-end lifecycle covering planning, provisioning of data system capabilities, and support for scientific analysis in order to improve the quality, cost effectiveness, and capabilities to enable new scientific discovery and research in earth observation.

  6. Digital PIV (DPIV) Software Analysis System

    NASA Technical Reports Server (NTRS)

    Blackshire, James L.

    1997-01-01

    A software package was developed to provide a Digital PIV (DPIV) capability for NASA LaRC. The system provides an automated image capture, test correlation, and autocorrelation analysis capability for the Kodak Megaplus 1.4 digital camera system for PIV measurements. The package includes three separate programs that, when used together with the PIV data validation algorithm, constitutes a complete DPIV analysis capability. The programs are run on an IBM PC/AT host computer running either Microsoft Windows 3.1 or Windows 95 using a 'quickwin' format that allows simple user interface and output capabilities to the windows environment.

  7. BigWig and BigBed: enabling browsing of large distributed datasets.

    PubMed

    Kent, W J; Zweig, A S; Barber, G; Hinrichs, A S; Karolchik, D

    2010-09-01

    BigWig and BigBed files are compressed binary indexed files containing data at several resolutions that allow the high-performance display of next-generation sequencing experiment results in the UCSC Genome Browser. The visualization is implemented using a multi-layered software approach that takes advantage of specific capabilities of web-based protocols and Linux and UNIX operating systems files, R trees and various indexing and compression tricks. As a result, only the data needed to support the current browser view is transmitted rather than the entire file, enabling fast remote access to large distributed data sets. Binaries for the BigWig and BigBed creation and parsing utilities may be downloaded at http://hgdownload.cse.ucsc.edu/admin/exe/linux.x86_64/. Source code for the creation and visualization software is freely available for non-commercial use at http://hgdownload.cse.ucsc.edu/admin/jksrc.zip, implemented in C and supported on Linux. The UCSC Genome Browser is available at http://genome.ucsc.edu.

  8. Perl-speaks-NONMEM (PsN)--a Perl module for NONMEM related programming.

    PubMed

    Lindbom, Lars; Ribbing, Jakob; Jonsson, E Niclas

    2004-08-01

    The NONMEM program is the most widely used nonlinear regression software in population pharmacokinetic/pharmacodynamic (PK/PD) analyses. In this article we describe a programming library, Perl-speaks-NONMEM (PsN), intended for programmers that aim at using the computational capability of NONMEM in external applications. The library is object oriented and written in the programming language Perl. The classes of the library are built around NONMEM's data, model and output files. The specification of the NONMEM model is easily set or changed through the model and data file classes while the output from a model fit is accessed through the output file class. The classes have methods that help the programmer perform common repetitive tasks, e.g. summarising the output from a NONMEM run, setting the initial estimates of a model based on a previous run or truncating values over a certain threshold in the data file. PsN creates a basis for the development of high-level software using NONMEM as the regression tool.

  9. The Use Of Videography For Three-Dimensional Motion Analysis

    NASA Astrophysics Data System (ADS)

    Hawkins, D. A.; Hawthorne, D. L.; DeLozier, G. S.; Campbell, K. R.; Grabiner, M. D.

    1988-02-01

    Special video path editing capabilities with custom hardware and software, have been developed for use in conjunction with existing video acquisition hardware and firmware. This system has simplified the task of quantifying the kinematics of human movement. A set of retro-reflective markers are secured to a subject performing a given task (i.e. walking, throwing, swinging a golf club, etc.). Multiple cameras, a video processor, and a computer work station collect video data while the task is performed. Software has been developed to edit video files, create centroid data, and identify marker paths. Multi-camera path files are combined to form a 3D path file using the DLT method of cinematography. A separate program converts the 3D path file into kinematic data by creating a set of local coordinate axes and performing a series of coordinate transformations from one local system to the next. The kinematic data is then displayed for appropriate review and/or comparison.

  10. MarFS, a Near-POSIX Interface to Cloud Objects

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Inman, Jeffrey Thornton; Vining, William Flynn; Ransom, Garrett Wilson

    The engineering forces driving development of “cloud” storage have produced resilient, cost-effective storage systems that can scale to 100s of petabytes, with good parallel access and bandwidth. These features would make a good match for the vast storage needs of High-Performance Computing datacenters, but cloud storage gains some of its capability from its use of HTTP-style Representational State Transfer (REST) semantics, whereas most large datacenters have legacy applications that rely on POSIX file-system semantics. MarFS is an open-source project at Los Alamos National Laboratory that allows us to present cloud-style object-storage as a scalable near-POSIX file system. We have alsomore » developed a new storage architecture to improve bandwidth and scalability beyond what’s available in commodity object stores, while retaining their resilience and economy. Additionally, we present a scheme for scaling the POSIX interface to allow billions of files in a single directory and trillions of files in total.« less

  11. MarFS, a Near-POSIX Interface to Cloud Objects

    DOE PAGES

    Inman, Jeffrey Thornton; Vining, William Flynn; Ransom, Garrett Wilson; ...

    2017-01-01

    The engineering forces driving development of “cloud” storage have produced resilient, cost-effective storage systems that can scale to 100s of petabytes, with good parallel access and bandwidth. These features would make a good match for the vast storage needs of High-Performance Computing datacenters, but cloud storage gains some of its capability from its use of HTTP-style Representational State Transfer (REST) semantics, whereas most large datacenters have legacy applications that rely on POSIX file-system semantics. MarFS is an open-source project at Los Alamos National Laboratory that allows us to present cloud-style object-storage as a scalable near-POSIX file system. We have alsomore » developed a new storage architecture to improve bandwidth and scalability beyond what’s available in commodity object stores, while retaining their resilience and economy. Additionally, we present a scheme for scaling the POSIX interface to allow billions of files in a single directory and trillions of files in total.« less

  12. OMERO and Bio-Formats 5: flexible access to large bioimaging datasets at scale

    NASA Astrophysics Data System (ADS)

    Moore, Josh; Linkert, Melissa; Blackburn, Colin; Carroll, Mark; Ferguson, Richard K.; Flynn, Helen; Gillen, Kenneth; Leigh, Roger; Li, Simon; Lindner, Dominik; Moore, William J.; Patterson, Andrew J.; Pindelski, Blazej; Ramalingam, Balaji; Rozbicki, Emil; Tarkowska, Aleksandra; Walczysko, Petr; Allan, Chris; Burel, Jean-Marie; Swedlow, Jason

    2015-03-01

    The Open Microscopy Environment (OME) has built and released Bio-Formats, a Java-based proprietary file format conversion tool and OMERO, an enterprise data management platform under open source licenses. In this report, we describe new versions of Bio-Formats and OMERO that are specifically designed to support large, multi-gigabyte or terabyte scale datasets that are routinely collected across most domains of biological and biomedical research. Bio- Formats reads image data directly from native proprietary formats, bypassing the need for conversion into a standard format. It implements the concept of a file set, a container that defines the contents of multi-dimensional data comprised of many files. OMERO uses Bio-Formats to read files natively, and provides a flexible access mechanism that supports several different storage and access strategies. These new capabilities of OMERO and Bio-Formats make them especially useful for use in imaging applications like digital pathology, high content screening and light sheet microscopy that create routinely large datasets that must be managed and analyzed.

  13. Scanning microarray slides.

    PubMed

    Ares, Manuel

    2014-02-01

    Here we describe some practical concerns surrounding the scanning of microarray slides that have been hybridized with fluorescent dyes. We use a laser scanner that has two lasers, each set to excite a different fluor, and separate detectors to capture emission from each fluor. The laser passes over an address (position on the scanned surface) and the detectors capture photons emitted from each address. Two superimposed image files are written that carry intensities for each channel for each pixel of the image scan. These are the raw data. Image analysis software is used to identify and summarize the intensities of the pixels that make up each spot. After comparison to background pixels, the processed intensity levels representing the gene expression measurements are associated with the identity of each spot.

  14. The Use of an On-Board MV Imager for Plan Verification of Intensity Modulated Radiation Therapy and Volumetrically Modulated Arc Therapy

    NASA Astrophysics Data System (ADS)

    Walker, Justin A.

    The introduction of complex treatment modalities such as IMRT and VMAT has led to the development of many devices for plan verification. One such innovation in this field is the repurposing of the portal imager to not only be used for tumor localization but for recording dose distributions as well. Several advantages make portal imagers attractive options for this purpose. Very high spatial resolution allows for better verification of small field plans than may be possible with commercially available devices. Because the portal imager is attached to the gantry set up is simpler than any other method available, requiring no additional accessories, and often can be accomplished from outside the treatment room. Dose images capture by the portal imager are in digital format make permanent records that can be analyzed immediately. Portal imaging suffers from a few limitations however that must be overcome. Images captured contain dose information and a calibration must be maintained for image to dose conversion. Dose images can only be taken perpendicular to the treatment beam allowing only for planar dose comparison. Planar dose files are themself difficult to obtain for VMAT treatments and an in-house script had to be developed to create such a file before analysis could be performed. Using the methods described in this study, excellent agreement between planar dose files generated and dose images taken were found. The average agreement for IMRT field analyzed being greater than 97% for non-normalized images at 3mm and 3%. Comparable agreement for VAMT plans was found as well with the average agreement being greater than 98%.

  15. AASG State Geothermal Data Repository for the National Geothermal Data System.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2012-01-01

    This Drupal metadata and documents capture and management system is a repository, used for maintenance of metadata which describe resources contributed to the AASG State Geothermal Data System. The repository also provides an archive for files that are not hosted by the agency contributing the resource. Data from all 50 state geological surveys is represented here, and is contributed in turn to the National Geothermal Data System.

  16. Steganography -- The New Intelligence Threat

    DTIC Science & Technology

    2004-01-01

    Information can be embedded within text files, digital music and videos, and digital photographs by simply changing bits and bytes. HOW IT WORKS...International Airport could be embedded in Brittany Spears’ latest music release in MP3 format. The wide range of steganography capabilities has been

  17. The Changing Environment of Personal Information Systems.

    ERIC Educational Resources Information Center

    Burton, Hilary D.

    1985-01-01

    Discusses technological developments causing changes in personal information systems: increase in commercial support services; proliferation of microcomputers; capability to download from secondary services into private files; and developing desire to utilize functions such as electronic mail and automated office functions. Appendices list 21…

  18. A comparative analysis of exome capture.

    PubMed

    Parla, Jennifer S; Iossifov, Ivan; Grabill, Ian; Spector, Mona S; Kramer, Melissa; McCombie, W Richard

    2011-09-29

    Human exome resequencing using commercial target capture kits has been and is being used for sequencing large numbers of individuals to search for variants associated with various human diseases. We rigorously evaluated the capabilities of two solution exome capture kits. These analyses help clarify the strengths and limitations of those data as well as systematically identify variables that should be considered in the use of those data. Each exome kit performed well at capturing the targets they were designed to capture, which mainly corresponds to the consensus coding sequences (CCDS) annotations of the human genome. In addition, based on their respective targets, each capture kit coupled with high coverage Illumina sequencing produced highly accurate nucleotide calls. However, other databases, such as the Reference Sequence collection (RefSeq), define the exome more broadly, and so not surprisingly, the exome kits did not capture these additional regions. Commercial exome capture kits provide a very efficient way to sequence select areas of the genome at very high accuracy. Here we provide the data to help guide critical analyses of sequencing data derived from these products.

  19. Requirements for a network storage service

    NASA Technical Reports Server (NTRS)

    Kelly, Suzanne M.; Haynes, Rena A.

    1991-01-01

    Sandia National Laboratories provides a high performance classified computer network as a core capability in support of its mission of nuclear weapons design and engineering, physical sciences research, and energy research and development. The network, locally known as the Internal Secure Network (ISN), comprises multiple distributed local area networks (LAN's) residing in New Mexico and California. The TCP/IP protocol suite is used for inter-node communications. Scientific workstations and mid-range computers, running UNIX-based operating systems, compose most LAN's. One LAN, operated by the Sandia Corporate Computing Computing Directorate, is a general purpose resource providing a supercomputer and a file server to the entire ISN. The current file server on the supercomputer LAN is an implementation of the Common File Server (CFS). Subsequent to the design of the ISN, Sandia reviewed its mass storage requirements and chose to enter into a competitive procurement to replace the existing file server with one more adaptable to a UNIX/TCP/IP environment. The requirements study for the network was the starting point for the requirements study for the new file server. The file server is called the Network Storage Service (NSS) and its requirements are described. An application or functional description of the NSS is given. The final section adds performance, capacity, and access constraints to the requirements.

  20. Text to Speech (TTS) Capabilities for the Common Driver Trainer (CDT)

    DTIC Science & Technology

    2010-10-01

    harnessing in’leigle jalClpeno jocelyn linu ~ los angeles lottery margarine mathematlze mathematized mathematized meme memes memol...including Julie, Kate, and Paul . Based upon the names of the voices, it may be that the VoiceText capability is the technology being used currently on...DFTTSExportToFileEx(O, " Paul ", 1, 1033, "Testing the Digital Future Text-to-Speech SDK.", -1, -1, -1, -1, -1, DFTTS_ TEXT_ TYPE_ XML, "test.wav", 0, "", -1

  1. CALNPS: Computer Analysis Language Naval Postgraduate School Version

    DTIC Science & Technology

    1989-06-01

    The graphics capabilities were expanded to include hai copy options using the PlotlO and Disspia araplaics libraries. T’\\u di ,pla. !𔃻z1 options are ...8217:c:n of tbhis page All oiher ediiions are obsc,,C I. nclassified Approved for public release; distribution is unlimited. CALNPS Computer Analysis... are now available and the user now has the capability to plot curves from data files from within the CALNPS domain. As CALNPS is a very large program

  2. IAC - INTEGRATED ANALYSIS CAPABILITY

    NASA Technical Reports Server (NTRS)

    Frisch, H. P.

    1994-01-01

    The objective of the Integrated Analysis Capability (IAC) system is to provide a highly effective, interactive analysis tool for the integrated design of large structures. With the goal of supporting the unique needs of engineering analysis groups concerned with interdisciplinary problems, IAC was developed to interface programs from the fields of structures, thermodynamics, controls, and system dynamics with an executive system and database to yield a highly efficient multi-disciplinary system. Special attention is given to user requirements such as data handling and on-line assistance with operational features, and the ability to add new modules of the user's choice at a future date. IAC contains an executive system, a data base, general utilities, interfaces to various engineering programs, and a framework for building interfaces to other programs. IAC has shown itself to be effective in automatic data transfer among analysis programs. IAC 2.5, designed to be compatible as far as possible with Level 1.5, contains a major upgrade in executive and database management system capabilities, and includes interfaces to enable thermal, structures, optics, and control interaction dynamics analysis. The IAC system architecture is modular in design. 1) The executive module contains an input command processor, an extensive data management system, and driver code to execute the application modules. 2) Technical modules provide standalone computational capability as well as support for various solution paths or coupled analyses. 3) Graphics and model generation interfaces are supplied for building and viewing models. Advanced graphics capabilities are provided within particular analysis modules such as INCA and NASTRAN. 4) Interface modules provide for the required data flow between IAC and other modules. 5) User modules can be arbitrary executable programs or JCL procedures with no pre-defined relationship to IAC. 6) Special purpose modules are included, such as MIMIC (Model Integration via Mesh Interpolation Coefficients), which transforms field values from one model to another; LINK, which simplifies incorporation of user specific modules into IAC modules; and DATAPAC, the National Bureau of Standards statistical analysis package. The IAC database contains structured files which provide a common basis for communication between modules and the executive system, and can contain unstructured files such as NASTRAN checkpoint files, DISCOS plot files, object code, etc. The user can define groups of data and relations between them. A full data manipulation and query system operates with the database. The current interface modules comprise five groups: 1) Structural analysis - IAC contains a NASTRAN interface for standalone analysis or certain structural/control/thermal combinations. IAC provides enhanced structural capabilities for normal modes and static deformation analysis via special DMAP sequences. IAC 2.5 contains several specialized interfaces from NASTRAN in support of multidisciplinary analysis. 2) Thermal analysis - IAC supports finite element and finite difference techniques for steady state or transient analysis. There are interfaces for the NASTRAN thermal analyzer, SINDA/SINFLO, and TRASYS II. FEMNET, which converts finite element structural analysis models to finite difference thermal analysis models, is also interfaced with the IAC database. 3) System dynamics - The DISCOS simulation program which allows for either nonlinear time domain analysis or linear frequency domain analysis, is fully interfaced to the IAC database management capability. 4) Control analysis - Interfaces for the ORACLS, SAMSAN, NBOD2, and INCA programs allow a wide range of control system analyses and synthesis techniques. Level 2.5 includes EIGEN, which provides tools for large order system eigenanalysis, and BOPACE, which allows for geometric capabilities and finite element analysis with nonlinear material. Also included in IAC level 2.5 is SAMSAN 3.1, an engineering analysis program which contains a general purpose library of over 600 subroutines for numerical analysis. 5) Graphics - The graphics package IPLOT is included in IAC. IPLOT generates vector displays of tabular data in the form of curves, charts, correlation tables, etc. Either DI3000 or PLOT-10 graphics software is required for full graphic capability. In addition to these analysis tools, IAC 2.5 contains an IGES interface which allows the user to read arbitrary IGES files into an IAC database and to edit and output new IGES files. IAC is available by license for a period of 10 years to approved U.S. licensees. The licensed program product includes one set of supporting documentation. Additional copies may be purchased separately. IAC is written in FORTRAN 77 and has been implemented on a DEC VAX series computer operating under VMS. IAC can be executed by multiple concurrent users in batch or interactive mode. The program is structured to allow users to easily delete those program capabilities and "how to" examples they do not want in order to reduce the size of the package. The basic central memory requirement for IAC is approximately 750KB. The following programs are also available from COSMIC as separate packages: NASTRAN, SINDA/SINFLO, TRASYS II, DISCOS, ORACLS, SAMSAN, NBOD2, and INCA. The development of level 2.5 of IAC was completed in 1989.

  3. The impact of workplace factors on filing of workers' compensation claims among nursing home workers.

    PubMed

    Qin, Jin; Kurowski, Alicia; Gore, Rebecca; Punnett, Laura

    2014-01-29

    Injuries reported to workers' compensation (WC) system are often used to estimate incidence of health outcomes and evaluate interventions in musculoskeletal epidemiology studies. However, WC claims represent a relatively small subset of all musculoskeletal disorders among employed individuals, and perhaps not a representative subset. This study determined the influence of workplace and individual factors on filing of workers' compensation claims by nursing home employees with back pain. Surveys were conducted in 18 skilled nursing facilities in four U.S. states. Self-administered questionnaires obtained information on demographic characteristics, working environment, and health behaviors/status. Employees who reported low back pain at least once in four questionnaire surveys were included. WC claims from the same facilities were obtained from the employer's workers compensation insurer and matched by employee name. The dichotomous dependent variable was filing of back-related worker's compensation claim. Association with predictors of interest, including pain severity, physical job demand, job strain, social support, schedule control, and safety climate, was assessed using multivariate regression modeling. Individual characteristics were tested as potential confounders. Pain severity level was significantly associated with filing low-back related claims (odds ratio (OR) = 1.49, 95% CI = 1.18 - 1.87). Higher physical demands at work (OR = 1.07, 95% CI = 1.01 - 1.14) also increased the likelihood of claim filing. Higher job strain (OR = 0.83, 95% CI = 0.73 - 0.94), social support at work (OR = 0.90, 95% CI = 0.82 - 0.99), and education (OR = 0.79, 95% CI = 0.71 - 0.89) decreased the likelihood of claim filing. The results suggest that the WC system captured the most severe occupational injuries. Workplace factors had additional influence on workers' decision to file claims, after adjusting for low back pain severity. Education was correlated with worker's socioeconomic status; its influence on claim filing is difficult to interpret because of the possible mixed effects of working conditions, self-efficacy, and content knowledge.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bower, J.C.; Burford, M.J.; Downing, T.R.

    The Integrated Baseline System (IBS) is an emergency management planning and analysis tool that is being developed under the direction of the US Army Nuclear and Chemical Agency (USANCA). The IBS Data Management Guide provides the background, as well as the operations and procedures needed to generate and maintain a site-specific map database. Data and system managers use this guide to manage the data files and database that support the administrative, user-environment, database management, and operational capabilities of the IBS. This document provides a description of the data files and structures necessary for running the IBS software and using themore » site map database.« less

  5. Cpp Utility - Version 1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    III, FredOppel; Rigdon, J. Brian

    2014-09-08

    A collection of general Umbra modules that are reused by other Umbra libraries. These capabilities include line segments, file utilities, color utilities, string utilities (for std::string), list utilities (for std ::vector ), bounding box intersections, range limiters, simple filters, cubic roots solvers and a few other utilities.

  6. Betty Petersen Memorial Library - NCWCP Publications - NWS

    Science.gov Websites

    Resources NCEP Office Notes IT Resources Request an item* University of Maryland Research Affiliate Contact for Environmental Prediction (NCEP) NESDIS Center for Satellite Applications & Research (STAR : Justifying new Arctic Observation Capabilities (.PDF file) 474 2013 Purser, R. James Comparative

  7. GTEX: An expert system for diagnosing faults in satellite ground stations

    NASA Technical Reports Server (NTRS)

    Schlegelmilch, Richard F.; Durkin, John; Petrik, Edward J.

    1991-01-01

    A proof of concept expert system called Ground Terminal Expert (GTEX) was developed at The University of Akron in collaboration with NASA Lewis Research Center. The objective of GTEX is to aid in diagnosing data faults occurring with a digital ground terminal. This strategy can also be applied to the Very Small Aperture Terminal (VSAT) technology. An expert system which detects and diagnoses faults would enhance the performance of the VSAT by improving reliability and reducing maintenance time. GTEX is capable of detecting faults, isolating the cause and recommending appropriate actions. Isolation of faults is completed to board-level modules. A graphical user interface provides control and a medium where data can be requested and cryptic information logically displayed. Interaction with GTEX consists of user responses and input from data files. The use of data files provides a method of simulating dynamic interaction between the digital ground terminal and the expert system. GTEX as described is capable of both improving reliability and reducing the time required for necessary maintenance.

  8. MIRADS-2 Implementation Manual

    NASA Technical Reports Server (NTRS)

    1975-01-01

    The Marshall Information Retrieval and Display System (MIRADS) which is a data base management system designed to provide the user with a set of generalized file capabilities is presented. The system provides a wide variety of ways to process the contents of the data base and includes capabilities to search, sort, compute, update, and display the data. The process of creating, defining, and loading a data base is generally called the loading process. The steps in the loading process which includes (1) structuring, (2) creating, (3) defining, (4) and implementing the data base for use by MIRADS are defined. The execution of several computer programs is required to successfully complete all steps of the loading process. This library must be established as a cataloged mass storage file as the first step in MIRADS implementation. The procedure for establishing the MIRADS Library is given. The system is currently operational for the UNIVAC 1108 computer system utilizing the Executive Operating System. All procedures relate to the use of MIRADS on the U-1108 computer.

  9. UPEML Version 2. 0: A machine-portable CDC Update emulator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mehlhorn, T.A.; Young, M.F.

    1987-05-01

    UPEML is a machine-portable CDC Update emulation program. UPEML is written in ANSI standard Fortran-77 and is relatively simple and compact. It is capable of emulating a significant subset of the standard CDC Update functions, including program library creation and subsequent modification. Machine-portability is an essential attribute of UPEML. UPEML was written primarily to facilitate the use of CDC-based scientific packages on alternate computer systems such as the VAX 11/780 and the IBM 3081. UPEML has also been successfully used on the multiprocessor ELXSI, on CRAYs under both COS and CTSS operating systems, on APOLLO workstations, and on the HP-9000.more » Version 2.0 includes enhanced error checking, full ASCI character support, a program library audit capability, and a partial update option in which only selected or modified decks are written to the compile file. Further enhancements include checks for overlapping corrections, processing of nested calls to common decks, and reads and addfiles from alternate input files.« less

  10. GTEX: An expert system for diagnosing faults in satellite ground stations

    NASA Astrophysics Data System (ADS)

    Schlegelmilch, Richard F.; Durkin, John; Petrik, Edward J.

    1991-11-01

    A proof of concept expert system called Ground Terminal Expert (GTEX) was developed at The University of Akron in collaboration with NASA Lewis Research Center. The objective of GTEX is to aid in diagnosing data faults occurring with a digital ground terminal. This strategy can also be applied to the Very Small Aperture Terminal (VSAT) technology. An expert system which detects and diagnoses faults would enhance the performance of the VSAT by improving reliability and reducing maintenance time. GTEX is capable of detecting faults, isolating the cause and recommending appropriate actions. Isolation of faults is completed to board-level modules. A graphical user interface provides control and a medium where data can be requested and cryptic information logically displayed. Interaction with GTEX consists of user responses and input from data files. The use of data files provides a method of simulating dynamic interaction between the digital ground terminal and the expert system. GTEX as described is capable of both improving reliability and reducing the time required for necessary maintenance.

  11. Spacelab Data Processing Facility

    NASA Technical Reports Server (NTRS)

    1983-01-01

    The capabilities of the Spacelab Data Processing Facility (SPDPF) are highlighted. The capturing, quality monitoring, processing, accounting, and forwarding of vital Spacelab data to various user facilities around the world are described.

  12. Historical Orthoimagery of the Lake Tahoe Basin

    USGS Publications Warehouse

    Soulard, Christopher E.; Raumann, Christian G.

    2008-01-01

    The U.S. Geological Survey (USGS) Western Geographic Science Center has developed a series of historical digital orthoimagery (HDO) datasets covering part or all of the Lake Tahoe Basin. Three datasets are available: (A) 1940 HDOs for the southern Lake Tahoe Basin, (B) 1969 HDOs for the entire Lake Tahoe Basin, and (C) 1987 HDOs for the southern Lake Tahoe Basin. The HDOs (for 1940, 1969, and 1987) were compiled photogrammically from aerial photography with varying scales, camera characteristics, image quality, and capture dates. The resulting datasets have a 1-meter horizontal resolution. Precision-corrected Ikonos multispectral satellite imagery was used as a substitute for HDOs/DOQs for the 2002 imagery date, but these data are not available for download in this series due to licensing restrictions. The projection of the HDO data is set to UTM Zone 10, NAD 1983. The data for each of the three available dates are clipped into files that spatially approximate the 3.75-minute USGS quarter quadrangles (roughly 3,000 to 4,000 hectares), and have roughly 100 pixels (or 100 meters) of overlap to facilitate combining the files into larger regions without data gaps. The files are named after 3.75-minute USGS quarter quadrangles that cover the same general spatial extent. These files are available in the ERDAS Imagine (.img) format.

  13. Metal- and Polymer-Matrix Composites: Functional Lightweight Materials for High-Performance Structures

    NASA Astrophysics Data System (ADS)

    Gupta, Nikhil; Paramsothy, Muralidharan

    2014-06-01

    The special topic "Metal- and Polymer-Matrix Composites" is intended to capture the state of the art in the research and practice of functional composites. The current set of articles related to metal-matrix composites includes reviews on functionalities such as self-healing, self-lubricating, and self-cleaning capabilities; research results on a variety of aluminum-matrix composites; and investigations on advanced composites manufacturing methods. In addition, the processing and properties of carbon nanotube-reinforced polymer-matrix composites and adhesive bonding of laminated composites are discussed. The literature on functional metal-matrix composites is relatively scarce compared to functional polymer-matrix composites. The demand for lightweight composites in the transportation sector is fueling the rapid development in this field, which is captured in the current set of articles. The possibility of simultaneously tailoring several desired properties is attractive but very challenging, and it requires significant advancements in the science and technology of composite materials. The progress captured in the current set of articles shows promise for developing materials that seem capable of moving this field from laboratory-scale prototypes to actual industrial applications.

  14. Capturing Change: Integrating Art and Science

    NASA Astrophysics Data System (ADS)

    Gillerman, J.

    2011-12-01

    The evolving capabilities of interactive media have broadened the potential, and the challenges, of sharing scientific knowledge. From video capture to mobile devices, new technologies have enabled artists to tackle previously demanding or out-of-reach topics and new avenues of dissemination of both art and science. These changes and capabilities affect not only the context and possibilities of scientific data collection, but also how information is presented and communicated innovatively to the public. When recording video of science material whether it is of a Ridley Sea Turtle laying eggs on a beach in Costa Rica, an active lava flow from the volcano Kilauea in Hawaii, or solar eclipses in remote locations around the world, one has to be prepared technically and artistically, not to mention patient in specialized and/or challenging conditions to capture video that satisfies the scientific and artistic imagination. This presentation will include material from varied natural phenomena, creative interfacing in a multimedia context integrating art, science, culture and technology to reach a broad and diverse public, and teaching the integration of art and science through varied art media. (http://www.vipervertex.com).

  15. Setting Up the JBrowse Genome Browser

    PubMed Central

    Skinner, Mitchell E; Holmes, Ian H

    2010-01-01

    JBrowse is a web-based tool for visualizing genomic data. Unlike most other web-based genome browsers, JBrowse exploits the capabilities of the user's web browser to make scrolling and zooming fast and smooth. It supports the browsers used by almost all internet users, and is relatively simple to install. JBrowse can utilize multiple types of data in a variety of common genomic data formats, including genomic feature data in bioperl databases, GFF files, and BED files, and quantitative data in wiggle files. This unit describes how to obtain the JBrowse software, set it up on a Linux or Mac OS X computer running as a web server and incorporate genome annotation data from multiple sources into JBrowse. After completing the protocols described in this unit, the reader will have a web site that other users can visit to browse the genomic data. PMID:21154710

  16. New Powder Diffraction File (PDF-4) in relational database format: advantages and data-mining capabilities.

    PubMed

    Kabekkodu, Soorya N; Faber, John; Fawcett, Tim

    2002-06-01

    The International Centre for Diffraction Data (ICDD) is responding to the changing needs in powder diffraction and materials analysis by developing the Powder Diffraction File (PDF) in a very flexible relational database (RDB) format. The PDF now contains 136,895 powder diffraction patterns. In this paper, an attempt is made to give an overview of the PDF-4, search/match methods and the advantages of having the PDF-4 in RDB format. Some case studies have been carried out to search for crystallization trends, properties, frequencies of space groups and prototype structures. These studies give a good understanding of the basic structural aspects of classes of compounds present in the database. The present paper also reports data-mining techniques and demonstrates the power of a relational database over the traditional (flat-file) database structures.

  17. LACIE performance predictor final operational capability program description, volume 1

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The program EPHEMS computes the orbital parameters for up to two vehicles orbiting the earth for up to 549 days. The data represents a continuous swath about the earth, producing tables which can be used to determine when and if certain land segments will be covered. The program GRID processes NASA's climatology tape to obtain the weather indices along with associated latitudes and longitudes. The program LUMP takes substrata historical data and sample segment ID, crop window, crop window error and statistical data, checks for valid input parameters and generates the segment ID file, crop window file and the substrata historical file. Finally, the System Error Executive (SEE) Program checks YES error and truth data, CAMS error data, and signature extension data for validity and missing elements. A message is printed for each error found.

  18. Aerobraking Maneuver (ABM) Report Generator

    NASA Technical Reports Server (NTRS)

    Fisher, Forrest; Gladden, Roy; Khanampornpan, Teerapat

    2008-01-01

    abmREPORT Version 3.1 is a Perl script that extracts vital summarization information from the Mars Reconnaissance Orbiter (MRO) aerobraking ABM build process. This information facilitates sequence reviews, and provides a high-level summarization of the sequence for mission management. The script extracts information from the ENV, SSF, FRF, SCMFmax, and OPTG files and burn magnitude configuration files and presents them in a single, easy-to-check report that provides the majority of the parameters necessary for cross check and verification during the sequence review process. This means that needed information, formerly spread across a number of different files and each in a different format, is all available in this one application. This program is built on the capabilities developed in dragReport and then the scripts evolved as the two tools continued to be developed in parallel.

  19. X-MATE: a flexible system for mapping short read data

    PubMed Central

    Pearson, John V.; Cloonan, Nicole; Grimmond, Sean M.

    2011-01-01

    Summary: Accurate and complete mapping of short-read sequencing to a reference genome greatly enhances the discovery of biological results and improves statistical predictions. We recently presented RNA-MATE, a pipeline for the recursive mapping of RNA-Seq datasets. With the rapid increase in genome re-sequencing projects, progression of available mapping software and the evolution of file formats, we now present X-MATE, an updated version of RNA-MATE, capable of mapping both RNA-Seq and DNA datasets and with improved performance, output file formats, configuration files, and flexibility in core mapping software. Availability: Executables, source code, junction libraries, test data and results and the user manual are available from http://grimmond.imb.uq.edu.au/X-MATE/. Contact: n.cloonan@uq.edu.au; s.grimmond@uq.edu.au Supplementary information: Supplementary data are available at Bioinformatics Online. PMID:21216778

  20. Requirement Development Process and Tools

    NASA Technical Reports Server (NTRS)

    Bayt, Robert

    2017-01-01

    Requirements capture the system-level capabilities in a set of complete, necessary, clear, attainable, traceable, and verifiable statements of need. Requirements should not be unduly restrictive, but should set limits that eliminate items outside the boundaries drawn, encourage competition (or alternatives), and capture source and reason of requirement. If it is not needed by the customer, it is not a requirement. They establish the verification methods that will lead to product acceptance. These must be reproducible assessment methods.

  1. Variable-speed wind power system with improved energy capture via multilevel conversion

    DOEpatents

    Erickson, Robert W.; Al-Naseem, Osama A.; Fingersh, Lee Jay

    2005-05-31

    A system and method for efficiently capturing electrical energy from a variable-speed generator are disclosed. The system includes a matrix converter using full-bridge, multilevel switch cells, in which semiconductor devices are clamped to a known constant DC voltage of a capacitor. The multilevel matrix converter is capable of generating multilevel voltage wave waveform of arbitrary magnitude and frequencies. The matrix converter can be controlled by using space vector modulation.

  2. Automated rendezvous and capture development infrastructure

    NASA Technical Reports Server (NTRS)

    Bryan, Thomas C.; Roe, Fred; Coker, Cynthia

    1992-01-01

    The facilities at Marshall Space Flight Center and JSC to be utilized to develop and test an autonomous rendezvous and capture (ARC) system are described. This includes equipment and personnel facility capabilities to devise, develop, qualify, and integrate ARC elements and subsystems into flight programs. Attention is given to the use of a LEO test facility, the current concept and unique system elements of the ARC, and the options available to develop ARC technology.

  3. Context-dependent control of attention capture: Evidence from proportion congruent effects.

    PubMed

    Crump, Matthew J C; Milliken, Bruce; Leboe-McGowan, Jason; Leboe-McGowan, Launa; Gao, Xiaoqing

    2018-06-01

    There are several independent demonstrations that attentional phenomena can be controlled in a context-dependent manner by cues associated with differing attentional control demands. The present set of experiments provide converging evidence that attention-capture phenomena can be modulated in a context-dependent fashion. We determined whether methods from the proportion congruent literature (listwide and item- and context-specific proportion congruent designs) that are known to modulate distractor interference effects in Stroop and flanker tasks are capable of modulating attention capture by salient feature singletons. Across experiments we found evidence that attention capture can be modulated by listwide, item-specific, and context-specific manipulations of proportion congruent. We discuss challenges associated with interpreting results from proportion congruent studies but propose that our findings converge with existing work that has demonstrated context-dependent control of attention capture. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  4. 56Fe capture cross section experiments at the RPI LINAC Center

    NASA Astrophysics Data System (ADS)

    McDermott, Brian; Blain, Ezekiel; Thompson, Nicholas; Weltz, Adam; Youmans, Amanda; Danon, Yaron; Barry, Devin; Block, Robert; Daskalakis, Adam; Epping, Brian; Leinweber, Gregory; Rapp, Michael

    2017-09-01

    A new array of C6D6 detectors installed at the RPI LINAC Center has enabled the capability to measure neutron capture cross sections above the 847 keV inelastic scattering threshold of 56Fe through the use of digital post-processing filters and pulse-integral discriminators, without sacrificing the statistical quality of data at lower incident neutron energies where such filtering is unnecessary. The C6D6 detectors were used to perform time-of-flight capture cross section measurements on a sample 99.87% enriched iron-56. The total-energy method, combined with the pulse height weighting technique, were then applied to the raw data to determine the energy-dependent capture yield. Above the inelastic threshold, the data were analyzed with a pulse-integral filter to reveal the capture signal, extending the the full data set to 2 MeV.

  5. Ionic liquid-functionalized mesoporous sorbents and their use in the capture of polluting gases

    DOEpatents

    Lee, Jong Suk; Koros, William J.; Bhuwania, Nitesh; Hillesheim, Patrick C.; Dai, Sheng

    2016-01-12

    A composite structure for capturing a gaseous electrophilic species, the composite structure comprising mesoporous refractory sorbent particles on which an ionic liquid is covalently attached, wherein said ionic liquid includes an accessible functional group that is capable of binding to said gaseous electrophilic species. In particular embodiments, the mesoporous sorbent particles are contained within refractory hollow fibers. Also described is a method for capturing a gaseous electrophilic species by use of the above-described composite structure, wherein the gaseous electrophilic species is contacted with the composite structure. In particular embodiments thereof, cooling water is passed through the refractory hollow fibers containing the IL-functionalized sorbent particles in order to facilitate capture of the gaseous electrophilic species, and then steam is passed through the refractory hollow fibers to facilitate release of the gaseous electrophilic species such that the composite structure can be re-used to capture additional gas.

  6. The design, construction and implementation of a computerised trauma registry in a developing South African metropolitan trauma service.

    PubMed

    Laing, G L; Bruce, J L; Aldous, C; Clarke, D L

    2014-01-01

    The Pietermaritzburg Metropolitan Trauma Service formerly lacked a robust computerised trauma registry. This made surgical audit difficult for the purpose of quality of care improvement and development. We aimed to design, construct and implement a computerised trauma registry within our service. Twelve months following its implementation, we sought to examine and report on the quality of the registry. Formal ethical approval to maintain a computerised trauma registry was obtained prior to undertaking any design and development. Appropriate commercial software was sourced to develop this project. The registry was designed as a flat file. A flat file is a plain text or mixed text and binary file which usually contains one record per line or physical record. Thereafter the registry file was launched onto a secure server. This provided the benefits of access security and automated backups. Registry training was provided to clients by the developer. The exercise of data capture was then integrated into the process of service delivery, taking place at the endpoint of patient care (discharge, transfer or death). Twelve months following its implementation, the compliance rates of data entry were measured. The developer of this project managed to design, construct and implement an electronic trauma registry into the service. Twelve months following its implementation the data were extracted and audited to assess the quality. A total of 2640 patient entries were captured onto the registry. Compliance rates were in the order of eighty percent and client satisfaction rates were high. A number of deficits were identified. These included the omission of weekend discharges and underreporting of deaths. The construction and implementation of the computerised trauma registry was the beginning of an endeavour to continue improvements in the quality of care within our service. The registry provided a reliable audit at twelve months post implementation. Deficits and limitations were identified and new strategies have been planned to overcome these problems and integrate the trauma registry into the process of clinical care. Copyright © 2013 Elsevier Ltd. All rights reserved.

  7. Case-Based Capture and Reuse of Aerospace Design Rationale

    NASA Technical Reports Server (NTRS)

    Leake, David B.

    2001-01-01

    The goal of this project was to apply artificial intelligence techniques to facilitate capture and reuse of aerospace design rationale. The project combined case-based reasoning (CBR) and concept maps (CMaps) to develop methods for capturing, organizing, and interactively accessing records of experiences encapsulating the methods and rationale underlying expert aerospace design, in order to bring the captured knowledge to bear to support future reasoning. The project's results contribute both principles and methods for effective design-aiding systems that aid capture and access of useful design knowledge. The project has been guided by the tenets that design-aiding systems must: (1) Leverage a designer's knowledge, rather than attempting to replace it; (2) Be able to reflect different designers' differing conceptualizations of the design task, and to clarify those conceptualizations to others; (3) Include capabilities to capture information both by interactive knowledge modeling and during normal use; and (4) Integrate into normal designer tasks as naturally and unobtrusive as possible.

  8. A low-cost digital filing system for echocardiography data with MPEG4 compression and its application to remote diagnosis.

    PubMed

    Umeda, Akira; Iwata, Yasushi; Okada, Yasumasa; Shimada, Megumi; Baba, Akiyasu; Minatogawa, Yasuyuki; Yamada, Takayasu; Chino, Masao; Watanabe, Takafumi; Akaishi, Makoto

    2004-12-01

    The high cost of digital echocardiographs and the large size of data files hinder the adoption of remote diagnosis of digitized echocardiography data. We have developed a low-cost digital filing system for echocardiography data. In this system, data from a conventional analog echocardiograph are captured using a personal computer (PC) equipped with an analog-to-digital converter board. Motion picture data are promptly compressed using a moving pictures expert group (MPEG) 4 codec. The digitized data with preliminary reports obtained in a rural hospital are then sent to cardiologists at distant urban general hospitals via the internet. The cardiologists can evaluate the data using widely available movie-viewing software (Windows Media Player). The diagnostic accuracy of this double-check system was confirmed by comparison with ordinary super-VHS videotapes. We have demonstrated that digitization of echocardiography data from a conventional analog echocardiograph and MPEG 4 compression can be performed using an ordinary PC-based system, and that this system enables highly efficient digital storage and remote diagnosis at low cost.

  9. Collaborative Workspaces within Distributed Virtual Environments.

    DTIC Science & Technology

    1996-12-01

    such as a text document, a 3D model, or a captured image using a collaborative workspace called the InPerson Whiteboard . The Whiteboard contains a...commands for editing objects drawn on the screen. Finally, when the call is completed, the Whiteboard can be saved to a file for future use . IRIS Annotator... use , and a shared whiteboard that includes a number of multimedia annotation tools. Both systems are also mindful of bandwidth limitations and can

  10. NASA ARCH- A FILE ARCHIVAL SYSTEM FOR THE DEC VAX

    NASA Technical Reports Server (NTRS)

    Scott, P. J.

    1994-01-01

    The function of the NASA ARCH system is to provide a permanent storage area for files that are infrequently accessed. The NASA ARCH routines were designed to provide a simple mechanism by which users can easily store and retrieve files. The user treats NASA ARCH as the interface to a black box where files are stored. There are only five NASA ARCH user commands, even though NASA ARCH employs standard VMS directives and the VAX BACKUP utility. Special care is taken to provide the security needed to insure file integrity over a period of years. The archived files may exist in any of three storage areas: a temporary buffer, the main buffer, and a magnetic tape library. When the main buffer fills up, it is transferred to permanent magnetic tape storage and deleted from disk. Files may be restored from any of the three storage areas. A single file, multiple files, or entire directories can be stored and retrieved. archived entities hold the same name, extension, version number, and VMS file protection scheme as they had in the user's account prior to archival. NASA ARCH is capable of handling up to 7 directory levels. Wildcards are supported. User commands include TEMPCOPY, DISKCOPY, DELETE, RESTORE, and DIRECTORY. The DIRECTORY command searches a directory of savesets covering all three archival areas, listing matches according to area, date, filename, or other criteria supplied by the user. The system manager commands include 1) ARCHIVE- to transfer the main buffer to duplicate magnetic tapes, 2) REPORTto determine when the main buffer is full enough to archive, 3) INCREMENT- to back up the partially filled main buffer, and 4) FULLBACKUP- to back up the entire main buffer. On-line help files are provided for all NASA ARCH commands. NASA ARCH is written in DEC VAX DCL for interactive execution and has been implemented on a DEC VAX computer operating under VMS 4.X. This program was developed in 1985.

  11. Fast in-situ tool inspection based on inverse fringe projection and compact sensor heads

    NASA Astrophysics Data System (ADS)

    Matthias, Steffen; Kästner, Markus; Reithmeier, Eduard

    2016-11-01

    Inspection of machine elements is an important task in production processes in order to ensure the quality of produced parts and to gather feedback for the continuous improvement process. A new measuring system is presented, which is capable of performing the inspection of critical tool geometries, such as gearing elements, inside the forming machine. To meet the constraints on sensor head size and inspection time imposed by the limited space inside the machine and the cycle time of the process, the measuring device employs a combination of endoscopy techniques with the fringe projection principle. Compact gradient index lenses enable a compact design of the sensor head, which is connected to a CMOS camera and a flexible micro-mirror based projector via flexible fiber bundles. Using common fringe projection patterns, the system achieves measuring times of less than five seconds. To further reduce the time required for inspection, the generation of inverse fringe projection patterns has been implemented for the system. Inverse fringe projection speeds up the inspection process by employing object-adapted patterns, which enable the detection of geometry deviations in a single image. Two different approaches to generate object adapted patterns are presented. The first approach uses a reference measurement of a manufactured tool master to generate the inverse pattern. The second approach is based on a virtual master geometry in the form of a CAD file and a ray-tracing model of the measuring system. Virtual modeling of the measuring device and inspection setup allows for geometric tolerancing for free-form surfaces by the tool designer in the CAD-file. A new approach is presented, which uses virtual tolerance specifications and additional simulation steps to enable fast checking of metric tolerances. Following the description of the pattern generation process, the image processing steps required for inspection are demonstrated on captures of gearing geometries.

  12. 77 FR 4765 - Marine Mammals; File No. 15142

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-01-31

    ... comment. Those individuals requesting a public hearing should submit a written request to the Chief... reasons why a hearing on this application would be appropriate. FOR FURTHER INFORMATION CONTACT: Amy Sloan...-01 (75 FR 58352) and will provide quantitative measurements of the amphibious hearing capabilities of...

  13. Capabilities and the Definition of Health: Comments on Venkatapuram.

    PubMed

    Richardson, Henry S

    2016-01-01

    Sridhar Venkatapuram's Health Justice argues that health is a 'metacapability' - specifically, as the metacapability of having the ten 'central human capabilities' described by Martha Nussbaum. This cannot be right, as it provides no basis for distinguishing health from education, riches, or love. An amendment correcting this problem is suggested, namely that health is the involuntary, bodily aspect of the metacapability for the central capabilities. This amendment is defended against the objection that it fails to capture some important aspects of mental health. © 2016 John Wiley & Sons Ltd.

  14. A mobile app for securely capturing and transferring clinical images to the electronic health record: description and preliminary usability study.

    PubMed

    Landman, Adam; Emani, Srinivas; Carlile, Narath; Rosenthal, David I; Semakov, Simon; Pallin, Daniel J; Poon, Eric G

    2015-01-02

    Photographs are important tools to record, track, and communicate clinical findings. Mobile devices with high-resolution cameras are now ubiquitous, giving clinicians the opportunity to capture and share images from the bedside. However, secure and efficient ways to manage and share digital images are lacking. The aim of this study is to describe the implementation of a secure application for capturing and storing clinical images in the electronic health record (EHR), and to describe initial user experiences. We developed CliniCam, a secure Apple iOS (iPhone, iPad) application that allows for user authentication, patient selection, image capture, image annotation, and storage of images as a Portable Document Format (PDF) file in the EHR. We leveraged our organization's enterprise service-oriented architecture to transmit the image file from CliniCam to our enterprise clinical data repository. There is no permanent storage of protected health information on the mobile device. CliniCam also required connection to our organization's secure WiFi network. Resident physicians from emergency medicine, internal medicine, and dermatology used CliniCam in clinical practice for one month. They were then asked to complete a survey on their experience. We analyzed the survey results using descriptive statistics. Twenty-eight physicians participated and 19/28 (68%) completed the survey. Of the respondents who used CliniCam, 89% found it useful or very useful for clinical practice and easy to use, and wanted to continue using the app. Respondents provided constructive feedback on location of the photos in the EHR, preferring to have photos embedded in (or linked to) clinical notes instead of storing them as separate PDFs within the EHR. Some users experienced difficulty with WiFi connectivity which was addressed by enhancing CliniCam to check for connectivity on launch. CliniCam was implemented successfully and found to be easy to use and useful for clinical practice. CliniCam is now available to all clinical users in our hospital, providing a secure and efficient way to capture clinical images and to insert them into the EHR. Future clinical image apps should more closely link clinical images and clinical documentation and consider enabling secure transmission over public WiFi or cellular networks.

  15. ISA-TAB-Nano: a specification for sharing nanomaterial research data in spreadsheet-based format.

    PubMed

    Thomas, Dennis G; Gaheen, Sharon; Harper, Stacey L; Fritts, Martin; Klaessig, Fred; Hahn-Dantona, Elizabeth; Paik, David; Pan, Sue; Stafford, Grace A; Freund, Elaine T; Klemm, Juli D; Baker, Nathan A

    2013-01-14

    The high-throughput genomics communities have been successfully using standardized spreadsheet-based formats to capture and share data within labs and among public repositories. The nanomedicine community has yet to adopt similar standards to share the diverse and multi-dimensional types of data (including metadata) pertaining to the description and characterization of nanomaterials. Owing to the lack of standardization in representing and sharing nanomaterial data, most of the data currently shared via publications and data resources are incomplete, poorly-integrated, and not suitable for meaningful interpretation and re-use of the data. Specifically, in its current state, data cannot be effectively utilized for the development of predictive models that will inform the rational design of nanomaterials. We have developed a specification called ISA-TAB-Nano, which comprises four spreadsheet-based file formats for representing and integrating various types of nanomaterial data. Three file formats (Investigation, Study, and Assay files) have been adapted from the established ISA-TAB specification; while the Material file format was developed de novo to more readily describe the complexity of nanomaterials and associated small molecules. In this paper, we have discussed the main features of each file format and how to use them for sharing nanomaterial descriptions and assay metadata. The ISA-TAB-Nano file formats provide a general and flexible framework to record and integrate nanomaterial descriptions, assay data (metadata and endpoint measurements) and protocol information. Like ISA-TAB, ISA-TAB-Nano supports the use of ontology terms to promote standardized descriptions and to facilitate search and integration of the data. The ISA-TAB-Nano specification has been submitted as an ASTM work item to obtain community feedback and to provide a nanotechnology data-sharing standard for public development and adoption.

  16. ISA-TAB-Nano: A Specification for Sharing Nanomaterial Research Data in Spreadsheet-based Format

    PubMed Central

    2013-01-01

    Background and motivation The high-throughput genomics communities have been successfully using standardized spreadsheet-based formats to capture and share data within labs and among public repositories. The nanomedicine community has yet to adopt similar standards to share the diverse and multi-dimensional types of data (including metadata) pertaining to the description and characterization of nanomaterials. Owing to the lack of standardization in representing and sharing nanomaterial data, most of the data currently shared via publications and data resources are incomplete, poorly-integrated, and not suitable for meaningful interpretation and re-use of the data. Specifically, in its current state, data cannot be effectively utilized for the development of predictive models that will inform the rational design of nanomaterials. Results We have developed a specification called ISA-TAB-Nano, which comprises four spreadsheet-based file formats for representing and integrating various types of nanomaterial data. Three file formats (Investigation, Study, and Assay files) have been adapted from the established ISA-TAB specification; while the Material file format was developed de novo to more readily describe the complexity of nanomaterials and associated small molecules. In this paper, we have discussed the main features of each file format and how to use them for sharing nanomaterial descriptions and assay metadata. Conclusion The ISA-TAB-Nano file formats provide a general and flexible framework to record and integrate nanomaterial descriptions, assay data (metadata and endpoint measurements) and protocol information. Like ISA-TAB, ISA-TAB-Nano supports the use of ontology terms to promote standardized descriptions and to facilitate search and integration of the data. The ISA-TAB-Nano specification has been submitted as an ASTM work item to obtain community feedback and to provide a nanotechnology data-sharing standard for public development and adoption. PMID:23311978

  17. Requirements for a network storage service

    NASA Technical Reports Server (NTRS)

    Kelly, Suzanne M.; Haynes, Rena A.

    1992-01-01

    Sandia National Laboratories provides a high performance classified computer network as a core capability in support of its mission of nuclear weapons design and engineering, physical sciences research, and energy research and development. The network, locally known as the Internal Secure Network (ISN), was designed in 1989 and comprises multiple distributed local area networks (LAN's) residing in Albuquerque, New Mexico and Livermore, California. The TCP/IP protocol suite is used for inner-node communications. Scientific workstations and mid-range computers, running UNIX-based operating systems, compose most LAN's. One LAN, operated by the Sandia Corporate Computing Directorate, is a general purpose resource providing a supercomputer and a file server to the entire ISN. The current file server on the supercomputer LAN is an implementation of the Common File System (CFS) developed by Los Alamos National Laboratory. Subsequent to the design of the ISN, Sandia reviewed its mass storage requirements and chose to enter into a competitive procurement to replace the existing file server with one more adaptable to a UNIX/TCP/IP environment. The requirements study for the network was the starting point for the requirements study for the new file server. The file server is called the Network Storage Services (NSS) and is requirements are described in this paper. The next section gives an application or functional description of the NSS. The final section adds performance, capacity, and access constraints to the requirements.

  18. PHASS99: A software program for retrieving and decoding the radiometric ages of igneous rocks from the international database IGBADAT

    NASA Astrophysics Data System (ADS)

    Al-Mishwat, Ali T.

    2016-05-01

    PHASS99 is a FORTRAN program designed to retrieve and decode radiometric and other physical age information of igneous rocks contained in the international database IGBADAT (Igneous Base Data File). In the database, ages are stored in a proprietary format using mnemonic representations. The program can handle up to 99 ages in an igneous rock specimen and caters to forty radiometric age systems. The radiometric age alphanumeric strings assigned to each specimen description in the database consist of four components: the numeric age and its exponential modifier, a four-character mnemonic method identification, a two-character mnemonic name of analysed material, and the reference number in the rock group bibliography vector. For each specimen, the program searches for radiometric age strings, extracts them, parses them, decodes the different age components, and converts them to high-level English equivalents. IGBADAT and similarly-structured files are used for input. The output includes three files: a flat raw ASCII text file containing retrieved radiometric age information, a generic spreadsheet-compatible file for data import to spreadsheets, and an error file. PHASS99 builds on the old program TSTPHA (Test Physical Age) decoder program and expands greatly its capabilities. PHASS99 is simple, user friendly, fast, efficient, and does not require users to have knowledge of programing.

  19. A PC-based bus monitor program for use with the transport systems research vehicle RS-232 communication interfaces

    NASA Technical Reports Server (NTRS)

    Easley, Wesley C.

    1991-01-01

    Experiment critical use of RS-232 data busses in the Transport Systems Research Vehicle (TSRV) operated by the Advanced Transport Operating Systems Program Office at the NASA Langley Research Center has recently increased. Each application utilizes a number of nonidentical computer and peripheral configurations and requires task specific software development. To aid these development tasks, an IBM PC-based RS-232 bus monitoring system was produced. It can simultaneously monitor two communication ports of a PC or clone, including the nonstandard bus expansion of the TSRV Grid laptop computers. Display occurs in a separate window for each port's input with binary display being selectable. A number of other features including binary log files, screen capture to files, and a full range of communication parameters are provided.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    M. P. Jensen; Toto, T.

    Standard Atmospheric Radiation Measurement (ARM) Climate Research Facility sounding files provide atmospheric state data in one dimension of increasing time and height per sonde launch. Many applications require a quick estimate of the atmospheric state at higher time resolution. The INTERPOLATEDSONDE (i.e., Interpolated Sounding) Value-Added Product (VAP) transforms sounding data into continuous daily files on a fixed time-height grid, at 1-minute time resolution, on 332 levels, from the surface up to a limit of approximately 40 km. The grid extends that high so the full height of soundings can be captured; however, most soundings terminate at an altitude between 25more » and 30 km, above which no data is provided. Between soundings, the VAP linearly interpolates atmospheric state variables in time for each height level. In addition, INTERPOLATEDSONDE provides relative humidity scaled to microwave radiometer (MWR) observations.« less

  1. My Beloved Blackboard: Teacher Empowerment for Students' Success.

    ERIC Educational Resources Information Center

    Caplan-Carbin, Elizabeth

    This paper describes a university German teacher's experience using the file transfer capabilities of the Blackboard Internet tool. The introduction highlights some of the features and advantages of Blackboard. The first section discusses teacher empowerment, noting that the Internet empowers the teacher by providing the wealth of the worlds…

  2. 47 CFR 2.1205 - Filing of required declaration.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... TREATY MATTERS; GENERAL RULES AND REGULATIONS Importation of Devices Capable of Causing Harmful... Customs has not been implemented, use FCC Form 740 to provide the needed information and declarations. Attach a copy of the completed FCC Form 740 to the Customs entry papers. (b)(1) For points of entry where...

  3. 47 CFR 2.1205 - Filing of required declaration.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... TREATY MATTERS; GENERAL RULES AND REGULATIONS Importation of Devices Capable of Causing Harmful... Customs has not been implemented, use FCC Form 740 to provide the needed information and declarations. Attach a copy of the completed FCC Form 740 to the Customs entry papers. (b)(1) For points of entry where...

  4. 47 CFR 2.1205 - Filing of required declaration.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... TREATY MATTERS; GENERAL RULES AND REGULATIONS Importation of Devices Capable of Causing Harmful... Customs has not been implemented, use FCC Form 740 to provide the needed information and declarations. Attach a copy of the completed FCC Form 740 to the Customs entry papers. (b)(1) For points of entry where...

  5. 47 CFR 2.1205 - Filing of required declaration.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... TREATY MATTERS; GENERAL RULES AND REGULATIONS Importation of Devices Capable of Causing Harmful... Customs has not been implemented, use FCC Form 740 to provide the needed information and declarations. Attach a copy of the completed FCC Form 740 to the Customs entry papers. (b)(1) For points of entry where...

  6. Library Circulation Systems -- An Overview.

    ERIC Educational Resources Information Center

    Surace, Cecily J.

    The model circulation system outlined is an on-line real time system in which the circulation file is created from the shelf list and the terminal inquiry system includes the capability to query and browse through the bibliographic system and the circulation subsystem together to determine the availability for circulation of specific documents, or…

  7. The Defense Message System and the U.S. Coast Guard

    DTIC Science & Technology

    1992-06-01

    these mail services, the Internet also provides a File Transfer Protocol (FTP) and remote login between host computers (TELNET) capabilities. 17 [Ref...the Joint Maritime Intelligence Element (JMIE), Zincdust, and Emerald . [Ref. 27] 4. Secure Data Network The Coast Guard’s Secure Data Network (SDN

  8. The Air Force Academy Instructor Workstation (IWS): I. Design and Implementation.

    ERIC Educational Resources Information Center

    Gist, Thomas E.; And Others

    1989-01-01

    Discusses the design and implementation of a computer-controlled instructor workstation (IWS), including a videodisc player, that was developed at the Air Force Academy. System capabilities for lesson presentation, administrative functions, an authoring system, and a file server for courseware maintenance are explained. (seven references) (LRW)

  9. New CD-ROM Technologies Help the Unemployed Search for Jobs.

    ERIC Educational Resources Information Center

    Fries, James R.; Dow, Ronald F.

    1992-01-01

    Describes the use of CD-ROM products containing company and industrial information for job searches and career planning. Examples of potential applications are provided, and search capabilities are examined. Brief descriptions of several products are presented, including a database of Security and Exchange Commission filings, Disclosure, Lotus One…

  10. Creation of a Book Order Management System Using a Microcomputer and a DBMS.

    ERIC Educational Resources Information Center

    Neill, Charlotte; And Others

    1985-01-01

    Describes management decisions and resultant technology-based system that allowed a medical library to meet increasing workloads without accompanying increases in resources available. Discussion covers system analysis; capabilities of book-order management system, "BOOKDIRT;" software and training; hardware; data files; data entry;…

  11. Database Management Systems: New Homes for Migrating Bibliographic Records.

    ERIC Educational Resources Information Center

    Brooks, Terrence A.; Bierbaum, Esther G.

    1987-01-01

    Assesses bibliographic databases as part of visionary text systems such as hypertext and scholars' workstations. Downloading is discussed in terms of the capability to search records and to maintain unique bibliographic descriptions, and relational database management systems, file managers, and text databases are reviewed as possible hosts for…

  12. VizieR Online Data Catalog: Berkeley 32 BVI photometry and spectroscopy (D'Orazi+, 2006)

    NASA Astrophysics Data System (ADS)

    D'Orazi, V.; Bragaglia, A.; Tosi, M.; Fabrizio, L. D.; Held, E. V.

    2010-05-01

    Our data were acquired at the Italian Telescopio Nazionale Galileo, on the Canary Islands, using DOLORES (device optimized for the low resolution), a focal reducer capable of imaging and low-resolution spectroscopy, on UT 2000 November 26 and 2004 February 14. (2 data files).

  13. 76 FR 76435 - Certain Devices With Secure Communication Capabilities, Components Thereof, and Products...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-07

    ... INTERNATIONAL TRADE COMMISSION [Investigation No. 337-TA-818] Certain Devices With Secure... AGENCY: U.S. International Trade Commission. ACTION: Notice. SUMMARY: Notice is hereby given that a complaint was filed with the U.S. International Trade Commission on November 4, 2011, under section 337 of...

  14. Reporting Capabilities and Management of the DSN Energy Data Base

    NASA Technical Reports Server (NTRS)

    Hughes, R. D.; Boyd, S. T.

    1981-01-01

    The DSN Energy Data Base is a collection of computer files developed and maintained by DSN Engineering. The energy consumption data must be updated monthly and summarized and displayed in printed output as desired. The methods used to handle the data and perform these tasks are described.

  15. ROGER a potential orbital space debris removal system

    NASA Astrophysics Data System (ADS)

    Starke, Juergen; Bischof, Bernd; Foth, W.-O.; -J., J.; Günther

    The previous activities in the field of On Orbit Servicing studied in the 1990's included in partic-ular the capability of vehicles in GEO to capture and support satellites (mainly communication satellites) to enable repair and continuation of operations, and finally the controlled transfer the target into a permanent graveyard orbit. The specific capture tools for these applications were mostly based on robotic systems to capture and fix the target under specific dynamic constraints (e.g. slowly tumbling target) without damage, and to allow the stabilization, re-orientation and potential repair of the target and subsequent release or transport to the final disposal orbit. Due to the drastically increasing number of debris particularly in the Low Earth Orbits (SSO) the active debris removal is now necessary to counteract to the predicted debris production cascade (Kessler Syndrome), which means the pollution of the total sphere in low earth orbit and not only the SSO area. In most of the debris congresses it was recommended to start removal with the still integrated systems as soon as possible. In the case of large debris objects, the soft capture system can be replaced by a simpler and robust system able to operate from a safe distance to the target and flexible enough to capture and hold different types of targets such as deactivated and/or defective satellites, upper stages and big fragments. These nominally non -cooperative targets might be partially destroyed by the capture process, but the production of additional debris shall be avoided. A major argument for the commercial applications is a multi-target mission potential, which is possible at GEO because the transfer propellant requirement to the disposal orbit and the return to the orbit of the next potential target is relative low (orbits with similar inclination and altitude). The proposed ROGER system is designed as a spacecraft with rendezvous capabilities including inspection in the vicinity of the target and stabilization and transportation features for the combined configuration. The capture system is a deployable and closable net. The net is ejected from the mother spacecraft at a safe distance to prevent any collision with the target. After transport to the disposal orbit the net will be cut and the spacecraft will return to the operational orbit of the next target. An initial down-scaled demonstration is planned for the net capture system on a parabolic flight in autumn 2010. Further representative demonstrations including, for example, one in LEO are under discussion. The capture system can be used operationally also in other orbits e.g. LEO, but the propellant requirements for transport of the target into a direct controlled re-entry orbit and the subsequent return of the mother spacecraft to a new target orbit will be very high. This could impact the multi mission capability of the system. The potential applications are under discussion with different customers including satellite operators, insurance companies and international organisations. juergen.starke@astrium.eads.net Tel.: +49-421-539-4573

  16. Incorporation of Automated ISR Systems by the 75th Ranger Regiment

    DTIC Science & Technology

    2003-06-06

    Intelligence, describes the intelligence challenges faced by a commander, the command estimate process, military intelligence unit capabilities...feed- capture stills INTERVIEW SUBJECT #2 Strategic Range N/S N/S N/S Voice Recognitio n for TGT/Comb at ID INTERVIEW SUBJECT #3 100- 1000+ N/S...intercept- monitor- jam freqs. Send live feed- capture stills INTERVIEW SUBJECT #2 Strategic Range N/S N/S N/S Voice Recognitio n for TGT/Comb at ID

  17. A Compendium on the NIST Radionuclidic Assays of the Massic Activity of 63Ni and 55Fe Solutions Used for an International Intercomparison of Liquid Scintillation Spectrometry Techniques

    PubMed Central

    Collé, R.; Zimmerman, B. E.

    1997-01-01

    The National Institute of Standards and Technology recently participated in an international measurement intercomparison for 63Ni and 55Fe, which was conducted amongst principal national radionuclidic metrology laboratories. The intercomparison was sponsored by EUROMET, and was primarily intended to evaluate the capabilities of liquid scintillation (LS) spectrometry techniques for standardizing nuclides that decay by low-energy β-emission (like 63Ni) and by low-Z (atomic number) electron capture (like 55Fe). The intercomparison findings exhibit a very good agreement for 63Ni amongst the various participating laboratories, including that for NIST, which suggests that the presently invoked LS methodologies are very capable of providing internationally-compatible standardizations for low-energy β-emitters. The results for 55Fe are in considerably poorer agreement, and demonstrated the existence of several unresolved problems. It has thus become apparent that there is a need for the various international laboratories to conduct rigorous, systematic evaluations of their LS capabilities in assaying radionuclides that decay by low-Z electron capture. PMID:27805141

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ullmann, John Leonard; Couture, Aaron Joseph; Koehler, Paul E.

    An accurate knowledge of the neutron capture cross section is important for many applications. Experimental measurements are important since theoretical calculations of capture have been notoriously difficult, with the ratio of measured to calculated cross sections often a factor of 2 or more in the 10 keV to 1 MeV region. However, a direct measurement of capture cannot be made on many interesting radioactive nuclides because of their short half-life or backgrounds caused by their nuclear decay. On the other hand, neutron transmission measurements of the total cross section are feasible for a wide range of radioactive nuclides since themore » detectors are far from the sample, and often are less sensitive to decay radiation. The parameters extracted from a total cross section measurement, which include the average resonance spacing, the neutron strength function, and the average total radiation width, (Γ γ), provide tight constraints on the calculation of the capture cross section, and when applied produce much more accurate results. These measurements can be made using the intense epithermal neutron flux at the Lujan Center on relatively small quantities of target material. It was the purpose of this project to investigate and develop the capability to make these measurements. A great deal of progress was made towards establishing this capability during 2016, including setting up the flight path and obtaining preliminary results, but more work remains to be done.« less

  19. NASA Has Joined America True's Design Mission for 2000

    NASA Technical Reports Server (NTRS)

    Steele, Gynelle C.

    1999-01-01

    Engineers at the NASA Lewis Research Center will support the America True design team led by America s Cup innovator Phil Kaiko. The joint effort between NASA and America True is encouraged by Mission HOME, the official public awareness campaign of the U.S. space community. NASA Lewis and America True have entered into a Space Act Agreement to focus on the interaction between the airfoil and the large deformation of the pretensioned sails and rigs along with the dynamic motions related to the boat motions. This work will require a coupled fluid and structural simulation. Included in the simulation will be both a steadystate capability, to capture the quasi-state interactions between the air loads and sail geometry and the lift and drag on the boat, and a transient capability, to capture the sail/mast pumping effects resulting from hull motions.

  20. Nanoplasmon-enabled macroscopic thermal management

    PubMed Central

    Jonsson, Gustav Edman; Miljkovic, Vladimir; Dmitriev, Alexandre

    2014-01-01

    In numerous applications of energy harvesting via transformation of light into heat the focus recently shifted towards highly absorptive nanoplasmonic materials. It is currently established that noble metals-based absorptive plasmonic platforms deliver significant light-capturing capability and can be viewed as super-absorbers of optical radiation. Naturally, approaches to the direct experimental probing of macroscopic temperature increase resulting from these absorbers are welcomed. Here we derive a general quantitative method of characterizing heat-generating properties of optically absorptive layers via macroscopic thermal imaging. We further monitor macroscopic areas that are homogeneously heated by several degrees with nanostructures that occupy a mere 8% of the surface, leaving it essentially transparent and evidencing significant heat generation capability of nanoplasmon-enabled light capture. This has a direct bearing to a large number of applications where thermal management is crucial. PMID:24870613

  1. FPGA Based Adaptive Rate and Manifold Pattern Projection for Structured Light 3D Camera System †

    PubMed Central

    Lee, Sukhan

    2018-01-01

    The quality of the captured point cloud and the scanning speed of a structured light 3D camera system depend upon their capability of handling the object surface of a large reflectance variation in the trade-off of the required number of patterns to be projected. In this paper, we propose and implement a flexible embedded framework that is capable of triggering the camera single or multiple times for capturing single or multiple projections within a single camera exposure setting. This allows the 3D camera system to synchronize the camera and projector even for miss-matched frame rates such that the system is capable of projecting different types of patterns for different scan speed applications. This makes the system capturing a high quality of 3D point cloud even for the surface of a large reflectance variation while achieving a high scan speed. The proposed framework is implemented on the Field Programmable Gate Array (FPGA), where the camera trigger is adaptively generated in such a way that the position and the number of triggers are automatically determined according to camera exposure settings. In other words, the projection frequency is adaptive to different scanning applications without altering the architecture. In addition, the proposed framework is unique as it does not require any external memory for storage because pattern pixels are generated in real-time, which minimizes the complexity and size of the application-specific integrated circuit (ASIC) design and implementation. PMID:29642506

  2. Preliminary ISIS users manual

    NASA Technical Reports Server (NTRS)

    Grantham, C.

    1979-01-01

    The Interactive Software Invocation (ISIS), an interactive data management system, was developed to act as a buffer between the user and host computer system. The user is provided by ISIS with a powerful system for developing software or systems in the interactive environment. The user is protected from the idiosyncracies of the host computer system by providing such a complete range of capabilities that the user should have no need for direct access to the host computer. These capabilities are divided into four areas: desk top calculator, data editor, file manager, and tool invoker.

  3. Development of Filtered Rayleigh Scattering for Accurate Measurement of Gas Velocity

    NASA Technical Reports Server (NTRS)

    Miles, Richard B.; Lempert, Walter R.

    1995-01-01

    The overall goals of this research were to develop new diagnostic tools capable of capturing unsteady and/or time-evolving, high-speed flow phenomena. The program centers around the development of Filtered Rayleigh Scattering (FRS) for velocity, temperature, and density measurement, and the construction of narrow linewidth laser sources which will be capable of producing an order MHz repetition rate 'burst' of high power pulses.

  4. Imaging autofluorescence temporal signatures of the human ocular fundus in vivo

    NASA Astrophysics Data System (ADS)

    Papour, Asael; Taylor, Zachary; Stafsudd, Oscar; Tsui, Irena; Grundfest, Warren

    2015-11-01

    We demonstrate real-time in vivo fundus imaging capabilities of our fluorescence lifetime imaging technology for the first time. This implementation of lifetime imaging uses light emitting diodes to capture full-field images capable of showing direct tissue contrast without executing curve fitting or lifetime calculations. Preliminary results of fundus images are presented, investigating autofluorescence imaging potential of various retina biomarkers for early detection of macular diseases.

  5. Acquisition Risks in a World of Joint Capabilities: Evaluating Complex Configurations

    DTIC Science & Technology

    2015-07-06

    embeddedness” refers to the quality and depth of a single dyadic tie. “Structural embeddedness” refers to the extent to which a node’s alters are...the entire network. Studies of the influence of dyadic ties on performance have mixed and contradictory findings. For example, Perry-Smith and...observed network (x) (Snijders et al, 2006). ERGMs are capable of incorporating three dependency structures. Dyadic dependence captures the presence

  6. LTCP 2D Graphical User Interface. Application Description and User's Guide

    NASA Technical Reports Server (NTRS)

    Ball, Robert; Navaz, Homayun K.

    1996-01-01

    A graphical user interface (GUI) written for NASA's LTCP (Liquid Thrust Chamber Performance) 2 dimensional computational fluid dynamic code is described. The GUI is written in C++ for a desktop personal computer running under a Microsoft Windows operating environment. Through the use of common and familiar dialog boxes, features, and tools, the user can easily and quickly create and modify input files for the LTCP code. In addition, old input files used with the LTCP code can be opened and modified using the GUI. The application is written in C++ for a desktop personal computer running under a Microsoft Windows operating environment. The program and its capabilities are presented, followed by a detailed description of each menu selection and the method of creating an input file for LTCP. A cross reference is included to help experienced users quickly find the variables which commonly need changes. Finally, the system requirements and installation instructions are provided.

  7. The Development of Design Tools for Fault Tolerant Quantum Dot Cellular Automata Based Logic

    NASA Technical Reports Server (NTRS)

    Armstrong, Curtis D.; Humphreys, William M.

    2003-01-01

    We are developing software to explore the fault tolerance of quantum dot cellular automata gate architectures in the presence of manufacturing variations and device defects. The Topology Optimization Methodology using Applied Statistics (TOMAS) framework extends the capabilities of the A Quantum Interconnected Network Array Simulator (AQUINAS) by adding front-end and back-end software and creating an environment that integrates all of these components. The front-end tools establish all simulation parameters, configure the simulation system, automate the Monte Carlo generation of simulation files, and execute the simulation of these files. The back-end tools perform automated data parsing, statistical analysis and report generation.

  8. Comparison of VDL Modes in the Aeronautical Telecommunications Network

    NASA Technical Reports Server (NTRS)

    Bretmersky, Steven; Konangi, Vijay K.; Kerczewski, Robert J.

    2002-01-01

    VHF Digital Link (VDL) has been identified as a method of communication between aircraft and ground stations in the Aeronautical Telecommunications Network (ATN). Three different modes of VDL have been suggested for implementation. Simulations were conducted to compare the data transfer capabilities of VDL Modes 2, 3, and 4. These simulations focus on up to 50 aircraft communicating with a single VDL ground station. The data traffic is generated by the standard File Transfer Protocol (FTP) and Hyper Text Transfer Protocol (HTTP) applications in the aircraft. Comparisons of the modes are based on the number of files and pages transferred and the response time.

  9. U.S.-China Military Contacts: Issues for Congress

    DTIC Science & Technology

    2010-07-22

    then Secretary of Defense Dick Cheney, alleging that “several dozen” American military personnel captured in the Korean War (1950- 1953 ) were sent...U.S. request to access the archives .97 In March 2003, DPMO Director Jerry Jennings visited China and said that PRC records likely hold “the key...files in its archives on the PRC’s foreign relations from 1949 to 1955. However, this step apparently excluded wartime records , and General Myers did not

  10. Understanding and Capturing People’s Mobile App Privacy Preferences

    DTIC Science & Technology

    2013-10-28

    The entire apps’ metadata takes up about 500MB of storage space when stored in a MySQL database and all the binary files take approximately 300GB of...functionality that can de- compile Dalvik bytecodes to Java source code faster than other de-compilers. Given the scale of the app analysis we planned on... java libraries, such as parser, sql connectors, etc Targeted Ads 137 admob, adwhirl, greystripe… Provided by mobile behavioral ads company to

  11. Update of the U.S. Army Research Institute’s Longitudinal Research Data Base of Enlisted Personnel

    DTIC Science & Technology

    1992-08-01

    accession data elements, including Composite score data from the Army Classification Battery Test (ACB), were captured for each individual. For each...individual, Skill Qualifying Test (SQT) scores were included beginning in 1980 and additional data were included from the Enlisted Master File (EMF). The...entry into active duty for current tour, pay grade, Composite and SQT scores , and military occupation specialty (MOS). The EPRDB is designed to play an

  12. Xeml Lab: a tool that supports the design of experiments at a graphical interface and generates computer-readable metadata files, which capture information about genotypes, growth conditions, environmental perturbations and sampling strategy.

    PubMed

    Hannemann, Jan; Poorter, Hendrik; Usadel, Björn; Bläsing, Oliver E; Finck, Alex; Tardieu, Francois; Atkin, Owen K; Pons, Thijs; Stitt, Mark; Gibon, Yves

    2009-09-01

    Data mining depends on the ability to access machine-readable metadata that describe genotypes, environmental conditions, and sampling times and strategy. This article presents Xeml Lab. The Xeml Interactive Designer provides an interactive graphical interface at which complex experiments can be designed, and concomitantly generates machine-readable metadata files. It uses a new eXtensible Mark-up Language (XML)-derived dialect termed XEML. Xeml Lab includes a new ontology for environmental conditions, called Xeml Environment Ontology. However, to provide versatility, it is designed to be generic and also accepts other commonly used ontology formats, including OBO and OWL. A review summarizing important environmental conditions that need to be controlled, monitored and captured as metadata is posted in a Wiki (http://www.codeplex.com/XeO) to promote community discussion. The usefulness of Xeml Lab is illustrated by two meta-analyses of a large set of experiments that were performed with Arabidopsis thaliana during 5 years. The first reveals sources of noise that affect measurements of metabolite levels and enzyme activities. The second shows that Arabidopsis maintains remarkably stable levels of sugars and amino acids across a wide range of photoperiod treatments, and that adjustment of starch turnover and the leaf protein content contribute to this metabolic homeostasis.

  13. Advanced Capabilities for Wind Tunnel Testing in the 21st Century

    NASA Technical Reports Server (NTRS)

    Kegelman, Jerome T.; Danehy, Paul M.; Schwartz, Richard J.

    2010-01-01

    Wind tunnel testing methods and test technologies for the 21st century using advanced capabilities are presented. These capabilities are necessary to capture more accurate and high quality test results by eliminating the uncertainties in testing and to facilitate verification of computational tools for design. This paper discusses near term developments underway in ground testing capabilities, which will enhance the quality of information of both the test article and airstream flow details. Also discussed is a selection of new capability investments that have been made to accommodate such developments. Examples include advanced experimental methods for measuring the test gas itself; using efficient experiment methodologies, including quality assurance strategies within the test; and increasing test result information density by using extensive optical visualization together with computed flow field results. These points could be made for both major investments in existing tunnel capabilities or for entirely new capabilities.

  14. Quantumness-generating capability of quantum dynamics

    NASA Astrophysics Data System (ADS)

    Li, Nan; Luo, Shunlong; Mao, Yuanyuan

    2018-04-01

    We study quantumness-generating capability of quantum dynamics, where quantumness refers to the noncommutativity between the initial state and the evolving state. In terms of the commutator of the square roots of the initial state and the evolving state, we define a measure to quantify the quantumness-generating capability of quantum dynamics with respect to initial states. Quantumness-generating capability is absent in classical dynamics and hence is a fundamental characteristic of quantum dynamics. For qubit systems, we present an analytical form for this measure, by virtue of which we analyze several prototypical dynamics such as unitary dynamics, phase damping dynamics, amplitude damping dynamics, and random unitary dynamics (Pauli channels). Necessary and sufficient conditions for the monotonicity of quantumness-generating capability are also identified. Finally, we compare these conditions for the monotonicity of quantumness-generating capability with those for various Markovianities and illustrate that quantumness-generating capability and quantum Markovianity are closely related, although they capture different aspects of quantum dynamics.

  15. Catching errors with patient-specific pretreatment machine log file analysis.

    PubMed

    Rangaraj, Dharanipathy; Zhu, Mingyao; Yang, Deshan; Palaniswaamy, Geethpriya; Yaddanapudi, Sridhar; Wooten, Omar H; Brame, Scott; Mutic, Sasa

    2013-01-01

    A robust, efficient, and reliable quality assurance (QA) process is highly desired for modern external beam radiation therapy treatments. Here, we report the results of a semiautomatic, pretreatment, patient-specific QA process based on dynamic machine log file analysis clinically implemented for intensity modulated radiation therapy (IMRT) treatments delivered by high energy linear accelerators (Varian 2100/2300 EX, Trilogy, iX-D, Varian Medical Systems Inc, Palo Alto, CA). The multileaf collimator machine (MLC) log files are called Dynalog by Varian. Using an in-house developed computer program called "Dynalog QA," we automatically compare the beam delivery parameters in the log files that are generated during pretreatment point dose verification measurements, with the treatment plan to determine any discrepancies in IMRT deliveries. Fluence maps are constructed and compared between the delivered and planned beams. Since clinical introduction in June 2009, 912 machine log file analyses QA were performed by the end of 2010. Among these, 14 errors causing dosimetric deviation were detected and required further investigation and intervention. These errors were the result of human operating mistakes, flawed treatment planning, and data modification during plan file transfer. Minor errors were also reported in 174 other log file analyses, some of which stemmed from false positives and unreliable results; the origins of these are discussed herein. It has been demonstrated that the machine log file analysis is a robust, efficient, and reliable QA process capable of detecting errors originating from human mistakes, flawed planning, and data transfer problems. The possibility of detecting these errors is low using point and planar dosimetric measurements. Copyright © 2013 American Society for Radiation Oncology. Published by Elsevier Inc. All rights reserved.

  16. Autoplot: a Browser for Science Data on the Web

    NASA Astrophysics Data System (ADS)

    Faden, J.; Weigel, R. S.; West, E. E.; Merka, J.

    2008-12-01

    Autoplot (www.autoplot.org) is software for plotting data from many different sources and in many different file formats. Data from CDF, CEF, Fits, NetCDF, and OpenDAP can be plotted, along with many other sources such as ASCII tables and Excel spreadsheets. This is done by adapting these various data formats and APIs into a common data model that borrows from the netCDF and CDF data models. Autoplot uses a web browser metaphor to simplify use. The user specifies a parameter URL, for example a CDF file accessible via http with a parameter name appended, and the file resource is downloaded and the parameter is rendered in a scientifically meaningful way. When data span multiple files, the user can use a file name template in the URL to aggregate (combine) a set of remote files. So the problem of aggregating data across file boundaries is handled on the client side, allowing simple web servers to be used. The das2 graphics library provides rich controls for exploring the data. Scripting is supported through Python, providing not just programmatic control, but for calculating new parameters in a language that will look familiar to IDL and Matlab users. Autoplot is Java-based software, and will run on most computers without a burdensome installation process. It can also used as an applet or as a servlet that serves static images. Autoplot was developed as part of the Virtual Radiation Belt Observatory (ViRBO) project, and is also being used for the Virtual Magnetospheric Observatory (VMO). It is expected that this flexible, general-purpose plotting tool will be useful for allowing a data provider to add instant visualization capabilities to a directory of files or for general use in the Virtual Observatory environment.

  17. Wrapping Python around MODFLOW/MT3DMS based groundwater models

    NASA Astrophysics Data System (ADS)

    Post, V.

    2008-12-01

    Numerical models that simulate groundwater flow and solute transport require a great amount of input data that is often organized into different files. A large proportion of the input data consists of spatially-distributed model parameters. The model output consists of a variety data such as heads, fluxes and concentrations. Typically all files have different formats. Consequently, preparing input and managing output is a complex and error-prone task. Proprietary software tools are available that facilitate the preparation of input files and analysis of model outcomes. The use of such software may be limited if it does not support all the features of the groundwater model or when the costs of such tools are prohibitive. Therefore a Python library was developed that contains routines to generate input files and process output files of MODFLOW/MT3DMS based models. The library is freely available and has an open structure so that the routines can be customized and linked into other scripts and libraries. The current set of functions supports the generation of input files for MODFLOW and MT3DMS, including the capability to read spatially-distributed input parameters (e.g. hydraulic conductivity) from PNG files. Both ASCII and binary output files can be read efficiently allowing for visualization of, for example, solute concentration patterns in contour plots with superimposed flow vectors using matplotlib. Series of contour plots are then easily saved as an animation. The subroutines can also be used within scripts to calculate derived quantities such as the mass of a solute within a particular region of the model domain. Using Python as a wrapper around groundwater models provides an efficient and flexible way of processing input and output data, which is not constrained by limitations of third-party products.

  18. Lessons Learned in Deploying the World s Largest Scale Lustre File System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dillow, David A; Fuller, Douglas; Wang, Feiyi

    2010-01-01

    The Spider system at the Oak Ridge National Laboratory's Leadership Computing Facility (OLCF) is the world's largest scale Lustre parallel file system. Envisioned as a shared parallel file system capable of delivering both the bandwidth and capacity requirements of the OLCF's diverse computational environment, the project had a number of ambitious goals. To support the workloads of the OLCF's diverse computational platforms, the aggregate performance and storage capacity of Spider exceed that of our previously deployed systems by a factor of 6x - 240 GB/sec, and 17x - 10 Petabytes, respectively. Furthermore, Spider supports over 26,000 clients concurrently accessing themore » file system, which exceeds our previously deployed systems by nearly 4x. In addition to these scalability challenges, moving to a center-wide shared file system required dramatically improved resiliency and fault-tolerance mechanisms. This paper details our efforts in designing, deploying, and operating Spider. Through a phased approach of research and development, prototyping, deployment, and transition to operations, this work has resulted in a number of insights into large-scale parallel file system architectures, from both the design and the operational perspectives. We present in this paper our solutions to issues such as network congestion, performance baselining and evaluation, file system journaling overheads, and high availability in a system with tens of thousands of components. We also discuss areas of continued challenges, such as stressed metadata performance and the need for file system quality of service alongside with our efforts to address them. Finally, operational aspects of managing a system of this scale are discussed along with real-world data and observations.« less

  19. Capturing and Understanding Experiment Provenance using NiNaC

    NASA Astrophysics Data System (ADS)

    Rosati, C.

    2017-12-01

    A problem the model development team faces at the GFDL is determining climate model experiment provenance. Each experiment is configured with at least one configuration file which may reference other files. The experiment then passes through three phases before completion. Configuration files or other input files may be modified between phases. Finding the modifications later is tedious due to the expanse of the experiment input and duplication across phases. Determining provenance may be impossible if any file has been changed or deleted. To reduce these efforts and address these problems, we propose a new toolset, NiNaC, for archiving experiment provenance from the beginning of the experiment to the end and every phase in-between. Each of the three phases, check-out, build, and run, of the experiment depends on the previous phase. We use a graph to model the phase dependencies. Let each phase be represented by a node. Let each edge correspond to a dependency between phases where the node incident with the tail depends on the node incident with the head. It follows that the dependency graph is a tree. We reduce the problem to finding the lowest common ancestor and diffing the successor nodes. All files related to input for a phase are assigned a checksum. A new file is created to aggregate the checksums. Then each phase is assigned a checksum of aforementioned file as an identifier. Any change to part of a phase configuration will create unique checksums in all subsequent phases. Finding differences between experiments with this toolset is as simple as diffing two files containing checksums found by traversing the tree. One new benefit is that this toolset now allows differences in source code to be found after experiments are run, which was previously impossible for executables that cannot be linked to a known version controlled source code. Knowing that these changes exist allows us to give priority to help desk tickets concerning unmodified supported experiment releases, and minimize effort spent on unsupported experiments. It is also possible that a change is made, either by mistake or by system error. NiNaC would find the exact file in the precise phase with the change. In this way, NiNaC makes provenance tracking less tedious and solves problems where tracking provenance may previously have been impossible to do.

  20. MOBBED: a computational data infrastructure for handling large collections of event-rich time series datasets in MATLAB

    PubMed Central

    Cockfield, Jeremy; Su, Kyungmin; Robbins, Kay A.

    2013-01-01

    Experiments to monitor human brain activity during active behavior record a variety of modalities (e.g., EEG, eye tracking, motion capture, respiration monitoring) and capture a complex environmental context leading to large, event-rich time series datasets. The considerable variability of responses within and among subjects in more realistic behavioral scenarios requires experiments to assess many more subjects over longer periods of time. This explosion of data requires better computational infrastructure to more systematically explore and process these collections. MOBBED is a lightweight, easy-to-use, extensible toolkit that allows users to incorporate a computational database into their normal MATLAB workflow. Although capable of storing quite general types of annotated data, MOBBED is particularly oriented to multichannel time series such as EEG that have event streams overlaid with sensor data. MOBBED directly supports access to individual events, data frames, and time-stamped feature vectors, allowing users to ask questions such as what types of events or features co-occur under various experimental conditions. A database provides several advantages not available to users who process one dataset at a time from the local file system. In addition to archiving primary data in a central place to save space and avoid inconsistencies, such a database allows users to manage, search, and retrieve events across multiple datasets without reading the entire dataset. The database also provides infrastructure for handling more complex event patterns that include environmental and contextual conditions. The database can also be used as a cache for expensive intermediate results that are reused in such activities as cross-validation of machine learning algorithms. MOBBED is implemented over PostgreSQL, a widely used open source database, and is freely available under the GNU general public license at http://visual.cs.utsa.edu/mobbed. Source and issue reports for MOBBED are maintained at http://vislab.github.com/MobbedMatlab/ PMID:24124417

  1. MOBBED: a computational data infrastructure for handling large collections of event-rich time series datasets in MATLAB.

    PubMed

    Cockfield, Jeremy; Su, Kyungmin; Robbins, Kay A

    2013-01-01

    Experiments to monitor human brain activity during active behavior record a variety of modalities (e.g., EEG, eye tracking, motion capture, respiration monitoring) and capture a complex environmental context leading to large, event-rich time series datasets. The considerable variability of responses within and among subjects in more realistic behavioral scenarios requires experiments to assess many more subjects over longer periods of time. This explosion of data requires better computational infrastructure to more systematically explore and process these collections. MOBBED is a lightweight, easy-to-use, extensible toolkit that allows users to incorporate a computational database into their normal MATLAB workflow. Although capable of storing quite general types of annotated data, MOBBED is particularly oriented to multichannel time series such as EEG that have event streams overlaid with sensor data. MOBBED directly supports access to individual events, data frames, and time-stamped feature vectors, allowing users to ask questions such as what types of events or features co-occur under various experimental conditions. A database provides several advantages not available to users who process one dataset at a time from the local file system. In addition to archiving primary data in a central place to save space and avoid inconsistencies, such a database allows users to manage, search, and retrieve events across multiple datasets without reading the entire dataset. The database also provides infrastructure for handling more complex event patterns that include environmental and contextual conditions. The database can also be used as a cache for expensive intermediate results that are reused in such activities as cross-validation of machine learning algorithms. MOBBED is implemented over PostgreSQL, a widely used open source database, and is freely available under the GNU general public license at http://visual.cs.utsa.edu/mobbed. Source and issue reports for MOBBED are maintained at http://vislab.github.com/MobbedMatlab/

  2. Distribution of immunodeficiency fact files with XML--from Web to WAP.

    PubMed

    Väliaho, Jouni; Riikonen, Pentti; Vihinen, Mauno

    2005-06-26

    Although biomedical information is growing rapidly, it is difficult to find and retrieve validated data especially for rare hereditary diseases. There is an increased need for services capable of integrating and validating information as well as proving it in a logically organized structure. A XML-based language enables creation of open source databases for storage, maintenance and delivery for different platforms. Here we present a new data model called fact file and an XML-based specification Inherited Disease Markup Language (IDML), that were developed to facilitate disease information integration, storage and exchange. The data model was applied to primary immunodeficiencies, but it can be used for any hereditary disease. Fact files integrate biomedical, genetic and clinical information related to hereditary diseases. IDML and fact files were used to build a comprehensive Web and WAP accessible knowledge base ImmunoDeficiency Resource (IDR) available at http://bioinf.uta.fi/idr/. A fact file is a user oriented user interface, which serves as a starting point to explore information on hereditary diseases. The IDML enables the seamless integration and presentation of genetic and disease information resources in the Internet. IDML can be used to build information services for all kinds of inherited diseases. The open source specification and related programs are available at http://bioinf.uta.fi/idml/.

  3. HAT DRMs 2013

    NASA Technical Reports Server (NTRS)

    Drake, Bret G.

    2013-01-01

    NASA uses a set of Design Reference Missions (DRMs) to help focus capability development activities across the agency The DRMs are intended to show capability needs and represent a set of various implementations The "mission class" context is used to establish temporal priorities and a LIMITED set of DRMs is used to capture driving mission capabilities The DRMs represent a snapshot in time of current thinking, and do not represent all potential future missions The DRMs are generic in nature, with stated assumptions for some supporting capabilities and elements - they do not represent firm requirements SLS/Orion DRMs are being developed and refined as part of the development program for SLS & Orion and are not included in this package.

  4. A Model-Driven, Science Data Product Registration Service

    NASA Astrophysics Data System (ADS)

    Hardman, S.; Ramirez, P.; Hughes, J. S.; Joyner, R.; Cayanan, M.; Lee, H.; Crichton, D. J.

    2011-12-01

    The Planetary Data System (PDS) has undertaken an effort to overhaul the PDS data architecture (including the data model, data structures, data dictionary, etc.) and to deploy an upgraded software system (including data services, distributed data catalog, etc.) that fully embraces the PDS federation as an integrated system while taking advantage of modern innovations in information technology (including networking capabilities, processing speeds, and software breakthroughs). A core component of this new system is the Registry Service that will provide functionality for tracking, auditing, locating, and maintaining artifacts within the system. These artifacts can range from data files and label files, schemas, dictionary definitions for objects and elements, documents, services, etc. This service offers a single reference implementation of the registry capabilities detailed in the Consultative Committee for Space Data Systems (CCSDS) Registry Reference Model White Book. The CCSDS Reference Model in turn relies heavily on the Electronic Business using eXtensible Markup Language (ebXML) standards for registry services and the registry information model, managed by the OASIS consortium. Registries are pervasive components in most information systems. For example, data dictionaries, service registries, LDAP directory services, and even databases provide registry-like services. These all include an account of informational items that are used in large-scale information systems ranging from data values such as names and codes, to vocabularies, services and software components. The problem is that many of these registry-like services were designed with their own data models associated with the specific type of artifact they track. Additionally these services each have their own specific interface for interacting with the service. This Registry Service implements the data model specified in the ebXML Registry Information Model (RIM) specification that supports the various artifacts above as well as offering the flexibility to support customer-defined artifacts. Key features for the Registry Service include: - Model-based configuration specifying customer-defined artifact types, metadata attributes to capture for each artifact type, supported associations and classification schemes. - A REST-based external interface that is accessible via the Hypertext Transfer Protocol (HTTP). - Federation of Registry Service instances allowing associations between registered artifacts across registries as well as queries for artifacts across those same registries. A federation also enables features such as replication and synchronization if desired for a given deployment. In addition to its use as a core component of the PDS, the generic implementation of the Registry Service facilitates its applicability as a core component in any science data archive or science data system.

  5. Trajectory determinations and collection of micrometeoroids on the space station. Report of the Workshop on Micrometeorite Capture Experiments

    NASA Technical Reports Server (NTRS)

    Hoerz, F. (Editor)

    1986-01-01

    Summaries of papers presented at the Workshop on Micrometeorite Capture Experiments are compiled. The goals of the workshop were to define the scientific objectives and the resulting performance requirements of a potential Space Station facility and to identify the major elements of a coherent development program that would generate the desired capabilities within the next decade. Specific topics include cosmic dust and space debris collection techniques, particle trajectory and source determination, and specimen analysis methods.

  6. The Demonstration and Science Experiments (DSX): A Fundamental Science Research Mission Advancing Technologies that Enable MEO Spaceflight

    DTIC Science & Technology

    2006-01-01

    dosimeters aboard the TSX5 and DSP satellites in LEO and GEO, respectively. Figure 13. Space weather data from TSX5 and DSP The Space Weather...capabilities are described in detail in the following sub- sections. 3.2.1 Compact Environment Anomaly Sensor (CEASE) Composed of two dosimeters , two...for DSX is that CEASE will capture and downlink the full dose spectra from each dosimeter , whereas prior versions only captured six reduced data

  7. Differential Mobility Spectrometry: Preliminary Findings on Determination of Fundamental Constants

    NASA Technical Reports Server (NTRS)

    Limero, Thomas; Cheng, Patti; Boyd, John

    2007-01-01

    The electron capture detector (ECD) has been used for 40+ years (1) to derive fundamental constants such as a compound's electron affinity. Given this historical perspective, it is not surprising that differential mobility spectrometry (DMS) might be used in a like manner. This paper will present data from a gas chromatography (GC)-DMS instrument that illustrates the potential capability of this device to derive fundamental constants for electron-capturing compounds. Potential energy curves will be used to provide possible explanation of the data.

  8. Application Program Interface for the Orion Aerodynamics Database

    NASA Technical Reports Server (NTRS)

    Robinson, Philip E.; Thompson, James

    2013-01-01

    The Application Programming Interface (API) for the Crew Exploration Vehicle (CEV) Aerodynamic Database has been developed to provide the developers of software an easily implemented, fully self-contained method of accessing the CEV Aerodynamic Database for use in their analysis and simulation tools. The API is programmed in C and provides a series of functions to interact with the database, such as initialization, selecting various options, and calculating the aerodynamic data. No special functions (file read/write, table lookup) are required on the host system other than those included with a standard ANSI C installation. It reads one or more files of aero data tables. Previous releases of aerodynamic databases for space vehicles have only included data tables and a document of the algorithm and equations to combine them for the total aerodynamic forces and moments. This process required each software tool to have a unique implementation of the database code. Errors or omissions in the documentation, or errors in the implementation, led to a lengthy and burdensome process of having to debug each instance of the code. Additionally, input file formats differ for each space vehicle simulation tool, requiring the aero database tables to be reformatted to meet the tool s input file structure requirements. Finally, the capabilities for built-in table lookup routines vary for each simulation tool. Implementation of a new database may require an update to and verification of the table lookup routines. This may be required if the number of dimensions of a data table exceeds the capability of the simulation tools built-in lookup routines. A single software solution was created to provide an aerodynamics software model that could be integrated into other simulation and analysis tools. The highly complex Orion aerodynamics model can then be quickly included in a wide variety of tools. The API code is written in ANSI C for ease of portability to a wide variety of systems. The input data files are in standard formatted ASCII, also for improved portability. The API contains its own implementation of multidimensional table reading and lookup routines. The same aerodynamics input file can be used without modification on all implementations. The turnaround time from aerodynamics model release to a working implementation is significantly reduced

  9. TH-AB-201-12: Using Machine Log-Files for Treatment Planning and Delivery QA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stanhope, C; Liang, J; Drake, D

    2016-06-15

    Purpose: To determine the segment reduction and dose resolution necessary for machine log-files to effectively replace current phantom-based patient-specific quality assurance, while minimizing computational cost. Methods: Elekta’s Log File Convertor R3.2 records linac delivery parameters (dose rate, gantry angle, leaf position) every 40ms. Five VMAT plans [4 H&N, 1 Pulsed Brain] comprised of 2 arcs each were delivered on the ArcCHECK phantom. Log-files were reconstructed in Pinnacle on the phantom geometry using 1/2/3/4° control point spacing and 2/3/4mm dose grid resolution. Reconstruction effectiveness was quantified by comparing 2%/2mm gamma passing rates of the original and log-file plans. Modulation complexity scoresmore » (MCS) were calculated for each beam to correlate reconstruction accuracy and beam modulation. Percent error in absolute dose for each plan-pair combination (log-file vs. ArcCHECK, original vs. ArcCHECK, log-file vs. original) was calculated for each arc and every diode greater than 10% of the maximum measured dose (per beam). Comparing standard deviations of the three plan-pair distributions, relative noise of the ArcCHECK and log-file systems was elucidated. Results: The original plans exhibit a mean passing rate of 95.1±1.3%. The eight more modulated H&N arcs [MCS=0.088±0.014] and two less modulated brain arcs [MCS=0.291±0.004] yielded log-file pass rates most similar to the original plan when using 1°/2mm [0.05%±1.3% lower] and 2°/3mm [0.35±0.64% higher] log-file reconstructions respectively. Log-file and original plans displayed percent diode dose errors 4.29±6.27% and 3.61±6.57% higher than measurement. Excluding the phantom eliminates diode miscalibration and setup errors; log-file dose errors were 0.72±3.06% higher than the original plans – significantly less noisy. Conclusion: For log-file reconstructed VMAT arcs, 1° control point spacing and 2mm dose resolution is recommended, however, less modulated arcs may allow less stringent reconstructions. Following the aforementioned reconstruction recommendations, the log-file technique is capable of detecting delivery errors with equivalent accuracy and less noise than ArcCHECK QA. I am funded by an Elekta Research Grant.« less

  10. A volume-filtered formulation to capture particle-shock interactions in multiphase compressible flows

    NASA Astrophysics Data System (ADS)

    Shallcross, Gregory; Capecelatro, Jesse

    2017-11-01

    Compressible particle-laden flows are common in engineering systems. Applications include but are not limited to water injection in high-speed jet flows for noise suppression, rocket-plume surface interactions during planetary landing, and explosions during coal mining operations. Numerically, it is challenging to capture these interactions due to the wide range of length and time scales. Additionally, there are many forms of the multiphase compressible flow equations with volume fraction effects, some of which are conflicting in nature. The purpose of this presentation is to develop the capability to accurately capture particle-shock interactions in systems with a large number of particles from dense to dilute regimes. A thorough derivation of the volume filtered equations is presented. The volume filtered equations are then implemented in a high-order, energy-stable Eulerian-Lagrangian framework. We show this framework is capable of decoupling the fluid mesh from the particle size, enabling arbitrary particle size distributions in the presence of shocks. The proposed method is then assessed against particle-laden shock tube data. Quantities of interest include fluid-phase pressure profiles and particle spreading rates. The effect of collisions in 2D and 3D are also evaluated.

  11. Estimation of Full-Body Poses Using Only Five Inertial Sensors: An Eager or Lazy Learning Approach?

    PubMed Central

    Wouda, Frank J.; Giuberti, Matteo; Bellusci, Giovanni; Veltink, Peter H.

    2016-01-01

    Human movement analysis has become easier with the wide availability of motion capture systems. Inertial sensing has made it possible to capture human motion without external infrastructure, therefore allowing measurements in any environment. As high-quality motion capture data is available in large quantities, this creates possibilities to further simplify hardware setups, by use of data-driven methods to decrease the number of body-worn sensors. In this work, we contribute to this field by analyzing the capabilities of using either artificial neural networks (eager learning) or nearest neighbor search (lazy learning) for such a problem. Sparse orientation features, resulting from sensor fusion of only five inertial measurement units with magnetometers, are mapped to full-body poses. Both eager and lazy learning algorithms are shown to be capable of constructing this mapping. The full-body output poses are visually plausible with an average joint position error of approximately 7 cm, and average joint angle error of 7∘. Additionally, the effects of magnetic disturbances typical in orientation tracking on the estimation of full-body poses was also investigated, where nearest neighbor search showed better performance for such disturbances. PMID:27983676

  12. Semantic Likelihood Models for Bayesian Inference in Human-Robot Interaction

    NASA Astrophysics Data System (ADS)

    Sweet, Nicholas

    Autonomous systems, particularly unmanned aerial systems (UAS), remain limited in au- tonomous capabilities largely due to a poor understanding of their environment. Current sensors simply do not match human perceptive capabilities, impeding progress towards full autonomy. Recent work has shown the value of humans as sources of information within a human-robot team; in target applications, communicating human-generated 'soft data' to autonomous systems enables higher levels of autonomy through large, efficient information gains. This requires development of a 'human sensor model' that allows soft data fusion through Bayesian inference to update the probabilistic belief representations maintained by autonomous systems. Current human sensor models that capture linguistic inputs as semantic information are limited in their ability to generalize likelihood functions for semantic statements: they may be learned from dense data; they do not exploit the contextual information embedded within groundings; and they often limit human input to restrictive and simplistic interfaces. This work provides mechanisms to synthesize human sensor models from constraints based on easily attainable a priori knowledge, develops compression techniques to capture information-dense semantics, and investigates the problem of capturing and fusing semantic information contained within unstructured natural language. A robotic experimental testbed is also developed to validate the above contributions.

  13. Estimation of Full-Body Poses Using Only Five Inertial Sensors: An Eager or Lazy Learning Approach?

    PubMed

    Wouda, Frank J; Giuberti, Matteo; Bellusci, Giovanni; Veltink, Peter H

    2016-12-15

    Human movement analysis has become easier with the wide availability of motion capture systems. Inertial sensing has made it possible to capture human motion without external infrastructure, therefore allowing measurements in any environment. As high-quality motion capture data is available in large quantities, this creates possibilities to further simplify hardware setups, by use of data-driven methods to decrease the number of body-worn sensors. In this work, we contribute to this field by analyzing the capabilities of using either artificial neural networks (eager learning) or nearest neighbor search (lazy learning) for such a problem. Sparse orientation features, resulting from sensor fusion of only five inertial measurement units with magnetometers, are mapped to full-body poses. Both eager and lazy learning algorithms are shown to be capable of constructing this mapping. The full-body output poses are visually plausible with an average joint position error of approximately 7 cm, and average joint angle error of 7 ∘ . Additionally, the effects of magnetic disturbances typical in orientation tracking on the estimation of full-body poses was also investigated, where nearest neighbor search showed better performance for such disturbances.

  14. Capturing reflected cladding modes from a fiber Bragg grating with a double-clad fiber coupler.

    PubMed

    Baiad, Mohamad Diaa; Gagné, Mathieu; Lemire-Renaud, Simon; De Montigny, Etienne; Madore, Wendy-Julie; Godbout, Nicolas; Boudoux, Caroline; Kashyap, Raman

    2013-03-25

    We present a novel measurement scheme using a double-clad fiber coupler (DCFC) and a fiber Bragg grating (FBG) to resolve cladding modes. Direct measurement of the optical spectra and power in the cladding modes is obtained through the use of a specially designed DCFC spliced to a highly reflective FBG written into slightly etched standard photosensitive single mode fiber to match the inner cladding diameter of the DCFC. The DCFC is made by tapering and fusing two double-clad fibers (DCF) together. The device is capable of capturing backward propagating low and high order cladding modes simply and efficiently. Also, we demonstrate the capability of such a device to measure the surrounding refractive index (SRI) with an extremely high sensitivity of 69.769 ± 0.035 μW/RIU and a resolution of 1.433 × 10(-5) ± 8 × 10(-9) RIU between 1.37 and 1.45 RIU. The device provides a large SRI operating range from 1.30 to 1.45 RIU with sufficient discrimination for all individual captured cladding modes. The proposed scheme can be adapted to many different types of bend, temperature, refractive index and other evanescent wave based sensors.

  15. Quality and noise measurements in mobile phone video capture

    NASA Astrophysics Data System (ADS)

    Petrescu, Doina; Pincenti, John

    2011-02-01

    The quality of videos captured with mobile phones has become increasingly important particularly since resolutions and formats have reached a level that rivals the capabilities available in the digital camcorder market, and since many mobile phones now allow direct playback on large HDTVs. The video quality is determined by the combined quality of the individual parts of the imaging system including the image sensor, the digital color processing, and the video compression, each of which has been studied independently. In this work, we study the combined effect of these elements on the overall video quality. We do this by evaluating the capture under various lighting, color processing, and video compression conditions. First, we measure full reference quality metrics between encoder input and the reconstructed sequence, where the encoder input changes with light and color processing modifications. Second, we introduce a system model which includes all elements that affect video quality, including a low light additive noise model, ISP color processing, as well as the video encoder. Our experiments show that in low light conditions and for certain choices of color processing the system level visual quality may not improve when the encoder becomes more capable or the compression ratio is reduced.

  16. Collection and Analysis of Crowd Data with Aerial, Rooftop, and Ground Views

    DTIC Science & Technology

    2014-11-10

    collected these datasets using different aircrafts. Erista 8 HL OctaCopter is a heavy-lift aerial platform capable of using high-resolution cinema ...is another high-resolution camera that is cinema grade and high quality, with the capability of capturing videos with 4K resolution at 30 frames per...292.58 Imaging Systems and Accessories Blackmagic Production Camera 4 Crowd Counting using 4K Cameras High resolution cinema grade digital video

  17. Smart Sensing and Dynamic Fitting for Enhanced Comfort and Performance of Prosthetics

    DTIC Science & Technology

    2015-10-01

    developing the antenna pressure/shear sensors and the bubble actuators for pressure regulation. An antenna sensor that is capable of measuring shear and...liner materials. We have characterized the load bearing capability of bubble actuator arrays at different actuation pressures. A “limb-socket...laboratory test setup was developed for capturing the internal pressure change of bubble actuators when the “limb” was subjected to the external force. 15

  18. Exploring Innovation Capabilities of Hospital CIOs: An Empirical Assessment.

    PubMed

    Esdar, Moritz; Liebe, Jan-David; Weiß, Jan-Patrick; Hübner, Ursula

    2017-01-01

    Hospital CIOs play a central role in the adoption of innovative health IT. Until now, it remained unclear which particular conditions constitute their capability to innovate in terms of intrapersonal as well as organisational factors. An inventory of 20 items was developed to capture these conditions and examined by analysing data obtained from 164 German hospital CIOs. Principal component analysis resulted in three internally consistent components that constitute large portions of the CIOs innovation capability: organisational innovation culture, entrepreneurship personality and openness towards users. Results were used to build composite indicators that allow further evaluations.

  19. Techno-economic assessment of polymer membrane systems for postcombustion carbon capture at coal-fired power plants.

    PubMed

    Zhai, Haibo; Rubin, Edward S

    2013-03-19

    This study investigates the feasibility of polymer membrane systems for postcombustion carbon dioxide (CO(2)) capture at coal-fired power plants. Using newly developed performance and cost models, our analysis shows that membrane systems configured with multiple stages or steps are capable of meeting capture targets of 90% CO(2) removal efficiency and 95+% product purity. A combined driving force design using both compressors and vacuum pumps is most effective for reducing the cost of CO(2) avoided. Further reductions in the overall system energy penalty and cost can be obtained by recycling a portion of CO(2) via a two-stage, two-step membrane configuration with air sweep to increase the CO(2) partial pressure of feed flue gas. For a typical plant with carbon capture and storage, this yielded a 15% lower cost per metric ton of CO(2) avoided compared to a plant using a current amine-based capture system. A series of parametric analyses also is undertaken to identify paths for enhancing the viability of membrane-based capture technology.

  20. Computational materials chemistry for carbon capture using porous materials

    NASA Astrophysics Data System (ADS)

    Sharma, Abhishek; Huang, Runhong; Malani, Ateeque; Babarao, Ravichandar

    2017-11-01

    Control over carbon dioxide (CO2) release is extremely important to decrease its hazardous effects on the environment such as global warming, ocean acidification, etc. For CO2 capture and storage at industrial point sources, nanoporous materials offer an energetically viable and economically feasible approach compared to chemisorption in amines. There is a growing need to design and synthesize new nanoporous materials with enhanced capability for carbon capture. Computational materials chemistry offers tools to screen and design cost-effective materials for CO2 separation and storage, and it is less time consuming compared to trial and error experimental synthesis. It also provides a guide to synthesize new materials with better properties for real world applications. In this review, we briefly highlight the various carbon capture technologies and the need of computational materials design for carbon capture. This review discusses the commonly used computational chemistry-based simulation methods for structural characterization and prediction of thermodynamic properties of adsorbed gases in porous materials. Finally, simulation studies reported on various potential porous materials, such as zeolites, porous carbon, metal organic frameworks (MOFs) and covalent organic frameworks (COFs), for CO2 capture are discussed.

  1. Highly efficient capture and harvest of circulating tumor cells on a microfluidic chip integrated with herringbone and micropost arrays.

    PubMed

    Xue, Peng; Wu, Yafeng; Guo, Jinhong; Kang, Yuejun

    2015-04-01

    Circulating tumor cells (CTCs), which are derived from primary tumor site and transported to distant organs, are considered as the major cause of metastasis. So far, various techniques have been applied for CTC isolation and enumeration. However, there exists great demand to improve the sensitivity of CTC capture, and it remains challenging to elute the cells efficiently from device for further biomolecular and cellular analyses. In this study, we fabricate a dual functional chip integrated with herringbone structure and micropost array to achieve CTC capture and elution through EpCAM-based immunoreaction. Hep3B tumor cell line is selected as the model of CTCs for processing using this device. The results demonstrate that the capture limit of Hep3B cells can reach up to 10 cells (per mL of sample volume) with capture efficiency of 80% on average. Moreover, the elution rate of the captured Hep3B cells can reach up to 69.4% on average for cell number ranging from 1 to 100. These results demonstrate that this device exhibits dual functions with considerably high capture rate and elution rate, indicating its promising capability for cancer diagnosis and therapeutics.

  2. 76 FR 50493 - Notice of Availability of the Record of Decision for the Desert Sunlight Holdings, LLC, Desert...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-15

    ... a solar photovoltaic (PV) facility, capable of producing 550 MW of electrical output. Southern... Sunlight Solar Farm (DSSF) and California Desert Conservation Area Plan Amendment, California AGENCY... . SUPPLEMENTARY INFORMATION: Desert Sunlight Holdings, LLC, a wholly owned subsidiary of First Solar, Inc., filed...

  3. 75 FR 67427 - Self-Regulatory Organizations; The NASDAQ OMX PHLX LLC; Notice of Filing of Proposed Rule Change...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-02

    ... specialist unit. \\20\\ Moreover, recognizing that the market making functions of Remote Specialists and RSQTs... have constrained open outcry capabilities, have been fulfilling market making requirements on the Exchange (e.g. making two-sided market quotations) for years without specific back-up personnel...

  4. Diary of a Conversion--Lotus 1-2-3 to Symphony 1.1.

    ERIC Educational Resources Information Center

    Dunnewin, Larry

    1986-01-01

    Describes the uses of Lotus 1-2-3 (a spreadsheet-graphics-database program created by Lotus Development Corporation) and Symphony 1.1 (a refinement and expansion of Symphony 1.01 providing memory efficiency, speed, ease of use, greater file compatibility). Spreadsheet and graphics capabilities, the use of windows, database environment, and…

  5. 75 FR 71708 - Proposed Data Collections Submitted for Public Comment and Recommendations

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-24

    ... level data, from recruited hospitals for the year 2011 onward. A pretest of a survey supplement on acute... demographic information, clinical capabilities, and financial information. The pretest of the supplement on... and re-filing Patient Records 151 133 1/60 335 (ED, OPD, and ASC). ACUTE CORONARY SYNDROME PRETEST...

  6. On the Nets. Comparing Web Browsers: Mosaic, Cello, Netscape, WinWeb and InternetWorks Life.

    ERIC Educational Resources Information Center

    Notess, Greg R.

    1995-01-01

    World Wide Web browsers are compared by speed, setup, hypertext transport protocol (HTTP) handling, management of file transfer protocol (FTP), telnet, gopher, and wide area information server (WAIS); bookmark options; and communication functions. Netscape has the most features, the fastest retrieval, sophisticated bookmark capabilities. (JMV)

  7. Lunar Data Information Center: A Shortcut to the Riddle of the Moon

    ERIC Educational Resources Information Center

    Waranius, Frances B.; Heiken, Jody H.

    1975-01-01

    The Lunar Data Information Center is a reference and lending collection for researchers, educators, and students of lunar science, worldwide. Such methods as a classification scheme for mission-oriented documentation, sample photo browse files, lunar feature index, and color coding have resulted in a user-oriented collection. Search capability is…

  8. 75 FR 9989 - Self-Regulatory Organizations; NASDAQ OMX PHLX, Inc.; Notice of Filing and Immediate...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-04

    ... Exchanges to operate a stand-alone system or ``Linkage'' for sending order-flow between exchanges to limit trade-throughs.\\6\\ The Options Clearing Corporation (``OCC'') operated the Linkage system (the ``System.... options markets are linked together on a real-time basis through a network capable of transporting orders...

  9. File level metadata generation and use for diverse airborne and in situ data: Experiences with Operation IceBridge and SnowEx

    NASA Astrophysics Data System (ADS)

    Tanner, S.; Schwab, M.; Beam, K.; Skaug, M.

    2017-12-01

    Operation IceBridge has been flying campaigns in the Arctic and Antarctic for nearly 10 years and will soon be a decadal mission. During that time, the generation and use of file level metadata has evolved from nearly non-existent to robust spatio-temporal support. This evolution has been difficult at times, but the results speak for themselves in the form of production tools for search, discovery, access and analysis. The lessons learned from this experience are now being incorporated into SnowEx, a new mission to measure snow cover using airborne and ground-based measurements. This presentation will focus on techniques for generating metadata for such a diverse set of measurements as well as the resulting tools that utilize this information. This includes the development and deployment of MetGen, a semi-automated metadata generation capability that relies on collaboration between data producers and data archivers, the newly deployed IceBridge data portal which incorporates data browse capabilities and limited in-line analysis, and programmatic access to metadata and data for incorporation into larger automated workflows.

  10. The impact of workplace factors on filing of workers’ compensation claims among nursing home workers

    PubMed Central

    2014-01-01

    Background Injuries reported to workers’ compensation (WC) system are often used to estimate incidence of health outcomes and evaluate interventions in musculoskeletal epidemiology studies. However, WC claims represent a relatively small subset of all musculoskeletal disorders among employed individuals, and perhaps not a representative subset. This study determined the influence of workplace and individual factors on filing of workers’ compensation claims by nursing home employees with back pain. Methods Surveys were conducted in 18 skilled nursing facilities in four U.S. states. Self-administered questionnaires obtained information on demographic characteristics, working environment, and health behaviors/status. Employees who reported low back pain at least once in four questionnaire surveys were included. WC claims from the same facilities were obtained from the employer’s workers compensation insurer and matched by employee name. The dichotomous dependent variable was filing of back-related worker’s compensation claim. Association with predictors of interest, including pain severity, physical job demand, job strain, social support, schedule control, and safety climate, was assessed using multivariate regression modeling. Individual characteristics were tested as potential confounders. Results Pain severity level was significantly associated with filing low-back related claims (odds ratio (OR) = 1.49, 95% CI = 1.18 – 1.87). Higher physical demands at work (OR = 1.07, 95% CI = 1.01 – 1.14) also increased the likelihood of claim filing. Higher job strain (OR = 0.83, 95% CI = 0.73 – 0.94), social support at work (OR = 0.90, 95% CI = 0.82 – 0.99), and education (OR = 0.79, 95% CI = 0.71 – 0.89) decreased the likelihood of claim filing. Conclusions The results suggest that the WC system captured the most severe occupational injuries. Workplace factors had additional influence on workers’ decision to file claims, after adjusting for low back pain severity. Education was correlated with worker’s socioeconomic status; its influence on claim filing is difficult to interpret because of the possible mixed effects of working conditions, self-efficacy, and content knowledge. PMID:24476529

  11. DyHAP: Dynamic Hybrid ANFIS-PSO Approach for Predicting Mobile Malware.

    PubMed

    Afifi, Firdaus; Anuar, Nor Badrul; Shamshirband, Shahaboddin; Choo, Kim-Kwang Raymond

    2016-01-01

    To deal with the large number of malicious mobile applications (e.g. mobile malware), a number of malware detection systems have been proposed in the literature. In this paper, we propose a hybrid method to find the optimum parameters that can be used to facilitate mobile malware identification. We also present a multi agent system architecture comprising three system agents (i.e. sniffer, extraction and selection agent) to capture and manage the pcap file for data preparation phase. In our hybrid approach, we combine an adaptive neuro fuzzy inference system (ANFIS) and particle swarm optimization (PSO). Evaluations using data captured on a real-world Android device and the MalGenome dataset demonstrate the effectiveness of our approach, in comparison to two hybrid optimization methods which are differential evolution (ANFIS-DE) and ant colony optimization (ANFIS-ACO).

  12. DyHAP: Dynamic Hybrid ANFIS-PSO Approach for Predicting Mobile Malware

    PubMed Central

    Afifi, Firdaus; Anuar, Nor Badrul; Shamshirband, Shahaboddin

    2016-01-01

    To deal with the large number of malicious mobile applications (e.g. mobile malware), a number of malware detection systems have been proposed in the literature. In this paper, we propose a hybrid method to find the optimum parameters that can be used to facilitate mobile malware identification. We also present a multi agent system architecture comprising three system agents (i.e. sniffer, extraction and selection agent) to capture and manage the pcap file for data preparation phase. In our hybrid approach, we combine an adaptive neuro fuzzy inference system (ANFIS) and particle swarm optimization (PSO). Evaluations using data captured on a real-world Android device and the MalGenome dataset demonstrate the effectiveness of our approach, in comparison to two hybrid optimization methods which are differential evolution (ANFIS-DE) and ant colony optimization (ANFIS-ACO). PMID:27611312

  13. Capture and fission with DANCE and NEUANCE

    DOE PAGES

    Jandel, M.; Baramsai, B.; Bond, E.; ...

    2015-12-23

    A summary of the current and future experimental program at DANCE is presented. Measurements of neutron capture cross sections are planned for many actinide isotopes with the goal to reduce the present uncertainties in nuclear data libraries. Detailed studies of capture gamma rays in the neutron resonance region will be performed in order to derive correlated data on the de-excitation of the compound nucleus. New approaches on how to remove the DANCE detector response from experimental data and retain the correlations between the cascade gamma rays are presented. Studies on 235U are focused on quantifying the population of short-lived isomericmore » states in 236U after neutron capture. For this purpose, a new neutron detector array NEUANCE is under construction. It will be installed in the central cavity of the DANCE array and enable the highly efficient tagging of fission and capture events. In addition, developments of fission fragment detectors are also underway to expand DANCE capabilities to measurements of fully correlated data on fission observables.« less

  14. Current and Future Research at DANCE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jandel, M.; Baramsai, B.; Bredeweg, T. A.

    2015-05-28

    An overview of the current experimental program on measurements of neutron capture and neutron induced fission at the Detector for Advanced Neutron Capture Experiments (DANCE) is presented. Three major projects are currently under way: 1) high precision measurements of neutron capture cross sections on Uranium isotopes, 2) research aimed at studies of the short-lived actinide isomer production in neutron capture on 235U and 3) measurements of correlated data of fission observables. New projects include developments of auxiliary detectors to improve the capability of DANCE. We are building a compact, segmented NEUtron detector Array at DANCE (NEUANCE), which will be installedmore » in the central cavity of the DANCE array. It will thus provide experimental information on prompt fission neutrons in coincidence with the prompt fission gamma-rays measured by 160 BaF 2 crystals of DANCE. Additionally, unique correlated data will be obtained for neutron capture and neutron-induced fission using the DANCE-NEUANCE experimental set up in the future.« less

  15. Capture and fission with DANCE and NEUANCE

    NASA Astrophysics Data System (ADS)

    Jandel, M.; Baramsai, B.; Bond, E.; Rusev, G.; Walker, C.; Bredeweg, T. A.; Chadwick, M. B.; Couture, A.; Fowler, M. M.; Hayes, A.; Kawano, T.; Mosby, S.; Stetcu, I.; Taddeucci, T. N.; Talou, P.; Ullmann, J. L.; Vieira, D. J.; Wilhelmy, J. B.

    2015-12-01

    A summary of the current and future experimental program at DANCE is presented. Measurements of neutron capture cross sections are planned for many actinide isotopes with the goal to reduce the present uncertainties in nuclear data libraries. Detailed studies of capture gamma rays in the neutron resonance region will be performed in order to derive correlated data on the de-excitation of the compound nucleus. New approaches on how to remove the DANCE detector response from experimental data and retain the correlations between the cascade gamma rays are presented. Studies on 235U are focused on quantifying the population of short-lived isomeric states in 236U after neutron capture. For this purpose, a new neutron detector array NEUANCE is under construction. It will be installed in the central cavity of the DANCE array and enable the highly efficient tagging of fission and capture events. In addition, developments of fission fragment detectors are also underway to expand DANCE capabilities to measurements of fully correlated data on fission observables.

  16. To modify the definition of armor piercing ammunition to better capture its capabilities.

    THOMAS, 113th Congress

    Rep. Speier, Jackie [D-CA-14

    2013-06-27

    House - 07/15/2013 Referred to the Subcommittee on Crime, Terrorism, Homeland Security, and Investigations. (All Actions) Tracker: This bill has the status IntroducedHere are the steps for Status of Legislation:

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    T.Rex is used to explore tabular data sets containing up to ten million records to help rapidly understand a previously unknown data set. Analysis can quickly identify patterns of interest and the records and fields that capture those patterns. T.Rex contains a growing set of deep analytical tools and supports robust export capabilities that selected data can be incorporated into to other specialized tools for further analysis. T.Rex is flexible in ingesting different types and formats of data, allowing the user to interactively experiment and perform trial and error guesses on the structure of the data; and also has amore » variety of linked visual analytic tools that enable exploration of the data to find relevant content, relationships among content, trends within the content, and capture knowledge about the content. Finally, T.Rex has a rich export capability, to extract relevant subsets of a larger data source, to further analyze their data in other analytic tools.« less

  18. Introducing WISDEM:An Integrated System Modeling for Wind Turbines and Plant (Presentation)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dykes, K.; Graf, P.; Scott, G.

    2015-01-01

    The National Wind Technology Center wind energy systems engineering initiative has developed an analysis platform to leverage its research capabilities toward integrating wind energy engineering and cost models across wind plants. This Wind-Plant Integrated System Design & Engineering Model (WISDEM) platform captures the important interactions between various subsystems to achieve a better National Wind Technology Center wind energy systems engineering initiative has developed an analysis platform to leverage its research capabilities toward integrating wind energy engineering and cost models across wind plants. This Wind-Plant Integrated System Design & Engineering Model (WISDEM) platform captures the important interactions between various subsystems tomore » achieve a better understanding of how to improve system-level performance and achieve system-level cost reductions. This work illustrates a few case studies with WISDEM that focus on the design and analysis of wind turbines and plants at different system levels.« less

  19. Object onset and parvocellular guidance of attentional allocation.

    PubMed

    Cole, Geoff G; Kentridge, Robert W; Heywood, Charles A

    2005-04-01

    The parvocellular visual pathway in the primate brain is known to be involved with the processing of color. However, a subject of debate is whether an abrupt change in color, conveyed via this pathway, is capable of automatically attracting attention. It has been shown that the appearance of new objects defined solely by color is indeed capable of modulating attention. However, given evidence suggesting that the visual system is particularly sensitive to new onsets, it is unclear to what extent such results reflect effects of color change per se, rather than effects of object onset. We assessed attentional capture by color change that occurred as a result of either new objects appearing or already-present "old" objects changing color. Results showed that although new object onsets accrued attention, changing the color of old objects did not. We conclude that abrupt color change per se is not sufficient to capture attention.

  20. T.Rex

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2016-06-08

    T.Rex is used to explore tabular data sets containing up to ten million records to help rapidly understand a previously unknown data set. Analysis can quickly identify patterns of interest and the records and fields that capture those patterns. T.Rex contains a growing set of deep analytical tools and supports robust export capabilities that selected data can be incorporated into to other specialized tools for further analysis. T.Rex is flexible in ingesting different types and formats of data, allowing the user to interactively experiment and perform trial and error guesses on the structure of the data; and also has amore » variety of linked visual analytic tools that enable exploration of the data to find relevant content, relationships among content, trends within the content, and capture knowledge about the content. Finally, T.Rex has a rich export capability, to extract relevant subsets of a larger data source, to further analyze their data in other analytic tools.« less

  1. Capture zones for simple aquifers

    USGS Publications Warehouse

    McElwee, Carl D.

    1991-01-01

    Capture zones showing the area influenced by a well within a certain time are useful for both aquifer protection and cleanup. If hydrodynamic dispersion is neglected, a deterministic curve defines the capture zone. Analytical expressions for the capture zones can be derived for simple aquifers. However, the capture zone equations are transcendental and cannot be explicitly solved for the coordinates of the capture zone boundary. Fortunately, an iterative scheme allows the solution to proceed quickly and efficiently even on a modest personal computer. Three forms of the analytical solution must be used in an iterative scheme to cover the entire region of interest, after the extreme values of the x coordinate are determined by an iterative solution. The resulting solution is a discrete one, and usually 100-1000 intervals along the x-axis are necessary for a smooth definition of the capture zone. The presented program is written in FORTRAN and has been used in a variety of computing environments. No graphics capability is included with the program; it is assumed the user has access to a commercial package. The superposition of capture zones for multiple wells is expected to be satisfactory if the spacing is not too close. Because this program deals with simple aquifers, the results rarely will be the final word in a real application.

  2. Large-scale electrophysiology: acquisition, compression, encryption, and storage of big data.

    PubMed

    Brinkmann, Benjamin H; Bower, Mark R; Stengel, Keith A; Worrell, Gregory A; Stead, Matt

    2009-05-30

    The use of large-scale electrophysiology to obtain high spatiotemporal resolution brain recordings (>100 channels) capable of probing the range of neural activity from local field potential oscillations to single-neuron action potentials presents new challenges for data acquisition, storage, and analysis. Our group is currently performing continuous, long-term electrophysiological recordings in human subjects undergoing evaluation for epilepsy surgery using hybrid intracranial electrodes composed of up to 320 micro- and clinical macroelectrode arrays. DC-capable amplifiers, sampling at 32kHz per channel with 18-bits of A/D resolution are capable of resolving extracellular voltages spanning single-neuron action potentials, high frequency oscillations, and high amplitude ultra-slow activity, but this approach generates 3 terabytes of data per day (at 4 bytes per sample) using current data formats. Data compression can provide several practical benefits, but only if data can be compressed and appended to files in real-time in a format that allows random access to data segments of varying size. Here we describe a state-of-the-art, scalable, electrophysiology platform designed for acquisition, compression, encryption, and storage of large-scale data. Data are stored in a file format that incorporates lossless data compression using range-encoded differences, a 32-bit cyclically redundant checksum to ensure data integrity, and 128-bit encryption for protection of patient information.

  3. Pulseq-Graphical Programming Interface: Open source visual environment for prototyping pulse sequences and integrated magnetic resonance imaging algorithm development.

    PubMed

    Ravi, Keerthi Sravan; Potdar, Sneha; Poojar, Pavan; Reddy, Ashok Kumar; Kroboth, Stefan; Nielsen, Jon-Fredrik; Zaitsev, Maxim; Venkatesan, Ramesh; Geethanath, Sairam

    2018-03-11

    To provide a single open-source platform for comprehensive MR algorithm development inclusive of simulations, pulse sequence design and deployment, reconstruction, and image analysis. We integrated the "Pulseq" platform for vendor-independent pulse programming with Graphical Programming Interface (GPI), a scientific development environment based on Python. Our integrated platform, Pulseq-GPI, permits sequences to be defined visually and exported to the Pulseq file format for execution on an MR scanner. For comparison, Pulseq files using either MATLAB only ("MATLAB-Pulseq") or Python only ("Python-Pulseq") were generated. We demonstrated three fundamental sequences on a 1.5 T scanner. Execution times of the three variants of implementation were compared on two operating systems. In vitro phantom images indicate equivalence with the vendor supplied implementations and MATLAB-Pulseq. The examples demonstrated in this work illustrate the unifying capability of Pulseq-GPI. The execution times of all the three implementations were fast (a few seconds). The software is capable of user-interface based development and/or command line programming. The tool demonstrated here, Pulseq-GPI, integrates the open-source simulation, reconstruction and analysis capabilities of GPI Lab with the pulse sequence design and deployment features of Pulseq. Current and future work includes providing an ISMRMRD interface and incorporating Specific Absorption Ratio and Peripheral Nerve Stimulation computations. Copyright © 2018 Elsevier Inc. All rights reserved.

  4. Large-scale Electrophysiology: Acquisition, Compression, Encryption, and Storage of Big Data

    PubMed Central

    Brinkmann, Benjamin H.; Bower, Mark R.; Stengel, Keith A.; Worrell, Gregory A.; Stead, Matt

    2009-01-01

    The use of large-scale electrophysiology to obtain high spatiotemporal resolution brain recordings (>100 channels) capable of probing the range of neural activity from local field potential oscillations to single neuron action potentials presents new challenges for data acquisition, storage, and analysis. Our group is currently performing continuous, long-term electrophysiological recordings in human subjects undergoing evaluation for epilepsy surgery using hybrid intracranial electrodes composed of up to 320 micro- and clinical macroelectrode arrays. DC-capable amplifiers, sampling at 32 kHz per channel with 18-bits of A/D resolution are capable of resolving extracellular voltages spanning single neuron action potentials, high frequency oscillations, and high amplitude ultraslow activity, but this approach generates 3 terabytes of data per day (at 4 bytes per sample) using current data formats. Data compression can provide several practical benefits, but only if data can be compressed and appended to files in real-time in a format that allows random access to data segments of varying size. Here we describe a state-of-the-art, scalable, electrophysiology platform designed for acquisition, compression, encryption, and storage of large-scale data. Data are stored in a file format that incorporates lossless data compression using range encoded differences, a 32-bit cyclically redundant checksum to ensure data integrity, and 128-bit encryption for protection of patient information. PMID:19427545

  5. Reconciling ethical and economic conceptions of value in health policy using the capabilities approach: A qualitative investigation of Non-Invasive Prenatal Testing.

    PubMed

    Kibel, Mia; Vanstone, Meredith

    2017-12-01

    When evaluating new morally complex health technologies, policy decision-makers consider a broad range of different evaluations, which may include the technology's clinical effectiveness, cost effectiveness, and social or ethical implications. This type of holistic assessment is challenging, because each of these evaluations may be grounded in different and potentially contradictory assumptions about the technology's value. One such technology where evaluations conflict is Non-Invasive Prenatal Testing (NIPT). Cost-effectiveness evaluations of NIPT often assess NIPT's ability to deliver on goals (i.e preventing the birth of children with disabilities) that social and ethical analyses suggest it should not have. Thus, cost effectiveness analyses frequently contradict social and ethical assessments of NIPT's value. We use the case of NIPT to explore how economic evaluations using a capabilities approach may be able to capture a broader, more ethical view of the value of NIPT. The capabilities approach is an evaluative framework which bases wellbeing assessments on a person's abilities, rather than their expressed preferences. It is linked to extra-welfarist approaches in health economic assessment. Beginning with Nussbaum's capability framework, we conducted a directed qualitative content analysis of interview data collected in 2014 from 27 Canadian women with personal experience of NIPT. We found that eight of Nussbaum's ten capabilities related to options, states, or choices that women valued in the context of NIPT, and identified one new capability. Our findings suggest that women value NIPT for its ability to provide more and different choices in the prenatal care pathway, and that a capabilities approach can indeed capture the value of NIPT in a way that goes beyond measuring health outcomes of ambiguous social and ethical value. More broadly, the capabilities approach may serve to resolve contradictions between ethical and economic evaluations of health technologies, and contribute to extra-welfarist approaches in the assessment of morally complex health technologies. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lindsey, Nicholas C.

    The growth of additive manufacturing as a disruptive technology poses nuclear proliferation concerns worthy of serious consideration. Additive manufacturing began in the early 1980s with technological advances in polymer manipulation, computer capabilities, and computer-aided design (CAD) modeling. It was originally limited to rapid prototyping; however, it eventually developed into a complete means of production that has slowly penetrated the consumer market. Today, additive manufacturing machines can produce complex and unique items in a vast array of materials including plastics, metals, and ceramics. These capabilities have democratized the manufacturing industry, allowing almost anyone to produce items as simple as cup holdersmore » or as complex as jet fuel nozzles. Additive manufacturing, or three-dimensional (3D) printing as it is commonly called, relies on CAD files created or shared by individuals with additive manufacturing machines to produce a 3D object from a digital model. This sharing of files means that a 3D object can be scanned or rendered as a CAD model in one country, and then downloaded and printed in another country, allowing items to be shared globally without physically crossing borders. The sharing of CAD files online has been a challenging task for the export controls regime to manage over the years, and additive manufacturing could make these transfers more common. In this sense, additive manufacturing is a disruptive technology not only within the manufacturing industry but also within the nuclear nonproliferation world. This paper provides an overview of additive manufacturing concerns of proliferation.« less

  7. IAC - INTEGRATED ANALYSIS CAPABILITY

    NASA Technical Reports Server (NTRS)

    Frisch, H. P.

    1994-01-01

    The objective of the Integrated Analysis Capability (IAC) system is to provide a highly effective, interactive analysis tool for the integrated design of large structures. With the goal of supporting the unique needs of engineering analysis groups concerned with interdisciplinary problems, IAC was developed to interface programs from the fields of structures, thermodynamics, controls, and system dynamics with an executive system and database to yield a highly efficient multi-disciplinary system. Special attention is given to user requirements such as data handling and on-line assistance with operational features, and the ability to add new modules of the user's choice at a future date. IAC contains an executive system, a data base, general utilities, interfaces to various engineering programs, and a framework for building interfaces to other programs. IAC has shown itself to be effective in automatic data transfer among analysis programs. IAC 2.5, designed to be compatible as far as possible with Level 1.5, contains a major upgrade in executive and database management system capabilities, and includes interfaces to enable thermal, structures, optics, and control interaction dynamics analysis. The IAC system architecture is modular in design. 1) The executive module contains an input command processor, an extensive data management system, and driver code to execute the application modules. 2) Technical modules provide standalone computational capability as well as support for various solution paths or coupled analyses. 3) Graphics and model generation interfaces are supplied for building and viewing models. Advanced graphics capabilities are provided within particular analysis modules such as INCA and NASTRAN. 4) Interface modules provide for the required data flow between IAC and other modules. 5) User modules can be arbitrary executable programs or JCL procedures with no pre-defined relationship to IAC. 6) Special purpose modules are included, such as MIMIC (Model Integration via Mesh Interpolation Coefficients), which transforms field values from one model to another; LINK, which simplifies incorporation of user specific modules into IAC modules; and DATAPAC, the National Bureau of Standards statistical analysis package. The IAC database contains structured files which provide a common basis for communication between modules and the executive system, and can contain unstructured files such as NASTRAN checkpoint files, DISCOS plot files, object code, etc. The user can define groups of data and relations between them. A full data manipulation and query system operates with the database. The current interface modules comprise five groups: 1) Structural analysis - IAC contains a NASTRAN interface for standalone analysis or certain structural/control/thermal combinations. IAC provides enhanced structural capabilities for normal modes and static deformation analysis via special DMAP sequences. IAC 2.5 contains several specialized interfaces from NASTRAN in support of multidisciplinary analysis. 2) Thermal analysis - IAC supports finite element and finite difference techniques for steady state or transient analysis. There are interfaces for the NASTRAN thermal analyzer, SINDA/SINFLO, and TRASYS II. FEMNET, which converts finite element structural analysis models to finite difference thermal analysis models, is also interfaced with the IAC database. 3) System dynamics - The DISCOS simulation program which allows for either nonlinear time domain analysis or linear frequency domain analysis, is fully interfaced to the IAC database management capability. 4) Control analysis - Interfaces for the ORACLS, SAMSAN, NBOD2, and INCA programs allow a wide range of control system analyses and synthesis techniques. Level 2.5 includes EIGEN, which provides tools for large order system eigenanalysis, and BOPACE, which allows for geometric capabilities and finite element analysis with nonlinear material. Also included in IAC level 2.5 is SAMSAN 3.1, an engineering analysis program which contains a general purpose library of over 600 subroutin

  8. Shipboard Analytical Capabilities on the Renovated JOIDES Resolution, IODP Riserless Drilling Vessel

    NASA Astrophysics Data System (ADS)

    Blum, P.; Foster, P.; Houpt, D.; Bennight, C.; Brandt, L.; Cobine, T.; Crawford, W.; Fackler, D.; Fujine, K.; Hastedt, M.; Hornbacher, D.; Mateo, Z.; Moortgat, E.; Vasilyev, M.; Vasilyeva, Y.; Zeliadt, S.; Zhao, J.

    2008-12-01

    The JOIDES Resolution (JR) has conducted 121 scientific drilling expeditions during the Ocean Drilling Program (ODP) and the first phase of the Integrated Ocean Drilling Program (IODP) (1983-2006). The vessel and scientific systems have just completed an NSF-sponsored renovation (2005-2008). Shipboard analytical systems have been upgraded, within funding constraints imposed by market driven vessel conversion cost increases, to include: (1) enhanced shipboard analytical services including instruments and software for sampling and the capture of chemistry, physical properties, and geological data; (2) new data management capabilities built around a laboratory information management system (LIMS), digital asset management system, and web services; (3) operations data services with enhanced access to navigation and rig instrumentation data; and (4) a combination of commercial and home-made user applications for workflow- specific data extractions, generic and customized data reporting, and data visualization within a shipboard production environment. The instrumented data capture systems include a new set of core loggers for rapid and non-destructive acquisition of images and other physical properties data from drill cores. Line-scan imaging and natural gamma ray loggers capture data at unprecedented quality due to new and innovative designs. Many instruments used to characterize chemical compounds of rocks, sediments, and interstitial fluids were upgraded with the latest technology. The shipboard analytical environment features a new and innovative framework (DESCinfo) and application (DESClogik) for capturing descriptive and interpretive data from geological sub-domains such as sedimentology, petrology, paleontology, structural geology, stratigraphy, etc. This system fills a long-standing gap by providing a global database, controlled vocabularies and taxa name lists with version control, a highly configurable spreadsheet environment for data capture, and visualization of context data collected with the shipboard core loggers and other instruments.

  9. FRAMES Metadata Reporting Templates for Ecohydrological Observations, version 1.1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Christianson, Danielle; Varadharajan, Charuleka; Christoffersen, Brad

    FRAMES is a a set of Excel metadata files and package-level descriptive metadata that are designed to facilitate and improve capture of desired metadata for ecohydrological observations. The metadata are bundled with data files into a data package and submitted to a data repository (e.g. the NGEE Tropics Data Repository) via a web form. FRAMES standardizes reporting of diverse ecohydrological and biogeochemical data for synthesis across a range of spatiotemporal scales and incorporates many best data science practices. This version of FRAMES supports observations for primarily automated measurements collected by permanently located sensors, including sap flow (tree water use), leafmore » surface temperature, soil water content, dendrometry (stem diameter growth increment), and solar radiation. Version 1.1 extend the controlled vocabulary and incorporates functionality to facilitate programmatic use of data and FRAMES metadata (R code available at NGEE Tropics Data Repository).« less

  10. Artificial Intelligent Platform as Decision Tool for Asset Management, Operations and Maintenance.

    PubMed

    2018-01-04

    An Artificial Intelligence (AI) system has been developed and implemented for water, wastewater and reuse plants to improve management of sensors, short and long term maintenance plans, asset and investment management plans. It is based on an integrated approach to capture data from different computer systems and files. It adds a layer of intelligence to the data. It serves as a repository of key current and future operations and maintenance conditions that a plant needs have knowledge of. With this information, it is able to simulate the configuration of processes and assets for those conditions to improve or optimize operations, maintenance and asset management, using the IViewOps (Intelligent View of Operations) model. Based on the optimization through model runs, it is able to create output files that can feed data to other systems and inform the staff regarding optimal solutions to the conditions experienced or anticipated in the future.

  11. Deployment Area Selection and Land Withdrawal/Acquisition. M-X/MPS (M-X/Multiple Protective Shelter) Environmental Technical Report. Washington County, Utah.

    DTIC Science & Technology

    1981-10-02

    file a FEIS for the MPS system. However, the attached preliminary FEIS captures the environ- mental data and analysis in the document that was nearing...efforts in the study area. There- fore, in response to requests for environmental technical data from the Congress, federal agencies and the states involved...associated Environmental Technical Reports (ETRs). The data tables presented here provide projections of the key socioeconomic impacts of M-X deployment for

  12. Automated knowledge base development from CAD/CAE databases

    NASA Technical Reports Server (NTRS)

    Wright, R. Glenn; Blanchard, Mary

    1988-01-01

    Knowledge base development requires a substantial investment in time, money, and resources in order to capture the knowledge and information necessary for anything other than trivial applications. This paper addresses a means to integrate the design and knowledge base development process through automated knowledge base development from CAD/CAE databases and files. Benefits of this approach include the development of a more efficient means of knowledge engineering, resulting in the timely creation of large knowledge based systems that are inherently free of error.

  13. Multi-scale and multi-physics simulations using the multi-fluid plasma model

    DTIC Science & Technology

    2017-04-25

    small The simulation uses 512 second-order elements Bz = 1.0, Te = Ti = 0.01, ui = ue = 0 ne = ni = 1.0 + e−10(x−6) 2 Baboolal, Math . and Comp. Sim. 55...DISTRIBUTION Clearance No. 17211 23 / 31 SUMMARY The blended finite element method (BFEM) is presented DG spatial discretization with explicit Runge...Kutta (i+, n) CG spatial discretization with implicit Crank-Nicolson (e−, fileds) DG captures shocks and discontinuities CG is efficient and robust for

  14. Dual-responsive surfaces modified with phenylboronic acid-containing polymer brush to reversibly capture and release cancer cells.

    PubMed

    Liu, Hongliang; Li, Yingying; Sun, Kang; Fan, Junbing; Zhang, Pengchao; Meng, Jingxin; Wang, Shutao; Jiang, Lei

    2013-05-22

    Artificial stimuli-responsive surfaces that can mimic the dynamic function of living systems have attracted much attention. However, there exist few artificial systems capable of responding to dual- or multistimulation as the natural system does. Herein, we synthesize a pH and glucose dual-responsive surface by grafting poly(acrylamidophenylboronic acid) (polyAAPBA) brush from aligned silicon nanowire (SiNW) array. The as-prepared surface can reversibly capture and release targeted cancer cells by precisely controlling pH and glucose concentration, exhibiting dual-responsive AND logic. In the presence of 70 mM glucose, the surface is pH responsive, which can vary from a cell-adhesive state to a cell-repulsive state by changing the pH from 6.8 to 7.8. While keeping the pH at 7.8, the surface becomes glucose responsive--capturing cells in the absence of glucose and releasing cells by adding 70 mM glucose. Through simultaneously changing the pH and glucose concentration from pH 6.8/0 mM glucose to pH 7.8/70 mM glucose, the surface is dual responsive with the capability to switch between cell capture and release for at least 5 cycles. The cell capture and release process on this dual-responsive surface is noninvasive with cell viability higher than 95%. Moreover, topographical interaction between the aligned SiNW array and cell protrusions greatly amplifies the responsiveness and accelerates the response rate of the dual-responsive surface between cell capture and release. The responsive mechanism of the dual-responsive surface is systematically studied using a quartz crystal microbalance, which shows that the competitive binding between polyAAPBA/sialic acid and polyAAPBA/glucose contributes to the dual response. Such dual-responsive surface can significantly impact biomedical and biological applications including cell-based diagnostics, in vivo drug delivery, etc.

  15. C2x: A tool for visualisation and input preparation for CASTEP and other electronic structure codes

    NASA Astrophysics Data System (ADS)

    Rutter, M. J.

    2018-04-01

    The c2x code fills two distinct roles. Its first role is in acting as a converter between the binary format .check files from the widely-used CASTEP [1] electronic structure code and various visualisation programs. Its second role is to manipulate and analyse the input and output files from a variety of electronic structure codes, including CASTEP, ONETEP and VASP, as well as the widely-used 'Gaussian cube' file format. Analysis includes symmetry analysis, and manipulation arbitrary cell transformations. It continues to be under development, with growing functionality, and is written in a form which would make it easy to extend it to working directly with files from other electronic structure codes. Data which c2x is capable of extracting from CASTEP's binary checkpoint files include charge densities, spin densities, wavefunctions, relaxed atomic positions, forces, the Fermi level, the total energy, and symmetry operations. It can recreate .cell input files from checkpoint files. Volumetric data can be output in formats useable by many common visualisation programs, and c2x will itself calculate integrals, expand data into supercells, and interpolate data via combinations of Fourier and trilinear interpolation. It can extract data along arbitrary lines (such as lines between atoms) as 1D output. C2x is able to convert between several common formats for describing molecules and crystals, including the .cell format of CASTEP. It can construct supercells, reduce cells to their primitive form, and add specified k-point meshes. It uses the spglib library [2] to report symmetry information, which it can add to .cell files. C2x is a command-line utility, so is readily included in scripts. It is available under the GPL and can be obtained from http://www.c2x.org.uk. It is believed to be the only open-source code which can read CASTEP's .check files, so it will have utility in other projects.

  16. Immuno capture PCR for rapid and sensitive identification of pathogenic Bacillus anthracis.

    PubMed

    Makam, Shivakiran S; Majumder, Saugata; Kingston, Joseph J; Urs, Radhika M; Tuteja, Urmil; Sripathi, Murali H; Batra, Harsh V

    2013-12-01

    Immuno capture PCR (IPCR) is a technique capable of detecting the pathogens with high specificity and sensitivity. Rapid and accurate detection of Bacillus anthracis was achieved using anti-EA1 antibodies to capture the cells and two primer sets targeting the virulence factors of the pathogen i.e., protective antigen (pag) and capsule (cap) in an IPCR format. Monoclonal antibodies specific to B. anthracis were generated against extractable antigen 1 protein and used as capture antibody onto 96 well polystyrene plates. Following the binding of the pathogen, the DNA extraction was carried out in the well itself and further processed for PCR assay. We compared IPCR described here with conventional duplex PCR using the same primers and sandwich ELISA using the monoclonal antibodies developed in the present study. IPCR was capable of detecting as few as 10 and 100 cfu ml⁻¹ of bacterial cells and spores, respectively. IPCR was found to be 2-3 logs more sensitive than conventional duplex PCR and the sandwich ELISA. The effect of other bacteria and any organic materials on IPCR was also analyzed and found that this method was robust with little change in the sensitivity in the presence of interfering agents. Moreover, we could demonstrate a simple process of microwave treatment for spore disruption which otherwise are resistant to chemical treatments. Also, the IPCR could clearly distinguish the pathogenic and nonpathogenic strains of B. anthracis in the same assay. This can help in saving resources on unnecessary decontamination procedures during false alarms.

  17. Active 3D camera design for target capture on Mars orbit

    NASA Astrophysics Data System (ADS)

    Cottin, Pierre; Babin, François; Cantin, Daniel; Deslauriers, Adam; Sylvestre, Bruno

    2010-04-01

    During the ESA Mars Sample Return (MSR) mission, a sample canister launched from Mars will be autonomously captured by an orbiting satellite. We present the concept and the design of an active 3D camera supporting the orbiter navigation system during the rendezvous and capture phase. This camera aims at providing the range and bearing of a 20 cm diameter canister from 2 m to 5 km within a 20° field-of-view without moving parts (scannerless). The concept exploits the sensitivity and the gating capability of a gated intensified camera. It is supported by a pulsed source based on an array of laser diodes with adjustable amplitude and pulse duration (from nanoseconds to microseconds). The ranging capability is obtained by adequately controlling the timing between the acquisition of 2D images and the emission of the light pulses. Three modes of acquisition are identified to accommodate the different levels of ranging and bearing accuracy and the 3D data refresh rate. To come up with a single 3D image, each mode requires a different number of images to be processed. These modes can be applied to the different approach phases. The entire concept of operation of this camera is detailed with an emphasis on the extreme lighting conditions. Its uses for other space missions and terrestrial applications are also highlighted. This design is implemented in a prototype with shorter ranging capabilities for concept validation. Preliminary results obtained with this prototype are also presented. This work is financed by the Canadian Space Agency.

  18. SE-FIT

    NASA Technical Reports Server (NTRS)

    Chen, Yongkang; Weislogel, Mark; Schaeffer, Ben; Semerjian, Ben; Yang, Lihong; Zimmerli, Gregory

    2012-01-01

    The mathematical theory of capillary surfaces has developed steadily over the centuries, but it was not until the last few decades that new technologies have put a more urgent demand on a substantially more qualitative and quantitative understanding of phenomena relating to capillarity in general. So far, the new theory development successfully predicts the behavior of capillary surfaces for special cases. However, an efficient quantitative mathematical prediction of capillary phenomena related to the shape and stability of geometrically complex equilibrium capillary surfaces remains a significant challenge. As one of many numerical tools, the open-source Surface Evolver (SE) algorithm has played an important role over the last two decades. The current effort was undertaken to provide a front-end to enhance the accessibility of SE for the purposes of design and analysis. Like SE, the new code is open-source and will remain under development for the foreseeable future. The ultimate goal of the current Surface Evolver Fluid Interface Tool (SEFIT) development is to build a fully integrated front-end with a set of graphical user interface (GUI) elements. Such a front-end enables the access to functionalities that are developed along with the GUIs to deal with pre-processing, convergence computation operation, and post-processing. In other words, SE-FIT is not just a GUI front-end, but an integrated environment that can perform sophisticated computational tasks, e.g. importing industry standard file formats and employing parameter sweep functions, which are both lacking in SE, and require minimal interaction by the user. These functions are created using a mixture of Visual Basic and the SE script language. These form the foundation for a high-performance front-end that substantially simplifies use without sacrificing the proven capabilities of SE. The real power of SE-FIT lies in its automated pre-processing, pre-defined geometries, convergence computation operation, computational diagnostic tools, and crash-handling capabilities to sustain extensive computations. SE-FIT performance is enabled by its so-called file-layer mechanism. During the early stages of SE-FIT development, it became necessary to modify the original SE code to enable capabilities required for an enhanced and synchronized communication. To this end, a file-layer was created that serves as a command buffer to ensure a continuous and sequential execution of commands sent from the front-end to SE. It also establishes a proper means for handling crashes. The file layer logs input commands and SE output; it also supports user interruption requests, back and forward operation (i.e. undo and redo), and others. It especially enables the batch mode computation of a series of equilibrium surfaces and the searching of critical parameter values in studying the stability of capillary surfaces. In this way, the modified SE significantly extends the capabilities of the original SE.

  19. Program for Experimentation With Expert Systems

    NASA Technical Reports Server (NTRS)

    Engle, S. W.

    1986-01-01

    CERBERUS is forward-chaining, knowledge-based system program useful for experimentation with expert systems. Inference-engine mechanism performs deductions according to user-supplied rule set. Information stored in intermediate area, and user interrogated only when no applicable data found in storage. Each assertion posed by CERBERUS answered with certainty ranging from 0 to 100 percent. Rule processor stops investigating applicable rules when goal reaches certainty of 95 percent or higher. Capable of operating for wide variety of domains. Sample rule files included for animal identification, pixel classification in image processing, and rudimentary car repair for novice mechanic. User supplies set of end goals or actions. System complexity decided by user's rule file. CERBERUS written in FORTRAN 77.

  20. Development and application of operational techniques for the inventory and monitoring of resources and uses for the Texas coastal zone. [Galvaston Bay and San Antonio test sites

    NASA Technical Reports Server (NTRS)

    Jones, R. (Principal Investigator); Harwood, P.; Finley, R.; Clements, G.; Lodwick, L.; Mcculloch, S.; Marphy, D.

    1976-01-01

    The author has identified the following significant results. The most significant ADP result was the modification of the DAM package to produce classified printouts, scaled and registered to U.S.G.S., 71/2 minute topographic maps from LARSYS-type classification files. With this modification, all the powerful scaling and registration capabilities of DAM become available for multiclass classification files. The most significant results with respect to image interpretation were the application of mapping techniques to a new, more complex area, and the refinement of an image interpretation procedure which should yield the best results.

  1. MAGIC: Model and Graphic Information Converter

    NASA Technical Reports Server (NTRS)

    Herbert, W. C.

    2009-01-01

    MAGIC is a software tool capable of converting highly detailed 3D models from an open, standard format, VRML 2.0/97, into the proprietary DTS file format used by the Torque Game Engine from GarageGames. MAGIC is used to convert 3D simulations from authoritative sources into the data needed to run the simulations in NASA's Distributed Observer Network. The Distributed Observer Network (DON) is a simulation presentation tool built by NASA to facilitate the simulation sharing requirements of the Data Presentation and Visualization effort within the Constellation Program. DON is built on top of the Torque Game Engine (TGE) and has chosen TGE's Dynamix Three Space (DTS) file format to represent 3D objects within simulations.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Koning, A.J.; Bersillon, O.; Forrest, R. A.

    The status of the Joint Evaluated Fission and Fusion file (JEFF) is described. The next version of the library, JEFF-3.1, comprises a significant update of actinide evaluations, evaluations emerging from European nuclear data projects, the activation library JEFF-3/A, the decay data and fission yield library, and fusion-related data files from the EFF project. The revisions were motivated by the availability of new measurements, modelling capabilities, or trends from integral experiments. Various pre-release validation efforts are underway, mainly for criticality and shielding of thermal and fast systems. This JEFF-3.1 library is expected to provide improved performances with respect to previous releasesmore » for a variety of scientific and industrial applications.« less

  3. Software Aids In Graphical Depiction Of Flow Data

    NASA Technical Reports Server (NTRS)

    Stegeman, J. D.

    1995-01-01

    Interactive Data Display System (IDDS) computer program is graphical-display program designed to assist in visualization of three-dimensional flow in turbomachinery. Grid and simulation data files in PLOT3D format required for input. Able to unwrap volumetric data cone associated with centrifugal compressor and display results in easy-to-understand two- or three-dimensional plots. IDDS provides majority of visualization and analysis capability for Integrated Computational Fluid Dynamics and Experiment (ICE) system. IDDS invoked from any subsystem, or used as stand-alone package of display software. Generates contour, vector, shaded, x-y, and carpet plots. Written in C language. Input file format used by IDDS is that of PLOT3D (COSMIC item ARC-12782).

  4. Treatment planning capability assessment of a beam shaping assembly for accelerator-based BNCT.

    PubMed

    Herrera, M S; González, S J; Burlon, A A; Minsky, D M; Kreiner, A J

    2011-12-01

    Within the frame of an ongoing project to develop a folded Tandem-Electrostatic-Quadrupole accelerator facility for Accelerator-Based Boron Neutron Capture Therapy (AB-BNCT) a theoretical study was performed to assess the treatment planning capability of different configurations of an optimized beam shaping assembly for such a facility. In particular this study aims at evaluating treatment plans for a clinical case of Glioblastoma. Copyright © 2011 Elsevier Ltd. All rights reserved.

  5. Acoustic Scattering of Broadband Echolocation Signals from Prey of Blainville’s Beaked Whales: Modeling and Analysis

    DTIC Science & Technology

    2006-09-01

    biosonar , summarized in the following paragraphs, provides context for this study. 1.1.1 Echolocation in bats Researchers have debated for over two...centuries the capabilities of certain species of animals to use biosonar in orientation, communication, and prey capture. As early as 1793 Italian...marine organisms In complement to the research on the biosonar systems of these capable predators, a concurrent body of research has been conducted on

  6. Block Copolymer Membranes for Efficient Capture of a Chemotherapy Drug

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, X. Chelsea; Oh, Hee Jeung; Yu, Jay F.

    In this paper, we introduce the use of block copolymer membranes for an emerging application, “drug capture”. The polymer is incorporated in a new class of biomedical devices, referred to as ChemoFilter, which is an image-guided temporarily deployable endovascular device designed to increase the efficacy of chemotherapy-based cancer treatment. We show that block copolymer membranes consisting of functional sulfonated polystyrene end blocks and a structural polyethylene middle block (SSES) are capable of capturing doxorubicin, a chemotherapy drug. We focus on the relationship between morphology of the membrane in the ChemoFilter device and efficacy of doxorubicin capture measured in vitro. Usingmore » small-angle X-ray scattering and cryogenic scanning transmission electron microscopy, we discovered that rapid doxorubicin capture is associated with the presence of water-rich channels in the lamellar-forming S-SES membranes in aqueous environment.« less

  7. Spatially selective photonic crystal enhanced fluorescence and application to background reduction for biomolecule detection assays

    PubMed Central

    Chaudhery, Vikram; Huang, Cheng-Sheng; Pokhriyal, Anusha; Polans, James; Cunningham, Brian T.

    2011-01-01

    By combining photonic crystal label-free biosensor imaging with photonic crystal enhanced fluorescence, it is possible to selectively enhance the fluorescence emission from regions of the PC surface based upon the density of immobilized capture molecules. A label-free image of the capture molecules enables determination of optimal coupling conditions of the laser used for fluorescence imaging of the photonic crystal surface on a pixel-by-pixel basis, allowing maximization of fluorescence enhancement factor from regions incorporating a biomolecule capture spot and minimization of background autofluorescence from areas between capture spots. This capability significantly improves the contrast of enhanced fluorescent images, and when applied to an antibody protein microarray, provides a substantial advantage over conventional fluorescence microscopy. Using the new approach, we demonstrate detection limits as low as 0.97 pg/ml for a representative protein biomarker in buffer. PMID:22109210

  8. A Self-Assessment Stereo Capture Model Applicable to the Internet of Things

    PubMed Central

    Lin, Yancong; Yang, Jiachen; Lv, Zhihan; Wei, Wei; Song, Houbing

    2015-01-01

    The realization of the Internet of Things greatly depends on the information communication among physical terminal devices and informationalized platforms, such as smart sensors, embedded systems and intelligent networks. Playing an important role in information acquisition, sensors for stereo capture have gained extensive attention in various fields. In this paper, we concentrate on promoting such sensors in an intelligent system with self-assessment capability to deal with the distortion and impairment in long-distance shooting applications. The core design is the establishment of the objective evaluation criteria that can reliably predict shooting quality with different camera configurations. Two types of stereo capture systems—toed-in camera configuration and parallel camera configuration—are taken into consideration respectively. The experimental results show that the proposed evaluation criteria can effectively predict the visual perception of stereo capture quality for long-distance shooting. PMID:26308004

  9. Block Copolymer Membranes for Efficient Capture of a Chemotherapy Drug

    DOE PAGES

    Chen, X. Chelsea; Oh, Hee Jeung; Yu, Jay F.; ...

    2016-07-23

    In this paper, we introduce the use of block copolymer membranes for an emerging application, “drug capture”. The polymer is incorporated in a new class of biomedical devices, referred to as ChemoFilter, which is an image-guided temporarily deployable endovascular device designed to increase the efficacy of chemotherapy-based cancer treatment. We show that block copolymer membranes consisting of functional sulfonated polystyrene end blocks and a structural polyethylene middle block (SSES) are capable of capturing doxorubicin, a chemotherapy drug. We focus on the relationship between morphology of the membrane in the ChemoFilter device and efficacy of doxorubicin capture measured in vitro. Usingmore » small-angle X-ray scattering and cryogenic scanning transmission electron microscopy, we discovered that rapid doxorubicin capture is associated with the presence of water-rich channels in the lamellar-forming S-SES membranes in aqueous environment.« less

  10. Three-Dimensionally Functionalized Reverse Phase Glycoprotein Array for Cancer Biomarker Discovery and Validation.

    PubMed

    Pan, Li; Aguilar, Hillary Andaluz; Wang, Linna; Iliuk, Anton; Tao, W Andy

    2016-11-30

    Glycoproteins have vast structural diversity that plays an important role in many biological processes and have great potential as disease biomarkers. Here, we report a novel functionalized reverse phase protein array (RPPA), termed polymer-based reverse phase glycoprotein array (polyGPA), to capture and profile glycoproteomes specifically, and validate glycoproteins. Nitrocellulose membrane functionalized with globular hydroxyaminodendrimers was used to covalently capture preoxidized glycans on glycoproteins from complex protein samples such as biofluids. The captured glycoproteins were subsequently detected using the same validated antibodies as in RPPA. We demonstrated the outstanding specificity, sensitivity, and quantitative capabilities of polyGPA by capturing and detecting purified as well as endogenous α-1-acid glycoprotein (AGP) in human plasma. We further applied quantitative N-glycoproteomics and the strategy to validate a panel of glycoproteins identified as potential biomarkers for bladder cancer by analyzing urine glycoproteins from bladder cancer patients or matched healthy individuals.

  11. Spatially selective photonic crystal enhanced fluorescence and application to background reduction for biomolecule detection assays.

    PubMed

    Chaudhery, Vikram; Huang, Cheng-Sheng; Pokhriyal, Anusha; Polans, James; Cunningham, Brian T

    2011-11-07

    By combining photonic crystal label-free biosensor imaging with photonic crystal enhanced fluorescence, it is possible to selectively enhance the fluorescence emission from regions of the PC surface based upon the density of immobilized capture molecules. A label-free image of the capture molecules enables determination of optimal coupling conditions of the laser used for fluorescence imaging of the photonic crystal surface on a pixel-by-pixel basis, allowing maximization of fluorescence enhancement factor from regions incorporating a biomolecule capture spot and minimization of background autofluorescence from areas between capture spots. This capability significantly improves the contrast of enhanced fluorescent images, and when applied to an antibody protein microarray, provides a substantial advantage over conventional fluorescence microscopy. Using the new approach, we demonstrate detection limits as low as 0.97 pg/ml for a representative protein biomarker in buffer.

  12. NASA/FLAGRO - FATIGUE CRACK GROWTH COMPUTER PROGRAM

    NASA Technical Reports Server (NTRS)

    Forman, R. G.

    1994-01-01

    Structural flaws and cracks may grow under fatigue inducing loads and, upon reaching a critical size, cause structural failure to occur. The growth of these flaws and cracks may occur at load levels well below the ultimate load bearing capability of the structure. The Fatigue Crack Growth Computer Program, NASA/FLAGRO, was developed as an aid in predicting the growth of pre-existing flaws and cracks in structural components of space systems. The earlier version of the program, FLAGRO4, was the primary analysis tool used by Rockwell International and the Shuttle subcontractors for fracture control analysis on the Space Shuttle. NASA/FLAGRO is an enhanced version of the program and incorporates state-of-the-art improvements in both fracture mechanics and computer technology. NASA/FLAGRO provides the fracture mechanics analyst with a computerized method of evaluating the "safe crack growth life" capabilities of structural components. NASA/FLAGRO could also be used to evaluate the damage tolerance aspects of a given structural design. The propagation of an existing crack is governed by the stress field in the vicinity of the crack tip. The stress intensity factor is defined in terms of the relationship between the stress field magnitude and the crack size. The propagation of the crack becomes catastrophic when the local stress intensity factor reaches the fracture toughness of the material. NASA/FLAGRO predicts crack growth using a two-dimensional model which predicts growth independently in two directions based on the calculation of stress intensity factors. The analyst can choose to use either a crack growth rate equation or a nonlinear interpolation routine based on tabular data. The growth rate equation is a modified Forman equation which can be converted to a Paris or Walker equation by substituting different values into the exponent. This equation provides accuracy and versatility and can be fit to data using standard least squares methods. Stress-intensity factor numerical values can be computed for making comparisons or checks of solutions. NASA/FLAGRO can check for failure of a part-through crack in the mode of a through crack when net ligament yielding occurs. NASA/FLAGRO has a number of special subroutines and files which provide enhanced capabilities and easy entry of data. These include crack case solutions, cyclic load spectrums, nondestructive examination initial flaw sizes, table interpolation, and material properties. The materials properties files are divided into two types, a user defined file and a fixed file. Data is entered and stored in the user defined file during program execution, while the fixed file contains already coded-in property value data for many different materials. Prompted input from CRT terminals consists of initial crack definition (which can be defined automatically), rate solution type, flaw type and geometry, material properties (if they are not in the built-in tables of material data), load spectrum data (if not included in the loads spectrum file), and design limit stress levels. NASA/FLAGRO output includes an echo of the input with any error or warning messages, the final crack size, whether or not critical crack size has been reached for the specified stress level, and a life history profile of the crack propagation. NASA/FLAGRO is modularly designed to facilitate revisions and operation on minicomputers. The program was implemented on a DEC VAX 11/780 with the VMS operating system. NASA/FLAGRO is written in FORTRAN77 and has a memory requirement of 1.4 MB. The program was developed in 1986.

  13. Laser dissection sampling modes for direct mass spectral analysis [using a hybrid optical microscopy/laser ablation liquid vortex capture/electrospray ionization system

    DOE PAGES

    Cahill, John F.; Kertesz, Vilmos; Van Berkel, Gary J.

    2016-02-01

    Here, laser microdissection coupled directly with mass spectrometry provides the capability of on-line analysis of substrates with high spatial resolution, high collection efficiency, and freedom on shape and size of the sampling area. Establishing the merits and capabilities of the different sampling modes that the system provides is necessary in order to select the best sampling mode for characterizing analytically challenging samples. The capabilities of laser ablation spot sampling, laser ablation raster sampling, and laser 'cut and drop' sampling modes of a hybrid optical microscopy/laser ablation liquid vortex capture electrospray ionization mass spectrometry system were compared for the analysis ofmore » single cells and tissue. Single Chlamydomonas reinhardtii cells were monitored for their monogalactosyldiacylglycerol (MGDG) and diacylglyceryltrimethylhomo-Ser (DGTS) lipid content using the laser spot sampling mode, which was capable of ablating individual cells (4-15 m) even when agglomerated together. Turbid Allium Cepa cells (150 m) having unique shapes difficult to precisely measure using the other sampling modes could be ablated in their entirety using laser raster sampling. Intact microdissections of specific regions of a cocaine-dosed mouse brain tissue were compared using laser 'cut and drop' sampling. Since in laser 'cut and drop' sampling whole and otherwise unmodified sections are captured into the probe, 100% collection efficiencies were achieved. Laser ablation spot sampling has the highest spatial resolution of any sampling mode, while laser ablation raster sampling has the highest sampling area adaptability of the sampling modes. In conclusion, laser ablation spot sampling has the highest spatial resolution of any sampling mode, useful in this case for the analysis of single cells. Laser ablation raster sampling was best for sampling regions with unique shapes that are difficult to measure using other sampling modes. Laser 'cut and drop' sampling can be used for cases where the highest sensitivity is needed, for example, monitoring drugs present in trace amounts in tissue.« less

  14. Laser dissection sampling modes for direct mass spectral analysis [using a hybrid optical microscopy/laser ablation liquid vortex capture/electrospray ionization system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cahill, John F.; Kertesz, Vilmos; Van Berkel, Gary J.

    Here, laser microdissection coupled directly with mass spectrometry provides the capability of on-line analysis of substrates with high spatial resolution, high collection efficiency, and freedom on shape and size of the sampling area. Establishing the merits and capabilities of the different sampling modes that the system provides is necessary in order to select the best sampling mode for characterizing analytically challenging samples. The capabilities of laser ablation spot sampling, laser ablation raster sampling, and laser 'cut and drop' sampling modes of a hybrid optical microscopy/laser ablation liquid vortex capture electrospray ionization mass spectrometry system were compared for the analysis ofmore » single cells and tissue. Single Chlamydomonas reinhardtii cells were monitored for their monogalactosyldiacylglycerol (MGDG) and diacylglyceryltrimethylhomo-Ser (DGTS) lipid content using the laser spot sampling mode, which was capable of ablating individual cells (4-15 m) even when agglomerated together. Turbid Allium Cepa cells (150 m) having unique shapes difficult to precisely measure using the other sampling modes could be ablated in their entirety using laser raster sampling. Intact microdissections of specific regions of a cocaine-dosed mouse brain tissue were compared using laser 'cut and drop' sampling. Since in laser 'cut and drop' sampling whole and otherwise unmodified sections are captured into the probe, 100% collection efficiencies were achieved. Laser ablation spot sampling has the highest spatial resolution of any sampling mode, while laser ablation raster sampling has the highest sampling area adaptability of the sampling modes. In conclusion, laser ablation spot sampling has the highest spatial resolution of any sampling mode, useful in this case for the analysis of single cells. Laser ablation raster sampling was best for sampling regions with unique shapes that are difficult to measure using other sampling modes. Laser 'cut and drop' sampling can be used for cases where the highest sensitivity is needed, for example, monitoring drugs present in trace amounts in tissue.« less

  15. Quantification of localized vertebral deformities using a sparse wavelet-based shape model.

    PubMed

    Zewail, R; Elsafi, A; Durdle, N

    2008-01-01

    Medical experts often examine hundreds of spine x-ray images to determine existence of various pathologies. Common pathologies of interest are anterior osteophites, disc space narrowing, and wedging. By careful inspection of the outline shapes of the vertebral bodies, experts are able to identify and assess vertebral abnormalities with respect to the pathology under investigation. In this paper, we present a novel method for quantification of vertebral deformation using a sparse shape model. Using wavelets and Independent component analysis (ICA), we construct a sparse shape model that benefits from the approximation power of wavelets and the capability of ICA to capture higher order statistics in wavelet space. The new model is able to capture localized pathology-related shape deformations, hence it allows for quantification of vertebral shape variations. We investigate the capability of the model to predict localized pathology related deformations. Next, using support-vector machines, we demonstrate the diagnostic capabilities of the method through the discrimination of anterior osteophites in lumbar vertebrae. Experiments were conducted using a set of 150 contours from digital x-ray images of lumbar spine. Each vertebra is labeled as normal or abnormal. Results reported in this work focus on anterior osteophites as the pathology of interest.

  16. Volterra representation enables modeling of complex synaptic nonlinear dynamics in large-scale simulations.

    PubMed

    Hu, Eric Y; Bouteiller, Jean-Marie C; Song, Dong; Baudry, Michel; Berger, Theodore W

    2015-01-01

    Chemical synapses are comprised of a wide collection of intricate signaling pathways involving complex dynamics. These mechanisms are often reduced to simple spikes or exponential representations in order to enable computer simulations at higher spatial levels of complexity. However, these representations cannot capture important nonlinear dynamics found in synaptic transmission. Here, we propose an input-output (IO) synapse model capable of generating complex nonlinear dynamics while maintaining low computational complexity. This IO synapse model is an extension of a detailed mechanistic glutamatergic synapse model capable of capturing the input-output relationships of the mechanistic model using the Volterra functional power series. We demonstrate that the IO synapse model is able to successfully track the nonlinear dynamics of the synapse up to the third order with high accuracy. We also evaluate the accuracy of the IO synapse model at different input frequencies and compared its performance with that of kinetic models in compartmental neuron models. Our results demonstrate that the IO synapse model is capable of efficiently replicating complex nonlinear dynamics that were represented in the original mechanistic model and provide a method to replicate complex and diverse synaptic transmission within neuron network simulations.

  17. Volterra representation enables modeling of complex synaptic nonlinear dynamics in large-scale simulations

    PubMed Central

    Hu, Eric Y.; Bouteiller, Jean-Marie C.; Song, Dong; Baudry, Michel; Berger, Theodore W.

    2015-01-01

    Chemical synapses are comprised of a wide collection of intricate signaling pathways involving complex dynamics. These mechanisms are often reduced to simple spikes or exponential representations in order to enable computer simulations at higher spatial levels of complexity. However, these representations cannot capture important nonlinear dynamics found in synaptic transmission. Here, we propose an input-output (IO) synapse model capable of generating complex nonlinear dynamics while maintaining low computational complexity. This IO synapse model is an extension of a detailed mechanistic glutamatergic synapse model capable of capturing the input-output relationships of the mechanistic model using the Volterra functional power series. We demonstrate that the IO synapse model is able to successfully track the nonlinear dynamics of the synapse up to the third order with high accuracy. We also evaluate the accuracy of the IO synapse model at different input frequencies and compared its performance with that of kinetic models in compartmental neuron models. Our results demonstrate that the IO synapse model is capable of efficiently replicating complex nonlinear dynamics that were represented in the original mechanistic model and provide a method to replicate complex and diverse synaptic transmission within neuron network simulations. PMID:26441622

  18. Imaging of blood cells based on snapshot Hyper-Spectral Imaging systems

    NASA Astrophysics Data System (ADS)

    Robison, Christopher J.; Kolanko, Christopher; Bourlai, Thirimachos; Dawson, Jeremy M.

    2015-05-01

    Snapshot Hyper-Spectral imaging systems are capable of capturing several spectral bands simultaneously, offering coregistered images of a target. With appropriate optics, these systems are potentially able to image blood cells in vivo as they flow through a vessel, eliminating the need for a blood draw and sample staining. Our group has evaluated the capability of a commercial Snapshot Hyper-Spectral imaging system, the Arrow system from Rebellion Photonics, in differentiating between white and red blood cells on unstained blood smear slides. We evaluated the imaging capabilities of this hyperspectral camera; attached to a microscope at varying objective powers and illumination intensity. Hyperspectral data consisting of 25, 443x313 hyperspectral bands with ~3nm spacing were captured over the range of 419 to 494nm. Open-source hyper-spectral data cube analysis tools, used primarily in Geographic Information Systems (GIS) applications, indicate that white blood cells features are most prominent in the 428-442nm band for blood samples viewed under 20x and 50x magnification over a varying range of illumination intensities. These images could potentially be used in subsequent automated white blood cell segmentation and counting algorithms for performing in vivo white blood cell counting.

  19. Plenoptic background oriented schlieren imaging

    NASA Astrophysics Data System (ADS)

    Klemkowsky, Jenna N.; Fahringer, Timothy W.; Clifford, Christopher J.; Bathel, Brett F.; Thurow, Brian S.

    2017-09-01

    The combination of the background oriented schlieren (BOS) technique with the unique imaging capabilities of a plenoptic camera, termed plenoptic BOS, is introduced as a new addition to the family of schlieren techniques. Compared to conventional single camera BOS, plenoptic BOS is capable of sampling multiple lines-of-sight simultaneously. Displacements from each line-of-sight are collectively used to build a four-dimensional displacement field, which is a vector function structured similarly to the original light field captured in a raw plenoptic image. The displacement field is used to render focused BOS images, which qualitatively are narrow depth of field slices of the density gradient field. Unlike focused schlieren methods that require manually changing the focal plane during data collection, plenoptic BOS synthetically changes the focal plane position during post-processing, such that all focal planes are captured in a single snapshot. Through two different experiments, this work demonstrates that plenoptic BOS is capable of isolating narrow depth of field features, qualitatively inferring depth, and quantitatively estimating the location of disturbances in 3D space. Such results motivate future work to transition this single-camera technique towards quantitative reconstructions of 3D density fields.

  20. Dynamic Non-Hierarchical File Systems for Exascale Storage

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Long, Darrell E.; Miller, Ethan L

    This constitutes the final report for “Dynamic Non-Hierarchical File Systems for Exascale Storage”. The ultimate goal of this project was to improve data management in scientific computing and high-end computing (HEC) applications, and to achieve this goal we proposed: to develop the first, HEC-targeted, file system featuring rich metadata and provenance collection, extreme scalability, and future storage hardware integration as core design goals, and to evaluate and develop a flexible non-hierarchical file system interface suitable for providing more powerful and intuitive data management interfaces to HEC and scientific computing users. Data management is swiftly becoming a serious problem in themore » scientific community – while copious amounts of data are good for obtaining results, finding the right data is often daunting and sometimes impossible. Scientists participating in a Department of Energy workshop noted that most of their time was spent “...finding, processing, organizing, and moving data and it’s going to get much worse”. Scientists should not be forced to become data mining experts in order to retrieve the data they want, nor should they be expected to remember the naming convention they used several years ago for a set of experiments they now wish to revisit. Ideally, locating the data you need would be as easy as browsing the web. Unfortunately, existing data management approaches are usually based on hierarchical naming, a 40 year-old technology designed to manage thousands of files, not exabytes of data. Today’s systems do not take advantage of the rich array of metadata that current high-end computing (HEC) file systems can gather, including content-based metadata and provenance1 information. As a result, current metadata search approaches are typically ad hoc and often work by providing a parallel management system to the “main” file system, as is done in Linux (the locate utility), personal computers, and enterprise search appliances. These search applications are often optimized for a single file system, making it difficult to move files and their metadata between file systems. Users have tried to solve this problem in several ways, including the use of separate databases to index file properties, the encoding of file properties into file names, and separately gathering and managing provenance data, but none of these approaches has worked well, either due to limited usefulness or scalability, or both. Our research addressed several key issues: High-performance, real-time metadata harvesting: extracting important attributes from files dynamically and immediately updating indexes used to improve search; Transparent, automatic, and secure provenance capture: recording the data inputs and processing steps used in the production of each file in the system; Scalable indexing: indexes that are optimized for integration with the file system; Dynamic file system structure: our approach provides dynamic directories similar to those in semantic file systems, but these are the native organization rather than a feature grafted onto a conventional system. In addition to these goals, our research effort will include evaluating the impact of new storage technologies on the file system design and performance. In particular, the indexing and metadata harvesting functions can potentially benefit from the performance improvements promised by new storage class memories.« less

Top