Sample records for history file processing

  1. Description of the process used to create 1992 Hanford Morality Study database

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gilbert, E.S.; Buchanan, J.A.; Holter, N.A.

    1992-12-01

    An updated and expanded database for the Hanford Mortality Study has been developed by PNL`s Epidemiology and Biometry Department. The purpose of this report is to document this process. The primary sources of data were the Occupational Health History (OHH) files maintained by the Hanford Environmental Health Foundation (HEHF) and including demographic data and job histories; the Hanford Mortality (HMO) files also maintained by HEHF and including information of deaths of Hanford workers; the Occupational Radiation Exposure (ORE) files maintained by PNL`s Health Physics Department and containing data on external dosimetry; and a file of workers with confirmed internal depositionsmore » of radionuclides also maintained by PNL`s Health Physics Department. This report describes each of these files in detail, and also describes the many edits that were performed to address the consistency and accuracy of data within and between these files.« less

  2. Description of the process used to create 1992 Hanford Morality Study database

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gilbert, E. S.; Buchanan, J. A.; Holter, N. A.

    1992-12-01

    An updated and expanded database for the Hanford Mortality Study has been developed by PNL's Epidemiology and Biometry Department. The purpose of this report is to document this process. The primary sources of data were the Occupational Health History (OHH) files maintained by the Hanford Environmental Health Foundation (HEHF) and including demographic data and job histories; the Hanford Mortality (HMO) files also maintained by HEHF and including information of deaths of Hanford workers; the Occupational Radiation Exposure (ORE) files maintained by PNL's Health Physics Department and containing data on external dosimetry; and a file of workers with confirmed internal depositionsmore » of radionuclides also maintained by PNL's Health Physics Department. This report describes each of these files in detail, and also describes the many edits that were performed to address the consistency and accuracy of data within and between these files.« less

  3. 32 CFR 1653.3 - Review by the National Appeal Board.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... review the file to insure that no procedural errors have occurred during the history of the current claim. Files containing procedural errors will be returned to the board where the errors occurred for any additional processing necessary to correct such errors. (c) Files containing procedural errors that were not...

  4. Radar Unix: a complete package for GPR data processing

    NASA Astrophysics Data System (ADS)

    Grandjean, Gilles; Durand, Herve

    1999-03-01

    A complete package for ground penetrating radar data interpretation including data processing, forward modeling and a case history database consultation is presented. Running on an Unix operating system, its architecture consists of a graphical user interface generating batch files transmitted to a library of processing routines. This design allows a better software maintenance and the possibility for the user to run processing or modeling batch files by itself and differed in time. A case history data base is available and consists of an hypertext document which can be consulted by using a standard HTML browser. All the software specifications are presented through a realistic example.

  5. Processing tracking in jMRUI software for magnetic resonance spectra quantitation reproducibility assurance.

    PubMed

    Jabłoński, Michał; Starčuková, Jana; Starčuk, Zenon

    2017-01-23

    Proton magnetic resonance spectroscopy is a non-invasive measurement technique which provides information about concentrations of up to 20 metabolites participating in intracellular biochemical processes. In order to obtain any metabolic information from measured spectra a processing should be done in specialized software, like jMRUI. The processing is interactive and complex and often requires many trials before obtaining a correct result. This paper proposes a jMRUI enhancement for efficient and unambiguous history tracking and file identification. A database storing all processing steps, parameters and files used in processing was developed for jMRUI. The solution was developed in Java, authors used a SQL database for robust storage of parameters and SHA-256 hash code for unambiguous file identification. The developed system was integrated directly in jMRUI and it will be publically available. A graphical user interface was implemented in order to make the user experience more comfortable. The database operation is invisible from the point of view of the common user, all tracking operations are performed in the background. The implemented jMRUI database is a tool that can significantly help the user to track the processing history performed on data in jMRUI. The created tool is oriented to be user-friendly, robust and easy to use. The database GUI allows the user to browse the whole processing history of a selected file and learn e.g. what processing lead to the results, where the original data are stored, to obtain the list of all processing actions performed on spectra.

  6. Guide to GFS History File Change on May 1, 2007

    Science.gov Websites

    Guide to GFS History File Change on May 1, 2007 On May 1, 2007 12Z, the GFS had a major change. The change caused the internal binary GFS history file to change formats. The file is still in spectral space but now pressure is calculated in a different way. Sometime in the future, the GFS history file may be

  7. 21 CFR 820.3 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... designs, manufactures, fabricates, assembles, or processes a finished device. Manufacturer includes but is... numbers, or both, from which the history of the manufacturing, packaging, labeling, and distribution of a unit, lot, or batch of finished devices can be determined. (e) Design history file (DHF) means a...

  8. 7 CFR 1980.452 - FmHA or its successor agency under Public Law 103-354 evaluation of application.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... National Officer that the Statements of Personal History(s) have been processed and cleared. FmHA or its... borrower, individual customer credit file, installment Loan Ledger Card or Computer printouts and other...

  9. Data handling with SAM and art at the NO vA experiment

    DOE PAGES

    Aurisano, A.; Backhouse, C.; Davies, G. S.; ...

    2015-12-23

    During operations, NOvA produces between 5,000 and 7,000 raw files per day with peaks in excess of 12,000. These files must be processed in several stages to produce fully calibrated and reconstructed analysis files. In addition, many simulated neutrino interactions must be produced and processed through the same stages as data. To accommodate the large volume of data and Monte Carlo, production must be possible both on the Fermilab grid and on off-site farms, such as the ones accessible through the Open Science Grid. To handle the challenge of cataloging these files and to facilitate their off-line processing, we havemore » adopted the SAM system developed at Fermilab. SAM indexes files according to metadata, keeps track of each file's physical locations, provides dataset management facilities, and facilitates data transfer to off-site grids. To integrate SAM with Fermilab's art software framework and the NOvA production workflow, we have developed methods to embed metadata into our configuration files, art files, and standalone ROOT files. A module in the art framework propagates the embedded information from configuration files into art files, and from input art files to output art files, allowing us to maintain a complete processing history within our files. Embedding metadata in configuration files also allows configuration files indexed in SAM to be used as inputs to Monte Carlo production jobs. Further, SAM keeps track of the input files used to create each output file. Parentage information enables the construction of self-draining datasets which have become the primary production paradigm used at NOvA. In this study we will present an overview of SAM at NOvA and how it has transformed the file production framework used by the experiment.« less

  10. Google earth as a source of ancillary material in a history of psychology class.

    PubMed

    Stevison, Blake K; Biggs, Patrick T; Abramson, Charles I

    2010-06-01

    This article discusses the use of Google Earth to visit significant geographical locations associated with events in the history of psychology. The process of opening files, viewing content, adding placemarks, and saving customized virtual tours on Google Earth are explained. Suggestions for incorporating Google Earth into a history of psychology course are also described.

  11. 49 CFR 391.53 - Driver investigation history file.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 49 Transportation 5 2011-10-01 2011-10-01 false Driver investigation history file. 391.53 Section... Driver investigation history file. (a) After October 29, 2004, each motor carrier must maintain records relating to the investigation into the safety performance history of a new or prospective driver pursuant...

  12. 49 CFR 391.53 - Driver investigation history file.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 49 Transportation 5 2014-10-01 2014-10-01 false Driver investigation history file. 391.53 Section... Driver investigation history file. (a) After October 29, 2004, each motor carrier must maintain records relating to the investigation into the safety performance history of a new or prospective driver pursuant...

  13. 49 CFR 391.53 - Driver investigation history file.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 49 Transportation 5 2012-10-01 2012-10-01 false Driver investigation history file. 391.53 Section... Driver investigation history file. (a) After October 29, 2004, each motor carrier must maintain records relating to the investigation into the safety performance history of a new or prospective driver pursuant...

  14. 49 CFR 391.53 - Driver investigation history file.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 49 Transportation 5 2013-10-01 2013-10-01 false Driver investigation history file. 391.53 Section... Driver investigation history file. (a) After October 29, 2004, each motor carrier must maintain records relating to the investigation into the safety performance history of a new or prospective driver pursuant...

  15. 49 CFR 391.53 - Driver investigation history file.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 49 Transportation 5 2010-10-01 2010-10-01 false Driver investigation history file. 391.53 Section... Driver investigation history file. (a) After October 29, 2004, each motor carrier must maintain records relating to the investigation into the safety performance history of a new or prospective driver pursuant...

  16. Data sharing system for lithography APC

    NASA Astrophysics Data System (ADS)

    Kawamura, Eiichi; Teranishi, Yoshiharu; Shimabara, Masanori

    2007-03-01

    We have developed a simple and cost-effective data sharing system between fabs for lithography advanced process control (APC). Lithography APC requires process flow, inter-layer information, history information, mask information and so on. So, inter-APC data sharing system has become necessary when lots are to be processed in multiple fabs (usually two fabs). The development cost and maintenance cost also have to be taken into account. The system handles minimum information necessary to make trend prediction for the lots. Three types of data have to be shared for precise trend prediction. First one is device information of the lots, e.g., process flow of the device and inter-layer information. Second one is mask information from mask suppliers, e.g., pattern characteristics and pattern widths. Last one is history data of the lots. Device information is electronic file and easy to handle. The electronic file is common between APCs and uploaded into the database. As for mask information sharing, mask information described in common format is obtained via Wide Area Network (WAN) from mask-vender will be stored in the mask-information data server. This information is periodically transferred to one specific lithography-APC server and compiled into the database. This lithography-APC server periodically delivers the mask-information to every other lithography-APC server. Process-history data sharing system mainly consists of function of delivering process-history data. In shipping production lots to another fab, the product-related process-history data is delivered by the lithography-APC server from the shipping site. We have confirmed the function and effectiveness of data sharing systems.

  17. 20 CFR 30.103 - How does a claimant make sure that OWCP has the evidence necessary to process the claim?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... Certain Cancer Claims Filing Claims for Benefits Under Eeoicpa § 30.103 How does a claimant make sure that... Compensation Program Act. (3) EE-3 Employment History for a Claim Under the Energy Employees Occupational Illness Compensation Program Act. (4) EE-4 Employment History Affidavit for a Claim Under the Energy...

  18. 20 CFR 30.103 - How does a claimant make sure that OWCP has the evidence necessary to process the claim?

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... Certain Cancer Claims Filing Claims for Benefits Under Eeoicpa § 30.103 How does a claimant make sure that... Compensation Program Act. (3) EE-3 Employment History for a Claim Under the Energy Employees Occupational Illness Compensation Program Act. (4) EE-4 Employment History Affidavit for a Claim Under the Energy...

  19. 20 CFR 30.103 - How does a claimant make sure that OWCP has the evidence necessary to process the claim?

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... Certain Cancer Claims Filing Claims for Benefits Under Eeoicpa § 30.103 How does a claimant make sure that... Compensation Program Act. (3) EE-3 Employment History for a Claim Under the Energy Employees Occupational Illness Compensation Program Act. (4) EE-4 Employment History Affidavit for a Claim Under the Energy...

  20. 20 CFR 30.103 - How does a claimant make sure that OWCP has the evidence necessary to process the claim?

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... Certain Cancer Claims Filing Claims for Benefits Under Eeoicpa § 30.103 How does a claimant make sure that... Compensation Program Act. (3) EE-3 Employment History for a Claim Under the Energy Employees Occupational Illness Compensation Program Act. (4) EE-4 Employment History Affidavit for a Claim Under the Energy...

  1. Program documentation for the space environment test division post-test data reduction program (GNFLEX)

    NASA Technical Reports Server (NTRS)

    Jones, L. D.

    1979-01-01

    The Space Environment Test Division Post-Test Data Reduction Program processes data from test history tapes generated on the Flexible Data System in the Space Environment Simulation Laboratory at the National Aeronautics and Space Administration/Lyndon B. Johnson Space Center. The program reads the tape's data base records to retrieve the item directory conversion file, the item capture file and the process link file to determine the active parameters. The desired parameter names are read in by lead cards after which the periodic data records are read to determine parameter data level changes. The data is considered to be compressed rather than full sample rate. Tabulations and/or a tape for generating plots may be output.

  2. Sequence History Update Tool

    NASA Technical Reports Server (NTRS)

    Khanampompan, Teerapat; Gladden, Roy; Fisher, Forest; DelGuercio, Chris

    2008-01-01

    The Sequence History Update Tool performs Web-based sequence statistics archiving for Mars Reconnaissance Orbiter (MRO). Using a single UNIX command, the software takes advantage of sequencing conventions to automatically extract the needed statistics from multiple files. This information is then used to populate a PHP database, which is then seamlessly formatted into a dynamic Web page. This tool replaces a previous tedious and error-prone process of manually editing HTML code to construct a Web-based table. Because the tool manages all of the statistics gathering and file delivery to and from multiple data sources spread across multiple servers, there is also a considerable time and effort savings. With the use of The Sequence History Update Tool what previously took minutes is now done in less than 30 seconds, and now provides a more accurate archival record of the sequence commanding for MRO.

  3. Manual for Getdata Version 3.1: a FORTRAN Utility Program for Time History Data

    NASA Technical Reports Server (NTRS)

    Maine, Richard E.

    1987-01-01

    This report documents version 3.1 of the GetData computer program. GetData is a utility program for manipulating files of time history data, i.e., data giving the values of parameters as functions of time. The most fundamental capability of GetData is extracting selected signals and time segments from an input file and writing the selected data to an output file. Other capabilities include converting file formats, merging data from several input files, time skewing, interpolating to common output times, and generating calculated output signals as functions of the input signals. This report also documents the interface standards for the subroutines used by GetData to read and write the time history files. All interface to the data files is through these subroutines, keeping the main body of GetData independent of the precise details of the file formats. Different file formats can be supported by changes restricted to these subroutines. Other computer programs conforming to the interface standards can call the same subroutines to read and write files in compatible formats.

  4. A Data Handling System for Modern and Future Fermilab Experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Illingworth, R. A.

    2014-01-01

    Current and future Fermilab experiments such as Minerva, NOνA, and MicroBoone are now using an improved version of the Fermilab SAM data handling system. SAM was originally used by the CDF and D0 experiments for Run II of the Fermilab Tevatron to provide file metadata and location cataloguing, uploading of new files to tape storage, dataset management, file transfers between global processing sites, and processing history tracking. However SAM was heavily tailored to the Run II environment and required complex and hard to deploy client software, which made it hard to adapt to new experiments. The Fermilab Computing Sector hasmore » progressively updated SAM to use modern, standardized, technologies in order to more easily deploy it for current and upcoming Fermilab experiments, and to support the data preservation efforts of the Run II experiments.« less

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, P E

    Tips and case histories on computer use for idea and outline processing: Productivity software to solve problems of idea hierarchy, transitions, and developments is matched to solutions for communicators. One case is text that ranges from methods and procedures to histories and legal definitions of classification for the US Department of Energy. Applications of value to writers, editors, and managers are for research; calendars; creativity; prioritization; idea discovery and manipulation; file and time management; and contents, indexes, and glossaries. 6 refs., 7 figs.

  6. Case file coding of child maltreatment: Methods, challenges, and innovations in a longitudinal project of youth in foster care.

    PubMed

    Huffhines, Lindsay; Tunno, Angela M; Cho, Bridget; Hambrick, Erin P; Campos, Ilse; Lichty, Brittany; Jackson, Yo

    2016-08-01

    State social service agency case files are a common mechanism for obtaining information about a child's maltreatment history, yet these documents are often challenging for researchers to access, and then to process in a manner consistent with the requirements of social science research designs. Specifically, accessing and navigating case files is an extensive undertaking, and a task that many researchers have had to maneuver with little guidance. Even after the files are in hand and the research questions and relevant variables have been clarified, case file information about a child's maltreatment exposure can be idiosyncratic, vague, inconsistent, and incomplete, making coding such information into useful variables for statistical analyses difficult. The Modified Maltreatment Classification System (MMCS) is a popular tool used to guide the process, and though comprehensive, this coding system cannot cover all idiosyncrasies found in case files. It is not clear from the literature how researchers implement this system while accounting for issues outside of the purview of the MMCS or that arise during MMCS use. Finally, a large yet reliable file coding team is essential to the process, however, the literature lacks training guidelines and methods for establishing reliability between coders. In an effort to move the field toward a common approach, the purpose of the present discussion is to detail the process used by one large-scale study of child maltreatment, the Studying Pathways to Adjustment and Resilience in Kids (SPARK) project, a longitudinal study of resilience in youth in foster care. The article addresses each phase of case file coding, from accessing case files, to identifying how to measure constructs of interest, to dealing with exceptions to the coding system, to coding variables reliably, to training large teams of coders and monitoring for fidelity. Implications for a comprehensive and efficient approach to case file coding are discussed.

  7. Case file coding of child maltreatment: Methods, challenges, and innovations in a longitudinal project of youth in foster care☆

    PubMed Central

    Huffhines, Lindsay; Tunno, Angela M.; Cho, Bridget; Hambrick, Erin P.; Campos, Ilse; Lichty, Brittany; Jackson, Yo

    2016-01-01

    State social service agency case files are a common mechanism for obtaining information about a child’s maltreatment history, yet these documents are often challenging for researchers to access, and then to process in a manner consistent with the requirements of social science research designs. Specifically, accessing and navigating case files is an extensive undertaking, and a task that many researchers have had to maneuver with little guidance. Even after the files are in hand and the research questions and relevant variables have been clarified, case file information about a child’s maltreatment exposure can be idiosyncratic, vague, inconsistent, and incomplete, making coding such information into useful variables for statistical analyses difficult. The Modified Maltreatment Classification System (MMCS) is a popular tool used to guide the process, and though comprehensive, this coding system cannot cover all idiosyncrasies found in case files. It is not clear from the literature how researchers implement this system while accounting for issues outside of the purview of the MMCS or that arise during MMCS use. Finally, a large yet reliable file coding team is essential to the process, however, the literature lacks training guidelines and methods for establishing reliability between coders. In an effort to move the field toward a common approach, the purpose of the present discussion is to detail the process used by one large-scale study of child maltreatment, the Studying Pathways to Adjustment and Resilience in Kids (SPARK) project, a longitudinal study of resilience in youth in foster care. The article addresses each phase of case file coding, from accessing case files, to identifying how to measure constructs of interest, to dealing with exceptions to the coding system, to coding variables reliably, to training large teams of coders and monitoring for fidelity. Implications for a comprehensive and efficient approach to case file coding are discussed. PMID:28138207

  8. Navy.mil - Photo Galleries

    Science.gov Websites

    Status of the Navy Strategic Documents Command Directory Our Ships Fact Files Today in Naval History Defense.gov U.S. Army U.S. Air Force U.S. Marine Corps U.S. Coast Guard Naval History & Heritage Command Ships Fact Files Today in Naval History Contact Us Command Addresses (SNDL) FAQ Leadership Secretary of

  9. Trustworthy History and Provenance for Files and Databases

    ERIC Educational Resources Information Center

    Hasan, Ragib

    2009-01-01

    In today's world, information is increasingly created, processed, transmitted, and stored digitally. While the digital nature of information has brought enormous benefits, it has also created new vulnerabilities and attacks against data. Unlike physical documents, digitally stored information can be rapidly copied, erased, or modified. The…

  10. Puzzling History--The Personal File in Residential Care: A Source for Life History and Historical Research

    ERIC Educational Resources Information Center

    De Wilde, Lieselot; Vanobbergen, Bruno

    2017-01-01

    Since the turn of the century large groups of former institutionalised children have exercised their right to see their "personal files", and this has drawn widespread attention to these documents and their potential in scholarly research. This article explores the meanings of personal files from the period 1945-1984 as sources for both…

  11. 20 CFR 30.105 - What must DOE do after an employee or survivor files a claim?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... Filing Claims; Evidence and Burden of Proof; Special Procedures for Certain Cancer Claims Verification of... employment history provided by the claimant. Upon receipt of such a request, DOE will complete Form EE-5 as... concurs with the employment history provided by the claimant, that it disagrees with such history, or that...

  12. 20 CFR 30.105 - What must DOE do after an employee or survivor files a claim?

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... Filing Claims; Evidence and Burden of Proof; Special Procedures for Certain Cancer Claims Verification of... employment history provided by the claimant. Upon receipt of such a request, DOE will complete Form EE-5 as... concurs with the employment history provided by the claimant, that it disagrees with such history, or that...

  13. 20 CFR 30.105 - What must DOE do after an employee or survivor files a claim?

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... Filing Claims; Evidence and Burden of Proof; Special Procedures for Certain Cancer Claims Verification of... employment history provided by the claimant. Upon receipt of such a request, DOE will complete Form EE-5 as... concurs with the employment history provided by the claimant, that it disagrees with such history, or that...

  14. 20 CFR 30.105 - What must DOE do after an employee or survivor files a claim?

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... Filing Claims; Evidence and Burden of Proof; Special Procedures for Certain Cancer Claims Verification of... employment history provided by the claimant. Upon receipt of such a request, DOE will complete Form EE-5 as... concurs with the employment history provided by the claimant, that it disagrees with such history, or that...

  15. Data Bookkeeping Service 3 - Providing Event Metadata in CMS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Giffels, Manuel; Guo, Y.; Riley, Daniel

    The Data Bookkeeping Service 3 provides a catalog of event metadata for Monte Carlo and recorded data of the Compact Muon Solenoid (CMS) experiment at the Large Hadron Collider (LHC) at CERN, Geneva. It comprises all necessary information for tracking datasets, their processing history and associations between runs, files and datasets, on a large scale of about 200, 000 datasets and more than 40 million files, which adds up in around 700 GB of metadata. The DBS is an essential part of the CMS Data Management and Workload Management (DMWM) systems [1], all kind of data-processing like Monte Carlo production,more » processing of recorded event data as well as physics analysis done by the users are heavily relying on the information stored in DBS.« less

  16. View_SPECPR: Software for Plotting Spectra (Installation Manual and User's Guide, Version 1.2)

    USGS Publications Warehouse

    Kokaly, Raymond F.

    2008-01-01

    This document describes procedures for installing and using the 'View_SPECPR' software system to plot spectra stored in SPECPR (SPECtrum Processing Routines) files. The View_SPECPR software is comprised of programs written in IDL (Interactive Data Language) that run within the ENVI (ENvironment for Visualizing Images) image processing system. SPECPR files are used by earth-remote-sensing scientists and planetary scientists for storing spectra collected by laboratory, field, and remote sensing instruments. A widely distributed SPECPR file is the U.S. Geological Survey (USGS) spectral library that contains thousands of spectra of minerals, vegetation, and man-made materials (Clark and others, 2007). SPECPR files contain reflectance data and associated wavelength and spectral resolution data, as well as meta-data on the time and date of collection and spectrometer settings. Furthermore, the SPECPR file automatically tracks changes to data records through its 'history' fields. For more details on the format and content of SPECPR files, see Clark (1993). For more details on ENVI, see ITT (2008). This program has been updated using an ENVI 4.5/IDL7.0 full license operating on a Windows XP operating system and requires the installation of the iTools components of IDL7.0; however, this program should work with full licenses on UNIX/LINUX systems. This software has not been tested with ENVI licenses on Windows Vista or Apple Operating Systems.

  17. 77 FR 6867 - Practice Guide for Proposed Trial Rules

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-09

    ... maintaining a complete and understandable file history and the parties' interest in protecting truly sensitive... complete and understandable file history for public notices purposes. The rule encourages parties to redact... been obvious over the prior art, the Board may review objective evidence of secondary considerations. B...

  18. PipeOnline 2.0: automated EST processing and functional data sorting.

    PubMed

    Ayoubi, Patricia; Jin, Xiaojing; Leite, Saul; Liu, Xianghui; Martajaja, Jeson; Abduraham, Abdurashid; Wan, Qiaolan; Yan, Wei; Misawa, Eduardo; Prade, Rolf A

    2002-11-01

    Expressed sequence tags (ESTs) are generated and deposited in the public domain, as redundant, unannotated, single-pass reactions, with virtually no biological content. PipeOnline automatically analyses and transforms large collections of raw DNA-sequence data from chromatograms or FASTA files by calling the quality of bases, screening and removing vector sequences, assembling and rewriting consensus sequences of redundant input files into a unigene EST data set and finally through translation, amino acid sequence similarity searches, annotation of public databases and functional data. PipeOnline generates an annotated database, retaining the processed unigene sequence, clone/file history, alignments with similar sequences, and proposed functional classification, if available. Functional annotation is automatic and based on a novel method that relies on homology of amino acid sequence multiplicity within GenBank records. Records are examined through a function ordered browser or keyword queries with automated export of results. PipeOnline offers customization for individual projects (MyPipeOnline), automated updating and alert service. PipeOnline is available at http://stress-genomics.org.

  19. 76 FR 63537 - Mandatory Electronic Filing for Agencies and Attorneys at Washington Regional Office and Denver...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-13

    ... believes e-filing would create an undue burden may request an exemption from the administrative [email protected] . SUPPLEMENTARY INFORMATION: 1. History of MSPB's E-Filing Initiative On February 26, 2008, MSPB issued final regulations at 5 CFR parts 1201, 1203, 1208, and 1209 governing e-filing. 73 FR 10127...

  20. Progress in defining a standard for file-level metadata

    NASA Technical Reports Server (NTRS)

    Williams, Joel; Kobler, Ben

    1996-01-01

    In the following narrative, metadata required to locate a file on tape or collection of tapes will be referred to as file-level metadata. This paper discribes the rationale for and the history of the effort to define a standard for this metadata.

  1. Agent Based Computing Machine

    DTIC Science & Technology

    2005-12-09

    decision making logic that respond to the environment (concentration of operands - the state vector), and bias or "mood" as established by its history of...mentioned in the chart, there is no need for file management in a ABC Machine. Information is distributed, no history is maintained. The instruction set... Postgresql ) for collection of cluster samples/snapshots over intervals of time. An prototypical example of an XML file to configure and launch the ABC

  2. Revision history aware repositories of computational models of biological systems.

    PubMed

    Miller, Andrew K; Yu, Tommy; Britten, Randall; Cooling, Mike T; Lawson, James; Cowan, Dougal; Garny, Alan; Halstead, Matt D B; Hunter, Peter J; Nickerson, David P; Nunns, Geo; Wimalaratne, Sarala M; Nielsen, Poul M F

    2011-01-14

    Building repositories of computational models of biological systems ensures that published models are available for both education and further research, and can provide a source of smaller, previously verified models to integrate into a larger model. One problem with earlier repositories has been the limitations in facilities to record the revision history of models. Often, these facilities are limited to a linear series of versions which were deposited in the repository. This is problematic for several reasons. Firstly, there are many instances in the history of biological systems modelling where an 'ancestral' model is modified by different groups to create many different models. With a linear series of versions, if the changes made to one model are merged into another model, the merge appears as a single item in the history. This hides useful revision history information, and also makes further merges much more difficult, as there is no record of which changes have or have not already been merged. In addition, a long series of individual changes made outside of the repository are also all merged into a single revision when they are put back into the repository, making it difficult to separate out individual changes. Furthermore, many earlier repositories only retain the revision history of individual files, rather than of a group of files. This is an important limitation to overcome, because some types of models, such as CellML 1.1 models, can be developed as a collection of modules, each in a separate file. The need for revision history is widely recognised for computer software, and a lot of work has gone into developing version control systems and distributed version control systems (DVCSs) for tracking the revision history. However, to date, there has been no published research on how DVCSs can be applied to repositories of computational models of biological systems. We have extended the Physiome Model Repository software to be fully revision history aware, by building it on top of Mercurial, an existing DVCS. We have demonstrated the utility of this approach, when used in conjunction with the model composition facilities in CellML, to build and understand more complex models. We have also demonstrated the ability of the repository software to present version history to casual users over the web, and to highlight specific versions which are likely to be useful to users. Providing facilities for maintaining and using revision history information is an important part of building a useful repository of computational models, as this information is useful both for understanding the source of and justification for parts of a model, and to facilitate automated processes such as merges. The availability of fully revision history aware repositories, and associated tools, will therefore be of significant benefit to the community.

  3. BOREAS TGB-12 Soil Carbon and Flux Data of NSA-MSA in Raster Format

    NASA Technical Reports Server (NTRS)

    Hall, Forrest G. (Editor); Knapp, David E. (Editor); Rapalee, Gloria; Davidson, Eric; Harden, Jennifer W.; Trumbore, Susan E.; Veldhuis, Hugo

    2000-01-01

    The BOREAS TGB-12 team made measurements of soil carbon inventories, carbon concentration in soil gases, and rates of soil respiration at several sites. This data set provides: (1) estimates of soil carbon stocks by horizon based on soil survey data and analyses of data from individual soil profiles; (2) estimates of soil carbon fluxes based on stocks, fire history, drain-age, and soil carbon inputs and decomposition constants based on field work using radiocarbon analyses; (3) fire history data estimating age ranges of time since last fire; and (4) a raster image and an associated soils table file from which area-weighted maps of soil carbon and fluxes and fire history may be generated. This data set was created from raster files, soil polygon data files, and detailed lab analysis of soils data that were received from Dr. Hugo Veldhuis, who did the original mapping in the field during 1994. Also used were soils data from Susan Trumbore and Jennifer Harden (BOREAS TGB-12). The binary raster file covers a 733-km 2 area within the NSA-MSA.

  4. An analysis of the high-latitude thermospheric wind pattern calculated by a thermospheric general circulation model. I - Momentum forcing

    NASA Technical Reports Server (NTRS)

    Killeen, T. L.; Roble, R. G.

    1984-01-01

    A diagnostic processor (DP) was developed for analysis of hydrodynamic and thermodynamic processes predicted by the NCAR thermospheric general circulation model (TGCM). The TGCM contains a history file on the projected wind, temperature and composition fields at each grid point for each hour of universal time. The DP assimilates the history file plus ion drag tensors and drift velocities, specific heats, coefficients of viscosity, and thermal conductivity and calculates the individual forcing terms for the momentum and energy equations for a given altitude. Sample momentum forcings were calculated for high latitudes in the presence of forcing by solar radiation and magnetospheric convection with a 60 kV cross-tail potential, i.e., conditions on Oct. 21, 1981. It was found that ion drag and pressure forces balance out at F region heights where ion drift velocities are small. The magnetic polar cap/auroral zone boundary featured the largest residual force or net acceleration. Diurnal oscillations were detected in the thermospheric convection, and geostrophic balance was dominant in the E layer.

  5. A History of the Andrew File System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bashear, Derrick

    2011-02-22

    Derrick Brashear and Jeffrey Altman will present a technical history of the evolution of Andrew File System starting with the early days of the Andrew Project at Carnegie Mellon through the commercialization by Transarc Corporation and IBM and a decade of OpenAFS. The talk will be technical with a focus on the various decisions and implementation trade-offs that were made over the course of AFS versions 1 through 4, the development of the Distributed Computing Environment Distributed File System (DCE DFS), and the course of the OpenAFS development community. The speakers will also discuss the various AFS branches developed atmore » the University of Michigan, Massachusetts Institute of Technology and Carnegie Mellon University.« less

  6. MINC 2.0: A Flexible Format for Multi-Modal Images.

    PubMed

    Vincent, Robert D; Neelin, Peter; Khalili-Mahani, Najmeh; Janke, Andrew L; Fonov, Vladimir S; Robbins, Steven M; Baghdadi, Leila; Lerch, Jason; Sled, John G; Adalat, Reza; MacDonald, David; Zijdenbos, Alex P; Collins, D Louis; Evans, Alan C

    2016-01-01

    It is often useful that an imaging data format can afford rich metadata, be flexible, scale to very large file sizes, support multi-modal data, and have strong inbuilt mechanisms for data provenance. Beginning in 1992, MINC was developed as a system for flexible, self-documenting representation of neuroscientific imaging data with arbitrary orientation and dimensionality. The MINC system incorporates three broad components: a file format specification, a programming library, and a growing set of tools. In the early 2000's the MINC developers created MINC 2.0, which added support for 64-bit file sizes, internal compression, and a number of other modern features. Because of its extensible design, it has been easy to incorporate details of provenance in the header metadata, including an explicit processing history, unique identifiers, and vendor-specific scanner settings. This makes MINC ideal for use in large scale imaging studies and databases. It also makes it easy to adapt to new scanning sequences and modalities.

  7. National Geochemical Database reformatted data from the National Uranium Resource Evaluation (NURE) Hydrogeochemical and Stream Sediment Reconnaissance (HSSR) program

    USGS Publications Warehouse

    Smith, Steven M.

    1997-01-01

    The National Uranium Resource Evaluation (NURE) Hydrogeochemical and Stream Sediment Reconnaissance (HSSR) program produced a large amount of geochemical data. To fully understand how these data were generated, it is recommended that you read the History of NURE HSSR Program for a summary of the entire program. By the time the NURE program had ended, the HSSR data consisted of 894 separate data files stored with 47 different formats. Many files contained duplication of data found in other files. The University of Oklahoma's Information Systems Programs of the Energy Resources Institute (ISP) was contracted by the Department of Energy to enhance the accessibility and usefulness of the NURE HSSR data. ISP created a single standard-format master file to replace the 894 original files. ISP converted 817 of the 894 original files before its funding apparently ran out. The ISP-reformatted NURE data files have been released by the USGS on CD-ROM (Lower 48 States, Hoffman and Buttleman, 1994; Alaska, Hoffman and Buttleman, 1996). A description of each NURE database field, derived from a draft NURE HSSR data format manual (unpubl. commun., Stan Moll, ISP, Oct 7, 1988), was included in a readme file on each CD-ROM. That original manual was incomplete and assumed that the reformatting process had gone to completion. A lot of vital information was not included. Efforts to correct that manual and the NURE data revealed a large number of problems and missing data. As a result of the frustrating process of cleaning and re-cleaning data from the ISP-reformatted NURE files, a new NURE HSSR data format was developed. This work represents a totally new attempt to reformat the original NURE files into 2 consistent database structures; one for water samples and a second for sediment samples, on a quadrangle by quadrangle basis, from the original NURE files. Although this USGS-reformatted NURE HSSR data format is different than that created by the ISP, many of their ideas were incorporated and expanded in this effort. All of the data from each quadrangle are being examined thoroughly in an attempt to eliminate problems, to combine partial or duplicate records, to convert all coding to a common scheme, and to identify problems even if they can not be solved at this time.

  8. 78 FR 57353 - Endangered Species; File No. 14726

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-18

    ... gather information on their life-history, genetics, movements, behavior, and diet. Researchers are... from this sampling would help Dr. Witherington determine the trophic history of pelagic neonate and... with the trophic histories would further describe the sea turtles' home range, habitat use, residency...

  9. Records of the Office of the Director of Navy Laboratories: Historical Files, 1960-1980, Records Collection 3-1

    DTIC Science & Technology

    1984-07-01

    This report describes the historical records of the office of the Director of Navy Laboratories from 1960-1980. It lists the records down to the file...heading level and indexes them by chronological period, alphabetical file heading, and keyword. The report includes a brief administrative history of the office and organizational charts.

  10. Treatment of patients with a history of penicillin allergy in a large tertiary-care academic hospital.

    PubMed

    Picard, Matthieu; Bégin, Philippe; Bouchard, Hugues; Cloutier, Jonathan; Lacombe-Barrios, Jonathan; Paradis, Jean; Des Roches, Anne; Laufer, Brian; Paradis, Louis

    2013-01-01

    Prescribing antibiotics to patients with a history of penicillin allergy is common in clinical practice. Opting for non-beta-lactam antibiotics has its inconveniences and is often unnecessary, because most of these patients are in fact not allergic. This study aimed to determine how physicians in a large Canadian tertiary-care academic hospital without allergists on staff treat patients with a history of penicillin allergy. A retrospective study was conducted during a 1-year period among all patients hospitalized in the intensive care unit, coronary care unit, and internal medicine wards. Files of patients with a record of penicillin allergy were reviewed to assess the need for antibiotics during their hospitalization and the decision-making process underlying the choice of antibiotic. The additional costs of alternative antibiotics were calculated. The files of 1738 patients admitted over a 1-year period were hand reviewed. A history of penicillin allergy was found in 172 patients (9.9%). The allergic reaction was described in only 30% of cases and left unmentioned in 20.7%. Beta-lactam antibiotics were used on 56 occasions despite a history of penicillin allergy. The use of alternative antibiotics in place of the beta-lactam standard of care carried an additional cost of $15,672 Canadian. Alleged penicillin allergy is common among hospitalized patients and leads to substantial additional costs. Poor documentation of penicillin allergy likely reflects a lack of knowledge on this issue in the medical community, which impairs optimal treatment of these patients. Increased education on this matter is needed, and allergists on staff could be part of the solution. Copyright © 2013 American Academy of Allergy, Asthma & Immunology. Published by Elsevier Inc. All rights reserved.

  11. TOAD Editor

    NASA Technical Reports Server (NTRS)

    Bingle, Bradford D.; Shea, Anne L.; Hofler, Alicia S.

    1993-01-01

    Transferable Output ASCII Data (TOAD) computer program (LAR-13755), implements format designed to facilitate transfer of data across communication networks and dissimilar host computer systems. Any data file conforming to TOAD format standard called TOAD file. TOAD Editor is interactive software tool for manipulating contents of TOAD files. Commonly used to extract filtered subsets of data for visualization of results of computation. Also offers such user-oriented features as on-line help, clear English error messages, startup file, macroinstructions defined by user, command history, user variables, UNDO features, and full complement of mathematical statistical, and conversion functions. Companion program, TOAD Gateway (LAR-14484), converts data files from variety of other file formats to that of TOAD. TOAD Editor written in FORTRAN 77.

  12. Reconstructing the history of holography

    NASA Astrophysics Data System (ADS)

    Johnston, Sean F.

    2003-05-01

    This paper discusses large-scale but gradual changes in the subject of holography that have only recently become readily observable. Presenting an analysis of publications in holography over the past half century, the paper illustrates and discusses the evolving shape of the subject. Over 40,000 international information sources have been recorded, including some 20,000 papers, 10,000 books, nearly as many of these and at least 500 exhibitions. This statistical and sociological approach is combined with the identification of specific factors - notably the role of individuals, conferences, proof-of-concept demonstrations and exhibitions - to suggest that the development of holography has been unusually contingent on a variety of intellectual and social influences. The paper situates these observations about holography and holographers in the context of a wider discussion about the styles, purposes and difficulties of historical writing on technological subjects. It further suggests that this ongoing process of both recording and reconstructing technological history can be aided by identification of sources sometimes overlooked or undervalued by practitioners: unpublished archival materials such as private file collections; business records; or undervalued by practitioners: unpublished archival material such as private file collections; business records; accounts of unsuccessful activities; and, by no means least, anecdotal accounts inter-linked between participants.

  13. VizieR Online Data Catalog: Metal enrichment in semi-analytical model (Cousin+, 2016)

    NASA Astrophysics Data System (ADS)

    Cousin, M.; Buat, V.; Boissier, S.; Bethermin, M.; Roehlly, Y. Genois M.

    2016-04-01

    The repository contains outputs from the different models: - m1: Classical (only hot gas) isotropic accretion scenario + Standard Shmidt Kennicutt law - m2: Bimodal accretion (cold streams) + Standard Shmidt Kennicutt law - m3: Classical (only hot gas) isotropic accretion scenario + ad-hoc non-star forming gas reservoir - m4: Bimodal accretion (cold streams) + ad-hoc non-star forming gas reservoir For each model of these models dada are saved in eGalICS_m*.fits file. All these fits-formated files are compatible with the TOPCAT software available on: http://www.star.bris.ac.uk/~mbt/topcat/ We also provide, for each Initial Mass Function available, a set of two fits-formated files associated to the chemodynamical library presented in the paper. For these two files, data are available for all metallicity bins used. - masslossrates_IMF.fits: The instantaneous total ejecta rate associated to a SSP for the six different main-ISM elements. - SNratesIMF.fits: The total SN rate (SNII+SNIa [nb/Gyr]) associated to a SSP, individual contribution of SNII and SNIa are also given. These files are available for four different IMFs: Salpeter+55 (1955ApJ...121..161S), Chabrier+03 (2003PASP..115..763C), Kroupa+93 (2001MNRAS.322..231K) and Scalo+98 (1998ASPC..142..201S. Both ejecta rates and SN rates are computed for the complete list of stellar ages provided in the BC03 spectra library. They are saved in fits-formated files and structured with different extensions corresponding to the different initial stellar metallicity bins. We finally provide the median star formation history, the median gas accretion history and the metal enrichment histories associated to our MW-sisters sample: MWsistershistories.dat If you used data associated to eGalICS semi-analytic model, please cite the following paper: Cousin et al., 2015A&A...575A..33C, "Toward a new modelling of gas flows in a semi-analytical model of galaxy formation and evolution" (3 data files).

  14. 49 CFR 564.5 - Information filing; agency processing of filings.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 49 Transportation 6 2010-10-01 2010-10-01 false Information filing; agency processing of filings... HIGHWAY TRAFFIC SAFETY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION REPLACEABLE LIGHT SOURCE INFORMATION (Eff. until 12-01-12) § 564.5 Information filing; agency processing of filings. (a) Each manufacturer...

  15. An overview of the catalog manager

    NASA Technical Reports Server (NTRS)

    Irani, Frederick M.

    1986-01-01

    The Catalog Manager (CM) is being used at the Goddard Space Flight Center in conjunction with the Land Analysis System (LAS) running under the Transportable Applications Executive (TAE). CM maintains a catalog of file names for all users of the LAS system. The catalog provides a cross-reference between TAE user file names and fully qualified host-file names. It also maintains information about the content and status of each file. A brief history of CM development is given and a description of naming conventions, catalog structure and file attributes, and archive/retrieve capabilities is presented. General user operation and the LAS user scenario are also discussed.

  16. 78 FR 28732 - Revisions to Electric Quarterly Report Filing Process; Availability of Draft XML Schema

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-16

    ... posting CSV file samples. Order No. 770 revised the process for filing EQRs. Pursuant to Order No. 770, one of the new processes for filing allows EQRs to be filed using an XML file. The XML schema that is needed to file EQRs in this manner is now posted on the Commission's Web site at http://www.ferc.gov/docs...

  17. 47 CFR 0.285 - Record of actions taken.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL COMMISSION ORGANIZATION Delegations of Authority Chief, Media Bureau § 0.285 Record of actions taken. The history card, the station file, and other appropriate files are designated to be the official records of action taken by the Chief of the Media Bureau. The...

  18. The Historian and Electronic Research: File Transfer Protocol (FTP).

    ERIC Educational Resources Information Center

    McCarthy, Michael J.

    1993-01-01

    Asserts that the Internet will become the academic communication medium for historians in the 1990s. Describes the "file transfer protocol" (FTP) access approach to the Internet and discusses its significant for historical research. Includes instructions for using FTP and a list of history-related FTP sites. (CFR)

  19. Alaska Department of Natural Resources

    Science.gov Websites

    Land Records Information/Status Plats Office of History and Archaeology Applications and many others many of these applications and pages Posted: March 7, 2013 DNR applications, including mapping recorded documents UCC Online Filing Search land case files and abstracts Search for land records More

  20. SHARAF: The Canadian Shared Authority File Project.

    ERIC Educational Resources Information Center

    MacIntosh, Helen

    1982-01-01

    Describes history, operating procedures, and current activities of group of users of the University of Toronto Library Automation System (UTLAS) who cooperated with each other, the bibliographic utility, and the National Library of Canada to produce an automated authority control system, termed Shared Authority File (SHARAF). Five references are…

  1. Cloud object store for checkpoints of high performance computing applications using decoupling middleware

    DOEpatents

    Bent, John M.; Faibish, Sorin; Grider, Gary

    2016-04-19

    Cloud object storage is enabled for checkpoints of high performance computing applications using a middleware process. A plurality of files, such as checkpoint files, generated by a plurality of processes in a parallel computing system are stored by obtaining said plurality of files from said parallel computing system; converting said plurality of files to objects using a log structured file system middleware process; and providing said objects for storage in a cloud object storage system. The plurality of processes may run, for example, on a plurality of compute nodes. The log structured file system middleware process may be embodied, for example, as a Parallel Log-Structured File System (PLFS). The log structured file system middleware process optionally executes on a burst buffer node.

  2. 77 FR 39447 - Revisions to Electric Quarterly Report Filing Process

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-03

    ... Quarterly Report Filing Process AGENCY: Federal Energy Regulatory Commission, DOE. ACTION: Notice of... Rule which governs the filing of Electric Quarterly Reports (EQRs), to change the process for filing... Regulatory Commission (Commission) proposes changes to the method for filing Electric Quarterly Reports (EQRs...

  3. Cloud object store for archive storage of high performance computing data using decoupling middleware

    DOEpatents

    Bent, John M.; Faibish, Sorin; Grider, Gary

    2015-06-30

    Cloud object storage is enabled for archived data, such as checkpoints and results, of high performance computing applications using a middleware process. A plurality of archived files, such as checkpoint files and results, generated by a plurality of processes in a parallel computing system are stored by obtaining the plurality of archived files from the parallel computing system; converting the plurality of archived files to objects using a log structured file system middleware process; and providing the objects for storage in a cloud object storage system. The plurality of processes may run, for example, on a plurality of compute nodes. The log structured file system middleware process may be embodied, for example, as a Parallel Log-Structured File System (PLFS). The log structured file system middleware process optionally executes on a burst buffer node.

  4. Candidacy for Bilateral Hearing Aids: A Retrospective Multicenter Study

    ERIC Educational Resources Information Center

    Boymans, Monique; Goverts, S. Theo; Kramer, Sophia E.; Festen, Joost M.; Dreschler, Wouter A.

    2009-01-01

    Purpose: The goal of this study was to find factors for refining candidacy criteria for bilateral hearing aid fittings. Clinical files of 1,000 consecutive hearing aid fittings were analyzed. Method: Case history, audiometric, and rehabilitation data were collected from clinical files, and an extensive questionnaire on long-term outcome measures…

  5. Perspective: Semantic Data Management for the Home

    DTIC Science & Technology

    2009-02-01

    stored. For example, one view might be “all files with type=music and artist= Beatles stored on Liz’s iPod” and another “all files with owner=Liz...semantic naming structures and search tech- niques from a rich history of previous work. The Seman- tic Filesystem [12] proposed the use of attribute

  6. 23 CFR 1327.5 - Conditions for becoming a participating State.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... check, NHTSA will search its computer file and mail the results (i.e., notification of no record found... official of a participating State shall implement the necessary computer system and procedures to respond...) Provide a Driver History Record from its file to the State of Inquiry upon receipt of a request for this...

  7. 23 CFR 1327.5 - Conditions for becoming a participating State.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... check, NHTSA will search its computer file and mail the results (i.e., notification of no record found... official of a participating State shall implement the necessary computer system and procedures to respond...) Provide a Driver History Record from its file to the State of Inquiry upon receipt of a request for this...

  8. f-treeGC: a questionnaire-based family tree-creation software for genetic counseling and genome cohort studies.

    PubMed

    Tokutomi, Tomoharu; Fukushima, Akimune; Yamamoto, Kayono; Bansho, Yasushi; Hachiya, Tsuyoshi; Shimizu, Atsushi

    2017-07-14

    The Tohoku Medical Megabank project aims to create a next-generation personalized healthcare system by conducting large-scale genome-cohort studies involving three generations of local residents in the areas affected by the Great East Japan Earthquake. We collected medical and genomic information for developing a biobank to be used for this healthcare system. We designed a questionnaire-based pedigree-creation software program named "f-treeGC," which enables even less experienced medical practitioners to accurately and rapidly collect family health history and create pedigree charts. f-treeGC may be run on Adobe AIR. Pedigree charts are created in the following manner: 1) At system startup, the client is prompted to provide required information on the presence or absence of children; f-treeGC is capable of creating a pedigree up to three generations. 2) An interviewer fills out a multiple-choice questionnaire on genealogical information. 3) The information requested includes name, age, gender, general status, infertility status, pregnancy status, fetal status, and physical features or health conditions of individuals over three generations. In addition, information regarding the client and the proband, and birth order information, including multiple gestation, custody, multiple individuals, donor or surrogate, adoption, and consanguinity may be included. 4) f-treeGC shows only marriages between first cousins via the overlay function. 5) f-treeGC automatically creates a pedigree chart, and the chart-creation process is visible for inspection on the screen in real time. 6) The genealogical data may be saved as a file in the original format. The created/modified date and time may be changed as required, and the file may be password-protected and/or saved in read-only format. To enable sorting or searching from the database, the file name automatically contains the terms typed into the entry fields, including physical features or health conditions, by default. 7) Alternatively, family histories are collected using a completed foldable interview paper sheet named "f-sheet", which is identical to the questionnaire in f-treeGC. We developed a questionnaire-based family tree-creation software, named f-treeGC, which is fully compliant with international recommendations for standardized human pedigree nomenclature. The present software simplifies the process of collecting family histories and pedigrees, and has a variety of uses, from genome cohort studies or primary care to genetic counseling.

  9. Decade of Change.

    ERIC Educational Resources Information Center

    Hunter, Leslie Gene

    1995-01-01

    Discusses advancements in the field of history-related computer-assisted instruction and research. Describes the components of Historiography and Methods of Research, a class that introduces history students to such practical applications as the World Wide Web (WWW), File Transfer Protocol (FTP), listservs, archival access, and others. Briefly…

  10. 77 FR 71587 - Wisconsin Public Service Corporation; Notices of Intent To File License Applications, Filing of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-03

    ...] Wisconsin Public Service Corporation; Notices of Intent To File License Applications, Filing of Pre-Application Documents (PAD), Commencement of Pre-Filing Processes and Scoping, Request for Comments on the...: Notices of Intent to File License Applications for Two New Licenses and Commencing the Pre-filing Process...

  11. Grandparent visitation rights: an inappropriate intrusion or appropriate protection?

    PubMed

    Keith, Pat M; Wacker, Robbyn R

    2002-01-01

    Increased divorce rates, longevity in multi-generational families, and activism by older persons are a part of the context in which the role of grandparents in the family, an overview of grandparent visitation statutes, and controversy about visitation rights are discussed. The history and characteristics of grandparent visitation statutes, the process of filing, and criteria used to grant visitation provide insight into the complexities of the request for, and determination of, these rights. Family dynamics interact with a myriad of state statutes to suggest implications for research and policy.

  12. Data acquisition and processing history for the Explorer 33 (AIMP-D) satellite

    NASA Technical Reports Server (NTRS)

    Karras, T. J.

    1972-01-01

    The quality control monitoring system, using accounting and quality control data bases, made it possible to perform an in-depth analysis. Results show that the percentage of useable data files for experimenter analysis was 97.7%; only 0.4% of the data sequences supplied to the experimenter exhibited missing data. The 50 percentile probability delay values (referenced to station record data) indicate that the analog tapes arrived within 11 days, the data were digitized within 4.2 weeks, and the experimenter tapes were delivered in 8.95 weeks or less.

  13. [Project HOPE contribution to the setting up of the professional identity of the first nurses from Alagoas, 1973-1977].

    PubMed

    Costa, Laís de Miranda Crispim; dos Santos, Regina Maria; Santos, Tânia Cristina Franco; Trezza, Maria Cristina Soares Figueiredo; Leite, Josete Luzia

    2014-01-01

    Social-historical study conducted to examine the contribution of the American Nurses of Project HOPE to the configuration of the professional identity of the first trained nurses in Alagoas, in the period of 1973-1977. The theoretical framework was the "Civilizing Process" of Norbert Elias. Primary sources were official documents and personal files of 13 respondents by oral history; the secondary sources were authors of the History of Brazil/Alagoas. Data analysis showed that the configuration of the professional identity of the first trained nurses in Alagoas was a civilizing process, with all the nuances that make up the power relations. There was a significant contribution of American Nursing. However the movement of resistance to this domination was very strong, resulting in a Course that could take advantage of technological advancement and prestige brought by the United States, to build a unique Nursing from the social fabric embroidery at this meeting with so many different cultures.

  14. Development of a relational database to capture and merge clinical history with the quantitative results of radionuclide renography.

    PubMed

    Folks, Russell D; Savir-Baruch, Bital; Garcia, Ernest V; Verdes, Liudmila; Taylor, Andrew T

    2012-12-01

    Our objective was to design and implement a clinical history database capable of linking to our database of quantitative results from (99m)Tc-mercaptoacetyltriglycine (MAG3) renal scans and export a data summary for physicians or our software decision support system. For database development, we used a commercial program. Additional software was developed in Interactive Data Language. MAG3 studies were processed using an in-house enhancement of a commercial program. The relational database has 3 parts: a list of all renal scans (the RENAL database), a set of patients with quantitative processing results (the Q2 database), and a subset of patients from Q2 containing clinical data manually transcribed from the hospital information system (the CLINICAL database). To test interobserver variability, a second physician transcriber reviewed 50 randomly selected patients in the hospital information system and tabulated 2 clinical data items: hydronephrosis and presence of a current stent. The CLINICAL database was developed in stages and contains 342 fields comprising demographic information, clinical history, and findings from up to 11 radiologic procedures. A scripted algorithm is used to reliably match records present in both Q2 and CLINICAL. An Interactive Data Language program then combines data from the 2 databases into an XML (extensible markup language) file for use by the decision support system. A text file is constructed and saved for review by physicians. RENAL contains 2,222 records, Q2 contains 456 records, and CLINICAL contains 152 records. The interobserver variability testing found a 95% match between the 2 observers for presence or absence of ureteral stent (κ = 0.52), a 75% match for hydronephrosis based on narrative summaries of hospitalizations and clinical visits (κ = 0.41), and a 92% match for hydronephrosis based on the imaging report (κ = 0.84). We have developed a relational database system to integrate the quantitative results of MAG3 image processing with clinical records obtained from the hospital information system. We also have developed a methodology for formatting clinical history for review by physicians and export to a decision support system. We identified several pitfalls, including the fact that important textual information extracted from the hospital information system by knowledgeable transcribers can show substantial interobserver variation, particularly when record retrieval is based on the narrative clinical records.

  15. Newsworkers during the Interwar Era: A Critique of Traditional Media History.

    ERIC Educational Resources Information Center

    Brennen, Bonnie

    This essay critiques the depictions of rank and file newsworkers of the 1920s and 1930s that are offered in traditional journalism histories and in cultural, social, and women's histories of the press. Following a tradition established in the first half of the 20th century, contemporary media historians continue to reify the use of other standard…

  16. JPRS Report, Soviet Union, Military History Journal, No. 2, February 1988.

    DTIC Science & Technology

    1988-06-16

    folio 6751, inv. 1, file 1, sheet 27. 4. TsPA IML [Central Party Archives of the Marxism - Leninism Institute], folio 19, inv. 3, file 111, sheets...revolutionary, as V.l. Lenin pointed out, civil wars of the proletariat against the bourgeoisie ...."(3) From an analysis of the policy of the imperialist

  17. 13 CFR 108.660 - Other items required to be filed by NMVC Company with SBA.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false Other items required to be filed by NMVC Company with SBA. 108.660 Section 108.660 Business Credit and Assistance SMALL BUSINESS... other person who was required by SBA to complete a personal history statement, is charged with or...

  18. Identifying and Preserving the History of the Latino Visual Arts: Survey of Archival Initiatives and Recommendations. CSRC Research Report. Number 6

    ERIC Educational Resources Information Center

    Grimm, Tracy

    2005-01-01

    Sometimes it is not until a piece of history is lost that its significance is recognized. In the case of the Latino arts, much of this history remains in the file drawers, storage boxes, closets, and attics of those who created it. It is not too late to save this history. Quick action to identify what remains to be saved is vital. Relatively few…

  19. Recovering Nimbus era Observations at the NASA GES DISC

    NASA Astrophysics Data System (ADS)

    Meyer, D. J.; Johnson, J. E.; Esfandiari, A. E.; Zamkoff, E. B.; Al-Jazrawi, A. F.; Gerasimov, I. V.; Alcott, G. T.

    2017-12-01

    Between 1964 and 1978, NASA launched a series of seven Nimbus meteorological satellites which provided Earth observations for 30 years. These satellites, carrying a total of 33 instruments to observe the Earth at visible, infrared, ultraviolet, and microwave wavelengths, revolutionized weather forecasting, provided early observations of ocean color and atmospheric ozone, and prototyped location-based search and rescue capabilities. The Nimbus series paved the way for a number of currently operational systems such as the EOS Terra, Aqua and Aura platforms.The original data archive included both magnetic tapes and film media. These media are well past their expected end of life, placing at risk valuable data that are critical to extending the history of Earth observations back in time. GES DISC has been incorporating these data into a modern online archive by recovering the digital data files from the tapes, and scanning images of the data from film strips. The original data products were written on obsolete hardware systems in outdated file formats, and in the absence of metadata standards at that time, were often written in proprietary file structures. Through a tedious and laborious process, oft-corrupted data are recovered, and incomplete metadata and documentation are reconstructed.

  20. Building the boundaries of a science: First representations of Italian social psychology between 1875 and 1954.

    PubMed

    Sensales, Gilda; Areni, Alessandra; Del Secco, Alessandra

    2011-11-01

    The present study embraces the critical traditions of "New History" and of social representations theory articulated with the mainstream historiographical tradition of a bibliometric approach. The historical analysis deals with the early representations of Italian social psychology articulated and disseminated by some of the main Italian scientific-cultural and philosophical journals. We examined seven journals published between 1875 and 1954, and gathered 2,030 texts dealing with the various forms of social and collective psychology. We have applied a grid of content analysis whose data have been transcribed to a numerical file. At the same time, we have created a textual file containing the titles of the contributions as well as the names of the authors and scholars reviewed. The two files have been processed by SPAD-T for a correspondence analysis in which both lexical data and category variables have been considered as active variables. Through the scree-test, two factors that explain 18.90% of the variance have been singled out. Their combination has produced a factorial plan able to highlight three distinct areas differently characterized from journals and years. The results are also discussed with regard to the contextual historical frame.

  1. IDSP- INTERACTIVE DIGITAL SIGNAL PROCESSOR

    NASA Technical Reports Server (NTRS)

    Mish, W. H.

    1994-01-01

    The Interactive Digital Signal Processor, IDSP, consists of a set of time series analysis "operators" based on the various algorithms commonly used for digital signal analysis work. The processing of a digital time series to extract information is usually achieved by the application of a number of fairly standard operations. However, it is often desirable to "experiment" with various operations and combinations of operations to explore their effect on the results. IDSP is designed to provide an interactive and easy-to-use system for this type of digital time series analysis. The IDSP operators can be applied in any sensible order (even recursively), and can be applied to single time series or to simultaneous time series. IDSP is being used extensively to process data obtained from scientific instruments onboard spacecraft. It is also an excellent teaching tool for demonstrating the application of time series operators to artificially-generated signals. IDSP currently includes over 43 standard operators. Processing operators provide for Fourier transformation operations, design and application of digital filters, and Eigenvalue analysis. Additional support operators provide for data editing, display of information, graphical output, and batch operation. User-developed operators can be easily interfaced with the system to provide for expansion and experimentation. Each operator application generates one or more output files from an input file. The processing of a file can involve many operators in a complex application. IDSP maintains historical information as an integral part of each file so that the user can display the operator history of the file at any time during an interactive analysis. IDSP is written in VAX FORTRAN 77 for interactive or batch execution and has been implemented on a DEC VAX-11/780 operating under VMS. The IDSP system generates graphics output for a variety of graphics systems. The program requires the use of Versaplot and Template plotting routines and IMSL Math/Library routines. These software packages are not included in IDSP. The virtual memory requirement for the program is approximately 2.36 MB. The IDSP system was developed in 1982 and was last updated in 1986. Versaplot is a registered trademark of Versatec Inc. Template is a registered trademark of Template Graphics Software Inc. IMSL Math/Library is a registered trademark of IMSL Inc.

  2. 77 FR 61585 - FPL Energy Maine Hydro LLC; Notice of Intent To File License Application, Filing of Pre...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-10

    ... Hydro LLC; Notice of Intent To File License Application, Filing of Pre-Application Document (PAD... Application for a New License and Commencing Pre-filing Process. b. Project No.: 2531-067. c. Dated Filed... Commission a Pre-Application Document (PAD; including a proposed process plan and schedule), pursuant to 18...

  3. 77 FR 61584 - FFP Missouri 12, LLC; Notice of Intent To File License Application, Filing of Pre-Application...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-10

    ..., LLC; Notice of Intent To File License Application, Filing of Pre-Application Document, and Approving... Application and Request to Use the Traditional Licensing Process. b. Project No.: 13755-001. c. Date Filed.... m. Free Flow Power filed a Pre-Application Document (PAD; including a proposed process plan and...

  4. 78 FR 20362 - Self-Regulatory Organizations; Chicago Board Options Exchange, Incorporated; Notice of Filing and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-04

    ... (``SPY''), Apple, Inc. (``AAPL''), SPDR Gold Trust (``GLD''), Google Inc. (``GOOG'') and Amazon.com Inc.... Further, the options marketplace has a history of offering preferential pricing to Customers. Finally... or unfairly discriminatory. Also, the options marketplace has a history of offering preferential...

  5. Chemical Stockpile Disposal Program. Chemical Weapons Movement History Compilation.

    DTIC Science & Technology

    1987-06-12

    Arsenal, Edgewood Arsenal, and Dugway Proving Ground . (2) The Army has transferred agent fram certain munitions into other containers or munitions...Aberdeen Proving Ground , Maryland (Historical Volume). - ~ - - - - -.. , 27. Sea Dump of 700 Tons of Lewisite and Mustard , NAD, Concord, California, 1958... Proving Ground , Maryland (Historical Volumes). 42. SITREP File, SFTCM II, 1980; Chemical Agent Identification Sets (CAIS) Historical File; Information

  6. Total Petroleum Systems and Geologic Assessment of Oil and Gas Resources in the Powder River Basin Province, Wyoming and Montana

    USGS Publications Warehouse

    Anna, L.O.

    2009-01-01

    The U.S. Geological Survey completed an assessment of the undiscovered oil and gas potential of the Powder River Basin in 2006. The assessment of undiscovered oil and gas used the total petroleum system concept, which includes mapping the distribution of potential source rocks and known petroleum accumulations and determining the timing of petroleum generation and migration. Geologically based, it focuses on source and reservoir rock stratigraphy, timing of tectonic events and the configuration of resulting structures, formation of traps and seals, and burial history modeling. The total petroleum system is subdivided into assessment units based on similar geologic characteristics and accumulation and petroleum type. In chapter 1 of this report, five total petroleum systems, eight conventional assessment units, and three continuous assessment units were defined and the undiscovered oil and gas resources within each assessment unit quantitatively estimated. Chapter 2 describes data used in support of the process being applied by the U.S. Geological Survey (USGS) National Oil and Gas Assessment (NOGA) project. Digital tabular data used in this report and archival data that permit the user to perform further analyses are available elsewhere on this CD-ROM. Computers and software may import the data without transcription from the Portable Document Format files (.pdf files) of the text by the reader. Because of the number and variety of platforms and software available, graphical images are provided as .pdf files and tabular data are provided in a raw form as tab-delimited text files (.tab files).

  7. Image editing with Adobe Photoshop 6.0.

    PubMed

    Caruso, Ronald D; Postel, Gregory C

    2002-01-01

    The authors introduce Photoshop 6.0 for radiologists and demonstrate basic techniques of editing gray-scale cross-sectional images intended for publication and for incorporation into computerized presentations. For basic editing of gray-scale cross-sectional images, the Tools palette and the History/Actions palette pair should be displayed. The History palette may be used to undo a step or series of steps. The Actions palette is a menu of user-defined macros that save time by automating an action or series of actions. Converting an image to 8-bit gray scale is the first editing function. Cropping is the next action. Both decrease file size. Use of the smallest file size necessary for the purpose at hand is recommended. Final file size for gray-scale cross-sectional neuroradiologic images (8-bit, single-layer TIFF [tagged image file format] at 300 pixels per inch) intended for publication varies from about 700 Kbytes to 3 Mbytes. Final file size for incorporation into computerized presentations is about 10-100 Kbytes (8-bit, single-layer, gray-scale, high-quality JPEG [Joint Photographic Experts Group]), depending on source and intended use. Editing and annotating images before they are inserted into presentation software is highly recommended, both for convenience and flexibility. Radiologists should find that image editing can be carried out very rapidly once the basic steps are learned and automated. Copyright RSNA, 2002

  8. A Strategy for Computing Disease and Non-Battle Injury Rates

    DTIC Science & Technology

    1989-12-12

    information on outpatient visits, monthly morbidity reports, service history data, environment data, and deployment information. In addition, more outpatient...information from available service history records. 2 A STRATEGY FOR coKptnfl DISEASE AND N-BATTLE INJURY RATES William M. Pugh Medical planners need an...between 1968 and 1979. The population information was acquired from service history files. These data included information on the patients’ age, sex

  9. Global mineral resource assessment: porphyry copper assessment of Mexico: Chapter A in Global mineral resource assessment

    USGS Publications Warehouse

    Hammarstrom, Jane M.; Robinson, Gilpin R.; Ludington, Steve; Gray, Floyd; Drenth, Benjamin J.; Cendejas-Cruz, Francisco; Espinosa, Enrique; Pérez-Segura, Efrén; Valencia-Moreno, Martín; Rodríguez-Castañeda, José Luis; Vásquez-Mendoza, Rigobert; Zürcher, Lukas

    2010-01-01

    This report includes a brief overview of porphyry copper deposits in Mexico, a description of the assessment process used, a summary of results, and appendixes. Appendixes A through K contain summary information for each tract, as follows: location, the geologic feature assessed, the rationale for tract delineation, tables and descriptions of known deposits and significant prospects, exploration history, model selection, rationale for the estimates, assessment results, and references. The accompanying digital map files (shapefiles) provide permissive tract outlines, assessment results, and data for deposits and prospects in a GIS format (appendix L).

  10. 76 FR 41790 - Natural Currents Energy Services, LLC; Notice of Intent To File License Application, Filing of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-15

    ... Energy Services, LLC; Notice of Intent To File License Application, Filing of Draft Application, Request for Waivers of Integrated Licensing Process Regulations Necessary for Expedited Processing of a.... Project No.: 12718-002. c. Date Filed: June 28, 2011. d. Submitted By: Natural Currents Energy Services...

  11. 12 CFR 225.43 - Procedures for filing, processing, publishing, and acting on notices.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 12 Banks and Banking 3 2010-01-01 2010-01-01 false Procedures for filing, processing, publishing... (REGULATION Y) Regulations Change in Bank Control § 225.43 Procedures for filing, processing, publishing, and.... Any person(s) filing a notice under this subpart shall publish, in a form prescribed by the Board, an...

  12. The Full Monty: Locating Resources, Creating, and Presenting a Web Enhanced History Course.

    ERIC Educational Resources Information Center

    Bazillion, Richard J.; Braun, Connie L.

    2001-01-01

    Discusses how to develop a history course using the World Wide Web; course development software; full text digitized articles, electronic books, primary documents, images, and audio files; and computer equipment such as LCD projectors and interactive whiteboards. Addresses the importance of support for faculty using technology in teaching. (PAL)

  13. Selected Test Items in American History. Bulletin Number 6, Fifth Edition.

    ERIC Educational Resources Information Center

    Anderson, Howard R.; Lindquist, E. F.

    Designed for high school students, this bulletin provides an extensive file of 1,062 multiple-choice questions in American history. Taken largely from the Iowa Every-Pupil Program and the Cooperative Test Service standardized examinations, the questions are chronologically divided into 16 topic areas. They include exploration and discovery;…

  14. 76 FR 70178 - Self-Regulatory Organizations; International Securities Exchange, LLC; Notice of Filing and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-10

    ... legislative history. If the free market should determine whether proprietary data is sold to broker-dealers at... reasonable and equitably allocated fees for market data. ``In fact, the legislative history indicates that... proprietary products that end users will not purchase in sufficient numbers. Internet portals, such as Google...

  15. 77 FR 21609 - Self-Regulatory Organizations; The NASDAQ Stock Market LLC; Notice of Filing and Immediate...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-04-10

    ... principles reflected in its legislative history. If the free market should determine whether proprietary data.... ``In fact, the legislative history indicates that the Congress intended that the market system `evolve... Subscribers will not purchase in sufficient numbers. Internet portals, such as Google, impose a discipline by...

  16. 78 FR 41483 - Self-Regulatory Organizations; BATS Exchange, Inc.; Notice of Filing and Immediate Effectiveness...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-10

    ... the principles reflected in its legislative history. If the free market should determine whether... reasonable and equitably allocated fees for market data. ``In fact, the legislative history indicates that... portals, such as Google, impose a discipline by providing only data that will enable them to attract...

  17. 77 FR 3313 - Self-Regulatory Organizations; the NASDAQ Stock Market LLC; Notice of Filing and Immediate...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-01-23

    ... principles reflected in its legislative history. If the free market should determine whether proprietary data... equitably allocated fees for market data. ``In fact, the legislative history indicates that the Congress... will not purchase in sufficient numbers. Internet portals, such as Google, impose a discipline by...

  18. 26 CFR 1.874-1 - Allowance of deductions and credits to nonresident alien individuals.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    .... Nonresident alien with effectively connected income. In Year 1, A, a computer programmer, opened an office in... failure to file. In Year 1, A, a computer programmer, opened an office in the United States to market and.... Example 6. Nonresident alien with prior filing history. A began a U.S. trade or business in Year 1 as a...

  19. ORIGAMI Automator Primer. Automated ORIGEN Source Terms and Spent Fuel Storage Pool Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wieselquist, William A.; Thompson, Adam B.; Bowman, Stephen M.

    2016-04-01

    Source terms and spent nuclear fuel (SNF) storage pool decay heat load analyses for operating nuclear power plants require a large number of Oak Ridge Isotope Generation and Depletion (ORIGEN) calculations. SNF source term calculations also require a significant amount of bookkeeping to track quantities such as core and assembly operating histories, spent fuel pool (SFP) residence times, heavy metal masses, and enrichments. The ORIGEN Assembly Isotopics (ORIGAMI) module in the SCALE code system provides a simple scheme for entering these data. However, given the large scope of the analysis, extensive scripting is necessary to convert formats and process datamore » to create thousands of ORIGAMI input files (one per assembly) and to process the results into formats readily usable by follow-on analysis tools. This primer describes a project within the SCALE Fulcrum graphical user interface (GUI) called ORIGAMI Automator that was developed to automate the scripting and bookkeeping in large-scale source term analyses. The ORIGAMI Automator enables the analyst to (1) easily create, view, and edit the reactor site and assembly information, (2) automatically create and run ORIGAMI inputs, and (3) analyze the results from ORIGAMI. ORIGAMI Automator uses the standard ORIGEN binary concentrations files produced by ORIGAMI, with concentrations available at all time points in each assembly’s life. The GUI plots results such as mass, concentration, activity, and decay heat using a powerful new ORIGEN Post-Processing Utility for SCALE (OPUS) GUI component. This document includes a description and user guide for the GUI, a step-by-step tutorial for a simplified scenario, and appendices that document the file structures used.« less

  20. 77 FR 55210 - Public Service Company of New Hampshire; Notice of Intent To File License Application, Filing of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-09-07

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Project No. 2457-038] Public Service... (PAD), Commencement of Pre-Filing Process, and Scoping; Request for Comments on the PAD And Scoping... File License Application for a New License and Commencing Pre-filing Process. b. Project No.: 2457-038...

  1. Advances in Software Tools for Pre-processing and Post-processing of Overset Grid Computations

    NASA Technical Reports Server (NTRS)

    Chan, William M.

    2004-01-01

    Recent developments in three pieces of software for performing pre-processing and post-processing work on numerical computations using overset grids are presented. The first is the OVERGRID graphical interface which provides a unified environment for the visualization, manipulation, generation and diagnostics of geometry and grids. Modules are also available for automatic boundary conditions detection, flow solver input preparation, multiple component dynamics input preparation and dynamics animation, simple solution viewing for moving components, and debris trajectory analysis input preparation. The second is a grid generation script library that enables rapid creation of grid generation scripts. A sample of recent applications will be described. The third is the OVERPLOT graphical interface for displaying and analyzing history files generated by the flow solver. Data displayed include residuals, component forces and moments, number of supersonic and reverse flow points, and various dynamics parameters.

  2. 47 CFR 0.141 - Functions of the Bureau.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... accessibility of communications services and technologies for persons with disabilities. (g) Plans, develops... Affiliation Agreements, court citation files, and legislative histories concerning telecommunications dockets...

  3. A Scientific Data Provenance Harvester for Distributed Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stephan, Eric G.; Raju, Bibi; Elsethagen, Todd O.

    Data provenance provides a way for scientists to observe how experimental data originates, conveys process history, and explains influential factors such as experimental rationale and associated environmental factors from system metrics measured at runtime. The US Department of Energy Office of Science Integrated end-to-end Performance Prediction and Diagnosis for Extreme Scientific Workflows (IPPD) project has developed a provenance harvester that is capable of collecting observations from file based evidence typically produced by distributed applications. To achieve this, file based evidence is extracted and transformed into an intermediate data format inspired in part by W3C CSV on the Web recommendations, calledmore » the Harvester Provenance Application Interface (HAPI) syntax. This syntax provides a general means to pre-stage provenance into messages that are both human readable and capable of being written to a provenance store, Provenance Environment (ProvEn). HAPI is being applied to harvest provenance from climate ensemble runs for Accelerated Climate Modeling for Energy (ACME) project funded under the U.S. Department of Energy’s Office of Biological and Environmental Research (BER) Earth System Modeling (ESM) program. ACME informally provides provenance in a native form through configuration files, directory structures, and log files that contain success/failure indicators, code traces, and performance measurements. Because of its generic format, HAPI is also being applied to harvest tabular job management provenance from Belle II DIRAC scheduler relational database tables as well as other scientific applications that log provenance related information.« less

  4. Newspapers of New York State: A Statewide Plan for Bibliographic Control and Preservation. Final Report of the Task Force on Newspaper Bibliography and Preservation.

    ERIC Educational Resources Information Center

    New York State Education Dept., Albany. Div. of Library Development.

    Underscoring the importance of newspapers as information sources on social and economic history, this report puts forth a statewide plan to catalog and preserve the newspaper files of New York State. The plan proposes the compilation of a statewide bibliography in two states: a preliminary survey to isolate and preserve files in greatest immediate…

  5. 34 CFR 682.511 - Procedures for filing a claim.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ...) The repayment instrument. (iv) A payment history, as described in § 682.414(a)(3)(ii)(I). (v) A collection history, as described in § 682.414(a)(3)(ii)(J). (vi) A copy of the final demand letter if... America of all rights, title, and interest of the lender in the note underlying the claim. (d) Bankruptcy...

  6. 34 CFR 682.511 - Procedures for filing a claim.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ...) The repayment instrument. (iv) A payment history, as described in § 682.414(a)(3)(ii)(I). (v) A collection history, as described in § 682.414(a)(3)(ii)(J). (vi) A copy of the final demand letter if... America of all rights, title, and interest of the lender in the note underlying the claim. (d) Bankruptcy...

  7. 34 CFR 682.511 - Procedures for filing a claim.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ...) The repayment instrument. (iv) A payment history, as described in § 682.414(a)(3)(ii)(I). (v) A collection history, as described in § 682.414(a)(3)(ii)(J). (vi) A copy of the final demand letter if... America of all rights, title, and interest of the lender in the note underlying the claim. (d) Bankruptcy...

  8. 77 FR 39752 - Self-Regulatory Organizations; The NASDAQ Stock Market LLC; Notice of Filing and Immediate...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-05

    ... history. If the free market should determine whether proprietary data is sold to broker-dealers at all, it... for market data. ``In fact, the legislative history indicates that the Congress intended that the... purchase in sufficient numbers. Internet portals, such as Google, impose a discipline by providing only...

  9. 76 FR 47630 - Self-Regulatory Organizations; International Securities Exchange, LLC; Notice of Filing and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-05

    ... advanced the goals of the Act and the principles reflected in its legislative history. If the free market... reasonable and equitably allocated fees for market data. ``In fact, the legislative history indicates that.... Internet portals, such as Google, impose a discipline by providing only data that will enable them to...

  10. 78 FR 18378 - Self-Regulatory Organizations; NASDAQ OMX PHLX LLC; Notice of Filing and Immediate Effectiveness...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-26

    ... history. If the free market should determine whether proprietary data is sold to broker-dealers at all, it... equitably allocated fees for market data. ``In fact, the legislative history indicates that the Congress... Subscribers will not purchase in sufficient numbers. Internet portals, such as Google, impose a discipline by...

  11. 78 FR 41447 - Self-Regulatory Organizations; BATS Y-Exchange, Inc.; Notice of Filing and Immediate...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-10

    ... history. If the free market should determine whether proprietary data is sold to BDs at all, it follows... equitably allocated fees for market data. ``In fact, the legislative history indicates that the Congress... sufficient numbers. Internet portals, such as Google, impose a discipline by providing only data that will...

  12. Pointing History Engine for the Spitzer Space Telescope

    NASA Technical Reports Server (NTRS)

    Bayard, David; Ahmed, Asif; Brugarolas, Paul

    2007-01-01

    The Pointing History Engine (PHE) is a computer program that provides mathematical transformations needed to reconstruct, from downlinked telemetry data, the attitude of the Spitzer Space Telescope (formerly known as the Space Infrared Telescope Facility) as a function of time. The PHE also serves as an example for development of similar pointing reconstruction software for future space telescopes. The transformations implemented in the PHE take account of the unique geometry of the Spitzer telescope-pointing chain, including all data on relative alignments of components, and all information available from attitude-determination instruments. The PHE makes it possible to coordinate attitude data with observational data acquired at the same time, so that any observed astronomical object can be located for future reference and re-observation. The PHE is implemented as a subroutine used in conjunction with telemetry-formatting services of the Mission Image Processing Laboratory of NASA s Jet Propulsion Laboratory to generate the Boresight Pointing History File (BPHF). The BPHF is an archival database designed to serve as Spitzer s primary astronomical reference documenting where the telescope was pointed at any time during its mission.

  13. Forensic Analysis of Compromised Computers

    NASA Technical Reports Server (NTRS)

    Wolfe, Thomas

    2004-01-01

    Directory Tree Analysis File Generator is a Practical Extraction and Reporting Language (PERL) script that simplifies and automates the collection of information for forensic analysis of compromised computer systems. During such an analysis, it is sometimes necessary to collect and analyze information about files on a specific directory tree. Directory Tree Analysis File Generator collects information of this type (except information about directories) and writes it to a text file. In particular, the script asks the user for the root of the directory tree to be processed, the name of the output file, and the number of subtree levels to process. The script then processes the directory tree and puts out the aforementioned text file. The format of the text file is designed to enable the submission of the file as input to a spreadsheet program, wherein the forensic analysis is performed. The analysis usually consists of sorting files and examination of such characteristics of files as ownership, time of creation, and time of most recent access, all of which characteristics are among the data included in the text file.

  14. Recovering Nimbus Era Observations at the NASA GES DISC

    NASA Technical Reports Server (NTRS)

    Meyer, D.; Johnson, J.; Esfandiari, A.; Zamkoff, E.; Al-Jazrawi, A.; Gerasimov, I.; Alcott, G.

    2017-01-01

    Between 1964 and 1978, NASA launched a series of seven Nimbus meteorological satellites which provided Earth observations for 30 years. These satellites, carrying a total of 33 instruments to observe the Earth at visible, infrared, ultraviolet, and microwave wavelengths, revolutionized weather forecasting, provided early observations of ocean color and atmospheric ozone, and prototyped location-based search and rescue capabilities. The Nimbus series paved the way for a number of currently operational systems such as the EOS (Earth Observation System) Terra, Aqua, and Aura platforms. The original data archive includes both magnetic tapes and film media. These media are well past their expected end of life, placing at risk valuable data that are critical to extending the history of Earth observations back in time. GES DISC (Goddard Earth Sciences Data and Information Services Center) has been incorporating these data into a modern online archive by recovering the digital data files from the tapes, and scanning images of the data from film strips. The digital data products were written on obsolete hardware systems in outdated file formats, and in the absence of metadata standards at that time, were often written in proprietary file structures. Through a tedious and laborious process, oft-corrupted data are recovered, and incomplete metadata and documentation are reconstructed.

  15. Framework for Integrating Science Data Processing Algorithms Into Process Control Systems

    NASA Technical Reports Server (NTRS)

    Mattmann, Chris A.; Crichton, Daniel J.; Chang, Albert Y.; Foster, Brian M.; Freeborn, Dana J.; Woollard, David M.; Ramirez, Paul M.

    2011-01-01

    A software framework called PCS Task Wrapper is responsible for standardizing the setup, process initiation, execution, and file management tasks surrounding the execution of science data algorithms, which are referred to by NASA as Product Generation Executives (PGEs). PGEs codify a scientific algorithm, some step in the overall scientific process involved in a mission science workflow. The PCS Task Wrapper provides a stable operating environment to the underlying PGE during its execution lifecycle. If the PGE requires a file, or metadata regarding the file, the PCS Task Wrapper is responsible for delivering that information to the PGE in a manner that meets its requirements. If the PGE requires knowledge of upstream or downstream PGEs in a sequence of executions, that information is also made available. Finally, if information regarding disk space, or node information such as CPU availability, etc., is required, the PCS Task Wrapper provides this information to the underlying PGE. After this information is collected, the PGE is executed, and its output Product file and Metadata generation is managed via the PCS Task Wrapper framework. The innovation is responsible for marshalling output Products and Metadata back to a PCS File Management component for use in downstream data processing and pedigree. In support of this, the PCS Task Wrapper leverages the PCS Crawler Framework to ingest (during pipeline processing) the output Product files and Metadata produced by the PGE. The architectural components of the PCS Task Wrapper framework include PGE Task Instance, PGE Config File Builder, Config File Property Adder, Science PGE Config File Writer, and PCS Met file Writer. This innovative framework is really the unifying bridge between the execution of a step in the overall processing pipeline, and the available PCS component services as well as the information that they collectively manage.

  16. 76 FR 32144 - Marine Mammals; File No. 15543

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-03

    ..., dynamics, life history, social structure, genetic structure including paternity patterns, and human interactions. The sampling and tagging will support health assessment, auditory system, feeding, and ranging...

  17. 76 FR 30309 - Marine Mammals; File No. 16087

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-25

    ... authorizes taking marine mammals in California, Oregon, and Washington to investigate population status, health, demographic parameters, life history and foraging ecology of California sea lions (Zalophus...

  18. SEGY to ASCII: Conversion and Plotting Program

    USGS Publications Warehouse

    Goldman, Mark R.

    1999-01-01

    This report documents a computer program to convert standard 4 byte, IBM floating point SEGY files to ASCII xyz format. The program then optionally plots the seismic data using the GMT plotting package. The material for this publication is contained in a standard tar file (of99-126.tar) that is uncompressed and 726 K in size. It can be downloaded by any Unix machine. Move the tar file to the directory you wish to use it in, then type 'tar xvf of99-126.tar' The archive files (and diskette) contain a NOTE file, a README file, a version-history file, source code, a makefile for easy compilation, and an ASCII version of the documentation. The archive files (and diskette) also contain example test files, including a typical SEGY file along with the resulting ASCII xyz and postscript files. Requirements for compiling the source code into an executable are a C++ compiler. The program has been successfully compiled using Gnu's g++ version 2.8.1, and use of other compilers may require modifications to the existing source code. The g++ compiler is a free, high quality C++ compiler and may be downloaded from the ftp site: ftp://ftp.gnu.org/gnu Requirements for plotting the seismic data is the existence of the GMT plotting package. The GMT plotting package may be downloaded from the web site: http://www.soest.hawaii.edu/gmt/

  19. Effect of various digital processing algorithms on the measurement accuracy of endodontic file length.

    PubMed

    Kal, Betül Ilhan; Baksi, B Güniz; Dündar, Nesrin; Sen, Bilge Hakan

    2007-02-01

    The aim of this study was to compare the accuracy of endodontic file lengths after application of various image enhancement modalities. Endodontic files of three different ISO sizes were inserted in 20 single-rooted extracted permanent mandibular premolar teeth and standardized images were obtained. Original digital images were then enhanced using five processing algorithms. Six evaluators measured the length of each file on each image. The measurements from each processing algorithm and each file size were compared using repeated measures ANOVA and Bonferroni tests (P = 0.05). Paired t test was performed to compare the measurements with the true lengths of the files (P = 0.05). All of the processing algorithms provided significantly shorter measurements than the true length of each file size (P < 0.05). The threshold enhancement modality produced significantly higher mean error values (P < 0.05), while there was no significant difference among the other enhancement modalities (P > 0.05). Decrease in mean error value was observed with increasing file size (P < 0.05). Invert, contrast/brightness and edge enhancement algorithms may be recommended for accurate file length measurements when utilizing storage phosphor plates.

  20. Do clinicians assess patients' religiousness? An audit of an aged psychiatry community team.

    PubMed

    Payman, Vahid; Lim, Zheng Jie

    2018-03-01

    To determine the frequency and quality of religious history taking of patients by clinicians working in an old age psychiatry service. A retrospective audit of 80 randomised patient files from the Koropiko Mental Health Services for Older People (MHSOP) in Middlemore Hospital, Auckland, New Zealand. A total of 66 clinical records were available for analysis. A religious history was taken in 33/66 (50%) patients. However, when such histories were evaluated using the FICA assessment tool, only 10/33 (30.3%) histories contained detailed information regarding the patient's religiousness. The infrequency and low quality of religious histories discovered in this audit suggest that clinicians need more training in taking a religious history from patients.

  1. Storing files in a parallel computing system based on user-specified parser function

    DOEpatents

    Faibish, Sorin; Bent, John M; Tzelnic, Percy; Grider, Gary; Manzanares, Adam; Torres, Aaron

    2014-10-21

    Techniques are provided for storing files in a parallel computing system based on a user-specified parser function. A plurality of files generated by a distributed application in a parallel computing system are stored by obtaining a parser from the distributed application for processing the plurality of files prior to storage; and storing one or more of the plurality of files in one or more storage nodes of the parallel computing system based on the processing by the parser. The plurality of files comprise one or more of a plurality of complete files and a plurality of sub-files. The parser can optionally store only those files that satisfy one or more semantic requirements of the parser. The parser can also extract metadata from one or more of the files and the extracted metadata can be stored with one or more of the plurality of files and used for searching for files.

  2. Current challenges and concepts of the thermomechanical treatment of nickel-titanium instruments.

    PubMed

    Shen, Ya; Zhou, Hui-min; Zheng, Yu-feng; Peng, Bin; Haapasalo, Markus

    2013-02-01

    The performance and mechanical properties of nickel-titanium (NiTi) instruments are influenced by factors such as cross-section, flute design, raw material, and manufacturing processes. Many improvements have been proposed by manufacturers during the past decade to provide clinicians with safer and more efficient instruments. The mechanical performance of NiTi alloys is sensitive to their microstructure and associated thermomechanical treatment history. Heat treatment or thermal processing is one of the most fundamental approaches toward adjusting the transition temperature in NiTi alloy, which affects the fatigue resistance of NiTi endodontic files. The newly developed NiTi instruments made from controlled memory wire, M-Wire (Dentsply Tulsa Dental Specialties, Tulsa, OK), or R-phase wire represent the next generation of NiTi alloys with improved flexibility and fatigue resistance. The advantages of NiTi files for canal cleaning and shaping are decreased canal transportation and ledging, a reduced risk of file fracture, and faster and more efficient instrumentation. The clinician must understand the nature of different NiTi raw materials and their impact on instrument performance because many new instruments are introduced on a regular basis. This review summarizes the metallurgical properties of next-generation NiTi instruments, the impact of thermomechanical treatment on instrument flexibility, and the resistance to cyclic fatigue and torsion. The aim of this review was to provide clinicians with the knowledge necessary for evidence-based practices, maximizing the benefits from the selection and application of NiTi rotary instruments for root canal treatment. Copyright © 2013 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  3. User's guide to HYPOINVERSE-2000, a Fortran program to solve for earthquake locations and magnitudes

    USGS Publications Warehouse

    Klein, Fred W.

    2002-01-01

    Hypoinverse is a computer program that processes files of seismic station data for an earthquake (like p wave arrival times and seismogram amplitudes and durations) into earthquake locations and magnitudes. It is one of a long line of similar USGS programs including HYPOLAYR (Eaton, 1969), HYPO71 (Lee and Lahr, 1972), and HYPOELLIPSE (Lahr, 1980). If you are new to Hypoinverse, you may want to start by glancing at the section “SOME SIMPLE COMMAND SEQUENCES” to get a feel of some simpler sessions. This document is essentially an advanced user’s guide, and reading it sequentially will probably plow the reader into more detail than he/she needs. Every user must have a crust model, station list and phase data input files, and glancing at these sections is a good place to begin. The program has many options because it has grown over the years to meet the needs of one the largest seismic networks in the world, but small networks with just a few stations do use the program and can ignore most of the options and commands. History and availability. Hypoinverse was originally written for the Eclipse minicomputer in 1978 (Klein, 1978). A revised version for VAX and Pro-350 computers (Klein, 1985) was later expanded to include multiple crustal models and other capabilities (Klein, 1989). This current report documents the expanded Y2000 version and it supercedes the earlier documents. It serves as a detailed user's guide to the current version running on unix and VAX-alpha computers, and to the version supplied with the Earthworm earthquake digitizing system. Fortran-77 source code (Sun and VAX compatible) and copies of this documentation is available via anonymous ftp from computers in Menlo Park. At present, the computer is swave.wr.usgs.gov and the directory is /ftp/pub/outgoing/klein/hyp2000. If you are running Hypoinverse on one of the Menlo Park EHZ or NCSN unix computers, the executable currently is ~klein/hyp2000/hyp2000. New features. The Y2000 version of Hypoinverse includes all of the previous capabilities, but adds Y2000 formats to those defined earlier. In most cases, the new formats add 2 digits to the year field to accommodate the century. Other fields are sometimes rearranged or expanded to accommodate a better field order. The Y2000 formats are invoked with the “200” command. When the Y2000 flag is turned on, all files are read and written in the new format and there is no mixing of format types in a single run. Some formats without a date field, like station files, have not changed. A separate program called 2000CONV has been written to convert old formats to new. Other new features, like expanded station names, calculating amplitude magnitudes from a variety of digital seismometers, station history files, interactive earthquake processing, and locations from CUSP (Caltech USGS Seismic Processing) binary files have been added. General features. Hypoinverse will locate any number of events in an input file, which can be in one of several different formats. Any or all of printout, summary or archive output may be produced. Hypoinverse is driven by user commands. The various commands define input and output files, set adjustable parameters, and solve for locations of a file of earthquake data using the parameters and files currently set. It is both interactive and "batch" in that commands may be executed either from the keyboard or from a file. You execute the commands in a file by typing @filename at the Hypoinverse prompt. Users may either supply parameters on the command line, or omit them and are prompted interactively. The current parameter values are displayed and may be taken as defaults by pressing just the RETURN key after the prompt. This makes the program very easy to use, providing you can remember the names of the commands. Combining commands with and without their required parameters into a command file permits a variety of customized procedures such as automatic input of crustal model and station data, but prompting for a different phase file each time. All commands are 3 letters long and most require one or more parameters or file names. If they appear on a line with a command, character strings such as filenames must be enclosed in apostrophes (single quotes). Appendix 1 gives this and other free-format rules for supplying parameters, which are parsed in Fortran. When several parameters are required following a command, any of them may be omitted by replacing them with null fields (see appendix 1). A null field leaves that parameter unchanged from its current or default value. When you start HYPOINVERSE, default values are in effect for all parameters except file names. Hypoinverse is a complicated program with many features and options. Many of these "advanced" or seldom used features are documented here, but are more detailed than a typical user needs to read about when first starting with the program. I have put some of this material in smaller type so that a first time user can concentrate on the more important information.

  4. 76 FR 1612 - Natural Currents Energy Services, LLC; Notice of Intent To File License Application, Filing of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-11

    ... Energy Services, LLC; Notice of Intent To File License Application, Filing of Draft Application, Request for Waivers of Integrated Licensing Process Regulations Necessary for Expedited Processing of a... Energy Services, LLC. e. Name of Project: Will's Hole Tidal Electric Project. f. Location: The project...

  5. 29 CFR 1641.6 - Processing of charges filed with EEOC.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 29 Labor 4 2013-07-01 2013-07-01 false Processing of charges filed with EEOC. 1641.6 Section 1641.6 Labor Regulations Relating to Labor (Continued) EQUAL EMPLOYMENT OPPORTUNITY COMMISSION PROCEDURES... HOLDING GOVERNMENT CONTRACTS OR SUBCONTRACTS § 1641.6 Processing of charges filed with EEOC. (a) ADA cause...

  6. 29 CFR 1641.6 - Processing of charges filed with EEOC.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 29 Labor 4 2012-07-01 2012-07-01 false Processing of charges filed with EEOC. 1641.6 Section 1641.6 Labor Regulations Relating to Labor (Continued) EQUAL EMPLOYMENT OPPORTUNITY COMMISSION PROCEDURES... HOLDING GOVERNMENT CONTRACTS OR SUBCONTRACTS § 1641.6 Processing of charges filed with EEOC. (a) ADA cause...

  7. 29 CFR 1641.6 - Processing of charges filed with EEOC.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 29 Labor 4 2010-07-01 2010-07-01 false Processing of charges filed with EEOC. 1641.6 Section 1641.6 Labor Regulations Relating to Labor (Continued) EQUAL EMPLOYMENT OPPORTUNITY COMMISSION PROCEDURES... HOLDING GOVERNMENT CONTRACTS OR SUBCONTRACTS § 1641.6 Processing of charges filed with EEOC. (a) ADA cause...

  8. 29 CFR 1641.6 - Processing of charges filed with EEOC.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 29 Labor 4 2011-07-01 2011-07-01 false Processing of charges filed with EEOC. 1641.6 Section 1641.6 Labor Regulations Relating to Labor (Continued) EQUAL EMPLOYMENT OPPORTUNITY COMMISSION PROCEDURES... HOLDING GOVERNMENT CONTRACTS OR SUBCONTRACTS § 1641.6 Processing of charges filed with EEOC. (a) ADA cause...

  9. 29 CFR 1641.6 - Processing of charges filed with EEOC.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 29 Labor 4 2014-07-01 2014-07-01 false Processing of charges filed with EEOC. 1641.6 Section 1641.6 Labor Regulations Relating to Labor (Continued) EQUAL EMPLOYMENT OPPORTUNITY COMMISSION PROCEDURES... HOLDING GOVERNMENT CONTRACTS OR SUBCONTRACTS § 1641.6 Processing of charges filed with EEOC. (a) ADA cause...

  10. The President’s Office of Science and Technology Policy: Issues for Congress

    DTIC Science & Technology

    2008-11-10

    www.ostp.gov/galleries/default-file/OSTP%20org%20charts%2010-15-08.pdf]. This report will provide an overview of the history of science and technology...Greenwood Press, 1997). 6 Jeffrey K. Stine, A History of Science Policy in the United States, 1940-1985, Report for the House Committee on Science and...Jeffrey K. Stine, A History of Science Policy in the United States, 1940-1985, Report for the House Committee on Science and Technology Task Force on

  11. Physical and Chemical Change: The Long History of the Iron Filings and Sulfur Experiment

    ERIC Educational Resources Information Center

    Palmer, W. P.

    1995-01-01

    As a part of a doctoral thesis considering the history of teaching physical and chemical change, 641 chemistry/science textbooks have currently been examined. These books are from many different countries and date from the eighteenth century to the present time. The books have described a wide variety of experiments to illustrate the difference…

  12. A SHORT HISTORY CSISRS - AT THE CUTTING EDGE OF NUCLEAR DATA INFORMATION STORAGE AND RETRIEVAL SYSTEMS AND ITS RELATIONSHIP TO CINDA, EXFOR AND ENDF.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    HOLDEN, N.E.

    A short history of CSISRS, pronounced ''scissors'' and standing for the Cross Section Information Storage and Retrieval System, is given. The relationship of CSISRS to CINDA, to the neutron nuclear data four-centers, to EXFOR and to ENDF, the evaluated neutron nuclear data file, is briefly explained.

  13. Compiler-assisted multiple instruction rollback recovery using a read buffer

    NASA Technical Reports Server (NTRS)

    Alewine, Neal J.; Chen, Shyh-Kwei; Fuchs, W. Kent; Hwu, Wen-Mei W.

    1995-01-01

    Multiple instruction rollback (MIR) is a technique that has been implemented in mainframe computers to provide rapid recovery from transient processor failures. Hardware-based MIR designs eliminate rollback data hazards by providing data redundancy implemented in hardware. Compiler-based MIR designs have also been developed which remove rollback data hazards directly with data-flow transformations. This paper describes compiler-assisted techniques to achieve multiple instruction rollback recovery. We observe that some data hazards resulting from instruction rollback can be resolved efficiently by providing an operand read buffer while others are resolved more efficiently with compiler transformations. The compiler-assisted scheme presented consists of hardware that is less complex than shadow files, history files, history buffers, or delayed write buffers, while experimental evaluation indicates performance improvement over compiler-based schemes.

  14. A real time ECG signal processing application for arrhythmia detection on portable devices

    NASA Astrophysics Data System (ADS)

    Georganis, A.; Doulgeraki, N.; Asvestas, P.

    2017-11-01

    Arrhythmia describes the disorders of normal heart rate, which, depending on the case, can even be fatal for a patient with severe history of heart disease. The purpose of this work is to develop an application for heart signal visualization, processing and analysis in Android portable devices e.g. Mobile phones, tablets, etc. The application is able to retrieve the signal initially from a file and at a later stage this signal is processed and analysed within the device so that it can be classified according to the features of the arrhythmia. In the processing and analysing stage, different algorithms are included among them the Moving Average and Pan Tompkins algorithm as well as the use of wavelets, in order to extract features and characteristics. At the final stage, testing is performed by simulating our application in real-time records, using the TCP network protocol for communicating the mobile with a simulated signal source. The classification of ECG beat to be processed is performed by neural networks.

  15. ES-doc-errata: an issue tracker platform for CMIP6

    NASA Astrophysics Data System (ADS)

    Ben Nasser, Atef; Levavasseur, Guillaume; Greenslade, Mark; Denvil, Sébastien

    2017-04-01

    In the context of overseeing the quality of data, and as a result of the inherent complexity of projects such as CMIP5/6, it is a mandatory task to keep track of the status of datasets and the version evolution they sustain in their life-cycle. The ESdoc-errata project aims to keep track of the issues affecting specific versions of datasets/files. It enables users to resolve the history tree of each dataset/file enabling a better choice of the data used in their work based on the data status. The ES-doc-errata project has been designed and built on top of the Parent-IDentifiers handle service that will be deployed in the next iteration of the CMIP project, by ensuring maximum usability of ESGF ecosystem and encapsulated in the ES-doc structure. Consuming PIDs from handle service is guided by a specifically built algorithm that extracts meta-data regarding the issues that may or may not affect the quality of datasets/files and cause newer version to be published replacing older deprecated versions. This algorithm is able to deduce the nature of the flaws to the file granularity, that is of high value to the end-user. This new platform has been designed keeping in mind usability by end-users specialized in the data publishing process or other scientists requiring feedback on reliability of data required for their work. To this end, a specific set of rules and a code of conduct has been defined. A validation process ensures the quality of this newly introduced errata meta-data , an authentication safe-guard was implemented to prevent tampering with the archived data, and a wide variety of tools were put at users disposal to interact safely with the platform including a command-line client and a dedicated front-end.

  16. Fail-over file transfer process

    NASA Technical Reports Server (NTRS)

    Semancik, Susan K. (Inventor); Conger, Annette M. (Inventor)

    2005-01-01

    The present invention provides a fail-over file transfer process to handle data file transfer when the transfer is unsuccessful in order to avoid unnecessary network congestion and enhance reliability in an automated data file transfer system. If a file cannot be delivered after attempting to send the file to a receiver up to a preset number of times, and the receiver has indicated the availability of other backup receiving locations, then the file delivery is automatically attempted to one of the backup receiving locations up to the preset number of times. Failure of the file transfer to one of the backup receiving locations results in a failure notification being sent to the receiver, and the receiver may retrieve the file from the location indicated in the failure notification when ready.

  17. 77 FR 71288 - Revisions to Electric Quarterly Report Filing Process

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-30

    ... its regulations to change the process for filing Electric Quarterly Reports (EQR). Due to technology... option.\\80\\ \\78\\ See, e.g., EEI at 8; Links Technology Solutions at 2; Pacific Gas and Electric at 6. \\79...; Order No. 770] Revisions to Electric Quarterly Report Filing Process AGENCY: Federal Energy Regulatory...

  18. Text and Illustration Processing System (TIPS) User’s Manual. Volume 1. Text Processing System.

    DTIC Science & Technology

    1981-07-01

    m.st De in tre file citalog. To copy a file, begin by calling up the file. Access the Main Menu and, T<ESSq: 2 - Edit an Existing File After you have...23 III MAKING REVISIONS............................................ 24 Call Up an Existing File...above the keyboard is called a Cathode Ray Tube (CRT). It displays information as you key it in. A CURSOR is an underscore character on the screen which

  19. Chirp subbottom profile data collected in 2015 from the northern Chandeleur Islands, Louisiana

    USGS Publications Warehouse

    Forde, Arnell S.; DeWitt, Nancy T.; Fredericks, Jake J.; Miselis, Jennifer L.

    2018-01-30

    As part of the Barrier Island Evolution Research project, scientists from the U.S. Geological Survey (USGS) St. Petersburg Coastal and Marine Science Center conducted a nearshore geophysical survey around the northern Chandeleur Islands, Louisiana, in September 2015. The objective of the project is to improve the understanding of barrier island geomorphic evolution, particularly storm-related depositional and erosional processes that shape the islands over annual to interannual time scales (1–5 years). Collecting geophysical data can help researchers identify relations between the geologic history of the islands and their present day morphology and sediment distribution. High-resolution geophysical data collected along this rapidly changing barrier island system can provide a unique time-series dataset to further the analyses and geomorphological interpretations of this and other coastal systems, improving our understanding of coastal response and evolution over medium-term time scales (months to years). Subbottom profile data were collected in September 2015 offshore of the northern Chandeleur Islands, during USGS Field Activity Number 2015-331-FA. Data products, including raw digital chirp subbottom data, processed subbottom profile images, survey trackline map, navigation files, geographic information system data files and formal Federal Geographic Data Committee metadata, and Field Activity Collection System and operation logs are available for download.

  20. DNR Recorder's Office

    Science.gov Websites

    /Filing occurs between 8:00 AM and 3:30 PM Monday through Friday except holidays or scheduled closures Copy Request e-Recording information. Search and Index Guidelines. Recording District History, and

  1. 40 CFR 51.363 - Quality assurance.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... using either electronic or written forms to be retained in the inspector and station history files, with...) Covert vehicles covering the range of vehicle technology groups (e.g., carbureted and fuel-injected...

  2. Alton Ochsner's Card File: A Profile of Medical History

    PubMed Central

    Trotter, Michael C.

    2010-01-01

    Alton Ochsner was a giant of American surgery. His career encompassed patient care, teaching, and research as symbolized on the original seal of the Ochsner Clinic. His ideas were innovative and groundbreaking on many fronts, making him and the Ochsner Clinic nationally and internationally known. Examination of his card file, a simple metal box with 3 × 5 index cards and subject dividers, gives extraordinary insight into the professional interests of this remarkable physician and surgeon. PMID:21603382

  3. 50 CFR 221.22 - How do I file a notice of intervention and response?

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 50 Wildlife and Fisheries 7 2010-10-01 2010-10-01 false How do I file a notice of intervention and... LICENSES Hearing Process Initiation of Hearing Process § 221.22 How do I file a notice of intervention and...; and (ii) File with the Office of Habitat Conservation a notice of intervention and a written response...

  4. 50 CFR 221.22 - How do I file a notice of intervention and response?

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 50 Wildlife and Fisheries 9 2011-10-01 2011-10-01 false How do I file a notice of intervention and... LICENSES Hearing Process Initiation of Hearing Process § 221.22 How do I file a notice of intervention and...; and (ii) File with the Office of Habitat Conservation a notice of intervention and a written response...

  5. Implementation of Rivest Shamir Adleman Algorithm (RSA) and Vigenere Cipher In Web Based Information System

    NASA Astrophysics Data System (ADS)

    Aryanti, Aryanti; Mekongga, Ikhthison

    2018-02-01

    Data security and confidentiality is one of the most important aspects of information systems at the moment. One attempt to secure data such as by using cryptography. In this study developed a data security system by implementing the cryptography algorithm Rivest, Shamir Adleman (RSA) and Vigenere Cipher. The research was done by combining Rivest, Shamir Adleman (RSA) and Vigenere Cipher cryptographic algorithms to document file either word, excel, and pdf. This application includes the process of encryption and decryption of data, which is created by using PHP software and my SQL. Data encryption is done on the transmit side through RSA cryptographic calculations using the public key, then proceed with Vigenere Cipher algorithm which also uses public key. As for the stage of the decryption side received by using the Vigenere Cipher algorithm still use public key and then the RSA cryptographic algorithm using a private key. Test results show that the system can encrypt files, decrypt files and transmit files. Tests performed on the process of encryption and decryption of files with different file sizes, file size affects the process of encryption and decryption. The larger the file size the longer the process of encryption and decryption.

  6. 77 FR 29730 - Self-Regulatory Organizations; NASDAQ OMX BX, Inc.; Notice of Filing of Proposed Rule Change, as...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-18

    ... operate the BX Options market. BX's history dates back to the 1830s. For many years, the Boston Stock... OMX BX. BX re-launched an equities marketplace utilizing state of the art NASDAQ technology, having...). Consistent with that storied history as a long-time competitor in the U.S. markets, BX now proposes to launch...

  7. 75 FR 35011 - Combined Notice of Filings #1

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-21

    ... Processes Manual Incorporating Proposed Revisions to the Reliability Standards Development Process. Filed..., June 21, 2010. Take notice that the Commission received the following electric reliability filings: Docket Numbers: RR10-12-000. Applicants: North American Electric Reliability Corp. Description: Petition...

  8. 75 FR 13257 - Marine Mammals; File No. 87-1743

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-19

    ... life history research studies on northern elephant seals (Mirounga angustirostris) in California. The... of weaned elephant seal pups weighed, measured, and flipper tagged by 100 animals in order to study...

  9. 76 FR 8785 - ABB Inc.; License Amendment Request for Decommissioning of the ABB Inc., Combustion Engineering...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-15

    .... Electronic Submissions (E-Filing) All documents filed in NRC adjudicatory proceedings, including a request... accordance with the NRC E-Filing rule (72 FR 49139, August 28, 2007). The E-Filing process requires... requirements of E-FilingFiling, at least ten (10) days prior to the filing deadline, the participant should...

  10. Sandia Advanced MEMS Design Tools v. 3.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yarberry, Victor R.; Allen, James J.; Lantz, Jeffrey W.

    This is a major revision to the Sandia Advanced MEMS Design Tools. It replaces all previous versions. New features in this version: Revised to support AutoCAD 2014 and 2015 This CD contains an integrated set of electronic files that: a) Describe the SUMMiT V fabrication process b) Provide enabling educational information (including pictures, videos, technical information) c) Facilitate the process of designing MEMS with the SUMMiT process (prototype file, Design Rule Checker, Standard Parts Library) d) Facilitate the process of having MEMS fabricated at Sandia National Laboratories e) Facilitate the process of having post-fabrication services performed. While there exists somemore » files on the CD that are used in conjunction with software package AutoCAD, these files are not intended for use independent of the CD. Note that the customer must purchase his/her own copy of AutoCAD to use with these files.« less

  11. Barriers to success: physical separation optimizes event-file retrieval in shared workspaces.

    PubMed

    Klempova, Bibiana; Liepelt, Roman

    2017-07-08

    Sharing tasks with other persons can simplify our work and life, but seeing and hearing other people's actions may also be very distracting. The joint Simon effect (JSE) is a standard measure of referential response coding when two persons share a Simon task. Sequential modulations of the joint Simon effect (smJSE) are interpreted as a measure of event-file processing containing stimulus information, response information and information about the just relevant control-state active in a given social situation. This study tested effects of physical (Experiment 1) and virtual (Experiment 2) separation of shared workspaces on referential coding and event-file processing using a joint Simon task. In Experiment 1, participants performed this task in individual (go-nogo), joint and standard Simon task conditions with and without a transparent curtain (physical separation) placed along the imagined vertical midline of the monitor. In Experiment 2, participants performed the same tasks with and without receiving background music (virtual separation). For response times, physical separation enhanced event-file retrieval indicated by an enlarged smJSE in the joint Simon task with curtain than without curtain (Experiment1), but did not change referential response coding. In line with this, we also found evidence for enhanced event-file processing through physical separation in the joint Simon task for error rates. Virtual separation did neither impact event-file processing, nor referential coding, but generally slowed down response times in the joint Simon task. For errors, virtual separation hampered event-file processing in the joint Simon task. For the cognitively more demanding standard two-choice Simon task, we found music to have a degrading effect on event-file retrieval for response times. Our findings suggest that adding a physical separation optimizes event-file processing in shared workspaces, while music seems to lead to a more relaxed task processing mode under shared task conditions. In addition, music had an interfering impact on joint error processing and more generally when dealing with a more complex task in isolation.

  12. Bulk Extractor 1.4 User’s Manual

    DTIC Science & Technology

    2013-08-01

    optimistically decompresses data in ZIP, GZIP, RAR, and Mi- crosoft’s Hibernation files. This has proven useful, for example, in recovering email...command line. Java 7 or above must be installed on the machine for the Bulk Extractor Viewer to run. Instructions on running bulk_extractor from the... Hibernation File Fragments (decompressed and processed, not carved) Subsection 4.6 winprefetch Windows Prefetch files, file fragments (processed

  13. The lone inventor: low success rates and common errors associated with pro-se patent applications.

    PubMed

    Gaudry, Kate S

    2012-01-01

    A pro-se patent applicant is an inventor who chooses to represent himself while pursuing ("prosecuting") a patent application. To the author's knowledge, this paper is the first empirical study addressing how applications filed by pro-se inventors fare compared to applications in which inventors were represented by patent attorneys or agents. The prosecution history of 500 patent applications filed at the United States Patent and Trademark Office were analyzed: inventors were represented by a patent professional for 250 of the applications ("represented applications") but not in the other 250 ("pro-se applications"). 76% of the pro-se applications became abandoned (not issuing as a patent), as compared to 35% of the represented applications. Further, among applications that issued as patents, pro-se patents' claims appear to be narrower and therefore of less value than claims in the represented patent set. Case-specific data suggests that a substantial portion of pro-se applicants unintentionally abandon their applications, terminate the examination process relatively early, and/or fail to take advantage of interview opportunities that may resolve issues stalling allowance of the application.

  14. JCPDS-ICDD Research Associateship (Cooperative Program with NBS/NIST)

    PubMed Central

    Wong-Ng, W.; McMurdie, H. F.; Hubbard, C. R.; Mighell, A. D.

    2001-01-01

    The Research Associateship program of the Joint Committee on Powder Diffraction-International Centre for Diffraction Data (JCPDS-ICDD, now known as the ICDD) at NBS/NIST was a long standing (over 35 years) successful industry-government cooperation. The main mission of the Associateship was to publish high quality x-ray reference patterns to be included in the Powder Diffraction File (PDF). The PDF is a continuing compilation of patterns gathered from many sources, compiled and published by the ICDD. As a result of this collaboration, more than 1500 high quality powder diffraction patterns, which have had a significant impact on the scientific community, were reported. In addition, various research collaborations with NBS/NIST also led to the development of several standard reference materials (SRMs) for instrument calibration and quantitative analyses, and computer software for data collection, calibration, reduction, for the editorial process of powder pattern publication, analysis of powder data, and for quantitative analyses. This article summarizes information concerning the JCPDS-ICDD organization, the Powder Diffraction File (PDF), history and accomplishments of the JCPDS-ICDD Research Associateship. PMID:27500061

  15. Racial and gender variation in use of diagnostic colonic procedures in the Michigan Medicare population.

    PubMed

    McMahon, L F; Wolfe, R A; Huang, S; Tedeschi, P; Manning, W; Edlund, M J

    1999-07-01

    There is accumulating evidence that screening programs can alter the natural history of colorectal cancer, a significant cause of mortality and morbidity in the US. Understanding how the technology to diagnose colonic diseases is utilized in the population provides insight into both the access and processes of care. Using Medicare Part B billing files from the state of Michigan from 1986 to 1989 we identified all procedures used to diagnose colorectal disease. We utilized the Medicare Beneficiary File and the Area Resource File to identify beneficiary-specific and community-sociodemographic characteristics. The beneficiary and sociodemographic characteristics were, then, used in multiple regression analyses to identify their association with procedure utilization. Sigmoidoscopic use declined dramatically with the increasing age cohorts of Medicare beneficiaries. Urban areas and communities with higher education levels had more sigmoidoscopic use. Among procedures used to examine the entire colon, isolated barium enema was used more frequently in African Americans, the elderly, and females. The combination of barium enema and sigmoidoscopy was used more frequently among females and the newest technology, colonoscopy, was used most frequently among White males. The existence of race, gender, and socioeconomic disparities in the use of colorectal technologies in a group of patients with near-universal insurance coverage demonstrates the necessity of understanding the reason(s) for these observed differences to improve access to appropriate technologies to all segments in our society.

  16. EMERALD: A Flexible Framework for Managing Seismic Data

    NASA Astrophysics Data System (ADS)

    West, J. D.; Fouch, M. J.; Arrowsmith, R.

    2010-12-01

    The seismological community is challenged by the vast quantity of new broadband seismic data provided by large-scale seismic arrays such as EarthScope’s USArray. While this bonanza of new data enables transformative scientific studies of the Earth’s interior, it also illuminates limitations in the methods used to prepare and preprocess those data. At a recent seismic data processing focus group workshop, many participants expressed the need for better systems to minimize the time and tedium spent on data preparation in order to increase the efficiency of scientific research. Another challenge related to data from all large-scale transportable seismic experiments is that there currently exists no system for discovering and tracking changes in station metadata. This critical information, such as station location, sensor orientation, instrument response, and clock timing data, may change over the life of an experiment and/or be subject to post-experiment correction. Yet nearly all researchers utilize metadata acquired with the downloaded data, even though subsequent metadata updates might alter or invalidate results produced with older metadata. A third long-standing issue for the seismic community is the lack of easily exchangeable seismic processing codes. This problem stems directly from the storage of seismic data as individual time series files, and the history of each researcher developing his or her preferred data file naming convention and directory organization. Because most processing codes rely on the underlying data organization structure, such codes are not easily exchanged between investigators. To address these issues, we are developing EMERALD (Explore, Manage, Edit, Reduce, & Analyze Large Datasets). The goal of the EMERALD project is to provide seismic researchers with a unified, user-friendly, extensible system for managing seismic event data, thereby increasing the efficiency of scientific enquiry. EMERALD stores seismic data and metadata in a state-of-the-art open source relational database (PostgreSQL), and can, on a timed basis or on demand, download the most recent metadata, compare it with previously acquired values, and alert the user to changes. The backend relational database is capable of easily storing and managing many millions of records. The extensible, plug-in architecture of the EMERALD system allows any researcher to contribute new visualization and processing methods written in any of 12 programming languages, and a central Internet-enabled repository for such methods provides users with the opportunity to download, use, and modify new processing methods on demand. EMERALD includes data acquisition tools allowing direct importation of seismic data, and also imports data from a number of existing seismic file formats. Pre-processed clean sets of data can be exported as standard sac files with user-defined file naming and directory organization, for use with existing processing codes. The EMERALD system incorporates existing acquisition and processing tools, including SOD, TauP, GMT, and FISSURES/DHI, making much of the functionality of those tools available in a unified system with a user-friendly web browser interface. EMERALD is now in beta test. See emerald.asu.edu or contact john.d.west@asu.edu for more details.

  17. National Dam Safety Program. Stony Brook Watershed Dam Site Number 7 (NJ00344), Raritan River Basin, Stony Brook, Mercer County, New Jersey. Phase 1 Inspection Report.

    DTIC Science & Technology

    1980-02-01

    for Permit for Construction and Repair of Dam" filed on March 16, 1959. f. Design and Construction History Design data on file with NJDEP include: 1...LAr- Us a-2. hr’s. LA9~ WATF=R? SiQ_~~- SL- !E q VOL ( YFv - mcA>-) (Acmr-- =T.) 2o4~ 2-Ito STORCH ENGINEERS shootL... of 11. Project FmnnK Wmr=X---A-1

  18. Chain of Custody Item Monitor Message Viewer v.1.0 Beta

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schwartz, Steven Robert; Fielder, Laura; Hymel, Ross W.

    The CoCIM Message Viewer software allows users to connect to and download messages from a Chain of Custody Item Monitor (CoCIM) connected to a serial port on the user’s computer. The downloaded messages are authenticated and displayed in a Graphical User Interface that allows the user a limited degree of sorting and filtering of the downloaded messages as well as the ability to save downloaded files or to open previously downloaded message history files.

  19. 76 FR 34177 - Privacy Act of 1974: Implementation of Exemptions; U.S. Citizenship and Immigration Services...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-13

    ... and Customs Enforcement, Customs and Border Protection--001 Alien File, Index, and National File... Services, Immigration and Customs Enforcement, and Customs and Border Protection--001 Alien File, Index... border protection processes. The Alien File (A-File), Index, and National File Tracking System of Records...

  20. 49 CFR 1104.6 - Timely filing required.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... offers next day delivery to Washington, DC. If the e-filing option is chosen (for those pleadings and documents that are appropriate for e-filing, as determined by reference to the information on the Board's Web site), then the e-filed pleading or document is timely filed if the e-filing process is completed...

  1. The Combatant Commander and Effective Operational HUMINT: Lessons From the Double Cross System of World War II and the CJ2X of Operation Joint Guard

    DTIC Science & Technology

    2003-05-19

    www.bbc.co.uk/cgi-bin/history/renderplain.pl?file=history/war/wwtwo/spying/sis_0/> [27 March 2003]. 37 Ibid. 38 “Operation Overlord,” Saving Private Ryan Online...Security, (Washington, DC: 2002), 17; National Strategy for Combating Terrorism. Washington, DC: 2003. “Operation Overlord,” Saving Private Ryan Online

  2. 20 CFR 30.106 - Can OWCP request employment verification from other sources?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... Filing Claims; Evidence and Burden of Proof; Special Procedures for Certain Cancer Claims Verification of... for other entities to provide OWCP with the information necessary to verify an employment history...

  3. 20 CFR 30.106 - Can OWCP request employment verification from other sources?

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... Filing Claims; Evidence and Burden of Proof; Special Procedures for Certain Cancer Claims Verification of... for other entities to provide OWCP with the information necessary to verify an employment history...

  4. 20 CFR 30.106 - Can OWCP request employment verification from other sources?

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... Filing Claims; Evidence and Burden of Proof; Special Procedures for Certain Cancer Claims Verification of... for other entities to provide OWCP with the information necessary to verify an employment history...

  5. 20 CFR 30.106 - Can OWCP request employment verification from other sources?

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... Filing Claims; Evidence and Burden of Proof; Special Procedures for Certain Cancer Claims Verification of... for other entities to provide OWCP with the information necessary to verify an employment history...

  6. 76 FR 28421 - Marine Mammals; File No. 15646

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-17

    ... Rebecca Dickhut, Ph.D., Virginia Institute of Marine Science, P.O. Box 1346, Route 1208 Greate Road... following archived samples will be imported from the Swedish Museum of Natural History: fur, blood, and fat...

  7. 77 FR 19646 - Marine Mammals; File No. 17178

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-04-02

    ... Virginia Institute of Marine Science [Responsible Party: Elizabeth Canuel, Ph.D.], P.O. Box 1346, Route... will be imported from the Swedish Museum of Natural History: fur, blood, and fat biopsies from up to...

  8. VizieR Online Data Catalog: FADO code (Gomes+, 2017)

    NASA Astrophysics Data System (ADS)

    Gomes, J. M.; Papaderos, P.

    2017-03-01

    FADO comes from the Latin word "fatum" that means fate or destiny. It is also a well known genre of Portuguese music, and by choosing this acronym for this spectral synthesis tool we would like to pay tribute to Portugal. The main goal of FADO is to explore the star-formation and chemical enrichment history (the "Fado") of galaxies based on two hitherto unique elements in spectral fitting models: a) self-consistency between the best-fitting star formation history (SFH) and the nebular characteristics of a galaxy (e.g., hydrogen Balmer-line luminosities and equivalent widths; shape of the nebular continuum, including the Balmer and Paschen discontinuity) and b) genetic optimization and artificial intelligence algorithms. This document is part of the FADO v.1 distribution package, which contains two different ascii files, ReadMe and Read_F, and one tarball archive FADOv1.tar.gz. FADOv1.tar.gz contains the binary (executable) compiled in both OpenSuSE 13.2 64bit LINUX (FADO) and MAC OS X (FADO_MACOSX). The former is compatible with most LINUX distributions, while the latter was only tested for Yosemite 10.10.3. It contains the configuration files for running FADO: FADO.config and PLOT.config, as well as the "Simple Stellar Population" (SSP) base library with the base file list Base.BC03.L, the FADO v.1 short manual Read_F and this file (in the ReadMe directory) and, for testing purposes, three characteristic de-redshifted spectra from SDSS-DR7 in ascii format, corresponding to a star-forming (spec1.txt), composite (spec2.txt) and LINER (spec3.txt) galaxy. Auxiliary files needed for execution of FADO (.HIfboundem.ascii, .HeIIfbound.ascii, .HeIfboundem.ascii, grfont.dat and grfont.txt) are also included in the tarball. By decompressing the tarball the following six directories are created: input, output, plots, ReadMe, SSPs and tables (see below for a brief explanation). (2 data files).

  9. 45 CFR 16.8 - The next step in the appeal process: Preparation of an appeal file and written argument.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 45 Public Welfare 1 2010-10-01 2010-10-01 false The next step in the appeal process: Preparation of an appeal file and written argument. 16.8 Section 16.8 Public Welfare DEPARTMENT OF HEALTH AND... step in the appeal process: Preparation of an appeal file and written argument. Except in expedited...

  10. Automated 3D Damaged Cavity Model Builder for Lower Surface Acreage Tile on Orbiter

    NASA Technical Reports Server (NTRS)

    Belknap, Shannon; Zhang, Michael

    2013-01-01

    The 3D Automated Thermal Tool for Damaged Acreage Tile Math Model builder was developed to perform quickly and accurately 3D thermal analyses on damaged lower surface acreage tiles and structures beneath the damaged locations on a Space Shuttle Orbiter. The 3D model builder created both TRASYS geometric math models (GMMs) and SINDA thermal math models (TMMs) to simulate an idealized damaged cavity in the damaged tile(s). The GMMs are processed in TRASYS to generate radiation conductors between the surfaces in the cavity. The radiation conductors are inserted into the TMMs, which are processed in SINDA to generate temperature histories for all of the nodes on each layer of the TMM. The invention allows a thermal analyst to create quickly and accurately a 3D model of a damaged lower surface tile on the orbiter. The 3D model builder can generate a GMM and the correspond ing TMM in one or two minutes, with the damaged cavity included in the tile material. A separate program creates a configuration file, which would take a couple of minutes to edit. This configuration file is read by the model builder program to determine the location of the damage, the correct tile type, tile thickness, structure thickness, and SIP thickness of the damage, so that the model builder program can build an accurate model at the specified location. Once the models are built, they are processed by the TRASYS and SINDA.

  11. ULSGEN (Uplink Summary Generator)

    NASA Technical Reports Server (NTRS)

    Wang, Y.-F.; Schrock, M.; Reeve, T.; Nguyen, K.; Smith, B.

    2014-01-01

    Uplink is an important part of spacecraft operations. Ensuring the accuracy of uplink content is essential to mission success. Before commands are radiated to the spacecraft, the command and sequence must be reviewed and verified by various teams. In most cases, this process requires collecting the command data, reviewing the data during a command conference meeting, and providing physical signatures by designated members of various teams to signify approval of the data. If commands or sequences are disapproved for some reason, the whole process must be restarted. Recording data and decision history is important for traceability reasons. Given that many steps and people are involved in this process, an easily accessible software tool for managing the process is vital to reducing human error which could result in uplinking incorrect data to the spacecraft. An uplink summary generator called ULSGEN was developed to assist this uplink content approval process. ULSGEN generates a web-based summary of uplink file content and provides an online review process. Spacecraft operations personnel view this summary as a final check before actual radiation of the uplink data. .

  12. 76 FR 11830 - Self-Regulatory Organizations; Financial Industry Regulatory Authority, Inc.; Notice of Filing...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-03

    ... Effectiveness of Proposed Rule Change to Eliminate Duplicative Filings Under FINRA Rule 9610(a) February 25... the proposed change will make the process of seeking exemptive relief more efficient by eliminating... the efficiency of the exemptive relief process by eliminating duplicative filings and providing...

  13. 78 FR 17394 - Filing via the Internet; Electronic Tariff Filings; Revisions to Electric Quarterly Report Filing...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-21

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket Nos. RM07-16-000; RM01-5-000; RM12-3-000] Filing via the Internet; Electronic Tariff Filings; Revisions to Electric Quarterly Report Filing Process; Notice of Technical Conference Take notice that on April 16, 2013, the staff of the...

  14. 77 FR 11596 - Central Vermont Public Service Corporation, Millstone Power Station, Unit 3; Notice of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-27

    ... NRC E-filing system. Requests for a hearing and petitions for leave to intervene should be filed in.../ . IV. Electronic Submissions (E-Filing) All documents filed in NRC adjudicatory proceedings, including... NRC E-Filing rule (72 FR 49139, August 28, 2007). The E-Filing process requires participants to submit...

  15. 78 FR 21930 - Aquenergy Systems, Inc.; Notice of Intent To File License Application, Filing of Pre-Application...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-12

    ... Systems, Inc.; Notice of Intent To File License Application, Filing of Pre-Application Document, and Approving Use of the Traditional Licensing Process a. Type of Filing: Notice of Intent to File License...: November 11, 2012. d. Submitted by: Aquenergy Systems, Inc., a fully owned subsidiaries of Enel Green Power...

  16. 77 FR 51985 - Archon Energy 1, Inc.; Notice of Intent To File License Application, Filing of Pre-Application...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-28

    ..., Inc.; Notice of Intent To File License Application, Filing of Pre-Application Document, and Approving... Application and Request to Use the Traditional Licensing Process. b. Project No.: 14432-000. c. Date Filed... Endangered Species Act. m. Archon filed a Pre-Application Document (PAD) with the Commission, pursuant to 18...

  17. On-Board File Management and Its Application in Flight Operations

    NASA Technical Reports Server (NTRS)

    Kuo, N.

    1998-01-01

    In this paper, the author presents the minimum functions required for an on-board file management system. We explore file manipulation processes and demonstrate how the file transfer along with the file management system will be utilized to support flight operations and data delivery.

  18. Wave data processing toolbox manual

    USGS Publications Warehouse

    Sullivan, Charlene M.; Warner, John C.; Martini, Marinna A.; Lightsom, Frances S.; Voulgaris, George; Work, Paul

    2006-01-01

    Researchers routinely deploy oceanographic equipment in estuaries, coastal nearshore environments, and shelf settings. These deployments usually include tripod-mounted instruments to measure a suite of physical parameters such as currents, waves, and pressure. Instruments such as the RD Instruments Acoustic Doppler Current Profiler (ADCP(tm)), the Sontek Argonaut, and the Nortek Aquadopp(tm) Profiler (AP) can measure these parameters. The data from these instruments must be processed using proprietary software unique to each instrument to convert measurements to real physical values. These processed files are then available for dissemination and scientific evaluation. For example, the proprietary processing program used to process data from the RD Instruments ADCP for wave information is called WavesMon. Depending on the length of the deployment, WavesMon will typically produce thousands of processed data files. These files are difficult to archive and further analysis of the data becomes cumbersome. More imperative is that these files alone do not include sufficient information pertinent to that deployment (metadata), which could hinder future scientific interpretation. This open-file report describes a toolbox developed to compile, archive, and disseminate the processed wave measurement data from an RD Instruments ADCP, a Sontek Argonaut, or a Nortek AP. This toolbox will be referred to as the Wave Data Processing Toolbox. The Wave Data Processing Toolbox congregates the processed files output from the proprietary software into two NetCDF files: one file contains the statistics of the burst data and the other file contains the raw burst data (additional details described below). One important advantage of this toolbox is that it converts the data into NetCDF format. Data in NetCDF format is easy to disseminate, is portable to any computer platform, and is viewable with public-domain freely-available software. Another important advantage is that a metadata structure is embedded with the data to document pertinent information regarding the deployment and the parameters used to process the data. Using this format ensures that the relevant information about how the data was collected and converted to physical units is maintained with the actual data. EPIC-standard variable names have been utilized where appropriate. These standards, developed by the NOAA Pacific Marine Environmental Laboratory (PMEL) (http://www.pmel.noaa.gov/epic/), provide a universal vernacular allowing researchers to share data without translation.

  19. Small file aggregation in a parallel computing system

    DOEpatents

    Faibish, Sorin; Bent, John M.; Tzelnic, Percy; Grider, Gary; Zhang, Jingwang

    2014-09-02

    Techniques are provided for small file aggregation in a parallel computing system. An exemplary method for storing a plurality of files generated by a plurality of processes in a parallel computing system comprises aggregating the plurality of files into a single aggregated file; and generating metadata for the single aggregated file. The metadata comprises an offset and a length of each of the plurality of files in the single aggregated file. The metadata can be used to unpack one or more of the files from the single aggregated file.

  20. Cooperative storage of shared files in a parallel computing system with dynamic block size

    DOEpatents

    Bent, John M.; Faibish, Sorin; Grider, Gary

    2015-11-10

    Improved techniques are provided for parallel writing of data to a shared object in a parallel computing system. A method is provided for storing data generated by a plurality of parallel processes to a shared object in a parallel computing system. The method is performed by at least one of the processes and comprises: dynamically determining a block size for storing the data; exchanging a determined amount of the data with at least one additional process to achieve a block of the data having the dynamically determined block size; and writing the block of the data having the dynamically determined block size to a file system. The determined block size comprises, e.g., a total amount of the data to be stored divided by the number of parallel processes. The file system comprises, for example, a log structured virtual parallel file system, such as a Parallel Log-Structured File System (PLFS).

  1. 49 CFR 1104.12 - Service of pleadings and papers.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... express mail. If a document is filed with the Board through the e-filing process, a copy of the e-filed... BOARD, DEPARTMENT OF TRANSPORTATION RULES OF PRACTICE FILING WITH THE BOARD-COPIES-VERIFICATION-SERVICE...

  2. 22 CFR 501.5 - Mid-level FSO Candidate Program (Class 3, 2, or 1).

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... university-level teacher of political science, history, English or other relevant disciplines. Appointments... Recruitment Branch, Office of Personnel (M/PDSE). (2) The filing of an application for the Foreign Service...

  3. 22 CFR 501.5 - Mid-level FSO Candidate Program (Class 3, 2, or 1).

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... university-level teacher of political science, history, English or other relevant disciplines. Appointments... Recruitment Branch, Office of Personnel (M/PDSE). (2) The filing of an application for the Foreign Service...

  4. STRUCTURAL ECONOMIC CHANGE AND INTERNATIONAL MIGRATION FROM MEXICO AND POLAND

    PubMed Central

    Massey, Douglas S.; Kalter, Frank; Pren, Karen A.

    2010-01-01

    In this article we use uniquely comparable data sets from two very different settings to examine how exogenous economic transformations affect the likelihood and selectivity of international out-migration. Specifically, we use data from the Mexican Migration Project to construct event history files predicting first U.S. trips from seven communities in the state of Veracruz, which until recently sent very few migrants abroad. Similarly, using data from the Polish Migration Project, we derive comparable event history files predicting first trips to Germany from four Polish communities, which also sent few migrants abroad before the 1980s. Our analyses suggest that the onset of structural adjustment in both places had a significant effect in raising the probability of international migration, even when controlling for a set of standard variables specified by other theories to influence migration propensity, such as the size of the binational income gap and various indicators of human and social capital. PMID:21765550

  5. 77 FR 60419 - Lock + Hydro Friends Fund XIX, LLC; Notice of Intent To File License Application, Filing of Pre...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-03

    ... Friends Fund XIX, LLC; Notice of Intent To File License Application, Filing of Pre-Application Document.... Date Filed: August 7, 2012. d. Submitted By: Lock + Hydro Friends Fund XIX, LLC. e. Name of Project.... Lock + Hydro Friends Fund XIX, LLC filed its request to use the Traditional Licensing Process on August...

  6. 77 FR 59680 - Request To Extend Time To Submit Decommissioning Plan; U.S. Department of the Army, Jefferson...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-09-28

    ... November 27, 2012. IV. Electronic Submissions (E-Filing) All documents filed in NRC adjudicatory... accordance with the NRC E-Filing rule (72 FR 49139; August 28, 2007). The E-Filing process requires... requirements of E-Filing, at least 10 days prior to the filing deadline, the participant should contact the...

  7. 36 CFR 223.118 - Appeal process for small business timber sale set-aside program share recomputation decisions.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... the Chief may designate. (e) Filing procedures. In order to file an appeal under this section, an... interested party in response to an appeal must be filed within 15 days after the close of the appeal filing... filing an appeal; however, when the filing period would expire on a Saturday, Sunday, or Federal holiday...

  8. 77 FR 24993 - License Amendment Request To Amend Source Materials License SUA-1310 and Proceed With Termination...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-04-26

    ... Commission by June 25, 2012. IV. Electronic Submissions (E-Filing) All documents filed in NRC adjudicatory... accordance with the NRC E-Filing rule (72 FR 49139, August 28, 2007). The E-Filing process requires... requirements of E-Filing, at least 10 days prior to the filing deadline, the participant should contact the...

  9. The Seamount Catalog in EarthRef.org

    NASA Astrophysics Data System (ADS)

    Gotberg, N. K.; Koppers, A. A.; Staudigel, H.; Perez, J.

    2004-12-01

    Seamounts are important to research and education in many scientific fields, providing a wide range of data on physical, chemical, biological and geological processes. In order to make a diverse set of seamount data accessible we have developed the Seamount Catalog in EarthRef.org, available through the http://earthref.org/databases/SC/. The primary goal of the Seamount Catalog is to provide access to digital data files on a large assortment of interdisciplinary seamount research. The catalog can be searched at a variety of ability or expert levels allowing it to be used from basic education to advanced research. Each seamount is described in terms of its location, height, volume, elongation, azimuth, irregularity, rifts, morphological classification and relation to other features. GEBCO (General Bathymetric Chart of the Ocean) gazetteer data (2002; 2003) is included in the database in order to provide information on the history, discovery and names of the seamounts. Screen-optimized bathymetry maps, grid files and the original multibeam data files are available for online viewing with higher resolution downloadable versions (AI, PS, PDF) also offered. The data files for each seamount include a map made from the multibeam data only, a map made from Smith and Sandwell's (1996) predicted bathymetry, a merged map incorporating both data sets, and a map showing the differences between the two data sets. We are working towards expanding the Seamount Catalog by integrating bathymetry data from various sources, developing and linking disciplinary reference models, and integrating information from multiple disciplines and from the literature. We hope to create a data integrative environment that provides access to seamount data and the tools needed for working with that data.

  10. Intrex Subject/Title Inverted-File Characteristics.

    ERIC Educational Resources Information Center

    Uemura, Syunsuke

    The characteristics of the Intrex subject/title inverted file are analyzed. Basic statistics of the inverted file are presented including various distributions of the index words and terms from which the file was derived, and statistics on stems, the file growth process, and redundancy measurements. A study of stems both with extremely high and…

  11. Tracking Provenance of Earth Science Data

    NASA Technical Reports Server (NTRS)

    Tilmes, Curt; Yesha, Yelena; Halem, Milton

    2010-01-01

    Tremendous volumes of data have been captured, archived and analyzed. Sensors, algorithms and processing systems for transforming and analyzing the data are evolving over time. Web Portals and Services can create transient data sets on-demand. Data are transferred from organization to organization with additional transformations at every stage. Provenance in this context refers to the source of data and a record of the process that led to its current state. It encompasses the documentation of a variety of artifacts related to particular data. Provenance is important for understanding and using scientific datasets, and critical for independent confirmation of scientific results. Managing provenance throughout scientific data processing has gained interest lately and there are a variety of approaches. Large scale scientific datasets consisting of thousands to millions of individual data files and processes offer particular challenges. This paper uses the analogy of art history provenance to explore some of the concerns of applying provenance tracking to earth science data. It also illustrates some of the provenance issues with examples drawn from the Ozone Monitoring Instrument (OMI) Data Processing System (OMIDAPS) run at NASA's Goddard Space Flight Center by the first author.

  12. Hydratools manual version 1.0, documentation for a MATLAB®-based post-processing package for the Sontek Hydra

    USGS Publications Warehouse

    Martini, Marinna A.; Sherwood, Chris; Horwitz, Rachel; Ramsey, Andree; Lightsom, Fran; Lacy, Jessie; Xu, Jingping

    2006-01-01

    3.\tpreserving minimally processed and partially processed versions of data sets. STG usually deploys ADV and PCADP probes configured as downward looking, mounted on bottom tripods, with the objective of measuring high-resolution near-bed currents. The velocity profiles are recorded with minimal internal data processing. Also recorded are parameters such as temperature, conductivity, optical backscatter, light transmission, and high frequency pressure. Sampling consists of high-frequency bursts(1–10 Hz) bursts of long duration (5–30 minutes) at regular and recurring intervals for a duration of 1 to 6 months. The result is very large data files, often 500 MB per Hydra, per deployment, in Sontek's compressed binary format. This section introduces the Hydratools toolbox and provides information about the history of the system's development. The USGS philosophy regarding data quality is discussed to provide an understating of the motivation for creating the system. General information about the following topics will also be discussed: hardware and software required for the system, basic processing steps, limitations of program usage, and features that are unique to the programs.

  13. An overview of the National Space Science data Center Standard Information Retrieval System (SIRS)

    NASA Technical Reports Server (NTRS)

    Shapiro, A.; Blecher, S.; Verson, E. E.; King, M. L. (Editor)

    1974-01-01

    A general overview is given of the National Space Science Data Center (NSSDC) Standard Information Retrieval System. A description, in general terms, the information system that contains the data files and the software system that processes and manipulates the files maintained at the Data Center. Emphasis is placed on providing users with an overview of the capabilities and uses of the NSSDC Standard Information Retrieval System (SIRS). Examples given are taken from the files at the Data Center. Detailed information about NSSDC data files is documented in a set of File Users Guides, with one user's guide prepared for each file processed by SIRS. Detailed information about SIRS is presented in the SIRS Users Guide.

  14. Combination of advanced encryption standard 256 bits with md5 to secure documents on android smartphone

    NASA Astrophysics Data System (ADS)

    Pasaribu, Hendra; Sitanggang, Delima; Rizki Damanik, Rudolfo; Rudianto Sitompul, Alex Chandra

    2018-04-01

    File transfer by using a smartphone has some security issues like data theft by irresponsible parties. To improve the quality of data security systems on smartphones, in this research the integration of AES 256 bit algorithm by using MD5 hashing is proposed. The use of MD5 aims to increase the key strength of the encryption and decryption process of document files. The test results show that the proposed method can increase the key strength of the encryption and decryption process in the document file. Encryption and decryption time by using AES and MD5 combination is faster than using AES only on *.txt file type and reverse results for *.docx, *.xlsx, *.pptx and *.pdf file files.

  15. 18 CFR 157.21 - Pre-filing procedures and review process for LNG terminal facilities and other natural gas...

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... the pre-filing review of any pipeline or other natural gas facilities, including facilities not... from the subject LNG terminal facilities to the existing natural gas pipeline infrastructure. (b) Other... and review process for LNG terminal facilities and other natural gas facilities prior to filing of...

  16. 15 CFR 30.5 - Electronic Export Information filing application and certification processes and standards.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... the USPPI in the form of a power of attorney or written authorization, thus making them authorized... agencies participating in the AES postdeparture filing review process. Failure to meet the standards of the...) calendar days of receipt of the postdeparture filing application by the Census Bureau, or if a decision...

  17. Development of the Large-Scale Statistical Analysis System of Satellites Observations Data with Grid Datafarm Architecture

    NASA Astrophysics Data System (ADS)

    Yamamoto, K.; Murata, K.; Kimura, E.; Honda, R.

    2006-12-01

    In the Solar-Terrestrial Physics (STP) field, the amount of satellite observation data has been increasing every year. It is necessary to solve the following three problems to achieve large-scale statistical analyses of plenty of data. (i) More CPU power and larger memory and disk size are required. However, total powers of personal computers are not enough to analyze such amount of data. Super-computers provide a high performance CPU and rich memory area, but they are usually separated from the Internet or connected only for the purpose of programming or data file transfer. (ii) Most of the observation data files are managed at distributed data sites over the Internet. Users have to know where the data files are located. (iii) Since no common data format in the STP field is available now, users have to prepare reading program for each data by themselves. To overcome the problems (i) and (ii), we constructed a parallel and distributed data analysis environment based on the Gfarm reference implementation of the Grid Datafarm architecture. The Gfarm shares both computational resources and perform parallel distributed processings. In addition, the Gfarm provides the Gfarm filesystem which can be as virtual directory tree among nodes. The Gfarm environment is composed of three parts; a metadata server to manage distributed files information, filesystem nodes to provide computational resources and a client to throw a job into metadata server and manages data processing schedulings. In the present study, both data files and data processes are parallelized on the Gfarm with 6 file system nodes: CPU clock frequency of each node is Pentium V 1GHz, 256MB memory and40GB disk. To evaluate performances of the present Gfarm system, we scanned plenty of data files, the size of which is about 300MB for each, in three processing methods: sequential processing in one node, sequential processing by each node and parallel processing by each node. As a result, in comparison between the number of files and the elapsed time, parallel and distributed processing shorten the elapsed time to 1/5 than sequential processing. On the other hand, sequential processing times were shortened in another experiment, whose file size is smaller than 100KB. In this case, the elapsed time to scan one file is within one second. It implies that disk swap took place in case of parallel processing by each node. We note that the operation became unstable when the number of the files exceeded 1000. To overcome the problem (iii), we developed an original data class. This class supports our reading of data files with various data formats since it converts them into an original data format since it defines schemata for every type of data and encapsulates the structure of data files. In addition, since this class provides a function of time re-sampling, users can easily convert multiple data (array) with different time resolution into the same time resolution array. Finally, using the Gfarm, we achieved a high performance environment for large-scale statistical data analyses. It should be noted that the present method is effective only when one data file size is large enough. At present, we are restructuring the new Gfarm environment with 8 nodes: CPU is Athlon 64 x2 Dual Core 2GHz, 2GB memory and 1.2TB disk (using RAID0) for each node. Our original class is to be implemented on the new Gfarm environment. In the present talk, we show the latest results with applying the present system for data analyses with huge number of satellite observation data files.

  18. DSN command system Mark III-78. [data processing

    NASA Technical Reports Server (NTRS)

    Stinnett, W. G.

    1978-01-01

    The Deep Space Network command Mark III-78 data processing system includes a capability for a store-and-forward handling method. The functions of (1) storing the command files at a Deep Space station; (2) attaching the files to a queue; and (3) radiating the commands to the spacecraft are straightforward. However, the total data processing capability is a result of assuming worst case, failure-recovery, or nonnominal operating conditions. Optional data processing functions include: file erase, clearing the queue, suspend radiation, command abort, resume command radiation, and close window time override.

  19. 50 CFR 221.12 - Where and how must documents be filed?

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... Hearing Process Document Filing and Service § 221.12 Where and how must documents be filed? (a) Place of... facsimile if: (A) The document is 20 pages or less, including all attachments; (B) The sending facsimile..., any document received after 5 p.m. at the place where the filing is due is considered filed on the...

  20. 78 FR 12365 - License Amendment Request for United Nuclear Corporation, Church Rock Mill-License No. SUA-1475

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-22

    .... Electronic Submissions (E-Filing) All documents filed in NRC adjudicatory proceedings, including a request... NRC E-Filing rule (72 FR 49139; August 28, 2007). The E-Filing process requires participants to submit... accordance with the procedures described below. To comply with the procedural requirements of E-Filing, at...

  1. P2P Watch: Personal Health Information Detection in Peer-to-Peer File-Sharing Networks

    PubMed Central

    El Emam, Khaled; Arbuckle, Luk; Neri, Emilio; Rose, Sean; Jonker, Elizabeth

    2012-01-01

    Background Users of peer-to-peer (P2P) file-sharing networks risk the inadvertent disclosure of personal health information (PHI). In addition to potentially causing harm to the affected individuals, this can heighten the risk of data breaches for health information custodians. Automated PHI detection tools that crawl the P2P networks can identify PHI and alert custodians. While there has been previous work on the detection of personal information in electronic health records, there has been a dearth of research on the automated detection of PHI in heterogeneous user files. Objective To build a system that accurately detects PHI in files sent through P2P file-sharing networks. The system, which we call P2P Watch, uses a pipeline of text processing techniques to automatically detect PHI in files exchanged through P2P networks. P2P Watch processes unstructured texts regardless of the file format, document type, and content. Methods We developed P2P Watch to extract and analyze PHI in text files exchanged on P2P networks. We labeled texts as PHI if they contained identifiable information about a person (eg, name and date of birth) and specifics of the person’s health (eg, diagnosis, prescriptions, and medical procedures). We evaluated the system’s performance through its efficiency and effectiveness on 3924 files gathered from three P2P networks. Results P2P Watch successfully processed 3924 P2P files of unknown content. A manual examination of 1578 randomly selected files marked by the system as non-PHI confirmed that these files indeed did not contain PHI, making the false-negative detection rate equal to zero. Of 57 files marked by the system as PHI, all contained both personally identifiable information and health information: 11 files were PHI disclosures, and 46 files contained organizational materials such as unfilled insurance forms, job applications by medical professionals, and essays. Conclusions PHI can be successfully detected in free-form textual files exchanged through P2P networks. Once the files with PHI are detected, affected individuals or data custodians can be alerted to take remedial action. PMID:22776692

  2. P2P watch: personal health information detection in peer-to-peer file-sharing networks.

    PubMed

    Sokolova, Marina; El Emam, Khaled; Arbuckle, Luk; Neri, Emilio; Rose, Sean; Jonker, Elizabeth

    2012-07-09

    Users of peer-to-peer (P2P) file-sharing networks risk the inadvertent disclosure of personal health information (PHI). In addition to potentially causing harm to the affected individuals, this can heighten the risk of data breaches for health information custodians. Automated PHI detection tools that crawl the P2P networks can identify PHI and alert custodians. While there has been previous work on the detection of personal information in electronic health records, there has been a dearth of research on the automated detection of PHI in heterogeneous user files. To build a system that accurately detects PHI in files sent through P2P file-sharing networks. The system, which we call P2P Watch, uses a pipeline of text processing techniques to automatically detect PHI in files exchanged through P2P networks. P2P Watch processes unstructured texts regardless of the file format, document type, and content. We developed P2P Watch to extract and analyze PHI in text files exchanged on P2P networks. We labeled texts as PHI if they contained identifiable information about a person (eg, name and date of birth) and specifics of the person's health (eg, diagnosis, prescriptions, and medical procedures). We evaluated the system's performance through its efficiency and effectiveness on 3924 files gathered from three P2P networks. P2P Watch successfully processed 3924 P2P files of unknown content. A manual examination of 1578 randomly selected files marked by the system as non-PHI confirmed that these files indeed did not contain PHI, making the false-negative detection rate equal to zero. Of 57 files marked by the system as PHI, all contained both personally identifiable information and health information: 11 files were PHI disclosures, and 46 files contained organizational materials such as unfilled insurance forms, job applications by medical professionals, and essays. PHI can be successfully detected in free-form textual files exchanged through P2P networks. Once the files with PHI are detected, affected individuals or data custodians can be alerted to take remedial action.

  3. DMFS: A Data Migration File System for NetBSD

    NASA Technical Reports Server (NTRS)

    Studenmund, William

    1999-01-01

    I have recently developed dmfs, a Data Migration File System, for NetBSD. This file system is based on the overlay file system, which is discussed in a separate paper, and provides kernel support for the data migration system being developed by my research group here at NASA/Ames. The file system utilizes an underlying file store to provide the file backing, and coordinates user and system access to the files. It stores its internal meta data in a flat file, which resides on a separate file system. Our data migration system provides archiving and file migration services. System utilities scan the dmfs file system for recently modified files, and archive them to two separate tape stores. Once a file has been doubly archived, files larger than a specified size will be truncated to that size, potentially freeing up large amounts of the underlying file store. Some sites will choose to retain none of the file (deleting its contents entirely from the file system) while others may choose to retain a portion, for instance a preamble describing the remainder of the file. The dmfs layer coordinates access to the file, retaining user-perceived access and modification times, file size, and restricting access to partially migrated files to the portion actually resident. When a user process attempts to read from the non-resident portion of a file, it is blocked and the dmfs layer sends a request to a system daemon to restore the file. As more of the file becomes resident, the user process is permitted to begin accessing the now-resident portions of the file. For simplicity, our data migration system divides a file into two portions, a resident portion followed by an optional non-resident portion. Also, a file is in one of three states: fully resident, fully resident and archived, and (partially) non-resident and archived. For a file which is only partially resident, any attempt to write or truncate the file, or to read a non-resident portion, will trigger a file restoration. Truncations and writes are blocked until the file is fully restored so that a restoration which only partially succeed does not leave the file in an indeterminate state with portions existing only on tape and other portions only in the disk file system. We chose layered file system technology as it permits us to focus on the data migration functionality, and permits end system administrators to choose the underlying file store technology. We chose the overlay layered file system instead of the null layer for two reasons: first to permit our layer to better preserve meta data integrity and second to prevent even root processes from accessing migrated files. This is achieved as the underlying file store becomes inaccessible once the dmfs layer is mounted. We are quite pleased with how the layered file system has turned out. Of the 45 vnode operations in NetBSD, 20 (forty-four percent) required no intervention by our file layer - they are passed directly to the underlying file store. Of the twenty five we do intercept, nine (such as vop_create()) are intercepted only to ensure meta data integrity. Most of the functionality was concentrated in five operations: vop_read, vop_write, vop_getattr, vop_setattr, and vop_fcntl. The first four are the core operations for controlling access to migrated files and preserving the user experience. vop_fcntl, a call generated for a certain class of fcntl codes, provides the command channel used by privileged user programs to communicate with the dmfs layer.

  4. Innovation in Library Education: Historical X-Files on Technology, People, and Change.

    ERIC Educational Resources Information Center

    Carmichael, James V., Jr.

    1998-01-01

    Discusses the history of library education and library educators. Highlights include Melvil Dewey's proposal for formal library education, the earlier apprentice system, obstacles to formal education, changes in attitudes toward patrons, accreditation, standards, and technological changes. (LRW)

  5. In-Process Items on LCS.

    ERIC Educational Resources Information Center

    Russell, Thyra K.

    Morris Library at Southern Illinois University computerized its technical processes using the Library Computer System (LCS), which was implemented in the library to streamline order processing by: (1) providing up-to-date online files to track in-process items; (2) encouraging quick, efficient accessing of information; (3) reducing manual files;…

  6. 78 FR 34370 - Revisions to Electric Quarterly Report Filing Process; Notice of Availability of Video Showing...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-07

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. RM12-3-000] Revisions to Electric Quarterly Report Filing Process; Notice of Availability of Video Showing How To File Electric Quarterly Reports Using the Web Interface Take notice that the Federal Energy Regulatory Commission (Commission) is making available on its Web site ...

  7. 78 FR 43197 - Revisions to Electric Quarterly Report Filing Process; Notice of Availability of Sandbox...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-19

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. RM12-3-000] Revisions to Electric Quarterly Report Filing Process; Notice of Availability of Sandbox Electronic Test Site Take notice that a Sandbox Electronic Test Site (ETS) and instructions have been posted on the Commission's Web site at http://www.ferc.gov/docs-filing...

  8. Attitude profile design program

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The Attitude Profile Design (APD) Program was designed to be used as a stand-alone addition to the Simplex Computation of Optimum Orbital Trajectories (SCOOT). The program uses information from a SCOOT output file and the user defined attitude profile to produce time histories of attitude, angular body rates, and accelerations. The APD program is written in standard FORTRAN77 and should be portable to any machine that has an appropriate compiler. The input and output are through formatted files. The program reads the basic flight data, such as the states of the vehicles, acceleration profiles, and burn information, from the SCOOT output file. The user inputs information about the desired attitude profile during coasts in a high level manner. The program then takes these high level commands and executes the maneuvers, outputting the desired information.

  9. Implementation of Super-Encryption with Trithemius Algorithm and Double Transposition Cipher in Securing PDF Files on Android Platform

    NASA Astrophysics Data System (ADS)

    Budiman, M. A.; Rachmawati, D.; Jessica

    2018-03-01

    This study aims to combine the trithemus algorithm and double transposition cipher in file security that will be implemented to be an Android-based application. The parameters being examined are the real running time, and the complexity value. The type of file to be used is a file in PDF format. The overall result shows that the complexity of the two algorithms with duper encryption method is reported as Θ (n 2). However, the processing time required in the encryption process uses the Trithemius algorithm much faster than using the Double Transposition Cipher. With the length of plaintext and password linearly proportional to the processing time.

  10. Automation software for a materials testing laboratory

    NASA Technical Reports Server (NTRS)

    Mcgaw, Michael A.; Bonacuse, Peter J.

    1990-01-01

    The software environment in use at the NASA-Lewis Research Center's High Temperature Fatigue and Structures Laboratory is reviewed. This software environment is aimed at supporting the tasks involved in performing materials behavior research. The features and capabilities of the approach to specifying a materials test include static and dynamic control mode switching, enabling multimode test control; dynamic alteration of the control waveform based upon events occurring in the response variables; precise control over the nature of both command waveform generation and data acquisition; and the nesting of waveform/data acquisition strategies so that material history dependencies may be explored. To eliminate repetitive tasks in the coventional research process, a communications network software system is established which provides file interchange and remote console capabilities.

  11. 76 FR 53967 - Carolina Power & Light, Shearon Harris Nuclear Power Plant, Unit 1; Notice of Consideration of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-30

    ... hearing and intervention via electronic submission through the NRC E-Filing system. Requests for a hearing... accordance with the NRC E-Filing rule (72 FR 49139, August 28, 2007). The E-Filing process requires... requirements of E-Filing, at least 10 days prior to the filing deadline, the participant should contact the...

  12. 78 FR 5511 - Virgil C. Summer Nuclear Station, Units 2 and 3; Application and Amendment to Facility Combined...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-25

    ... (E-Filing) All documents filed in the NRC adjudicatory proceedings, including a request for hearing... entities participating under 10 CFR 2.315(c), must be filed in accordance with the NRC's E-Filing rule (72 FR 49139; August 28, 2007). The E-Filing process requires participants to submit and serve all...

  13. 76 FR 80982 - International Cyclotron, Inc., Hato Rey, Puerto Rico; Order Suspending Licensed Activities

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-27

    ... under 10 CFR 2.315(c), must be filed in accordance with the NRC E-Filing rule (72 FR 49139, August 28, 2007). The E-Filing process requires participants to submit and serve all adjudicatory documents over... below. To comply with the procedural requirements of E-Filing, at least 10 days prior to the filing...

  14. 76 FR 39910 - Nine Mile Point Nuclear Station, LLC; Nine Mile Point Nuclear Station, Unit Nos. 1 and 2; Notice...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-07

    ... hearing and intervention via electronic submission through the NRC E-filing system. Requests for a hearing... accordance with the NRC E-Filing rule (72 FR 49139, August 28, 2007). The E-Filing process requires... requirements of E-Filing, at least ten (10) days prior to the filing deadline, the [[Page 39912

  15. 77 FR 11523 - Bryant Mountain, LLC; Notice of Intent To File License Application, Filing of Pre-Application...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-27

    ... and Commencing Pre-filing Process. b. Project No.: 13680-001. c. Dated Filed: December 21, 2011. d... Reclamation and private lands. g. Filed Pursuant to: 18 CFR Part 5 of the Commission's Regulations. h..., California, 94505, (925) 634-1550 or email at [email protected] . i. FERC Contact: Ryan Hansen at (202...

  16. Program to convert SUDS2ASC files to a single binary SEGY file

    USGS Publications Warehouse

    Goldman, Mark

    2000-01-01

    This program, SUDS2SEGY, converts and combines ASCII files created using SUDS2ASC Version 2.60, to a single SEGY file. SUDS2ASC has been used previously to create an ASCII file of three-component seismic data for an individual recording station. However, many seismic processing packages have difficulty reading in ASCII data. In addition, it may be cumbersome to process a separate file for each recording station, particularly if traces from different recording stations contain a different number of data samples and/or a different start time. This new program - SUDS2SEGY - combines these recording station files into a single SEGY file. In addition, SUDS2SEGY normalizes the trace times so that each trace starts at a given time and consists of a fixed number of samples. This normalization allows seismic data from many different stations to be read in as a single "data gather". SUDS2SEGY also produces a report summarizing the offset and maximum absolute amplitude for each component in a station file. These data are output separately to an ASCII file and can be subsequently input to a plotting package.

  17. Ground Processing of Data From the Mars Exploration Rovers

    NASA Technical Reports Server (NTRS)

    Wright, Jesse; Sturdevant, Kathryn; Noble, David

    2006-01-01

    A computer program implements the Earth side of the protocol that governs the transfer of data files generated by the Mars Exploration Rovers. It also provides tools for viewing data in these files and integrating data-product files into automated and manual processes. It reconstitutes files from telemetry data packets. Even if only one packet is received, metadata provide enough information to enable this program to identify and use partial data products. This software can generate commands to acknowledge received files and retransmit missed parts of files, or it can feed a manual process to make decisions about retransmission. The software uses an Extensible Markup Language (XML) data dictionary to provide a generic capability for displaying files of basic types, and uses external "plug-in" application programs to provide more sophisticated displays. This program makes data products available with very low latency, and can trigger automated actions when complete or partial products are received. The software is easy to install and use. The only system requirement for installing the software is a Java J2SE 1.4 platform. Several instances of the software can be executed simultaneously on the same machine.

  18. Storing files in a parallel computing system using list-based index to identify replica files

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Faibish, Sorin; Bent, John M.; Tzelnic, Percy

    Improved techniques are provided for storing files in a parallel computing system using a list-based index to identify file replicas. A file and at least one replica of the file are stored in one or more storage nodes of the parallel computing system. An index for the file comprises at least one list comprising a pointer to a storage location of the file and a storage location of the at least one replica of the file. The file comprises one or more of a complete file and one or more sub-files. The index may also comprise a checksum value formore » one or more of the file and the replica(s) of the file. The checksum value can be evaluated to validate the file and/or the file replica(s). A query can be processed using the list.« less

  19. The Lone Inventor: Low Success Rates and Common Errors Associated with Pro-Se Patent Applications

    PubMed Central

    Gaudry, Kate S.

    2012-01-01

    A pro-se patent applicant is an inventor who chooses to represent himself while pursuing (“prosecuting”) a patent application. To the author's knowledge, this paper is the first empirical study addressing how applications filed by pro-se inventors fare compared to applications in which inventors were represented by patent attorneys or agents. The prosecution history of 500 patent applications filed at the United States Patent and Trademark Office were analyzed: inventors were represented by a patent professional for 250 of the applications (“represented applications”) but not in the other 250 (“pro-se applications”). 76% of the pro-se applications became abandoned (not issuing as a patent), as compared to 35% of the represented applications. Further, among applications that issued as patents, pro-se patents' claims appear to be narrower and therefore of less value than claims in the represented patent set. Case-specific data suggests that a substantial portion of pro-se applicants unintentionally abandon their applications, terminate the examination process relatively early, and/or fail to take advantage of interview opportunities that may resolve issues stalling allowance of the application. PMID:22470439

  20. User's manual for THPLOT, A FORTRAN 77 Computer program for time history plotting

    NASA Technical Reports Server (NTRS)

    Murray, J. E.

    1982-01-01

    A general purpose FORTRAN 77 computer program (THPLOT) for plotting time histories using Calcomp pen plotters is described. The program is designed to read a time history data file and to generate time history plots for selected time intervals and/or selected data channels. The capabilities of the program are described. The card input required to define the plotting operation is described and examples of card input and the resulting plotted output are given. The examples are followed by a description of the printed output, including both normal output and error messages. Lastly, implementation of the program is described. A complete listing of the program with reference maps produced by the CDC FTN 5.0 compiler is included.

  1. 76 FR 56747 - Combined Notice of Filings #1

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-14

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission Combined Notice of Filings 1 Take notice.... Description: Black Hills Power, Inc. submits tariff filing per 35.1: MBR Tariff Baseline to be effective 8/1... through WestConnect stakeholder process to be effective 11/1/2011. Filed Date: 09/02/2011. Accession...

  2. 17 CFR 202.3 - Processing of filings.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... of 1939, which are routed to the Division of Investment Management, and filings under the Public Utility Holding Company Act of 1935 which are also routed to the Division of Investment Management. A... respect to filings under the Investment Company Act of 1940 and certain filings relating to investment...

  3. 42 CFR 93.510 - Filing motions.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 42 Public Health 1 2011-10-01 2011-10-01 false Filing motions. 93.510 Section 93.510 Public Health... MISCONDUCT Opportunity To Contest ORI Findings of Research Misconduct and HHS Administrative Actions Hearing Process § 93.510 Filing motions. (a) Parties must file all motions and requests for an order or ruling...

  4. 42 CFR 93.510 - Filing motions.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 1 2010-10-01 2010-10-01 false Filing motions. 93.510 Section 93.510 Public Health... MISCONDUCT Opportunity To Contest ORI Findings of Research Misconduct and HHS Administrative Actions Hearing Process § 93.510 Filing motions. (a) Parties must file all motions and requests for an order or ruling...

  5. 12 CFR 516.130 - Where are comments filed?

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 12 Banks and Banking 5 2010-01-01 2010-01-01 false Where are comments filed? 516.130 Section 516.130 Banks and Banking OFFICE OF THRIFT SUPERVISION, DEPARTMENT OF THE TREASURY APPLICATION PROCESSING PROCEDURES Comment Procedures § 516.130 Where are comments filed? A commenter must file with the appropriate...

  6. Design of housing file box of fire academy based on RFID

    NASA Astrophysics Data System (ADS)

    Li, Huaiyi

    2018-04-01

    This paper presents a design scheme of intelligent file box based on RFID. The advantages of RFID file box and traditional file box are compared and analyzed, and the feasibility of RFID file box design is analyzed based on the actual situation of our university. After introducing the shape and structure design of the intelligent file box, the paper discusses the working process of the file box, and explains in detail the internal communication principle of the RFID file box and the realization of the control system. The application of the RFID based file box will greatly improve the efficiency of our school's archives management.

  7. The Microcomputer as an Administrative/Educational Tool in Education of the Hearing Impaired.

    ERIC Educational Resources Information Center

    Graham, Richard

    1982-01-01

    Administrative and instructional uses of microcomputers with hearing impaired students (infants to junior high level) are described. Uses include data storage and retrieval, maintenance of student history files, storage of test data, and vocabulary reinforcement for students. (CL)

  8. Accelerating Malware Detection via a Graphics Processing Unit

    DTIC Science & Technology

    2010-09-01

    Processing Unit . . . . . . . . . . . . . . . . . . 4 PE Portable Executable . . . . . . . . . . . . . . . . . . . . . 4 COFF Common Object File Format...operating systems for the future [Szo05]. The PE format is an updated version of the common object file format ( COFF ) [Mic06]. Microsoft released a new...NAs02]. These alerts can be costly in terms of time and resources for individuals and organizations to investigate each misidentified file [YWL07] [Vak10

  9. Semi-Automated Processing of Trajectory Simulator Output Files for Model Evaluation

    DTIC Science & Technology

    2018-01-01

    ARL-TR-8284 ● JAN 2018 US Army Research Laboratory Semi-Automated Processing of Trajectory Simulator Output Files for Model......Do not return it to the originator. ARL-TR-8284 ● JAN 2018 US Army Research Laboratory Semi-Automated Processing of Trajectory

  10. TES Validation Reports

    Atmospheric Science Data Center

    2014-06-30

    ... Reports: TES Data Versions: TES Validation Report Version 6.0 (PDF) R13 processing version; F07_10 file versions TES Validation Report Version 5.0 (PDF) R12 processing version; F06_08, F06_09 file ...

  11. 75 FR 35846 - In the Matter of Babcock & Wilcox Nuclear Operations Group, Inc., Lynchburg, VA; Order Imposing...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-23

    ... under 10 CFR 2.315(c), must be filed in accordance with the NRC E-Filing rule (72 FR 49139, August 28, 2007). The E-Filing process requires participants to submit and serve all adjudicatory documents over... below. To comply with the procedural requirements of E-Filing, at least 10 days prior to the filing...

  12. 78 FR 13384 - In the Matter of FirstEnergy Nuclear Operating Co. (Beaver Valley Units 1 and 2); Confirmatory...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-27

    ... under 10 CFR 2.315(c), must be filed in accordance with NRC E-Filing rule (72 FR 49139; August 28, 2007). The E-Filing process requires participants to submit and serve all adjudicatory documents over the.... To comply with the procedural requirements of E-Filing, at least 10 days prior to the filing deadline...

  13. 76 FR 22935 - Calvert Cliffs Nuclear Power Plant, LLC Independent Spent Fuel Storage Installation; Notice of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-25

    ... accordance with the NRC E-Filing rule (72 FR 49139, August 28, 2007). The E-Filing process requires... requirements of E-Filing, at least ten (10) days prior to the filing deadline, the participant should contact... may attempt to use other software not listed on the Web site, but should note that the NRC's E-Filing...

  14. 78 FR 72120 - Tennessee Valley Authority Watts Bar Nuclear Plant Unit No. 2; Order Approving Extension of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-02

    ... under 10 CFR 2.315(c), must be filed in accordance with the NRC E-Filing rule (72 FR 49139, August 28, 2007). The E-Filing process requires participants to submit and serve all adjudicatory documents over... below. To comply with the procedural requirements of E-Filing, at least ten 10 days prior to the filing...

  15. 76 FR 70766 - Notice of Application From FMRI for Consent to an Indirect Change of Control for Source Material...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-15

    ... Submissions (E-Filing) All documents filed in NRC adjudicatory proceedings, including a request for hearing, a... entities participating under 10 CFR 2.315(c), must be filed in accordance with the NRC E-Filing rule (72 FR 49139, August 28, 2007). The E-Filing process requires participants to submit and serve all adjudicatory...

  16. VizieR Online Data Catalog: Li enrichment histories of the thick/thin disc (Fu+, 2018)

    NASA Astrophysics Data System (ADS)

    Fu, X.; Romano, D.; Bragaglia, A.; Mucciarelli, A.; Lind, K.; Delgado Mena, E.; Sousa, S. G.; Randich, S.; Bressan, A.; Sbordone, L.; Martell, S.; Korn, A. J.; Abia, C.; Smiljanic, R.; Jofre, P.; Pancino, E.; Tautvaisiene, G.; Tang, B.; Lanzafame, A. C.; Magrini, L.; Carraro, G.; Bensby, T.; Damiani, F.; Alfaro, E. J.; Flaccomio, E.; Morbidelli, L.; Zaggia, S.; Lardo, C.; Monaco, L.; Frasca, A.; Donati, P.; Drazdauskas, A.; Chorniy, Y.; Bayo, A.; Kordopatis, G.

    2017-11-01

    To investigate the Galactic lithium enrichment history we se- lect well-measured main sequence field stars with UVES spectra from the GES iDR4 catalogue. In our selection, 1884 UVES stars are marked as field stars, including those of the Galactic disc and halo designated as MW (GEMW) fields, standard CoRoT (GES D_CR) field, standard radial velocity (GES DRV) field, and stars to the Galactic Bulge direction (GEMWBL). (1 data file).

  17. Council of War: A History of the Joint Chiefs of Staff, 1942-1991

    DTIC Science & Technology

    2012-07-01

    PLANNING nuclear fission could produce enormous explosive power. Among those alarmed by the German breakthrough were Leo Szilard, a Hungarian expatriate...Letter, Roosevelt to Einstein, October 19, 1939, Safe File, PSF, Roosevelt Library. See also Leo Szilard,"Reminiscences," in Perspec- tives in American...i960, loc . cit. 85 Michael A. Palmer, Guardians of the Gulf: A History of America’s Expanding Role in the Persian Gulf, 1833-IQ92 (New York: Free

  18. Development of an indexed integrated neuroradiology reports for teaching file creation

    NASA Astrophysics Data System (ADS)

    Tameem, Hussain Z.; Morioka, Craig; Bennett, David; El-Saden, Suzie; Sinha, Usha; Taira, Ricky; Bui, Alex; Kangarloo, Hooshang

    2007-03-01

    The decrease in reimbursement rates for radiology procedures has placed even more pressure on radiology departments to increase their clinical productivity. Clinical faculties have less time for teaching residents, but with the advent and prevalence of an electronic environment that includes PACS, RIS, and HIS, there is an opportunity to create electronic teaching files for fellows, residents, and medical students. Experienced clinicians, who select the most appropriate radiographic image, and clinical information relevant to that patient, create these teaching files. Important cases are selected based on the difficulty in determining the diagnosis or the manifestation of rare diseases. This manual process of teaching file creation is time consuming and may not be practical under the pressure of increased demands on the radiologist. It is the goal of this research to automate the process of teaching file creation by manually selecting key images and automatically extracting key sections from clinical reports and laboratories. The text report is then processed for indexing to two standard nomenclatures UMLS and RADLEX. Interesting teaching files can then be queried based on specific anatomy and findings found within the clinical reports.

  19. Applying the Karma Provenance tool to NASA's AMSR-E Data Production Stream

    NASA Astrophysics Data System (ADS)

    Ramachandran, R.; Conover, H.; Regner, K.; Movva, S.; Goodman, H. M.; Pale, B.; Purohit, P.; Sun, Y.

    2010-12-01

    Current procedures for capturing and disseminating provenance, or data product lineage, are limited in both what is captured and how it is disseminated to the science community. For example, the Advanced Microwave Scanning Radiometer for the Earth Observing System (AMSR-E) Science Investigator-led Processing System (SIPS) generates Level 2 and Level 3 data products for a variety of geophysical parameters. Data provenance and quality information for these data sets is either very general (e.g., user guides, a list of anomalous data receipt and processing conditions over the life of the missions) or difficult to access or interpret (e.g., quality flags embedded in the data, production history files not easily available to users). Karma is a provenance collection and representation tool designed and developed for data driven workflows such as the productions streams used to produce EOS standard products. Karma records uniform and usable provenance metadata independent of the processing system while minimizing both the modification burden on the processing system and the overall performance overhead. Karma collects both the process and data provenance. The process provenance contains information about the workflow execution and the associated algorithm invocations. The data provenance captures metadata about the derivation history of the data product, including algorithms used and input data sources transformed to generate it. As part of an ongoing NASA funded project, Karma is being integrated into the AMSR-E SIPS data production streams. Metadata gathered by the tool will be presented to the data consumers as provenance graphs, which are useful in validating the workflows and determining the quality of the data product. This presentation will discuss design and implementation issues faced while incorporating a provenance tool into a structured data production flow. Prototype results will also be presented in this talk.

  20. Preliminary investigation of single-file diffusion in complex plasma rings

    NASA Astrophysics Data System (ADS)

    Theisen, W. L.; Sheridan, T. E.

    2010-04-01

    Particles in one-dimensional (1D) systems cannot pass each other. However, it is still possible to define a diffusion process where the mean-squared displacement (msd) of an ensemble of particles in a 1D chain increases with time t. This process is called single-file diffusion. In contrast to diffusive processes that follow Fick's law, msdt, single-file diffusion is sub-Fickean and the msd is predicted to increase as t^1/2. We have recently created 1D dusty (complex) plasma rings in the DONUT (Dusty ONU experimenT) apparatus. Particle position data from these rings will be analyzed to determine the scaling of the msd with time and results will be compared with predictions of single-file diffusion theory.

  1. Trigemino-autonomic headache and Horner syndrome as a first sign of granulomatous hypophysitis

    PubMed Central

    Kreitschmann-Andermahr, Ilonka; Fisse, Anna Lena; Börnke, Christian; Schroeder, Christoph; Pitarokoili, Kalliopi; Müller, Oliver; Lukas, Carsten; van de Nes, Johannes; Buslei, Rolf; Gold, Ralf; Ayzenberg, Ilya

    2017-01-01

    Objective: To report a rare case of incipient granulomatous hypophysitis presenting by atypical trigemino-autonomic cephalalgia (TAC) and Horner syndrome. Methods: The patient was investigated with repeated brain MRI, CSF examination, thoracic CT, Doppler and duplex ultrasound of the cerebral arteries, and extensive serologic screening for endocrine and autoimmune markers. Written informed consent was obtained from the patient for access to clinical files for research purposes and for publication. Results: We present a middle-aged woman with a history of an autoimmune pancreatitis type 2 who had therapy-refractory TAC with Horner syndrome. Initial cerebral MRI showed only indistinct and unspecific signs of a pathologic process. A biopsy revealed a granulomatous hypophysitis. The symptoms disappeared after transsphenoidal subtotal resection of the pituitary mass and anti-inflammatory therapy. Conclusions: This case elucidates that inflammatory pituitary diseases must be taken into account in case of atypical and refractory TAC, especially in patients with a history of autoimmune diseases. To our knowledge, the association between TAC accompanied by Horner syndrome and hypophysitis has not yet been described before. PMID:28243612

  2. GeoChange Global Change Data

    USGS Publications Warehouse

    ,

    1997-01-01

    GeoChange is an online data system providing access to research results and data generated by the U.S. Geological Survey's Global Change Research Program. Researchers in this program study climate history and the causes of climatic variations, as well as providing baseline data sets on contemporary atmospheric chemistry, high-resolution meteorology, and dust deposition. Research results are packaged as data sets, groups of digital files containing scientific observations, documentation, and interpretation. The data sets are arranged in a consistent manner using standard file formats so that users of a variety of computer systems can access and use them.

  3. Simulation of multi-photon emission isotopes using time-resolved SimSET multiple photon history generator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chiang, Chih-Chieh; Lin, Hsin-Hon; Lin, Chang-Shiun

    Abstract-Multiple-photon emitters, such as In-111 or Se-75, have enormous potential in the field of nuclear medicine imaging. For example, Se-75 can be used to investigate the bile acid malabsorption and measure the bile acid pool loss. The simulation system for emission tomography (SimSET) is a well-known Monte Carlo simulation (MCS) code in nuclear medicine for its high computational efficiency. However, current SimSET cannot simulate these isotopes due to the lack of modeling of complex decay scheme and the time-dependent decay process. To extend the versatility of SimSET for simulation of those multi-photon emission isotopes, a time-resolved multiple photon history generatormore » based on SimSET codes is developed in present study. For developing the time-resolved SimSET (trSimSET) with radionuclide decay process, the new MCS model introduce new features, including decay time information and photon time-of-flight information, into this new code. The half-life of energy states were tabulated from the Evaluated Nuclear Structure Data File (ENSDF) database. The MCS results indicate that the overall percent difference is less than 8.5% for all simulation trials as compared to GATE. To sum up, we demonstrated that time-resolved SimSET multiple photon history generator can have comparable accuracy with GATE and keeping better computational efficiency. The new MCS code is very useful to study the multi-photon imaging of novel isotopes that needs the simulation of lifetime and the time-of-fight measurements. (authors)« less

  4. Reporting Differences Between Spacecraft Sequence Files

    NASA Technical Reports Server (NTRS)

    Khanampompan, Teerapat; Gladden, Roy E.; Fisher, Forest W.

    2010-01-01

    A suite of computer programs, called seq diff suite, reports differences between the products of other computer programs involved in the generation of sequences of commands for spacecraft. These products consist of files of several types: replacement sequence of events (RSOE), DSN keyword file [DKF (wherein DSN signifies Deep Space Network)], spacecraft activities sequence file (SASF), spacecraft sequence file (SSF), and station allocation file (SAF). These products can include line numbers, request identifications, and other pieces of information that are not relevant when generating command sequence products, though these fields can result in the appearance of many changes to the files, particularly when using the UNIX diff command to inspect file differences. The outputs of prior software tools for reporting differences between such products include differences in these non-relevant pieces of information. In contrast, seq diff suite removes the fields containing the irrelevant pieces of information before processing to extract differences, so that only relevant differences are reported. Thus, seq diff suite is especially useful for reporting changes between successive versions of the various products and in particular flagging difference in fields relevant to the sequence command generation and review process.

  5. Image processing tool for automatic feature recognition and quantification

    DOEpatents

    Chen, Xing; Stoddard, Ryan J.

    2017-05-02

    A system for defining structures within an image is described. The system includes reading of an input file, preprocessing the input file while preserving metadata such as scale information and then detecting features of the input file. In one version the detection first uses an edge detector followed by identification of features using a Hough transform. The output of the process is identified elements within the image.

  6. 76 FR 79755 - First Meeting: RTCA Special Committee 226 Audio Systems and Equipment

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-22

    ... Administrative Remarks Introductions RTCA Overview Audio Systems and Equipment--Background and History Agenda..., Discussion, Recommendations and Assignment of Responsibilities Other Business Establish Agenda for Next..., Manager, Business Operations Branch, Federal Aviation Administration. [FR Doc. 2011-32863 Filed 12-21-11...

  7. Bread: CDC 7600 program that processes Spent Fuel Test Climax data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hage, G.L.

    BREAD will process a family of files copied from a data tape made by Hewlett-Packard equipment employed for data acquisition on the Spent Fuel Test-Climax at NTS. Tapes are delivered to Livermore approximately monthly. The process at this stage consists of four steps: read the binary files and convert from H-P 16-bit words to CDC 7600 60-bit words; check identification and data ranges; write the data in 6-bit ASCII (BCD) format, one data point per line; then sort the file by identifier and time.

  8. EPA Enforcement and Compliance History Online

    EPA Pesticide Factsheets

    The Environmental Protection Agency's Enforcement and Compliance History Online (ECHO) website provides customizable and downloadable information about environmental inspections, violations, and enforcement actions for EPA-regulated facilities related to the Clean Air Act, Clean Water Act, Resource Conservation and Recovery Act, and Safe Drinking Water Act. These data are updated weekly as part of the ECHO data refresh, and ECHO offers many user-friendly options to explore data, including:? Facility Search: ECHO information is searchable by varied criteria, including location, facility type, and compliance status. Search results are customizable and downloadable.? Comparative Maps and State Dashboards: These tools offer aggregated information about facility compliance status, regulatory agency compliance monitoring, and enforcement activity at the national and state level.? Bulk Data Downloads: One of ECHO??s most popular features is the ability to work offline by downloading large data sets. Users can take advantage of the ECHO Exporter, which provides summary information about each facility in comma-separated values (csv) file format, or download data sets by program as zip files.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sublet, J.-Ch.; Koning, A.J.; Forrest, R.A.

    The reasons for the conversion of the European Activation File, EAF into ENDF-6 format are threefold. First, it significantly enhances the JEFF-3.0 release by the addition of an activation file. Second, to considerably increase its usage by using a recognized, official file format, allowing existing plug-in processes to be effective; and third, to move towards a universal nuclear data file in contrast to the current separate general and special-purpose files. The format chosen for the JEFF-3.0/A file uses reaction cross sections (MF-3), cross sections (MF-10), and multiplicities (MF-9). Having the data in ENDF-6 format allows the ENDF suite of utilitiesmore » and checker codes to be used alongside many other utility, visualizing, and processing codes. It is based on the EAF activation file used for many applications from fission to fusion, including dosimetry, inventories, depletion-transmutation, and geophysics. JEFF-3.0/A takes advantage of four generations of EAF files. Extensive benchmarking activities on these files provide feedback and validation with integral measurements. These, in parallel with a detailed graphical analysis based on EXFOR, have been applied stimulating new measurements, significantly increasing the quality of this activation file. The next step is to include the EAF uncertainty data for all channels into JEFF-3.0/A.« less

  10. 78 FR 143 - Desert Mining, Inc., Eagle Broadband, Inc., Endovasc, Inc., Environmental Oil Processing...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-02

    ... SECURITIES AND EXCHANGE COMMISSION [File No. 500-1] Desert Mining, Inc., Eagle Broadband, Inc., Endovasc, Inc., Environmental Oil Processing Technology Corp., Falcon Ridge Development, Inc., Fellows... Environmental Oil Processing Technology Corp. because it has not filed any periodic reports since the period...

  11. Project Integration Architecture (PIA) and Computational Analysis Programming Interface (CAPRI) for Accessing Geometry Data from CAD Files

    NASA Technical Reports Server (NTRS)

    Benyo, Theresa L.

    2002-01-01

    Integration of a supersonic inlet simulation with a computer aided design (CAD) system is demonstrated. The integration is performed using the Project Integration Architecture (PIA). PIA provides a common environment for wrapping many types of applications. Accessing geometry data from CAD files is accomplished by incorporating appropriate function calls from the Computational Analysis Programming Interface (CAPRI). CAPRI is a CAD vendor neutral programming interface that aids in acquiring geometry data directly from CAD files. The benefits of wrapping a supersonic inlet simulation into PIA using CAPRI are; direct access of geometry data, accurate capture of geometry data, automatic conversion of data units, CAD vendor neutral operation, and on-line interactive history capture. This paper describes the PIA and the CAPRI wrapper and details the supersonic inlet simulation demonstration.

  12. Reducing Bits in Electrodeposition Process of Commercial Vehicle - A Case Study

    NASA Astrophysics Data System (ADS)

    Rahim, Nabiilah Ab; Hamedon, Zamzuri; Mohd Turan, Faiz; Iskandar, Ismed

    2016-02-01

    Painting process is critical in commercial vehicle manufacturing process for protection and decorative. The good quality on painted body is important to reduce repair cost and achieve customer satisfaction. In order to achieve the good quality, it is important to reduce the defect at the first process in painting process which is electrodeposition process. The Pareto graph and cause and effect diagram in the seven QC tools is utilized to reduce the electrodeposition defects. The main defects in the electrodeposition process in this case study are the bits. The 55% of the bits are iron filings. The iron filings which come from the metal assembly process at the body shop are minimised by controlling the spot welding parameter, defect control and standard body cleaning process. However the iron filings are still remained on the body and carry over to the paint shop. The remained iron filings on the body are settled inside the dipping tank and removed by filtration system and magnetic separation. The implementation of filtration system and magnetic separation improved 27% of bits and reduced 42% of sanding man hour with a total saving of RM38.00 per unit.

  13. Least-Squares Neutron Spectral Adjustment with STAYSL PNNL

    NASA Astrophysics Data System (ADS)

    Greenwood, L. R.; Johnson, C. D.

    2016-02-01

    The STAYSL PNNL computer code, a descendant of the STAY'SL code [1], performs neutron spectral adjustment of a starting neutron spectrum, applying a least squares method to determine adjustments based on saturated activation rates, neutron cross sections from evaluated nuclear data libraries, and all associated covariances. STAYSL PNNL is provided as part of a comprehensive suite of programs [2], where additional tools in the suite are used for assembling a set of nuclear data libraries and determining all required corrections to the measured data to determine saturated activation rates. Neutron cross section and covariance data are taken from the International Reactor Dosimetry File (IRDF-2002) [3], which was sponsored by the International Atomic Energy Agency (IAEA), though work is planned to update to data from the IAEA's International Reactor Dosimetry and Fusion File (IRDFF) [4]. The nuclear data and associated covariances are extracted from IRDF-2002 using the third-party NJOY99 computer code [5]. The NJpp translation code converts the extracted data into a library data array format suitable for use as input to STAYSL PNNL. The software suite also includes three utilities to calculate corrections to measured activation rates. Neutron self-shielding corrections are calculated as a function of neutron energy with the SHIELD code and are applied to the group cross sections prior to spectral adjustment, thus making the corrections independent of the neutron spectrum. The SigPhi Calculator is a Microsoft Excel spreadsheet used for calculating saturated activation rates from raw gamma activities by applying corrections for gamma self-absorption, neutron burn-up, and the irradiation history. Gamma self-absorption and neutron burn-up corrections are calculated (iteratively in the case of the burn-up) within the SigPhi Calculator spreadsheet. The irradiation history corrections are calculated using the BCF computer code and are inserted into the SigPhi Calculator workbook for use in correcting the measured activities. Output from the SigPhi Calculator is automatically produced, and consists of a portion of the STAYSL PNNL input file data that is required to run the spectral adjustment calculations. Within STAYSL PNNL, the least-squares process is performed in one step, without iteration, and provides rapid results on PC platforms. STAYSL PNNL creates multiple output files with tabulated results, data suitable for plotting, and data formatted for use in subsequent radiation damage calculations using the SPECTER computer code (which is not included in the STAYSL PNNL suite). All components of the software suite have undergone extensive testing and validation prior to release and test cases are provided with the package.

  14. RivGen, Igiugig Deployment, Control System Specifications and Models

    DOE Data Explorer

    Forbush, Dominic; Cavagnaro, Robert J.; Guerra, Maricarmen; Donegan, James; McEntee, Jarlath; Thomson, Jim; Polagye, Brian; Fabien, Brian; Kilcher, Levi

    2016-03-21

    Control System simulation models, case studies, and processing codes for analyzing field data. Raw data files included from VFD and SCADA. MatLab and Simulink are required to open some data files and all model files.

  15. Changing an automated drug inventory control system to a data base design.

    PubMed

    Bradish, R A

    1982-09-01

    A pharmacy department's change from indexed sequential access files to a data base management system (DBMS) for purposes of automated inventory control is described. The DBMS has three main functional areas: (1) inventory ordering and accountability, (2) charging of interdepartmental and intradepartmental orders, and (3) data manipulation with report design for management control. There are seven files directly related to the inventory ordering and accountability area. Each record can be accessed directly or through another file. Information on the quantity of a drug on hand, drug(s) supplied by a specific vendor, status of a purchase order, or calculation of an estimated order quantity can be retrieved quickly. In the drug master file, two records contain a reorder point and safety-stock level that are determined by searching the entries in the order history file and vendor master file. The intradepartmental and interdepartmental orders section contains five files assigned to record and store information on drug distribution. All items removed from the stockroom and distributed are recorded, and reports can be generated for itemized bills, total cost by area, and as formatted files for the accounts payable department. The design, development, and implementation of the DBMS took approximately a year using a part-time pharmacist and minimal outside help, while the previous system required constant expensive help of a programmer/analyst. The DBMS has given the pharmacy department a flexible inventory management system with increased drug control, decreased operating expenses, increased use of department personnel, and the ability to develop and enhance other systems.

  16. Automating Reference Desk Files with Microcomputers in a Public Library: An Exploration of Data Resources, Methods, and Software.

    ERIC Educational Resources Information Center

    Miley, David W.

    Many reference librarians still rely on manual searches to access vertical files, ready reference files, and other information stored in card files, drawers, and notebooks scattered around the reference department. Automated access to these materials via microcomputers using database management software may speed up the process. This study focuses…

  17. 76 FR 40906 - Pacific Gas and Electric Company; Notice of Intent To File License Application, Filing of Pre...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-12

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Project No. 2678-005] Pacific Gas and... Application for a Subsequent License and Commencing Pre-filing Process. b. Project No.: 2678-005. c. Dated Filed: April 29, 2011. d. Submitted By: Pacific Gas and Electric Company. e. Name of Project: Narrows No...

  18. 78 FR 55069 - Whitewater Green Energy, LLC; Notice of Intent to File License Application, Filing of Pre...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-09

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Project No. 14383-005] Whitewater Green... Application for an Original License and Commencing Pre-filing Process. b. Project No.: 14383-005. c. Dated Filed: July 8, 2013. d. Submitted By: Whitewater Green Energy, LLC. e. Name of Project: Whitewater Creek...

  19. Internet for Library Media Specialists.

    ERIC Educational Resources Information Center

    Simpson, Carol Mann

    This guide introduces the library media specialist to the Internet, its history and features, and provides information on specific uses of the Internet in school libraries and specific areas. Section 1, "What is the Internet?" introduces the reader to the Internet; electronic mail; telnet; file transfer protocol (FTP); wide area…

  20. Optimization of Hybrid-Electric Propulsion Systems for Small Remotely-Piloted Aircraft

    DTIC Science & Technology

    2011-03-24

    automobile manufacturer has developed its version of a HEV. In 2008, a group from the University of Padova, Italy designed a surface-mounted permanent...File:Hybridpeak.png [8] Ernest H. Wakefield, History of the Electric Automobile : Hybrid Electric Vehicles. Warrendale, PA: Society of Automotive

  1. 17 CFR 229.301 - (Item 301) Selected financial data.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... subsidiaries consolidated. 4. If interim period financial statements are included, or are required to be... issuer shall disclose also the following information in all filings containing financial statements: A.... currency of the foreign currency in which the financial statements are denominated; B. A history of...

  2. Data Processing Aspects of MEDLARS

    PubMed Central

    Austin, Charles J.

    1964-01-01

    The speed and volume requirements of MEDLARS necessitate the use of high-speed data processing equipment, including paper-tape typewriters, a digital computer, and a special device for producing photo-composed output. Input to the system is of three types: variable source data, including citations from the literature and search requests; changes to such master files as the medical subject headings list and the journal record file; and operating instructions such as computer programs and procedures for machine operators. MEDLARS builds two major stores of data on magnetic tape. The Processed Citation File includes bibliographic citations in expanded form for high-quality printing at periodic intervals. The Compressed Citation File is a coded, time-sequential citation store which is used for high-speed searching against demand request input. Major design considerations include converting variable-length, alphanumeric data to mechanical form quickly and accurately; serial searching by the computer within a reasonable period of time; high-speed printing that must be of graphic quality; and efficient maintenance of various complex computer files. PMID:14119287

  3. DATA PROCESSING ASPECTS OF MEDLARS.

    PubMed

    AUSTIN, C J

    1964-01-01

    The speed and volume requirements of MEDLARS necessitate the use of high-speed data processing equipment, including paper-tape typewriters, a digital computer, and a special device for producing photo-composed output. Input to the system is of three types: variable source data, including citations from the literature and search requests; changes to such master files as the medical subject headings list and the journal record file; and operating instructions such as computer programs and procedures for machine operators. MEDLARS builds two major stores of data on magnetic tape. The Processed Citation File includes bibliographic citations in expanded form for high-quality printing at periodic intervals. The Compressed Citation File is a coded, time-sequential citation store which is used for high-speed searching against demand request input. Major design considerations include converting variable-length, alphanumeric data to mechanical form quickly and accurately; serial searching by the computer within a reasonable period of time; high-speed printing that must be of graphic quality; and efficient maintenance of various complex computer files.

  4. A Scientific Data Provenance API for Distributed Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Raju, Bibi; Elsethagen, Todd O.; Stephan, Eric G.

    Data provenance has been an active area of research as a means to standardize how the origin of data, process event history, and what or who was responsible for influencing results is explained. There are two approaches to capture provenance information. The first approach is to collect observed evidence produced by an executing application using log files, event listeners, and temporary files that are used by the application or application developer. The provenance translated from these observations is an interpretation of the provided evidence. The second approach is called disclosed because the application provides a firsthand account of the provenancemore » based on the anticipated questions on data flow, process flow, and responsible agents. Most observed provenance collection systems collect lot of provenance information during an application run or workflow execution. The common trend in capturing provenance is to collect all possible information, then attempt to find relevant information, which is not efficient. Existing disclosed provenance system APIs do not work well in distributed environment and have trouble finding where to fit the individual pieces of provenance information. This work focuses on determining more reliable solutions for provenance capture. As part of the Integrated End-to-end Performance Prediction and Diagnosis for Extreme Scientific Workflows (IPPD) project, an API was developed, called Producer API (PAPI), which can disclose application targeted provenance, designed to work in distributed environments by means of unique object identification methods. The provenance disclosure approach used adds additional metadata to the provenance information to uniquely identify the pieces and connect them together. PAPI uses a common provenance model to support this provenance integration across disclosure sources. The API also provides the flexibility to let the user decide what to do with the collected provenance. The collected provenance can be sent to a triple store using REST services or it can be logged to a file.« less

  5. Development of a user-friendly system for image processing of electron microscopy by integrating a web browser and PIONE with Eos.

    PubMed

    Tsukamoto, Takafumi; Yasunaga, Takuo

    2014-11-01

    Eos (Extensible object-oriented system) is one of the powerful applications for image processing of electron micrographs. In usual cases, Eos works with only character user interfaces (CUI) under the operating systems (OS) such as OS-X or Linux, not user-friendly. Thus, users of Eos need to be expert at image processing of electron micrographs, and have a little knowledge of computer science, as well. However, all the persons who require Eos does not an expert for CUI. Thus we extended Eos to a web system independent of OS with graphical user interfaces (GUI) by integrating web browser.Advantage to use web browser is not only to extend Eos with GUI, but also extend Eos to work under distributed computational environment. Using Ajax (Asynchronous JavaScript and XML) technology, we implemented more comfortable user-interface on web browser. Eos has more than 400 commands related to image processing for electron microscopy, and the usage of each command is different from each other. Since the beginning of development, Eos has managed their user-interface by using the interface definition file of "OptionControlFile" written in CSV (Comma-Separated Value) format, i.e., Each command has "OptionControlFile", which notes information for interface and its usage generation. Developed GUI system called "Zephyr" (Zone for Easy Processing of HYpermedia Resources) also accessed "OptionControlFIle" and produced a web user-interface automatically, because its mechanism is mature and convenient,The basic actions of client side system was implemented properly and can supply auto-generation of web-form, which has functions of execution, image preview, file-uploading to a web server. Thus the system can execute Eos commands with unique options for each commands, and process image analysis. There remain problems of image file format for visualization and workspace for analysis: The image file format information is useful to check whether the input/output file is correct and we also need to provide common workspace for analysis because the client is physically separated from a server. We solved the file format problem by extension of rules of OptionControlFile of Eos. Furthermore, to solve workspace problems, we have developed two type of system. The first system is to use only local environments. The user runs a web server provided by Eos, access to a web client through a web browser, and manipulate the local files with GUI on the web browser. The second system is employing PIONE (Process-rule for Input/Output Negotiation Environment), which is our developing platform that works under heterogenic distributed environment. The users can put their resources, such as microscopic images, text files and so on, into the server-side environment supported by PIONE, and so experts can write PIONE rule definition, which defines a workflow of image processing. PIONE run each image processing on suitable computers, following the defined rule. PIONE has the ability of interactive manipulation, and user is able to try a command with various setting values. In this situation, we contribute to auto-generation of GUI for a PIONE workflow.As advanced functions, we have developed a module to log user actions. The logs include information such as setting values in image processing, procedure of commands and so on. If we use the logs effectively, we can get a lot of advantages. For example, when an expert may discover some know-how of image processing, other users can also share logs including his know-hows and so we may obtain recommendation workflow of image analysis, if we analyze logs. To implement social platform of image processing for electron microscopists, we have developed system infrastructure, as well. © The Author 2014. Published by Oxford University Press on behalf of The Japanese Society of Microscopy. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  6. Cleaning of endodontic files, Part I: The effect of bioburden on the sterilization of endodontic files.

    PubMed

    Johnson, M A; Primack, P D; Loushine, R J; Craft, D W

    1997-01-01

    Ninety-two new endodontic files were randomly assigned to five groups with varying parameters of contamination, cleaning method, and sterilization (steam or chemical). Files were instrumented in bovine teeth to accumulate debris and a known contaminant, Bacillus stearothermophilus. Positive controls produced growth on both T-soy agar plates and in T-soy broth. Negative controls and experimental files (some with heavy debris) failed to produce growth. The results showed that there was no significant difference between contaminated files that were not cleaned before sterilization and contaminated files that were cleaned before sterilization. Bioburden present on endodontic files does not appear to affect the sterilization process.

  7. 47 CFR 1.1703 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... (CARS). All services authorized under part 78 of this title. (e) Filings. Any application, notification... conveyed by operation of rule upon filing notification of aeronautical frequency usage by MVPDs or... database, application filing system, and processing system for Multichannel Video and Cable Television...

  8. PLEXOS Input Data Generator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    The PLEXOS Input Data Generator (PIDG) is a tool that enables PLEXOS users to better version their data, automate data processing, collaborate in developing inputs, and transfer data between different production cost modeling and other power systems analysis software. PIDG can process data that is in a generalized format from multiple input sources, including CSV files, PostgreSQL databases, and PSS/E .raw files and write it to an Excel file that can be imported into PLEXOS with only limited manual intervention.

  9. 24 CFR 103.20 - Can someone help me with filing a claim?

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 24 Housing and Urban Development 1 2011-04-01 2011-04-01 false Can someone help me with filing a... FAIR HOUSING FAIR HOUSING-COMPLAINT PROCESSING Complaints § 103.20 Can someone help me with filing a claim? HUD's Office of Fair Housing and Equal Opportunity can help you in filing a claim, if you contact...

  10. 24 CFR 103.20 - Can someone help me with filing a claim?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 24 Housing and Urban Development 1 2010-04-01 2010-04-01 false Can someone help me with filing a... FAIR HOUSING FAIR HOUSING-COMPLAINT PROCESSING Complaints § 103.20 Can someone help me with filing a claim? HUD's Office of Fair Housing and Equal Opportunity can help you in filing a claim, if you contact...

  11. 78 FR 101 - Guidance for Industry and Food and Drug Administration Staff; Acceptance and Filing Reviews for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-02

    ... element(s). In order to enhance the consistency of our acceptance and filing decisions and to help..., thereby assuring the consistency of our acceptance and filing decisions. This guidance is applicable to... issues that need to be addressed in a PMA and the key decisions to be made during the filing process. The...

  12. 47 CFR 1.10009 - What are the steps for electronic filing?

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 47 Telecommunication 1 2010-10-01 2010-10-01 false What are the steps for electronic filing? 1... International Bureau Filing System § 1.10009 What are the steps for electronic filing? (a) Step 1: Register for... an FRN, go to Step 2. (2) In order to process your electronic application, you must have an FRN. You...

  13. 78 FR 66970 - In the Matter of Michael J. Buhrman; Order Prohibiting Involvement in NRC-Licensed Activities...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-07

    ... governmental entities participating under 10 CFR 2.315(c), must be filed in accordance with the NRC E-Filing rule (72 FR 49139, August 28, 2007). The E-Filing process requires participants to submit and serve all... procedures described below. To comply with the procedural requirements of E-Filing, at least ten 10 days...

  14. 76 FR 54261 - Carolina Power & Light; H.B. Robinson Steam Electric Plant, Unit No. 2; HBRSEP Independent Spent...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-31

    ... via electronic submission through the NRC E-Filing system. Requests for a hearing and petitions for... entities participating under 10 CFR 2.315(c), must be filed in accordance with the NRC E-Filing rule (72 FR 49139, August 28, 2007). The E-Filing process requires participants to submit and serve all adjudicatory...

  15. 75 FR 11202 - Southern Nuclear Operating Company; Notice of Consideration of Issuance of Amendment to Facility...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-10

    ... governmental entities participating under 10 CFR 2.315(c), must be filed in accordance with the NRC E-Filing rule (72 FR 49139, August 28, 2007). The E-Filing process requires participants to submit and serve all... accordance with the procedures described below. To comply with the procedural requirements of E-Filing, at...

  16. 75 FR 42465 - Exelon Generation Company, LLC; Notice of Consideration of Issuance of Amendment to Facility...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-21

    ... governmental entities participating under 10 CFR 2.315(c), must be filed in accordance with the NRC E-Filing rule (72 FR 49139, August 28, 2007). The E-Filing process requires participants to submit and serve all... procedures described below. To comply with the procedural requirements of E-Filing, at least ten (10) days...

  17. 76 FR 41532 - Yankee Atomic Electric Company, Yankee Nuclear Power Station (Yankee-Rowe); Notice of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-14

    ... electronic submission through the NRC E-filing system. Requests for a hearing and petitions for leave to... entities participating under 10 CFR 2.315(c), must be filed in accordance with the NRC E-Filing rule (72 FR 49139, August 28, 2007). The E-Filing process requires participants to submit and serve all adjudicatory...

  18. 76 FR 53970 - Carolina Power & Light; Brunswick Steam Electric Plant, Units 1 and 2; Independent Spent Fuel...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-30

    ... submission through the NRC E-Filing system. Requests for a hearing and petitions for leave to intervene... participating under 10 CFR 2.315(c), must be filed in accordance with the NRC E-Filing rule (72 FR 49139, August 28, 2007). The E-Filing process requires participants to submit and serve all adjudicatory documents...

  19. 76 FR 41530 - Connecticut Yankee Atomic Power Company, Haddam Neck Plant; Notice of Consideration of Approval...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-14

    ... intervention via electronic submission through the NRC E-filing system. Requests for a hearing and petitions... entities participating under 10 CFR 2.315(c), must be filed in accordance with the NRC E-Filing rule (72 FR 49139, August 28, 2007). The E-Filing process requires participants to submit and serve all adjudicatory...

  20. 75 FR 69707 - Exelon Generation Company, LLC; Notice of Consideration of Issuance of Amendment to Facility...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-15

    ... NRC E-Filing rule (72 FR 49139, August 28, 2007). The E-Filing process requires participants to submit... accordance with the procedures described below. To comply with the procedural requirements of E-Filing, at... software not listed on the Web site, but should note that the NRC's E-Filing system does not support...

  1. 78 FR 2451 - Grand Gulf ESP Site; Consideration of Approval of Application Regarding Proposed Creation of a...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-11

    ... request a hearing and intervention via electronic submission through the NRC's E-filing system. Requests... entities participating under 10 CFR 2.315(c), must be filed in accordance with the NRC's E-Filing rule (72 FR 49139; August 28, 2007). The E-Filing process requires participants to submit and serve all...

  2. 75 FR 11572 - Notice of Acceptance for Docketing of the Application, Notice of Opportunity for Hearing...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-11

    ... governmental entities participating under 10 CFR 2.315(c), must be filed in accordance with the NRC E-Filing rule (72 FR 49139, August 28, 2007). The E-Filing process requires participants to submit and serve all... procedures described below. To comply with the procedural requirements of E-Filing, at least ten (10) days...

  3. 78 FR 14362 - Tennessee Valley Authority; Notice of Acceptance for Docketing of Application and Notice of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-05

    ... participating under 10 CFR 2.315(c), must be filed in accordance with the NRC E-Filing rule (72 FR 49139, August 28, 2007). The E-Filing process requires participants to submit and serve all adjudicatory documents... described below. To comply with the procedural requirements of E-Filing, at least 10 days prior to the...

  4. 76 FR 66713 - Hydro Green Energy, LLC; Notice of Intent To File License Applications and Approving Use of the...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-27

    ...-001, 13424-001, 13809-001, 13649-001, 13651-001] Hydro Green Energy, LLC; Notice of Intent To File... Intent To File License Applications and Approving Requests to Use the Traditional Licensing Process. b... Filed: August 23, 2011. d. Submitted By: Hydro Green Energy, LLC (Hydro Green). e. Name of Projects...

  5. 29 CFR 1640.8 - Processing of complaints or charges of employment discrimination filed with both the EEOC and a...

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... discrimination filed with both the EEOC and a section 504 agency. 1640.8 Section 1640.8 Labor Regulations... complaints or charges of employment discrimination filed with both the EEOC and a section 504 agency. (a) Procedures for handling dual-filed complaints or charges. As between the EEOC and a section 504 agency...

  6. 29 CFR 1640.8 - Processing of complaints or charges of employment discrimination filed with both the EEOC and a...

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... discrimination filed with both the EEOC and a section 504 agency. 1640.8 Section 1640.8 Labor Regulations... complaints or charges of employment discrimination filed with both the EEOC and a section 504 agency. (a) Procedures for handling dual-filed complaints or charges. As between the EEOC and a section 504 agency...

  7. 29 CFR 1640.8 - Processing of complaints or charges of employment discrimination filed with both the EEOC and a...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... discrimination filed with both the EEOC and a section 504 agency. 1640.8 Section 1640.8 Labor Regulations... complaints or charges of employment discrimination filed with both the EEOC and a section 504 agency. (a) Procedures for handling dual-filed complaints or charges. As between the EEOC and a section 504 agency...

  8. 29 CFR 1640.8 - Processing of complaints or charges of employment discrimination filed with both the EEOC and a...

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... discrimination filed with both the EEOC and a section 504 agency. 1640.8 Section 1640.8 Labor Regulations... complaints or charges of employment discrimination filed with both the EEOC and a section 504 agency. (a) Procedures for handling dual-filed complaints or charges. As between the EEOC and a section 504 agency...

  9. 76 FR 76705 - Inside Passage Electric Cooperative; Notice of Intent To File License Application, Filing of Pre...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-08

    ... Application and Request to Use the Traditional Licensing Process. b. Project No.: 14066-001. c. Date Filed... regulations. h. Potential Applicant Contact: Peter Bibb, Inside Passage Electric Cooperative, P.O. Box 210149... Hansen at (202) 502-8074; or email at ryan.hansen@ferc.gov . j. IPEC filed its request to use the...

  10. Data files from the Grays Harbor Sediment Transport Experiment Spring 2001

    USGS Publications Warehouse

    Landerman, Laura A.; Sherwood, Christopher R.; Gelfenbaum, Guy; Lacy, Jessica; Ruggiero, Peter; Wilson, Douglas; Chisholm, Tom; Kurrus, Keith

    2005-01-01

    This publication consists of two DVD-ROMs, both of which are presented here. This report describes data collected during the Spring 2001 Grays Harbor Sediment Transport Experiment, and provides additional information needed to interpret the data. Two DVDs accompany this report; both contain documentation in html format that assist the user in navigating through the data. DVD-ROM-1 contains a digital version of this report in .pdf format, raw Aquatec acoustic backscatter (ABS) data in .zip format, Sonar data files in .avi format, and coastal processes and morphology data in ASCII format. ASCII data files are provided in .zip format; bundled coastal processes ASCII files are separated by deployment and instrument; bundled morphology ASCII files are separated into monthly data collection efforts containing the beach profiles collected (or extracted from the surface map) at that time; weekly surface maps are also bundled together. DVD-ROM-2 contains a digital version of this report in .pdf format, the binary data files collected by the SonTek instrumentation, calibration files for the pressure sensors, and Matlab m-files for loading the ABS data into Matlab and cleaning-up the optical backscatter (OBS) burst time-series data.

  11. 77 FR 5574 - United States v. Grupo Bimbo, S.A.B. de C.V., et al.; Public Comment and Response on Proposed...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-03

    .... Procedural History On October 21, 2011, the United States filed a civil antitrust lawsuit against Defendants... hypercritically, nor with a microscope, but with an artist's reducing glass''); see generally Microsoft, 56 F.3d...

  12. Computer program documentation: CYBER to Univac binary conversion user's guide

    NASA Technical Reports Server (NTRS)

    Martin, E. W.

    1980-01-01

    A user's guide for a computer program which will convert SINDA temperature history data from CDC (Cyber) binary format to UNIVAC 1100 binary format is presented. The various options available, the required input, the optional output, file assignments, and the restrictions of the program are discussed.

  13. Internet Resources on Aging: Parts of the Internet.

    ERIC Educational Resources Information Center

    Post, Joyce A.

    1996-01-01

    Provides a brief history of the Internet and a listing of various resources on aging that can be obtained through the Internet. Components of the Internet discussed are electronic-mail applications (listservs, USENET Newsgroups, Bulletin Board Systems, Freenets, and Commercial Services); File Transfer Protocol; Telnet/Remote Login; Gophers; Wide…

  14. 78 FR 47830 - Privacy Act of 1974; Report of Matching Program

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-06

    ... of Veterans Affairs. ACTION: Notice of Computer Matching Program. SUMMARY: The Department of Veterans Affairs (VA) provides notice that it intends to conduct a recurring computer matching program matching... necessary information from RRB-26: Payment, Rate, and Entitlement History File, published at 75 FR 43729...

  15. Labor Market Turnover and Joblessness for Hispanic American Youth.

    ERIC Educational Resources Information Center

    Stephenson, Stanley P., Jr.

    Using data from the National Longitudinal Survey of Youth's continuous work history files, this paper examines how individual and market characteristics influence the unemployment rates of Hispanic youth. The results show that family income, marital status, post-school vocational education, age, and local unemployment rates significantly influence…

  16. 16. Photocopy of original USRS photograph (from original print in ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    16. Photocopy of original USRS photograph (from original print in the Umatilla Project History 1918, on file at National Archives, Rocky Mountain Region, Denver, Colorado) Photographer unknown, ca. 1918. Office of U.S. Reclamation Service - Hermiston, Umatilla Project, Oregon - Former Umatilla Project Headquarters Buildings, Office, Hermiston, Umatilla County, OR

  17. Reprocessing of multi-channel seismic-reflection data collected in the Beaufort Sea

    USGS Publications Warehouse

    Agena, W.F.; Lee, Myung W.; Hart, P.E.

    2000-01-01

    Contained on this set of two CD-ROMs are stacked and migrated multi-channel seismic-reflection data for 65 lines recorded in the Beaufort Sea by the United States Geological Survey in 1977. All data were reprocessed by the USGS using updated processing methods resulting in improved interpretability. Each of the two CD-ROMs contains the following files: 1) 65 files containing the digital seismic data in standard, SEG-Y format; 2) 1 file containing navigation data for the 65 lines in standard SEG-P1 format; 3) an ASCII text file with cross-reference information for relating the sequential trace numbers on each line to cdp numbers and shotpoint numbers; 4) 2 small scale graphic images (stacked and migrated) of a segment of line 722 in Adobe Acrobat (R) PDF format; 5) a graphic image of the location map, generated from the navigation file; 6) PlotSeis, an MS-DOS Application that allows PC users to interactively view the SEG-Y files; 7) a PlotSeis documentation file; and 8) an explanation of the processing used to create the final seismic sections (this document).

  18. 78 FR 41014 - Online Political File and Petition for Reconsideration

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-09

    ... Political File and Petition for Reconsideration AGENCY: Federal Communications Commission. ACTION: Notice... of the rules requiring broadcast television stations to post their political files online, and on a... instructions for submitting comments and additional information on the rulemaking process, see the...

  19. Mission Operations Center (MOC) - Precipitation Processing System (PPS) Interface Software System (MPISS)

    NASA Technical Reports Server (NTRS)

    Ferrara, Jeffrey; Calk, William; Atwell, William; Tsui, Tina

    2013-01-01

    MPISS is an automatic file transfer system that implements a combination of standard and mission-unique transfer protocols required by the Global Precipitation Measurement Mission (GPM) Precipitation Processing System (PPS) to control the flow of data between the MOC and the PPS. The primary features of MPISS are file transfers (both with and without PPS specific protocols), logging of file transfer and system events to local files and a standard messaging bus, short term storage of data files to facilitate retransmissions, and generation of file transfer accounting reports. The system includes a graphical user interface (GUI) to control the system, allow manual operations, and to display events in real time. The PPS specific protocols are an enhanced version of those that were developed for the Tropical Rainfall Measuring Mission (TRMM). All file transfers between the MOC and the PPS use the SSH File Transfer Protocol (SFTP). For reports and data files generated within the MOC, no additional protocols are used when transferring files to the PPS. For observatory data files, an additional handshaking protocol of data notices and data receipts is used. MPISS generates and sends to the PPS data notices containing data start and stop times along with a checksum for the file for each observatory data file transmitted. MPISS retrieves the PPS generated data receipts that indicate the success or failure of the PPS to ingest the data file and/or notice. MPISS retransmits the appropriate files as indicated in the receipt when required. MPISS also automatically retrieves files from the PPS. The unique feature of this software is the use of both standard and PPS specific protocols in parallel. The advantage of this capability is that it supports users that require the PPS protocol as well as those that do not require it. The system is highly configurable to accommodate the needs of future users.

  20. Generalized File Management System or Proto-DBMS?

    ERIC Educational Resources Information Center

    Braniff, Tom

    1979-01-01

    The use of a data base management system (DBMS) as opposed to traditional data processing is discussed. The generalized file concept is viewed as an entry level step to the DBMS. The transition process from one system to the other is detailed. (SF)

  1. Distributed Data Collection for the ATLAS EventIndex

    NASA Astrophysics Data System (ADS)

    Sánchez, J.; Fernández Casaní, A.; González de la Hoz, S.

    2015-12-01

    The ATLAS EventIndex contains records of all events processed by ATLAS, in all processing stages. These records include the references to the files containing each event (the GUID of the file) and the internal pointer to each event in the file. This information is collected by all jobs that run at Tier-0 or on the Grid and process ATLAS events. Each job produces a snippet of information for each permanent output file. This information is packed and transferred to a central broker at CERN using an ActiveMQ messaging system, and then is unpacked, sorted and reformatted in order to be stored and catalogued into a central Hadoop server. This contribution describes in detail the Producer/Consumer architecture to convey this information from the running jobs through the messaging system to the Hadoop server.

  2. ColorTree: a batch customization tool for phylogenic trees

    PubMed Central

    Chen, Wei-Hua; Lercher, Martin J

    2009-01-01

    Background Genome sequencing projects and comparative genomics studies typically aim to trace the evolutionary history of large gene sets, often requiring human inspection of hundreds of phylogenetic trees. If trees are checked for compatibility with an explicit null hypothesis (e.g., the monophyly of certain groups), this daunting task is greatly facilitated by an appropriate coloring scheme. Findings In this note, we introduce ColorTree, a simple yet powerful batch customization tool for phylogenic trees. Based on pattern matching rules, ColorTree applies a set of customizations to an input tree file, e.g., coloring labels or branches. The customized trees are saved to an output file, which can then be viewed and further edited by Dendroscope (a freely available tree viewer). ColorTree runs on any Perl installation as a stand-alone command line tool, and its application can thus be easily automated. This way, hundreds of phylogenic trees can be customized for easy visual inspection in a matter of minutes. Conclusion ColorTree allows efficient and flexible visual customization of large tree sets through the application of a user-supplied configuration file to multiple tree files. PMID:19646243

  3. ColorTree: a batch customization tool for phylogenic trees.

    PubMed

    Chen, Wei-Hua; Lercher, Martin J

    2009-07-31

    Genome sequencing projects and comparative genomics studies typically aim to trace the evolutionary history of large gene sets, often requiring human inspection of hundreds of phylogenetic trees. If trees are checked for compatibility with an explicit null hypothesis (e.g., the monophyly of certain groups), this daunting task is greatly facilitated by an appropriate coloring scheme. In this note, we introduce ColorTree, a simple yet powerful batch customization tool for phylogenic trees. Based on pattern matching rules, ColorTree applies a set of customizations to an input tree file, e.g., coloring labels or branches. The customized trees are saved to an output file, which can then be viewed and further edited by Dendroscope (a freely available tree viewer). ColorTree runs on any Perl installation as a stand-alone command line tool, and its application can thus be easily automated. This way, hundreds of phylogenic trees can be customized for easy visual inspection in a matter of minutes. ColorTree allows efficient and flexible visual customization of large tree sets through the application of a user-supplied configuration file to multiple tree files.

  4. GWM-VI: groundwater management with parallel processing for multiple MODFLOW versions

    USGS Publications Warehouse

    Banta, Edward R.; Ahlfeld, David P.

    2013-01-01

    Groundwater Management–Version Independent (GWM–VI) is a new version of the Groundwater Management Process of MODFLOW. The Groundwater Management Process couples groundwater-flow simulation with a capability to optimize stresses on the simulated aquifer based on an objective function and constraints imposed on stresses and aquifer state. GWM–VI extends prior versions of Groundwater Management in two significant ways—(1) it can be used with any version of MODFLOW that meets certain requirements on input and output, and (2) it is structured to allow parallel processing of the repeated runs of the MODFLOW model that are required to solve the optimization problem. GWM–VI uses the same input structure for files that describe the management problem as that used by prior versions of Groundwater Management. GWM–VI requires only minor changes to the input files used by the MODFLOW model. GWM–VI uses the Joint Universal Parameter IdenTification and Evaluation of Reliability Application Programming Interface (JUPITER-API) to implement both version independence and parallel processing. GWM–VI communicates with the MODFLOW model by manipulating certain input files and interpreting results from the MODFLOW listing file and binary output files. Nearly all capabilities of prior versions of Groundwater Management are available in GWM–VI. GWM–VI has been tested with MODFLOW-2005, MODFLOW-NWT (a Newton formulation for MODFLOW-2005), MF2005-FMP2 (the Farm Process for MODFLOW-2005), SEAWAT, and CFP (Conduit Flow Process for MODFLOW-2005). This report provides sample problems that demonstrate a range of applications of GWM–VI and the directory structure and input information required to use the parallel-processing capability.

  5. Continuation of research into language concepts for the mission support environment: Source code

    NASA Technical Reports Server (NTRS)

    Barton, Timothy J.; Ratner, Jeremiah M.

    1991-01-01

    Research into language concepts for the Mission Control Center is presented. A computer code for source codes is presented. The file contains the routines which allow source code files to be created and compiled. The build process assumes that all elements and the COMP exist in the current directory. The build process places as much code generation as possible on the preprocessor as possible. A summary is given of the source files as used and/or manipulated by the build routine.

  6. Analyzed Boise Data for Oscillatory Hydraulic Tomography

    DOE Data Explorer

    Lim, David

    2015-07-01

    Data here has been "pre-processed" and "analyzed" from the raw data submitted to the GDR previously (raw data files found at http://gdr.openei.org/submissions/479. doi:10.15121/1176944 after 30 September 2017). First, we submit .mat files which are the "pre-processed" data (must have MATLAB software to use). Secondly, the csv files contain submitted data in its final analyzed form before being used for inversion. Specifically, we have fourier coefficients obtained from Fast Fourier Transform Algorithms.

  7. 75 FR 60157 - Self-Regulatory Organizations; Financial Industry Regulatory Authority, Inc.; Notice of Filing...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-29

    ...-Regulatory Organizations; Financial Industry Regulatory Authority, Inc.; Notice of Filing and Immediate..., 2010, Financial Industry Regulatory Authority, Inc. (``FINRA'') filed with the Securities and Exchange... information about the rulebook consolidation process, see Information Notice, March 12, 2008 (Rulebook...

  8. 76 FR 20759 - Self-Regulatory Organizations; Financial Industry Regulatory Authority, Inc.; Notice of Filing...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-13

    ...-Regulatory Organizations; Financial Industry Regulatory Authority, Inc.; Notice of Filing and Immediate..., 2011, Financial Industry Regulatory Authority, Inc. (``FINRA'') filed with the Securities and Exchange.... For more information about the rulebook consolidation process, see Information Notice, March 12, 2008...

  9. Electronic Photography at the NASA Langley Research Center

    NASA Technical Reports Server (NTRS)

    Holm, Jack; Judge, Nancianne

    1995-01-01

    An electronic photography facility has been established in the Imaging & Photographic Technology Section, Visual Imaging Branch, at the NASA Langley Research Center (LaRC). The purpose of this facility is to provide the LaRC community with access to digital imaging technology. In particular, capabilities have been established for image scanning, direct image capture, optimized image processing for storage, image enhancement, and optimized device dependent image processing for output. Unique approaches include: evaluation and extraction of the entire film information content through scanning; standardization of image file tone reproduction characteristics for optimal bit utilization and viewing; education of digital imaging personnel on the effects of sampling and quantization to minimize image processing related information loss; investigation of the use of small kernel optimal filters for image restoration; characterization of a large array of output devices and development of image processing protocols for standardized output. Currently, the laboratory has a large collection of digital image files which contain essentially all the information present on the original films. These files are stored at 8-bits per color, but the initial image processing was done at higher bit depths and/or resolutions so that the full 8-bits are used in the stored files. The tone reproduction of these files has also been optimized so the available levels are distributed according to visual perceptibility. Look up tables are available which modify these files for standardized output on various devices, although color reproduction has been allowed to float to some extent to allow for full utilization of output device gamut.

  10. 75 FR 30074 - Notice of Docketing, Proposed Action, and Opportunity for a Hearing for Renewal of Special...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-28

    ... hearing, in accordance with the NRC E-Filing rule, which the NRC promulgated on August 28, 2007 (72 FR... entities participating under 10 CFR 2.315(c), must be filed in accordance with the NRC E-Filing rule (72 FR 49139, August 28, 2007). The E-Filing process requires participants to submit and serve all adjudicatory...

  11. 78 FR 6839 - Duke Energy Carolinas, LLC, Oconee Nuclear Station, Units 1, 2, and 3 Denial of Amendment to...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-31

    ... entities participating under 10 CFR 2.315(c), must be filed in accordance with the NRC E-Filing rule (72 FR 49139; August 28, 2007). The E-Filing process requires participants to submit and serve all adjudicatory... procedures described below. To comply with the procedural requirements of E-Filing, at least 10 days prior to...

  12. Assessment of elemental composition, microstructure, and hardness of stainless steel endodontic files and reamers.

    PubMed

    Darabara, Myrsini; Bourithis, Lefteris; Zinelis, Spiros; Papadimitriou, George D

    2004-07-01

    The purpose of this study was to determine the elemental composition, microstructure, and hardness of commercially available reamers, K files, and H files. Five instruments of each type from different manufacturers (Antaeos, FKG, Maillefer, Mani, and Micromega) were embedded in epoxy resin along their longitudinal axis. After metallographic grinding and polishing, the specimens were chemically etched and their microstructure investigated under an incident light microscope. The specimens were studied under a scanning electron microscope, and their elemental compositions were determined by energy dispersive X-ray microanalysis. The same surfaces were repolished and X-ray diffraction was performed. The same specimen surface was used for the assessment of the Vickers hardness (HV200) by using a microhardness tester with a 200-g load and 20-s contact time. The hardness results were statistically analyzed with two-way ANOVA and Tukey's test (a = 0.05). All files demonstrated extensively elongated grains parallel to longitudinal file axis because of cold drawing. The elemental composition of Maillefer and Mani reamers, Antaeos K files, and Mani H files were found in the range of AISI 303 SS, whereas all the rest were determined as AISI 304 SS. Two different phases (austenite SSt and martensite SSt) were identified with X-ray diffraction for all files tested. The results of hardness classified reamers in the following decreasing order (HMV200): Micromega = 673 +/- 29, Mani = 662 +/- 24, Maillefer = 601 +/- 34, Antaeos = 586 +/- 18, FKG = 557 +/- 19, and the K files (HV200): FKG = 673 +/- 16, Mani = 647 +/- 19, Maillefer = 603 +/- 41, Antaeos = 566 +/- 21, Micromega = 555 +/- 15, and the H files (HMV200): Mani = 640 +/- 12, FKG = 583 +/- 31, Maillefer = 581 +/- 5, Antaeos = 573 +/- 3, Micromega = 546 +/- 14. Although only two stainless steel alloys were used for the production of endodontic files, the differences in hardness are independent to the alloys used, implying that other factors, such as the production method or the thermomechanical history of the alloys, play an important role on the mechanical properties of endodontic files.

  13. File concepts for parallel I/O

    NASA Technical Reports Server (NTRS)

    Crockett, Thomas W.

    1989-01-01

    The subject of input/output (I/O) was often neglected in the design of parallel computer systems, although for many problems I/O rates will limit the speedup attainable. The I/O problem is addressed by considering the role of files in parallel systems. The notion of parallel files is introduced. Parallel files provide for concurrent access by multiple processes, and utilize parallelism in the I/O system to improve performance. Parallel files can also be used conventionally by sequential programs. A set of standard parallel file organizations is proposed, organizations are suggested, using multiple storage devices. Problem areas are also identified and discussed.

  14. A digital repository with an extensible data model for biobanking and genomic analysis management.

    PubMed

    Izzo, Massimiliano; Mortola, Francesco; Arnulfo, Gabriele; Fato, Marco M; Varesio, Luigi

    2014-01-01

    Molecular biology laboratories require extensive metadata to improve data collection and analysis. The heterogeneity of the collected metadata grows as research is evolving in to international multi-disciplinary collaborations and increasing data sharing among institutions. Single standardization is not feasible and it becomes crucial to develop digital repositories with flexible and extensible data models, as in the case of modern integrated biobanks management. We developed a novel data model in JSON format to describe heterogeneous data in a generic biomedical science scenario. The model is built on two hierarchical entities: processes and events, roughly corresponding to research studies and analysis steps within a single study. A number of sequential events can be grouped in a process building up a hierarchical structure to track patient and sample history. Each event can produce new data. Data is described by a set of user-defined metadata, and may have one or more associated files. We integrated the model in a web based digital repository with a data grid storage to manage large data sets located in geographically distinct areas. We built a graphical interface that allows authorized users to define new data types dynamically, according to their requirements. Operators compose queries on metadata fields using a flexible search interface and run them on the database and on the grid. We applied the digital repository to the integrated management of samples, patients and medical history in the BIT-Gaslini biobank. The platform currently manages 1800 samples of over 900 patients. Microarray data from 150 analyses are stored on the grid storage and replicated on two physical resources for preservation. The system is equipped with data integration capabilities with other biobanks for worldwide information sharing. Our data model enables users to continuously define flexible, ad hoc, and loosely structured metadata, for information sharing in specific research projects and purposes. This approach can improve sensitively interdisciplinary research collaboration and allows to track patients' clinical records, sample management information, and genomic data. The web interface allows the operators to easily manage, query, and annotate the files, without dealing with the technicalities of the data grid.

  15. A digital repository with an extensible data model for biobanking and genomic analysis management

    PubMed Central

    2014-01-01

    Motivation Molecular biology laboratories require extensive metadata to improve data collection and analysis. The heterogeneity of the collected metadata grows as research is evolving in to international multi-disciplinary collaborations and increasing data sharing among institutions. Single standardization is not feasible and it becomes crucial to develop digital repositories with flexible and extensible data models, as in the case of modern integrated biobanks management. Results We developed a novel data model in JSON format to describe heterogeneous data in a generic biomedical science scenario. The model is built on two hierarchical entities: processes and events, roughly corresponding to research studies and analysis steps within a single study. A number of sequential events can be grouped in a process building up a hierarchical structure to track patient and sample history. Each event can produce new data. Data is described by a set of user-defined metadata, and may have one or more associated files. We integrated the model in a web based digital repository with a data grid storage to manage large data sets located in geographically distinct areas. We built a graphical interface that allows authorized users to define new data types dynamically, according to their requirements. Operators compose queries on metadata fields using a flexible search interface and run them on the database and on the grid. We applied the digital repository to the integrated management of samples, patients and medical history in the BIT-Gaslini biobank. The platform currently manages 1800 samples of over 900 patients. Microarray data from 150 analyses are stored on the grid storage and replicated on two physical resources for preservation. The system is equipped with data integration capabilities with other biobanks for worldwide information sharing. Conclusions Our data model enables users to continuously define flexible, ad hoc, and loosely structured metadata, for information sharing in specific research projects and purposes. This approach can improve sensitively interdisciplinary research collaboration and allows to track patients' clinical records, sample management information, and genomic data. The web interface allows the operators to easily manage, query, and annotate the files, without dealing with the technicalities of the data grid. PMID:25077808

  16. Documentary of MFENET, a national computer network

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shuttleworth, B.O.

    1977-06-01

    The national Magnetic Fusion Energy Computer Network (MFENET) is a newly operational star network of geographically separated heterogeneous hosts and a communications subnetwork of PDP-11 processors. Host processors interfaced to the subnetwork currently include a CDC 7600 at the Central Computer Center (CCC) and several DECsystem-10's at User Service Centers (USC's). The network was funded by a U.S. government agency (ERDA) to provide in an economical manner the needed computational resources to magnetic confinement fusion researchers. Phase I operation of MFENET distributed the processing power of the CDC 7600 among the USC's through the provision of file transport between anymore » two hosts and remote job entry to the 7600. Extending the capabilities of Phase I, MFENET Phase II provided interactive terminal access to the CDC 7600 from the USC's. A file management system is maintained at the CCC for all network users. The history and development of MFENET are discussed, with emphasis on the protocols used to link the host computers and the USC software. Comparisons are made of MFENET versus ARPANET (Advanced Research Projects Agency Computer Network) and DECNET (Digital Distributed Network Architecture). DECNET and MFENET host-to host, host-to-CCP, and link protocols are discussed in detail. The USC--CCP interface is described briefly. 43 figures, 2 tables.« less

  17. Storing files in a parallel computing system based on user or application specification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Faibish, Sorin; Bent, John M.; Nick, Jeffrey M.

    2016-03-29

    Techniques are provided for storing files in a parallel computing system based on a user-specification. A plurality of files generated by a distributed application in a parallel computing system are stored by obtaining a specification from the distributed application indicating how the plurality of files should be stored; and storing one or more of the plurality of files in one or more storage nodes of a multi-tier storage system based on the specification. The plurality of files comprise a plurality of complete files and/or a plurality of sub-files. The specification can optionally be processed by a daemon executing on onemore » or more nodes in a multi-tier storage system. The specification indicates how the plurality of files should be stored, for example, identifying one or more storage nodes where the plurality of files should be stored.« less

  18. 49 CFR 272.105 - Requirement to file critical incident stress plan electronically.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 49 Transportation 4 2014-10-01 2014-10-01 false Requirement to file critical incident stress plan...) FEDERAL RAILROAD ADMINISTRATION, DEPARTMENT OF TRANSPORTATION CRITICAL INCIDENT STRESS PLANS Plan Components and Approval Process § 272.105 Requirement to file critical incident stress plan electronically...

  19. Some utilities to help produce Rich Text Files from Stata.

    PubMed

    Gillman, Matthew S

    Producing RTF files from Stata can be difficult and somewhat cryptic. Utilities are introduced to simplify this process; one builds up a table row-by-row, another inserts a PNG image file into an RTF document, and the others start and finish the RTF document.

  20. 8 CFR 1003.104 - Filing of complaints; preliminary inquiries; resolutions; referral of complaints.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... authorities within the Department to ensure that neither the disciplinary process nor criminal prosecutions... Professional Conduct for Practitioners-Rules and Procedures § 1003.104 Filing of complaints; preliminary... Immigration Courts shall be filed with the EOIR disciplinary counsel. Disciplinary complaints must be...

  1. Analysis of Management Control Techniques for the Data Processing Department at the Navy Finance Center, Cleveland, Ohio.

    DTIC Science & Technology

    1983-03-01

    Sysiem are: Order processinq coordinators Order processing management Credit and collections Accounts receivable Support management Admin ianagemenr...or sales secretary, then by order processing (OP). Phone-in orders go directly to OP. The infor- mation is next Transcribed onto an order entry... ORDER PROCESSING : The central systems validate The order items and codes t!, processing them against the customer file, the prodicT or PA? ts file, and

  2. 19 CFR 191.143 - Drawback entry.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... (CONTINUED) DRAWBACK Foreign-Built Jet Aircraft Engines Processed in the United States § 191.143 Drawback entry. (a) Filing of entry. Drawback entries covering these foreign-built jet aircraft engines shall be filed on Customs Form 7551, modified to show that the entry covers jet aircraft engines processed under...

  3. 19 CFR 191.143 - Drawback entry.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... (CONTINUED) DRAWBACK Foreign-Built Jet Aircraft Engines Processed in the United States § 191.143 Drawback entry. (a) Filing of entry. Drawback entries covering these foreign-built jet aircraft engines shall be filed on Customs Form 7551, modified to show that the entry covers jet aircraft engines processed under...

  4. The Standard Autonomous File Server, A Customized, Off-the-Shelf Success Story

    NASA Technical Reports Server (NTRS)

    Semancik, Susan K.; Conger, Annette M.; Obenschain, Arthur F. (Technical Monitor)

    2001-01-01

    The Standard Autonomous File Server (SAFS), which includes both off-the-shelf hardware and software, uses an improved automated file transfer process to provide a quicker, more reliable, prioritized file distribution for customers of near real-time data without interfering with the assets involved in the acquisition and processing of the data. It operates as a stand-alone solution, monitoring itself, and providing an automated fail-over process to enhance reliability. This paper describes the unique problems and lessons learned both during the COTS selection and integration into SAFS, and the system's first year of operation in support of NASA's satellite ground network. COTS was the key factor in allowing the two-person development team to deploy systems in less than a year, meeting the required launch schedule. The SAFS system has been so successful; it is becoming a NASA standard resource, leading to its nomination for NASA's Software of the Year Award in 1999.

  5. The ISMARA client

    PubMed Central

    Ioannidis, Vassilios; van Nimwegen, Erik; Stockinger, Heinz

    2016-01-01

    ISMARA ( ismara.unibas.ch) automatically infers the key regulators and regulatory interactions from high-throughput gene expression or chromatin state data. However, given the large sizes of current next generation sequencing (NGS) datasets, data uploading times are a major bottleneck. Additionally, for proprietary data, users may be uncomfortable with uploading entire raw datasets to an external server. Both these problems could be alleviated by providing a means by which users could pre-process their raw data locally, transferring only a small summary file to the ISMARA server. We developed a stand-alone client application that pre-processes large input files (RNA-seq or ChIP-seq data) on the user's computer for performing ISMARA analysis in a completely automated manner, including uploading of small processed summary files to the ISMARA server. This reduces file sizes by up to a factor of 1000, and upload times from many hours to mere seconds. The client application is available from ismara.unibas.ch/ISMARA/client. PMID:28232860

  6. ExcelAutomat: a tool for systematic processing of files as applied to quantum chemical calculations

    NASA Astrophysics Data System (ADS)

    Laloo, Jalal Z. A.; Laloo, Nassirah; Rhyman, Lydia; Ramasami, Ponnadurai

    2017-07-01

    The processing of the input and output files of quantum chemical calculations often necessitates a spreadsheet as a key component of the workflow. Spreadsheet packages with a built-in programming language editor can automate the steps involved and thus provide a direct link between processing files and the spreadsheet. This helps to reduce user-interventions as well as the need to switch between different programs to carry out each step. The ExcelAutomat tool is the implementation of this method in Microsoft Excel (MS Excel) using the default Visual Basic for Application (VBA) programming language. The code in ExcelAutomat was adapted to work with the platform-independent open-source LibreOffice Calc, which also supports VBA. ExcelAutomat provides an interface through the spreadsheet to automate repetitive tasks such as merging input files, splitting, parsing and compiling data from output files, and generation of unique filenames. Selected extracted parameters can be retrieved as variables which can be included in custom codes for a tailored approach. ExcelAutomat works with Gaussian files and is adapted for use with other computational packages including the non-commercial GAMESS. ExcelAutomat is available as a downloadable MS Excel workbook or as a LibreOffice workbook.

  7. ExcelAutomat: a tool for systematic processing of files as applied to quantum chemical calculations.

    PubMed

    Laloo, Jalal Z A; Laloo, Nassirah; Rhyman, Lydia; Ramasami, Ponnadurai

    2017-07-01

    The processing of the input and output files of quantum chemical calculations often necessitates a spreadsheet as a key component of the workflow. Spreadsheet packages with a built-in programming language editor can automate the steps involved and thus provide a direct link between processing files and the spreadsheet. This helps to reduce user-interventions as well as the need to switch between different programs to carry out each step. The ExcelAutomat tool is the implementation of this method in Microsoft Excel (MS Excel) using the default Visual Basic for Application (VBA) programming language. The code in ExcelAutomat was adapted to work with the platform-independent open-source LibreOffice Calc, which also supports VBA. ExcelAutomat provides an interface through the spreadsheet to automate repetitive tasks such as merging input files, splitting, parsing and compiling data from output files, and generation of unique filenames. Selected extracted parameters can be retrieved as variables which can be included in custom codes for a tailored approach. ExcelAutomat works with Gaussian files and is adapted for use with other computational packages including the non-commercial GAMESS. ExcelAutomat is available as a downloadable MS Excel workbook or as a LibreOffice workbook.

  8. MPST Software: MoonKommand

    NASA Technical Reports Server (NTRS)

    Kwok, John H.; Call, Jared A.; Khanampornpan, Teerapat

    2012-01-01

    This software automatically processes Sally Ride Science (SRS) delivered MoonKAM camera control files (ccf) into uplink products for the GRAIL-A and GRAIL-B spacecraft as part of an education and public outreach (EPO) extension to the Grail Mission. Once properly validated and deemed safe for execution onboard the spacecraft, MoonKommand generates the command products via the Automated Sequence Processor (ASP) and generates uplink (.scmf) files for radiation to the Grail-A and/or Grail-B spacecraft. Any errors detected along the way are reported back to SRS via email. With Moon Kommand, SRS can control their EPO instrument as part of a fully automated process. Inputs are received from SRS as either image capture files (.ccficd) for new image requests, or downlink/delete files (.ccfdl) for requesting image downlink from the instrument and on-board memory management. The Moon - Kommand outputs are command and file-load (.scmf) files that will be uplinked by the Deep Space Network (DSN). Without MoonKommand software, uplink product generation for the MoonKAM instrument would be a manual process. The software is specific to the Moon - KAM instrument on the GRAIL mission. At the time of this writing, the GRAIL mission was making final preparations to begin the science phase, which was scheduled to continue until June 2012.

  9. 12 CFR Appendix C to Part 360 - Deposit File Structure

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... structure for the data file to provide deposit data to the FDIC. If data or information are not maintained... covered institution's understanding of its customers and the data maintained around deposit accounts... complete its insurance determination process, it may add this information to the end of this data file...

  10. Some utilities to help produce Rich Text Files from Stata

    PubMed Central

    Gillman, Matthew S.

    2018-01-01

    Producing RTF files from Stata can be difficult and somewhat cryptic. Utilities are introduced to simplify this process; one builds up a table row-by-row, another inserts a PNG image file into an RTF document, and the others start and finish the RTF document. PMID:29731697

  11. 32 CFR 147.24 - The national agency check.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 32 National Defense 1 2013-07-01 2013-07-01 false The national agency check. 147.24 Section 147.24 National Defense Department of Defense OFFICE OF THE SECRETARY OF DEFENSE PERSONNEL, MILITARY AND CIVILIAN... reinvestigations. It consists of a review of; (a) Investigative and criminal history files of the FBI, including a...

  12. 32 CFR 147.24 - The national agency check.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 32 National Defense 1 2011-07-01 2011-07-01 false The national agency check. 147.24 Section 147.24 National Defense Department of Defense OFFICE OF THE SECRETARY OF DEFENSE PERSONNEL, MILITARY AND CIVILIAN... reinvestigations. It consists of a review of; (a) Investigative and criminal history files of the FBI, including a...

  13. 32 CFR 147.24 - The national agency check.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 32 National Defense 1 2010-07-01 2010-07-01 false The national agency check. 147.24 Section 147.24 National Defense Department of Defense OFFICE OF THE SECRETARY OF DEFENSE PERSONNEL, MILITARY AND CIVILIAN... reinvestigations. It consists of a review of; (a) Investigative and criminal history files of the FBI, including a...

  14. 32 CFR 147.24 - The national agency check.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 32 National Defense 1 2014-07-01 2014-07-01 false The national agency check. 147.24 Section 147.24 National Defense Department of Defense OFFICE OF THE SECRETARY OF DEFENSE PERSONNEL, MILITARY AND CIVILIAN... reinvestigations. It consists of a review of; (a) Investigative and criminal history files of the FBI, including a...

  15. 32 CFR 147.24 - The national agency check.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 32 National Defense 1 2012-07-01 2012-07-01 false The national agency check. 147.24 Section 147.24 National Defense Department of Defense OFFICE OF THE SECRETARY OF DEFENSE PERSONNEL, MILITARY AND CIVILIAN... reinvestigations. It consists of a review of; (a) Investigative and criminal history files of the FBI, including a...

  16. 77 FR 14686 - Claims for Patent and Copyright Infringement

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-13

    ... history of each patent, if it is available to the claimant. Indicate whether the patent has been the... corresponding foreign patents and patent applications and full copies of the same. (11) Pertinent prior art known to the claimant not contained in the USPTO file, for example, publications and foreign prior art...

  17. 77 FR 71669 - Qualification of Drivers; Exemption Applications; Vision

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-03

    ... proneness from crash history coupled with other factors. These factors--such as age, sex, geographic... notice of September 26, 2012 (77 FR 59248). We recognize that the vision of an applicant may change and.... Minor, Associate Administrator for Policy. [FR Doc. 2012-29161 Filed 11-30-12; 8:45 am] BILLING CODE...

  18. 81. PHOTOCOPY OF PHOTOGRAPH SHOWING NEW CREEK CHANNEL UNDER CONSTRUCTION ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    81. PHOTOCOPY OF PHOTOGRAPH SHOWING NEW CREEK CHANNEL UNDER CONSTRUCTION AT P STREET BEND, FROM 1940 REPORT ON PROPOSED DEVELOPMENT OF ROCK CREEK AND POTOMAC PARKWAY, SECTION II (ROCK CREEK AND POTOMAC PARKWAY FILE, HISTORY DEPARTMENT ARCHIVES, NATIONAL PARK SERVICE, WASHINGTON, DC). - Rock Creek & Potomac Parkway, Washington, District of Columbia, DC

  19. 47 CFR 0.455 - Other locations at which records may be inspected.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... ORGANIZATION General Information Public Information and Inspection of Records § 0.455 Other locations at which... available for inspection in the Reference Information Center or the offices of the Bureau or Office which... in the Office of the Secretary. (2) Files containing information concerning the history of the...

  20. 47 CFR 0.455 - Other locations at which records may be inspected.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... ORGANIZATION General Information Public Information and Inspection of Records § 0.455 Other locations at which... available for inspection in the Reference Information Center or the offices of the Bureau or Office which... in the Office of the Secretary. (2) Files containing information concerning the history of the...

  1. 47 CFR 0.455 - Other locations at which records may be inspected.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... ORGANIZATION General Information Public Information and Inspection of Records § 0.455 Other locations at which... available for inspection in the Reference Information Center or the offices of the Bureau or Office which... in the Office of the Secretary. (2) Files containing information concerning the history of the...

  2. 47 CFR 0.455 - Other locations at which records may be inspected.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... ORGANIZATION General Information Public Information and Inspection of Records § 0.455 Other locations at which... available for inspection in the Reference Information Center or the offices of the Bureau or Office which... in the Office of the Secretary. (2) Files containing information concerning the history of the...

  3. 78 FR 11760 - 3-decen-2-one; Exemption from the Requirement of a Tolerance

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-20

    ....2000(a)(1), with a history of unremarkable human exposure. 3-decen-2-one functions as a plant growth regulator, affecting plant growth by increasing tuber respiration. Data on file indicate that 3-decen-2-one interferes with membrane integrity, which results in increased oxidative stress, desiccation, and rapid...

  4. The Virtual City: Putting Charleston on the World Wide Web.

    ERIC Educational Resources Information Center

    Beagle, Donald

    1996-01-01

    Describes the Charleston Multimedia Project, a World Wide Web guide to the history, architecture, and culture of Charleston, South Carolina, which includes a timeline and virtual tours. Incorporates materials issued by many agencies that were previously held in vertical files. The Charleston County Library's role and future plans are also…

  5. RLMS Micro-File: Current State of Catalog Card Reproduction. Supplement 1.

    ERIC Educational Resources Information Center

    Nitecki, Joseph Z., Comp.

    Nine papers on various aspects and methods of catalog card reproduction are included in this supplement. Many reports include cost analyses and comparisons. A lengthy paper describes the history and the present use of technology of the Library of Congress card production operations. Other reports cover offset press and computer output microfilm…

  6. 49 CFR Appendix D to Part 222 - Determining Risk Levels

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ...) through (e). (c) FRA will match the highway-rail incident files for the past five years against a data... of the number of predicted collisions per year: 1. average annual daily traffic 2. total number of... history for the previous five years and it is calibrated by resetting normalizing constants. The...

  7. The Arkansas School Finance Case: Is It Over Yet?

    ERIC Educational Resources Information Center

    Schoppmeyer, Martin W.

    This paper reports on the protracted history of the Arkansas school-finance case, the longest-running school-finance lawsuit in the United States. It details in chronological sequence the lawsuit filed in 1992 by the Lakeview School District, a very small all African-American school district alleging that the state school-finance plan was…

  8. 38 CFR 74.12 - What must a concern submit to apply for VetBiz VIP Verification Program?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... personal and business tax returns, payroll records and personal history statements. An applicant must also...' Relief DEPARTMENT OF VETERANS AFFAIRS (CONTINUED) VETERANS SMALL BUSINESS REGULATIONS Application... dispatches the electronic forms, the applicant must also retain on file at the principal place of business a...

  9. My Job Application File. Third Edition.

    ERIC Educational Resources Information Center

    Kahn, Charles; And Others

    This guide contains ten exercises designed to aid students in completing job applications. Exercises included are (1) My Personal History, (2) My Educational Record, (3) Printing Neatly Helps, (4) Key Words and Abbreviations, (5) My Health Record, (6) Papers I Will Need, (7) Paid Work Experience, (8) Unpaid Work Experience, (9) My References, and…

  10. 41 CFR 101-26.100-3 - Warranties.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Form (SF) 368, Product Quality Deficiency Report, in duplicate, shall be sent to the GSA Discrepancy... should be clearly stated in the text of the SF 368. This information will be maintained as a quality history file for use in future procurements. (b) If the contractor refuses to correct, or fails to replace...

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fried, Jack

    The TSPM receives and processes ASIC signals and transmits processed data over to the PC using an Ethernet cable. The data is given a location and a time stamp. This is the heart of the device as it gathers and stamps the timing and location of events on each of the ASICs. The five files for the TSPM are needed to manufacture Pet scanners that are based on the RatCAP (Rat Conscious Animal PET). They include a TSPM schematic, a raw data file to build the RatCAP TSPM, an output file that along with the assay file is used bymore » an assembly house to build the RatCAP TSPM, an assay file that provides the part list and XY location for the components that go on the RatCAP TSPM, firmware that includes the source code to program the FPGA, and a realized program on the TSPM based on the firmware.« less

  12. AstroVis: Visualizing astronomical data cubes

    NASA Astrophysics Data System (ADS)

    Finniss, Stephen; Tyler, Robin; Questiaux, Jacques

    2016-08-01

    AstroVis enables rapid visualization of large data files on platforms supporting the OpenGL rendering library. Radio astronomical observations are typically three dimensional and stored as data cubes. AstroVis implements a scalable approach to accessing these files using three components: a File Access Component (FAC) that reduces the impact of reading time, which speeds up access to the data; the Image Processing Component (IPC), which breaks up the data cube into smaller pieces that can be processed locally and gives a representation of the whole file; and Data Visualization, which implements an approach of Overview + Detail to reduces the dimensions of the data being worked with and the amount of memory required to store it. The result is a 3D display paired with a 2D detail display that contains a small subsection of the original file in full resolution without reducing the data in any way.

  13. 34 CFR 303.440 - Filing a due process complaint.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... AND REHABILITATIVE SERVICES, DEPARTMENT OF EDUCATION EARLY INTERVENTION PROGRAM FOR INFANTS AND... early intervention services to the infant or toddler with a disability and his or her family under part... filing a due process complaint under this part, in the time allowed by that State law, except that the...

  14. 34 CFR 303.440 - Filing a due process complaint.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... AND REHABILITATIVE SERVICES, DEPARTMENT OF EDUCATION EARLY INTERVENTION PROGRAM FOR INFANTS AND... early intervention services to the infant or toddler with a disability and his or her family under part... filing a due process complaint under this part, in the time allowed by that State law, except that the...

  15. 34 CFR 303.440 - Filing a due process complaint.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... AND REHABILITATIVE SERVICES, DEPARTMENT OF EDUCATION EARLY INTERVENTION PROGRAM FOR INFANTS AND... early intervention services to the infant or toddler with a disability and his or her family under part... filing a due process complaint under this part, in the time allowed by that State law, except that the...

  16. 77 FR 1040 - Provisional Waivers of Inadmissibility for Certain Immediate Relatives of U.S. Citizens

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-01-09

    .... SUMMARY: U.S. Citizenship and Immigration Services (USCIS) intends to change its current process for filing and adjudication of certain applications for waivers of inadmissibility filed in connection with... such ground. Typically, under current processes, aliens who are immediate relatives of U.S. citizens...

  17. 37 CFR 7.4 - Receipt of correspondence.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ..., DEPARTMENT OF COMMERCE RULES OF PRACTICE IN FILINGS PURSUANT TO THE PROTOCOL RELATING TO THE MADRID AGREEMENT... to review an action of the Office's Madrid Processing Unit, when filed by mail, must be addressed to: Madrid Processing Unit, 600 Dulany Street, MDE-7B87, Alexandria, VA 22314-5793. (1) International...

  18. 37 CFR 7.4 - Receipt of correspondence.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ..., DEPARTMENT OF COMMERCE RULES OF PRACTICE IN FILINGS PURSUANT TO THE PROTOCOL RELATING TO THE MADRID AGREEMENT... to review an action of the Office's Madrid Processing Unit, when filed by mail, must be addressed to: Madrid Processing Unit, 600 Dulany Street, MDE-7B87, Alexandria, VA 22314-5793. (1) International...

  19. 37 CFR 7.4 - Receipt of correspondence.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ..., DEPARTMENT OF COMMERCE RULES OF PRACTICE IN FILINGS PURSUANT TO THE PROTOCOL RELATING TO THE MADRID AGREEMENT... to review an action of the Office's Madrid Processing Unit, when filed by mail, must be addressed to: Madrid Processing Unit, 600 Dulany Street, MDE-7B87, Alexandria, VA 22314-5793. (1) International...

  20. 37 CFR 7.4 - Receipt of correspondence.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ..., DEPARTMENT OF COMMERCE RULES OF PRACTICE IN FILINGS PURSUANT TO THE PROTOCOL RELATING TO THE MADRID AGREEMENT... to review an action of the Office's Madrid Processing Unit, when filed by mail, must be addressed to: Madrid Processing Unit, 600 Dulany Street, MDE-7B87, Alexandria, VA 22314-5793. (1) International...

  1. 37 CFR 1.446 - Refund of international application filing and processing fees.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 37 Patents, Trademarks, and Copyrights 1 2014-07-01 2014-07-01 false Refund of international application filing and processing fees. 1.446 Section 1.446 Patents, Trademarks, and Copyrights UNITED STATES PATENT AND TRADEMARK OFFICE, DEPARTMENT OF COMMERCE GENERAL RULES OF PRACTICE IN PATENT CASES...

  2. 37 CFR 1.446 - Refund of international application filing and processing fees.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 37 Patents, Trademarks, and Copyrights 1 2012-07-01 2012-07-01 false Refund of international application filing and processing fees. 1.446 Section 1.446 Patents, Trademarks, and Copyrights UNITED STATES PATENT AND TRADEMARK OFFICE, DEPARTMENT OF COMMERCE GENERAL RULES OF PRACTICE IN PATENT CASES...

  3. 37 CFR 1.445 - International application filing, processing and search fees.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 37 Patents, Trademarks, and Copyrights 1 2014-07-01 2014-07-01 false International application filing, processing and search fees. 1.445 Section 1.445 Patents, Trademarks, and Copyrights UNITED STATES PATENT AND TRADEMARK OFFICE, DEPARTMENT OF COMMERCE GENERAL RULES OF PRACTICE IN PATENT CASES...

  4. 37 CFR 1.446 - Refund of international application filing and processing fees.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 37 Patents, Trademarks, and Copyrights 1 2011-07-01 2011-07-01 false Refund of international application filing and processing fees. 1.446 Section 1.446 Patents, Trademarks, and Copyrights UNITED STATES PATENT AND TRADEMARK OFFICE, DEPARTMENT OF COMMERCE GENERAL RULES OF PRACTICE IN PATENT CASES...

  5. 37 CFR 1.445 - International application filing, processing and search fees.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 37 Patents, Trademarks, and Copyrights 1 2012-07-01 2012-07-01 false International application filing, processing and search fees. 1.445 Section 1.445 Patents, Trademarks, and Copyrights UNITED STATES PATENT AND TRADEMARK OFFICE, DEPARTMENT OF COMMERCE GENERAL RULES OF PRACTICE IN PATENT CASES...

  6. 37 CFR 1.445 - International application filing, processing and search fees.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 37 Patents, Trademarks, and Copyrights 1 2010-07-01 2010-07-01 false International application filing, processing and search fees. 1.445 Section 1.445 Patents, Trademarks, and Copyrights UNITED STATES PATENT AND TRADEMARK OFFICE, DEPARTMENT OF COMMERCE GENERAL RULES OF PRACTICE IN PATENT CASES...

  7. 37 CFR 1.446 - Refund of international application filing and processing fees.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 37 Patents, Trademarks, and Copyrights 1 2013-07-01 2013-07-01 false Refund of international application filing and processing fees. 1.446 Section 1.446 Patents, Trademarks, and Copyrights UNITED STATES PATENT AND TRADEMARK OFFICE, DEPARTMENT OF COMMERCE GENERAL RULES OF PRACTICE IN PATENT CASES...

  8. 37 CFR 1.445 - International application filing, processing and search fees.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 37 Patents, Trademarks, and Copyrights 1 2013-07-01 2013-07-01 false International application filing, processing and search fees. 1.445 Section 1.445 Patents, Trademarks, and Copyrights UNITED STATES PATENT AND TRADEMARK OFFICE, DEPARTMENT OF COMMERCE GENERAL RULES OF PRACTICE IN PATENT CASES...

  9. 37 CFR 1.445 - International application filing, processing and search fees.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 37 Patents, Trademarks, and Copyrights 1 2011-07-01 2011-07-01 false International application filing, processing and search fees. 1.445 Section 1.445 Patents, Trademarks, and Copyrights UNITED STATES PATENT AND TRADEMARK OFFICE, DEPARTMENT OF COMMERCE GENERAL RULES OF PRACTICE IN PATENT CASES...

  10. Cyclic fatigue resistance of four nickel-titanium rotary instruments: a comparative study.

    PubMed

    Pedullà, Eugenio; Plotino, Gianluca; Grande, Nicola Maria; Pappalardo, Alfio; Rapisarda, Ernesto

    2012-04-01

    The aim of this study is to investigate cyclic fatigue resistance of four nickel - titanium rotary (NTR) instruments produced by a new method or traditional grinding processes. FOUR NTR INSTRUMENTS FROM DIFFERENT BRANDS WERE SELECTED: group 1. Twisted File produced by a new thermal treatment of nickel - titanium alloy; group 2. Revo S SU; group 3. Mtwo and group 4. BioRaCe BR3 produced by traditional grinding processes. A total of 80 instruments (20 for each group) were tested for cyclic fatigue resistance inside a curved artificial canal with a 60 degree angle of curvature and 5 mm radius of curvature. Time to fracture (TtF) from the start of the test until the moment of file breakage and the length of the fractured tip was recorded for each instrument. Means and standard deviations (SD) of TtF and fragment length were calculated. Data were subjected to one-way analysis of variance (ANOVA). Group 1 (Twisted File) showed the highest value of TtF means. Cyclic fatigue resistance of Twisted File and Mtwo was significantly higher than group 2 (Revo S SU) and 4 (BioRace BR3), while no significant differences were found between group 1 (Twisted File) and 3 (Mtwo) or group 2 (Revo S SU) and 4 (BioRaCe BR3). The cyclic fatigue resistance of Twisted File was significantly frigher than instruments produced with traditional grinding process except of Mtwo files.

  11. Cyclic fatigue resistance of four nickel-titanium rotary instruments: a comparative study

    PubMed Central

    Pedullà, Eugenio; Plotino, Gianluca; Grande, Nicola Maria; Pappalardo, Alfio; Rapisarda, Ernesto

    2012-01-01

    Summary Aims The aim of this study is to investigate cyclic fatigue resistance of four nickel – titanium rotary (NTR) instruments produced by a new method or traditional grinding processes. Methods Four NTR instruments from different brands were selected: group 1. Twisted File produced by a new thermal treatment of nickel – titanium alloy; group 2. Revo S SU; group 3. Mtwo and group 4. BioRaCe BR3 produced by traditional grinding processes. A total of 80 instruments (20 for each group) were tested for cyclic fatigue resistance inside a curved artificial canal with a 60 degree angle of curvature and 5 mm radius of curvature. Time to fracture (TtF) from the start of the test until the moment of file breakage and the length of the fractured tip was recorded for each instrument. Means and standard deviations (SD) of TtF and fragment length were calculated. Data were subjected to one-way analysis of variance (ANOVA). Results Group 1 (Twisted File) showed the highest value of TtF means. Cyclic fatigue resistance of Twisted File and Mtwo was significantly higher than group 2 (Revo S SU) and 4 (BioRace BR3), while no significant differences were found between group 1 (Twisted File) and 3 (Mtwo) or group 2 (Revo S SU) and 4 (BioRaCe BR3). Conclusions The cyclic fatigue resistance of Twisted File was significantly frigher than instruments produced with traditional grinding process except of Mtwo files. PMID:23087787

  12. Searching Process with Raita Algorithm and its Application

    NASA Astrophysics Data System (ADS)

    Rahim, Robbi; Saleh Ahmar, Ansari; Abdullah, Dahlan; Hartama, Dedy; Napitupulu, Darmawan; Putera Utama Siahaan, Andysah; Hasan Siregar, Muhammad Noor; Nasution, Nurliana; Sundari, Siti; Sriadhi, S.

    2018-04-01

    Searching is a common process performed by many computer users, Raita algorithm is one algorithm that can be used to match and find information in accordance with the patterns entered. Raita algorithm applied to the file search application using java programming language and the results obtained from the testing process of the file search quickly and with accurate results and support many data types.

  13. Ecosystem history of South Florida; Biscayne Bay sediment core descriptions

    USGS Publications Warehouse

    Ishman, S.E.

    1997-01-01

    The 'Ecosystem History of Biscayne Bay and the southeast Coast' project of the U.S. Geological Survey is part of a multi-disciplinary effort that includes Florida Bay and the Everglades to provide paleoecologic reconstructions for the south Florida region. Reconstructions of past salinity, nutrients, substrate, and water quality are needed to determine ecosystem variability due to both natural and human-induced causes. Our understanding of the relations between the south Florida ecosystem and introduced forces will allow managers to make informed decisions regarding the south Florida ecosystem restoration and monitoring. The record of past ecosystem conditions can be found in shallow sediment cores. This U.S. Geological Survey Open-File Report describes six shallow sediment cores collected from Biscayne Bay. The cores described herein are being processed for a variety of analytical procedures, and this provides the descriptive framework for future analyses of the included cores. This report is preliminary and has not been reviewed for conformity with U.S. Geological Survey editorial standards or with the North American Stratigraphic Code. Any use of trade, product, or firm names is for descriptive purposes only and does not imply endorsement by the U.S. Government.

  14. Flow regions of granules in Dorfan Impingo filter for gas cleanup

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kuo, J.T.; Smid, J.; Hsiau, S.S.

    1999-07-01

    Inside a two-dimensional model of the louvered Dorfan Impingo panel with transparent front and rear walls the flow region of filter granules without gas cross flow were observed. The white PE beads were used as filter granules. Colored PE beads served as tracers. Filter granules were discharged and circulated to the bed. The flow rate of filter medium was controlled by the belt conveyor. The image processing system including a Frame Grabber and JVC videocamera was used to record the granular flow. Every image of motion was digitized and stored in a file. The flow patterns and the quasi-stagnant zonesmore » history in the moving granular bed were evaluated. The experiment showed fast central moving region (flowing core) of filter granules and quasi-stagnant zones close to louver walls.« less

  15. Using ABAQUS Scripting Interface for Materials Evaluation and Life Prediction

    NASA Technical Reports Server (NTRS)

    Powers, Lynn M.; Arnold, Steven M.; Baranski, Andrzej

    2006-01-01

    An ABAQUS script has been written to aid in the evaluation of the mechanical behavior of viscoplastic materials. The purposes of the script are to: handle complex load histories; control load/displacement with alternate stopping criteria; predict failure and life; and verify constitutive models. Material models from the ABAQUS library may be used or the UMAT routine may specify mechanical behavior. User subroutines implemented include: UMAT for the constitutive model; UEXTERNALDB for file manipulation; DISP for boundary conditions; and URDFIL for results processing. Examples presented include load, strain and displacement control tests on a single element model. The tests are creep with a life limiting strain criterion, strain control with a stress limiting cycle and a complex interrupted cyclic relaxation test. The techniques implemented in this paper enable complex load conditions to be solved efficiently with ABAQUS.

  16. Catching errors with patient-specific pretreatment machine log file analysis.

    PubMed

    Rangaraj, Dharanipathy; Zhu, Mingyao; Yang, Deshan; Palaniswaamy, Geethpriya; Yaddanapudi, Sridhar; Wooten, Omar H; Brame, Scott; Mutic, Sasa

    2013-01-01

    A robust, efficient, and reliable quality assurance (QA) process is highly desired for modern external beam radiation therapy treatments. Here, we report the results of a semiautomatic, pretreatment, patient-specific QA process based on dynamic machine log file analysis clinically implemented for intensity modulated radiation therapy (IMRT) treatments delivered by high energy linear accelerators (Varian 2100/2300 EX, Trilogy, iX-D, Varian Medical Systems Inc, Palo Alto, CA). The multileaf collimator machine (MLC) log files are called Dynalog by Varian. Using an in-house developed computer program called "Dynalog QA," we automatically compare the beam delivery parameters in the log files that are generated during pretreatment point dose verification measurements, with the treatment plan to determine any discrepancies in IMRT deliveries. Fluence maps are constructed and compared between the delivered and planned beams. Since clinical introduction in June 2009, 912 machine log file analyses QA were performed by the end of 2010. Among these, 14 errors causing dosimetric deviation were detected and required further investigation and intervention. These errors were the result of human operating mistakes, flawed treatment planning, and data modification during plan file transfer. Minor errors were also reported in 174 other log file analyses, some of which stemmed from false positives and unreliable results; the origins of these are discussed herein. It has been demonstrated that the machine log file analysis is a robust, efficient, and reliable QA process capable of detecting errors originating from human mistakes, flawed planning, and data transfer problems. The possibility of detecting these errors is low using point and planar dosimetric measurements. Copyright © 2013 American Society for Radiation Oncology. Published by Elsevier Inc. All rights reserved.

  17. PDT - PARTICLE DISPLACEMENT TRACKING SOFTWARE

    NASA Technical Reports Server (NTRS)

    Wernet, M. P.

    1994-01-01

    Particle Imaging Velocimetry (PIV) is a quantitative velocity measurement technique for measuring instantaneous planar cross sections of a flow field. The technique offers very high precision (1%) directionally resolved velocity vector estimates, but its use has been limited by high equipment costs and complexity of operation. Particle Displacement Tracking (PDT) is an all-electronic PIV data acquisition and reduction procedure which is simple, fast, and easily implemented. The procedure uses a low power, continuous wave laser and a Charged Coupled Device (CCD) camera to electronically record the particle images. A frame grabber board in a PC is used for data acquisition and reduction processing. PDT eliminates the need for photographic processing, system costs are moderately low, and reduced data are available within seconds of acquisition. The technique results in velocity estimate accuracies on the order of 5%. The software is fully menu-driven from the acquisition to the reduction and analysis of the data. Options are available to acquire a single image or 5- or 25-field series of images separated in time by multiples of 1/60 second. The user may process each image, specifying its boundaries to remove unwanted glare from the periphery and adjusting its background level to clearly resolve the particle images. Data reduction routines determine the particle image centroids and create time history files. PDT then identifies the velocity vectors which describe the particle movement in the flow field. Graphical data analysis routines are included which allow the user to graph the time history files and display the velocity vector maps, interpolated velocity vector grids, iso-velocity vector contours, and flow streamlines. The PDT data processing software is written in FORTRAN 77 and the data acquisition routine is written in C-Language for 80386-based IBM PC compatibles running MS-DOS v3.0 or higher. Machine requirements include 4 MB RAM (3 MB Extended), a single or multiple frequency RGB monitor (EGA or better), a math co-processor, and a pointing device. The printers supported by the graphical analysis routines are the HP Laserjet+, Series II, and Series III with at least 1.5 MB memory. The data acquisition routines require the EPIX 4-MEG video board and optional 12.5MHz oscillator, and associated EPIX software. Data can be acquired from any CCD or RS-170 compatible video camera with pixel resolution of 600hX400v or better. PDT is distributed on one 5.25 inch 360K MS-DOS format diskette. Due to the use of required proprietary software, executable code is not provided on the distribution media. Compiling the source code requires the Microsoft C v5.1 compiler, Microsoft QuickC v2.0, the Microsoft Mouse Library, EPIX Image Processing Libraries, the Microway NDP-Fortran-386 v2.1 compiler, and the Media Cybernetics HALO Professional Graphics Kernal System. Due to the complexities of the machine requirements, COSMIC strongly recommends the purchase and review of the documentation prior to the purchase of the program. The source code, and sample input and output files are provided in PKZIP format; the PKUNZIP utility is included. PDT was developed in 1990. All trade names used are the property of their respective corporate owners.

  18. Transferable Output ASCII Data (TOAD) editor version 1.0 user's guide

    NASA Technical Reports Server (NTRS)

    Bingel, Bradford D.; Shea, Anne L.; Hofler, Alicia S.

    1991-01-01

    The Transferable Output ASCII Data (TOAD) editor is an interactive software tool for manipulating the contents of TOAD files. The TOAD editor is specifically designed to work with tabular data. Selected subsets of data may be displayed to the user's screen, sorted, exchanged, duplicated, removed, replaced, inserted, or transferred to and from external files. It also offers a number of useful features including on-line help, macros, a command history, an 'undo' option, variables, and a full compliment of mathematical functions and conversion factors. Written in ANSI FORTRAN 77 and completely self-contained, the TOAD editor is very portable and has already been installed on SUN, SGI/IRIS, and CONVEX hosts.

  19. 77 FR 42784 - Self-Regulatory Organizations; Options Clearing Corporation; Notice of Filing of Proposed Rule...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-20

    ... Organizations; Options Clearing Corporation; Notice of Filing of Proposed Rule Change Relating to the Auction Process Under Options Clearing Corporation Rule 1104 July 16, 2012. Pursuant to Section 19(b)(1) of the... July 3, 2012, The Options Clearing Corporation (``OCC'') filed with the Securities and Exchange...

  20. 49 CFR 1155.26 - Board determinations under 49 U.S.C. 10909.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 45 days prior to filing of application—Environmental report (and/or historic report, if applicable) filed and environmental process initiated pursuant to 49 CFR 1155.24. (i) Day 0—Application filed. (ii... date for initial comments. (iv) 30 days after the Final EIS (or other final environmental documentation...

  1. 12 CFR Appendix C to Part 360 - Deposit File Structure

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... tax identification number. Possible values are: • S = Social Security Number. • T = Federal Tax... or do not apply, a null value in the appropriate field should be indicated. The file will be in a tab... complete its insurance determination process, it may add this information to the end of this data file...

  2. 12 CFR Appendix C to Part 360 - Deposit File Structure

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... tax identification number. Possible values are: • S = Social Security Number. • T = Federal Tax... or do not apply, a null value in the appropriate field should be indicated. The file will be in a tab... complete its insurance determination process, it may add this information to the end of this data file...

  3. 12 CFR Appendix C to Part 360 - Deposit File Structure

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... tax identification number. Possible values are: • S = Social Security Number. • T = Federal Tax... or do not apply, a null value in the appropriate field should be indicated. The file will be in a tab... complete its insurance determination process, it may add this information to the end of this data file...

  4. 75 FR 33297 - Western Technical College; Notice of Intent To File License Application, Filing of Pre...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-11

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Project No. 13417-001] Western... License Application and Request to Use the Traditional Licensing Process. b. Project No.: 13417-001. c. Date Filed: December 21, 2009. d. Submitted By: Western Technical College (Western). e. Name of Project...

  5. 75 FR 74117 - Self-Regulatory Organizations; National Securities Clearing Corporation; Notice of Filing of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-30

    ... SECURITIES AND EXCHANGE COMMISSION [Release No. 34-63368; File No. SR-NSCC-2010-15] Self-Regulatory Organizations; National Securities Clearing Corporation; Notice of Filing of Proposed Rule Change Relating to Establishing an Automated Service for the Processing of Transfers, Replacements, and Exchanges of Insurance and Retirement Products...

  6. 77 FR 46561 - Amendments to Adjudicatory Process Rules and Related Requirements

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-03

    ... eight late-filed factors, especially not for late-filed hearing requests or intervention petitions. The... current three Sec. 2.309(f)(2) factors. As the NRC explained in the proposed rule, whether filings after... the existence of good cause, not the other factors. The commenter has not supported its assertion that...

  7. 77 FR 101 - Corbett Water District; Notice of Intent To File License Application, Filing of Pre-Application...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-01-03

    ... [email protected] . j. Corbett Water District filed its request to use the Traditional Licensing... approved Corbett Water District's request to use the Traditional Licensing Process. k. With this notice, we... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Project No. 14322-000] Corbett Water...

  8. Image Size Variation Influence on Corrupted and Non-viewable BMP Image

    NASA Astrophysics Data System (ADS)

    Azmi, Tengku Norsuhaila T.; Azma Abdullah, Nurul; Rahman, Nurul Hidayah Ab; Hamid, Isredza Rahmi A.; Chai Wen, Chuah

    2017-08-01

    Image is one of the evidence component seek in digital forensics. Joint Photographic Experts Group (JPEG) format is most popular used in the Internet because JPEG files are very lossy and easy to compress that can speed up Internet transmitting processes. However, corrupted JPEG images are hard to recover due to the complexities of determining corruption point. Nowadays Bitmap (BMP) images are preferred in image processing compared to another formats because BMP image contain all the image information in a simple format. Therefore, in order to investigate the corruption point in JPEG, the file is required to be converted into BMP format. Nevertheless, there are many things that can influence the corrupting of BMP image such as the changes of image size that make the file non-viewable. In this paper, the experiment indicates that the size of BMP file influences the changes in the image itself through three conditions, deleting, replacing and insertion. From the experiment, we learnt by correcting the file size, it can able to produce a viewable file though partially. Then, it can be investigated further to identify the corruption point.

  9. Performance Evaluation of Various STL File Mesh Refining Algorithms Applied for FDM-RP Process

    NASA Astrophysics Data System (ADS)

    Ledalla, Siva Rama Krishna; Tirupathi, Balaji; Sriram, Venkatesh

    2018-06-01

    Layered manufacturing machines use the stereolithography (STL) file to build parts. When a curved surface is converted from a computer aided design (CAD) file to STL, it results in a geometrical distortion and chordal error. Parts manufactured with this file, might not satisfy geometric dimensioning and tolerance requirements due to approximated geometry. Current algorithms built in CAD packages have export options to globally reduce this distortion, which leads to an increase in the file size and pre-processing time. In this work, different mesh subdivision algorithms are applied on STL file of a complex geometric features using MeshLab software. The mesh subdivision algorithms considered in this work are modified butterfly subdivision technique, loops sub division technique and general triangular midpoint sub division technique. A comparative study is made with respect to volume and the build time using the above techniques. It is found that triangular midpoint sub division algorithm is more suitable for the geometry under consideration. Only the wheel cap part is then manufactured on Stratasys MOJO FDM machine. The surface roughness of the part is measured on Talysurf surface roughness tester.

  10. Animal cruelty and psychiatric disorders.

    PubMed

    Gleyzer, Roman; Felthous, Alan R; Holzer, Charles E

    2002-01-01

    Animal cruelty in childhood, although generally viewed as abnormal or deviant, for years was not considered symptomatic of any particular psychiatric disorder. Although animal cruelty is currently used as a diagnostic criterion for conduct disorder, research establishing the diagnostic significance of this behavior is essentially nonexistent. In the current study, investigators tested the hypothesis that a history of substantial animal cruelty is associated with a diagnosis of antisocial personality disorder (APD) and looked for associations with other disorders commonly diagnosed in a population of criminal defendants. Forty-eight subjects, criminal defendants who had histories of substantial animal cruelty, were matched with defendants without this history. Data were systematically obtained from the files by using four specifically designed data retrieval outlines. A history of animal cruelty during childhood was significantly associated with APD, antisocial personality traits, and polysubstance abuse. Mental retardation, psychotic disorders, and alcohol abuse showed no such association.

  11. Processing EOS MLS Level-2 Data

    NASA Technical Reports Server (NTRS)

    Snyder, W. Van; Wu, Dong; Read, William; Jiang, Jonathan; Wagner, Paul; Livesey, Nathaniel; Schwartz, Michael; Filipiak, Mark; Pumphrey, Hugh; Shippony, Zvi

    2006-01-01

    A computer program performs level-2 processing of thermal-microwave-radiance data from observations of the limb of the Earth by the Earth Observing System (EOS) Microwave Limb Sounder (MLS). The purpose of the processing is to estimate the composition and temperature of the atmosphere versus altitude from .8 to .90 km. "Level-2" as used here is a specialists f term signifying both vertical profiles of geophysical parameters along the measurement track of the instrument and processing performed by this or other software to generate such profiles. Designed to be flexible, the program is controlled via a configuration file that defines all aspects of processing, including contents of state and measurement vectors, configurations of forward models, measurement and calibration data to be read, and the manner of inverting the models to obtain the desired estimates. The program can operate in a parallel form in which one instance of the program acts a master, coordinating the work of multiple slave instances on a cluster of computers, each slave operating on a portion of the data. Optionally, the configuration file can be made to instruct the software to produce files of simulated radiances based on state vectors formed from sets of geophysical data-product files taken as input.

  12. 36 CFR 218.10 - Objection time periods and process.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... calendar day (11:59 p.m. in the time zone of the receiving office) for objections filed by electronic means... 36 Parks, Forests, and Public Property 2 2010-07-01 2010-07-01 false Objection time periods and... Objection time periods and process. (a) Time to file an objection. Written objections, including any...

  13. 29 CFR 1640.7 - Processing of charges of employment discrimination filed with the EEOC.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... with the EEOC. 1640.7 Section 1640.7 Labor Regulations Relating to Labor (Continued) EQUAL EMPLOYMENT... REHABILITATION ACT OF 1973 § 1640.7 Processing of charges of employment discrimination filed with the EEOC. (a) EEOC determination of jurisdiction. Upon receipt of a charge of employment discrimination, the EEOC...

  14. 29 CFR 1640.7 - Processing of charges of employment discrimination filed with the EEOC.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... with the EEOC. 1640.7 Section 1640.7 Labor Regulations Relating to Labor (Continued) EQUAL EMPLOYMENT... REHABILITATION ACT OF 1973 § 1640.7 Processing of charges of employment discrimination filed with the EEOC. (a) EEOC determination of jurisdiction. Upon receipt of a charge of employment discrimination, the EEOC...

  15. 29 CFR 1640.7 - Processing of charges of employment discrimination filed with the EEOC.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... with the EEOC. 1640.7 Section 1640.7 Labor Regulations Relating to Labor (Continued) EQUAL EMPLOYMENT... REHABILITATION ACT OF 1973 § 1640.7 Processing of charges of employment discrimination filed with the EEOC. (a) EEOC determination of jurisdiction. Upon receipt of a charge of employment discrimination, the EEOC...

  16. 29 CFR 1640.7 - Processing of charges of employment discrimination filed with the EEOC.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... with the EEOC. 1640.7 Section 1640.7 Labor Regulations Relating to Labor (Continued) EQUAL EMPLOYMENT... REHABILITATION ACT OF 1973 § 1640.7 Processing of charges of employment discrimination filed with the EEOC. (a) EEOC determination of jurisdiction. Upon receipt of a charge of employment discrimination, the EEOC...

  17. 29 CFR 1640.7 - Processing of charges of employment discrimination filed with the EEOC.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... with the EEOC. 1640.7 Section 1640.7 Labor Regulations Relating to Labor (Continued) EQUAL EMPLOYMENT... REHABILITATION ACT OF 1973 § 1640.7 Processing of charges of employment discrimination filed with the EEOC. (a) EEOC determination of jurisdiction. Upon receipt of a charge of employment discrimination, the EEOC...

  18. Electronic medical file exchange between on-duty care providers and the attending paediatrician: a Belgian paediatric pilot project.

    PubMed

    Deneyer, M; Hachimi-Idrissi, S; Michel, L; Nyssen, M; De Moor, G; Vandenplas, Y

    2012-01-01

    The authors propose the introduction of a pilot project: "paediatric core file exchange in emergencies" (PCF-EXEM) which enables the exchange of medical data between the attending paediatrician (AP), holder of the medical record, and on-duty medical units (i.e. general practitioners, paediatricians, surgeons, emergency physicians,...). This project is based on two pillars: a protected server (PCF-server) containing paediatric core files (PCF), with important clinical data that should be available for the physician in order to quickly get a clear insight into the relevant clinical medical history of the child, and secondly, the possibility to provide feedback to the attending physician about the findings recorded during the on-call duty. The permanent availability of health data on the PCF-server and the possibility to provide feedback represent together the PCF-EXEM-project. This project meets the demand of the care providers to have relevant medical information permanently available in order to guarantee high quality care in emergency situations. The frail balance between the right to informative privacy and professional confidentiality on the one hand and the right to quality health care on the other hand has been taken into account. The technical and practical feasibility of this project is described. The objectives and vision of the PCF-EXEM project are conform to Belgian legislation concerning the processing of medical data and are in line with the still under consideration European projects which are focusing on interoperability and the development of a common access control to databanks containing health data for care providers. PCF-EXEM could therefore be a model for other EU countries as well.

  19. Monitoring Satellite Data Ingest and Processing for the Atmosphere Science Investigator-led Processing Systems (SIPS)

    NASA Astrophysics Data System (ADS)

    Witt, J.; Gumley, L.; Braun, J.; Dutcher, S.; Flynn, B.

    2017-12-01

    The Atmosphere SIPS (Science Investigator-led Processing Systems) team at the Space Science and Engineering Center (SSEC), which is funded through a NASA contract, creates Level 2 cloud and aerosol products from the VIIRS instrument aboard the S-NPP satellite. In order to monitor the ingest and processing of files, we have developed an extensive monitoring system to observe every step in the process. The status grid is used for real time monitoring, and shows the current state of the system, including what files we have and whether or not we are meeting our latency requirements. Our snapshot tool displays the state of the system in the past. It displays which files were available at a given hour and is used for historical and backtracking purposes. In addition to these grid like tools we have created histograms and other statistical graphs for tracking processing and ingest metrics, such as total processing time, job queue time, and latency statistics.

  20. Cloud Engineering Principles and Technology Enablers for Medical Image Processing-as-a-Service.

    PubMed

    Bao, Shunxing; Plassard, Andrew J; Landman, Bennett A; Gokhale, Aniruddha

    2017-04-01

    Traditional in-house, laboratory-based medical imaging studies use hierarchical data structures (e.g., NFS file stores) or databases (e.g., COINS, XNAT) for storage and retrieval. The resulting performance from these approaches is, however, impeded by standard network switches since they can saturate network bandwidth during transfer from storage to processing nodes for even moderate-sized studies. To that end, a cloud-based "medical image processing-as-a-service" offers promise in utilizing the ecosystem of Apache Hadoop, which is a flexible framework providing distributed, scalable, fault tolerant storage and parallel computational modules, and HBase, which is a NoSQL database built atop Hadoop's distributed file system. Despite this promise, HBase's load distribution strategy of region split and merge is detrimental to the hierarchical organization of imaging data (e.g., project, subject, session, scan, slice). This paper makes two contributions to address these concerns by describing key cloud engineering principles and technology enhancements we made to the Apache Hadoop ecosystem for medical imaging applications. First, we propose a row-key design for HBase, which is a necessary step that is driven by the hierarchical organization of imaging data. Second, we propose a novel data allocation policy within HBase to strongly enforce collocation of hierarchically related imaging data. The proposed enhancements accelerate data processing by minimizing network usage and localizing processing to machines where the data already exist. Moreover, our approach is amenable to the traditional scan, subject, and project-level analysis procedures, and is compatible with standard command line/scriptable image processing software. Experimental results for an illustrative sample of imaging data reveals that our new HBase policy results in a three-fold time improvement in conversion of classic DICOM to NiFTI file formats when compared with the default HBase region split policy, and nearly a six-fold improvement over a commonly available network file system (NFS) approach even for relatively small file sets. Moreover, file access latency is lower than network attached storage.

  1. 77 FR 43222 - Endangered and Threatened Wildlife and Plants; Designation of Critical Habitat for the Tidewater...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-24

    ... DEPARTMENT OF THE INTERIOR Fish and Wildlife Service 50 CFR Part 17 [Docket No. FWS-R8-ES-2011... Federal Actions On April 15, 2009, Natural Resources Defense Council (NRDC) filed a lawsuit in the U.S... consultation history for natural resource management projects suggests that these projects are generally...

  2. Assessment Testing: Analysis and Predictions, Spring-Fall 1985.

    ERIC Educational Resources Information Center

    Harris, Howard L.; Hansson, Claudia J.

    During spring and fall 1985, a study was conducted at Cosumnes River College (CRC) to determine how assessment testing scores related to student persistence and performance. The student history files of a random sample of 498 students who had been tested by the CRC Assessment Center during spring and fall 1985 were examined, yielding the following…

  3. 77 FR 27845 - Qualification of Drivers; Exemption Applications; Vision

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-11

    ... predicting crash proneness from crash history coupled with other factors. These factors--such as age, sex... March 23, 2012 (77 FR 17109). We recognize that the vision of an applicant may change and affect his/her.... Larry W. Minor, Associate Administrator for Policy. [FR Doc. 2012-11444 Filed 5-10-12; 8:45 am] BILLING...

  4. First Battle of Manassas: An End to Innocence. Teaching with Historic Places.

    ERIC Educational Resources Information Center

    Litterst, Michael

    This lesson is based on the National Register of Historic Places registration file, "Manassas National Battlefield Park" and other sources. The lesson could be used in units on the Civil War. Students strengthen their skills of observation and interpretation in the study of history and geography and gain practice in analyzing primary…

  5. 80. PHOTOCOPY OF VIEW OF GRADING OPERATIONS BELOW P STREET ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    80. PHOTOCOPY OF VIEW OF GRADING OPERATIONS BELOW P STREET BRIDGE, LOOKING EAST FROM SOUTHBOUND P STREET PARKWAY ACCESS, FROM 1940 REPORT ON PROPOSED DEVELOPMENT OF ROCK CREEK AND POTOMAC PARKWAY, SECTION II (ROCK CREEK AND POTOMAC PARKWAY FILE, HISTORY DEPARTMENT ARCHIVES, NATIONAL PARK SERVICE, WASHINGTON, DC). - Rock Creek & Potomac Parkway, Washington, District of Columbia, DC

  6. A Brief History of an Ethnographic Database: The HRAF Collection of Ethnography

    ERIC Educational Resources Information Center

    Roe, Sandra K.

    2007-01-01

    Since 1950, the Human Relations Area Files, Inc. has produced what is currently known as the eHRAF Collection of Ethnography. This article explores the reasons why it was created and describes the structure of this complex collection of ethnographic works. Over time, this resource has been produced in four different formats: paper slips,…

  7. NOVA - Official Website | The Pluto Files

    Science.gov Websites

    December 14, 2011 on PBS Program Description (Program not available for streaming.) When the American : So that's when most people first learn about the planets. So his name comes up, and you learn he's a DEGRASSE TYSON (American Museum of Natural History, Hayden Planetarium): In 1930, a farm boy, with a

  8. Can Financial Need Analysis be Simplified?

    ERIC Educational Resources Information Center

    Orwig, M. D.; Jones, Paul K.

    This paper examines the problem of collecting financial data on aid applicants. A 10% sample (12,383) of student records was taken from the 1968-69 alphabetic history file for the ACT Student Need Analysis Service. Random sub-samples were taken in certain phases of the study. A relatively small number of financial variables were found to predict…

  9. Glen Echo Park: Center for Education and Recreation. Teaching with Historic Places.

    ERIC Educational Resources Information Center

    Gray, Stephanie

    This lesson is based on the National Register of Historic Places registration file "Glen Echo Amusement Park," park planning documents, and newspaper and magazine accounts. The lesson can be used in U.S. history units on the Gilded Age and the Progressive Era to explore religious and educational reform movements (including the Chautauqua…

  10. Auto Draw from Excel Input Files

    NASA Technical Reports Server (NTRS)

    Strauss, Karl F.; Goullioud, Renaud; Cox, Brian; Grimes, James M.

    2011-01-01

    The design process often involves the use of Excel files during project development. To facilitate communications of the information in the Excel files, drawings are often generated. During the design process, the Excel files are updated often to reflect new input. The problem is that the drawings often lag the updates, often leading to confusion of the current state of the design. The use of this program allows visualization of complex data in a format that is more easily understandable than pages of numbers. Because the graphical output can be updated automatically, the manual labor of diagram drawing can be eliminated. The more frequent update of system diagrams can reduce confusion and reduce errors and is likely to uncover symmetric problems earlier in the design cycle, thus reducing rework and redesign.

  11. Report on IVS-WG4

    NASA Astrophysics Data System (ADS)

    Gipson, John

    2011-07-01

    I describe the proposed data structure for storing, archiving and processing VLBI data. In this scheme, most VLBI data is stored in NetCDF files. NetCDF has the advantage that there are interfaces to most common computer languages including Fortran, Fortran-90, C, C++, Perl, etc, and the most common operating systems including linux, Windows and Mac. The data files for a particular session are organized by special ASCII "wrapper" files which contain pointers to the data files. This allows great flexibility in the processing and analysis of VLBI data, and also allows for extending the types of data used, e.g., source maps. I discuss the use of the new format in calc/solve and other VLBI analysis packages. I also discuss plans for transitioning to the new structure.

  12. Computer Programs to Display and Modify Data in Geographic Coordinates and Methods to Transfer Positions to and from Maps, with Applications to Gravity Data Processing, Global Positioning Systems, and 30-Meter Digital Elevation Models

    USGS Publications Warehouse

    Plouff, Donald

    1998-01-01

    Computer programs were written in the Fortran language to process and display gravity data with locations expressed in geographic coordinates. The programs and associated processes have been tested for gravity data in an area of about 125,000 square kilometers in northwest Nevada, southeast Oregon, and northeast California. This report discusses the geographic aspects of data processing. Utilization of the programs begins with application of a template (printed in PostScript format) to transfer locations obtained with Global Positioning Systems to and from field maps and includes a 5-digit geographic-based map naming convention for field maps. Computer programs, with source codes that can be copied, are used to display data values (printed in PostScript format) and data coverage, insert data into files, extract data from files, shift locations, test for redundancy, and organize data by map quadrangles. It is suggested that 30-meter Digital Elevation Models needed for gravity terrain corrections and other applications should be accessed in a file search by using the USGS 7.5-minute map name as a file name, for example, file '40117_B8.DEM' contains elevation data for the map with a southeast corner at lat 40? 07' 30' N. and lon 117? 52' 30' W.

  13. 10 CFR 708.10 - Where does an employee file a complaint?

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 4 2012-01-01 2012-01-01 false Where does an employee file a complaint? 708.10 Section 708.10 Energy DEPARTMENT OF ENERGY DOE CONTRACTOR EMPLOYEE PROTECTION PROGRAM Employee Complaint Resolution Process § 708.10 Where does an employee file a complaint? (a) If you were employed by a contractor...

  14. 10 CFR 708.10 - Where does an employee file a complaint?

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 4 2011-01-01 2011-01-01 false Where does an employee file a complaint? 708.10 Section 708.10 Energy DEPARTMENT OF ENERGY DOE CONTRACTOR EMPLOYEE PROTECTION PROGRAM Employee Complaint Resolution Process § 708.10 Where does an employee file a complaint? (a) If you were employed by a contractor...

  15. 78 FR 44105 - Monroe Hydro, LLC; Notice of Intent to File License Application, Filing of Pre-Application...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-23

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Project No. 14430-000] Monroe Hydro... Application and Request to Use the Traditional Licensing Process. b. Project No.: 14430-000 c. Date Filed: April 2, 2013 d. Submitted By: Monroe Hydro, LLC e. Name of Project: Monroe Drop Hydroelectric Project f...

  16. 20 CFR 404.506 - When waiver may be applied and how to process the request.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... in writing and given the dates, times and place of the file review and personal conference; the procedure for reviewing the claims file prior to the personal conference; the procedure for seeking a change... individual about the personal conference. The file review is always scheduled at least 5 days before the...

  17. SHOEBOX: A Personal File Handling System for Textual Data. Information System Language Studies, Number 23.

    ERIC Educational Resources Information Center

    Glantz, Richard S.

    Until recently, the emphasis in information storage and retrieval systems has been towards batch-processing of large files. In contrast, SHOEBOX is designed for the unformatted, personal file collection of the computer-naive individual. Operating through display terminals in a time-sharing, interactive environment on the IBM 360, the user can…

  18. 20 CFR 410.561a - When waiver may be applied and how to process the request.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... personal conference; the procedure for reviewing the claims file prior to the personal conference; the... necessary to fully inform the individual about the personal conference. The file review is always scheduled at least 5 days before the personal conference. (d) At the file review, the individual and the...

  19. 20 CFR 404.506 - When waiver may be applied and how to process the request.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... in writing and given the dates, times and place of the file review and personal conference; the procedure for reviewing the claims file prior to the personal conference; the procedure for seeking a change... individual about the personal conference. The file review is always scheduled at least 5 days before the...

  20. Health aspects of Arctic exploration – Alaska’s medical history based on the research files of Dr. Robert Fortuine

    PubMed Central

    Murray, Kathleen

    2013-01-01

    Background Robert Fortuine provided basic medical care to Alaska Native people, chronicled the Health Aspects of Arctic Exploration and through a number of influential publications, was the first to thoroughly document and analyse Alaska’s Medical History. This overview of his published work will provide the reader with a detailed overview, so that they can begin to explore Dr. Fortuine’s many published works in more detail. Objective This review will explore Alaska’s Medical History and the Health Aspects of Arctic Exploration through the research files and the 10 most significant publications of Dr. Robert Fortuine. Design Review of Dr. Fortuine’s major works and the master bibliography has over 3,000 references and 81 subjects. The master bibliography is a merger of 55 separate bibliographies, which provides a wealth of bibliographic information. This paper will describe his 10 most significant publications, 2 of which began as a journal issue. Results Dr. Fortuine was a prolific writer throughout his career, publishing 134 articles and books. He wrote papers and books on Alaska’s medical history, tuberculosis and health care delivery from Russian–America through the Public Health Service efforts in the territory and then the State of Alaska. The master bibliography has over 3,000 references and 81 subjects. This list has a significant number of entries for tuberculosis with almost one-third of the entries including this heading. Others dwell on the history of “pre-contact” health, the history of Alaska Native health care, the history of the Alaska Department of Health, especially the tuberculosis programme, the role of the US Public Health Service and traditional medicine. He completely reviewed every Governors’ and the US Surgeon General’s reports in regard to Alaska content. This paper describes his 10 most significant publications. Conclusions Robert Fortuine’s published works offer a wealth of information and insight into Alaska’s Medical History and the Health Aspects of Arctic Exploration. As is probably true for many historians, he began small, creating a bibliography and adapting a talk before tackling his first full-length book. Readers who sample his many works will be enriched and enlightened. PMID:23967418

  1. The Standard Autonomous File Server, a Customized, Off-the-Shelf Success Story

    NASA Technical Reports Server (NTRS)

    Semancik, Susan K.; Conger, Annette M.; Obenschain, Arthur F. (Technical Monitor)

    2001-01-01

    The Standard Autonomous File Server (SAFS), which includes both off-the-shelf hardware and software, uses an improved automated file transfer process to provide a quicker, more reliable, prioritized file distribution for customers of near real-time data without interfering with the assets involved in the acquisition and processing of the data. It operates as a stand-alone solution, monitoring itself, and providing an automated fail-over process to enhance reliability. This paper will describe the unique problems and lessons learned both during the COTS selection and integration into SAFS, and the system's first year of operation in support of NASA's satellite ground network. COTS was the key factor in allowing the two-person development team to deploy systems in less than a year, meeting the required launch schedule. The SAFS system his been so successful, it is becoming a NASA standard resource, leading to its nomination for NASA's Software or the Year Award in 1999.

  2. An alternative to sneakernet

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Orrell, S.; Ralstin, S.

    1992-04-01

    Many computer security plans specify that only a small percentage of the data processed will be classified. Thus, the bulk of the data on secure systems must be unclassified. Secure limited access sites operating approved classified computing systems sometimes also have a system ostensibly containing only unclassified files but operating within the secure environment. That system could be networked or otherwise connected to a classified system(s) in order that both be able to use common resources for file storage or computing power. Such a system must operate under the same rules as the secure classified systems. It is in themore » nature of unclassified files that they either came from, or will eventually migrate to, a non-secure system. Today, unclassified files are exported from systems within the secure environment typically by loading transport media and carrying them to an open system. Import of unclassified files is handled similarly. This media transport process, sometimes referred to as sneaker net, often is manually logged and controlled only by administrative procedures. A comprehensive system for secure bi-directional transfer of unclassified files between secure and open environments has yet to be developed. Any such secure file transport system should be required to meet several stringent criteria. It is the purpose of this document to begin a definition of these criteria.« less

  3. An alternative to sneakernet

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Orrell, S.; Ralstin, S.

    1992-01-01

    Many computer security plans specify that only a small percentage of the data processed will be classified. Thus, the bulk of the data on secure systems must be unclassified. Secure limited access sites operating approved classified computing systems sometimes also have a system ostensibly containing only unclassified files but operating within the secure environment. That system could be networked or otherwise connected to a classified system(s) in order that both be able to use common resources for file storage or computing power. Such a system must operate under the same rules as the secure classified systems. It is in themore » nature of unclassified files that they either came from, or will eventually migrate to, a non-secure system. Today, unclassified files are exported from systems within the secure environment typically by loading transport media and carrying them to an open system. Import of unclassified files is handled similarly. This media transport process, sometimes referred to as sneaker net, often is manually logged and controlled only by administrative procedures. A comprehensive system for secure bi-directional transfer of unclassified files between secure and open environments has yet to be developed. Any such secure file transport system should be required to meet several stringent criteria. It is the purpose of this document to begin a definition of these criteria.« less

  4. Deceit: A flexible distributed file system

    NASA Technical Reports Server (NTRS)

    Siegel, Alex; Birman, Kenneth; Marzullo, Keith

    1989-01-01

    Deceit, a distributed file system (DFS) being developed at Cornell, focuses on flexible file semantics in relation to efficiency, scalability, and reliability. Deceit servers are interchangeable and collectively provide the illusion of a single, large server machine to any clients of the Deceit service. Non-volatile replicas of each file are stored on a subset of the file servers. The user is able to set parameters on a file to achieve different levels of availability, performance, and one-copy serializability. Deceit also supports a file version control mechanism. In contrast with many recent DFS efforts, Deceit can behave like a plain Sun Network File System (NFS) server and can be used by any NFS client without modifying any client software. The current Deceit prototype uses the ISIS Distributed Programming Environment for all communication and process group management, an approach that reduces system complexity and increases system robustness.

  5. The Western Aeronautical Test Range. Chapter 10 Tools

    NASA Technical Reports Server (NTRS)

    Knudtson, Kevin; Park, Alice; Downing, Robert; Sheldon, Jack; Harvey, Robert; Norcross, April

    2011-01-01

    The Western Aeronautical Test Range (WATR) staff at the NASA Dryden Flight Research Center is developing a translation software called Chapter 10 Tools in response to challenges posed by post-flight processing data files originating from various on-board digital recorders that follow the Range Commanders Council Inter-Range Instrumentation Group (IRIG) 106 Chapter 10 Digital Recording Standard but use differing interpretations of the Standard. The software will read the date files regardless of the vendor implementation of the source recorder, displaying data, identifying and correcting errors, and producing a data file that can be successfully processed post-flight

  6. Evolution versus creation in the public school curriculum: History of the legal battles

    NASA Astrophysics Data System (ADS)

    Machleidt, Ruprecht

    2007-04-01

    In 2004, the school board of Dover, Pennsylvania, ordered Intelligent Design (ID) to be included in the Biology curriculum of Dover High School. Tammy Kitzmiller and ten other parents of Dover students filed suit in Federal Court contending that the Dover ID policy constituted an establishment of religion prohibited by the First Amendment to the United States Constitution. The six-week trial in fall 2005 made national headlines. The judge ruled in favor of the plaintiffs. The Kitzmiller case cannot be understood properly without knowledge of the legal landscape and the history of court cases of similar kind. It is the purpose of this contribution to provide this background. Thus, I will review the 80+ year history of the controversy surrounding the teaching of evolution versus creation in public schools.

  7. Congress, NRC mull utility access to FBI criminal files

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ultroska, D.

    1984-08-01

    Experiences at Alabama Power Company and other nuclear utilities have promped a request for institutionalizing security checks of personnel in order to eliminated convicted criminals and drug users. The Nuclear Regulatory Commission (NRC), which could provide FBI criminal history information by submitting fingerprints, does not do so, and would require new legislation to take on that duty. Believing that current malevolent employees can be managed with existing procedures, NRC allows criminal background checks only on prospective employees in order to avoid a negative social impact on personnel. Legislation to transfer criminal histories to nuclear facilities is now pending, and NRCmore » is leaning toward a request for full disclosure, partly because of terrorist threats and partly to save manpower time and costs in reviewing case histories.« less

  8. 75 FR 15949 - Revisions to Form, Procedures, and Criteria for Certification of Qualifying Facility Status for a...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-30

    ... of newer technologies that will reduce both the filing burden for applicants and the processing... technologies to reduce both the filing burden for applicants and the processing burden for the Commission. 3... both with administering the Form No. 556 and with new technologies for electronic data collection that...

  9. 75 FR 71625 - System Restoration Reliability Standards

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-24

    ... processing software should be filed in native applications or print-to-PDF format, and not in a scanned... (2006), aff'd sub nom. Alcoa, Inc. v. FERC, 564 F.3d 1342 (D.C. Cir. 2009). 6. On March 16, 2007, the... electronically using word processing software should be filed in native applications or print-to-PDF format, and...

  10. 75 FR 70224 - New York Tidal Energy Company; Notice Concluding Pre-Filing Process and Approving Process Plan...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-17

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Project No. 12665-003] New York Tidal... Tidal Energy Company. e. Name of Project: East River Tidal Energy Pilot Project. f. Location: In the.... Filed Pursuant to: 18 CFR 5.3 of the Commission's regulations. h. Applicant Contact: Daniel Power...

  11. 17 CFR 240.17Ad-2 - Turnaround, processing, and forwarding of items.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 17 Commodity and Securities Exchanges 3 2011-04-01 2011-04-01 false Turnaround, processing, and forwarding of items. 240.17Ad-2 Section 240.17Ad-2 Commodity and Securities Exchanges SECURITIES AND EXCHANGE... be filed with the Comptroller of the Currency shall be filed with the Office of the Comptroller of...

  12. 17 CFR 240.17Ad-2 - Turnaround, processing, and forwarding of items.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 17 Commodity and Securities Exchanges 3 2010-04-01 2010-04-01 false Turnaround, processing, and forwarding of items. 240.17Ad-2 Section 240.17Ad-2 Commodity and Securities Exchanges SECURITIES AND EXCHANGE... be filed with the Comptroller of the Currency shall be filed with the Office of the Comptroller of...

  13. 18 CFR 157.21 - Pre-filing procedures and review process for LNG terminal facilities and other natural gas...

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... and review process for LNG terminal facilities and other natural gas facilities prior to filing of... COMMISSION, DEPARTMENT OF ENERGY REGULATIONS UNDER NATURAL GAS ACT APPLICATIONS FOR CERTIFICATES OF PUBLIC CONVENIENCE AND NECESSITY AND FOR ORDERS PERMITTING AND APPROVING ABANDONMENT UNDER SECTION 7 OF THE NATURAL...

  14. 29 CFR 1640.9 - Processing of complaints or charges of employment discrimination filed with a designated agency...

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... discrimination filed with a designated agency and either a section 504 agency, the EEOC, or both. 1640.9 Section 1640.9 Labor Regulations Relating to Labor (Continued) EQUAL EMPLOYMENT OPPORTUNITY COMMISSION... and either a section 504 agency, the EEOC, or both. (a) Designated agency processing. A designated...

  15. 29 CFR 1640.9 - Processing of complaints or charges of employment discrimination filed with a designated agency...

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... discrimination filed with a designated agency and either a section 504 agency, the EEOC, or both. 1640.9 Section 1640.9 Labor Regulations Relating to Labor (Continued) EQUAL EMPLOYMENT OPPORTUNITY COMMISSION... and either a section 504 agency, the EEOC, or both. (a) Designated agency processing. A designated...

  16. 29 CFR 1640.9 - Processing of complaints or charges of employment discrimination filed with a designated agency...

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... discrimination filed with a designated agency and either a section 504 agency, the EEOC, or both. 1640.9 Section 1640.9 Labor Regulations Relating to Labor (Continued) EQUAL EMPLOYMENT OPPORTUNITY COMMISSION... and either a section 504 agency, the EEOC, or both. (a) Designated agency processing. A designated...

  17. 29 CFR 1640.9 - Processing of complaints or charges of employment discrimination filed with a designated agency...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... discrimination filed with a designated agency and either a section 504 agency, the EEOC, or both. 1640.9 Section 1640.9 Labor Regulations Relating to Labor (Continued) EQUAL EMPLOYMENT OPPORTUNITY COMMISSION... and either a section 504 agency, the EEOC, or both. (a) Designated agency processing. A designated...

  18. Converting CSV Files to RKSML Files

    NASA Technical Reports Server (NTRS)

    Trebi-Ollennu, Ashitey; Liebersbach, Robert

    2009-01-01

    A computer program converts, into a format suitable for processing on Earth, files of downlinked telemetric data pertaining to the operation of the Instrument Deployment Device (IDD), which is a robot arm on either of the Mars Explorer Rovers (MERs). The raw downlinked data files are in comma-separated- value (CSV) format. The present program converts the files into Rover Kinematics State Markup Language (RKSML), which is an Extensible Markup Language (XML) format that facilitates representation of operations of the IDD and enables analysis of the operations by means of the Rover Sequencing Validation Program (RSVP), which is used to build sequences of commanded operations for the MERs. After conversion by means of the present program, the downlinked data can be processed by RSVP, enabling the MER downlink operations team to play back the actual IDD activity represented by the telemetric data against the planned IDD activity. Thus, the present program enhances the diagnosis of anomalies that manifest themselves as differences between actual and planned IDD activities.

  19. Improving the Taiwan Military’s Disaster Relief Response to Typhoons

    DTIC Science & Technology

    2015-06-01

    circulation, are mostly westbound. When they reach the vicinity of Taiwan or the Philippines , which are always at the edge of the Pacific subtropical high...files from the POM base case model, one set for each design point. To automate the process of running all the GAMS files, a Windows batch file ( BAT ...is used to call on GAMS to solve each version of the model. The BAT file creates a new directory for each run to hold output, and one of the outputs

  20. [DNAStat, version 1.2 -- a software package for processing genetic profile databases and biostatistical calculations].

    PubMed

    Berent, Jarosław

    2007-01-01

    This paper presents the new DNAStat version 1.2 for processing genetic profile databases and biostatistical calculations. This new version contains, besides all the options of its predecessor 1.0, a calculation-results file export option in .xls format for Microsoft Office Excel, as well as the option of importing/exporting the population base of systems as .txt files for processing in Microsoft Notepad or EditPad

  1. History by history statistical estimators in the BEAM code system.

    PubMed

    Walters, B R B; Kawrakow, I; Rogers, D W O

    2002-12-01

    A history by history method for estimating uncertainties has been implemented in the BEAMnrc and DOSXYznrc codes replacing the method of statistical batches. This method groups scored quantities (e.g., dose) by primary history. When phase-space sources are used, this method groups incident particles according to the primary histories that generated them. This necessitated adding markers (negative energy) to phase-space files to indicate the first particle generated by a new primary history. The new method greatly reduces the uncertainty in the uncertainty estimate. The new method eliminates one dimension (which kept the results for each batch) from all scoring arrays, resulting in memory requirement being decreased by a factor of 2. Correlations between particles in phase-space sources are taken into account. The only correlations with any significant impact on uncertainty are those introduced by particle recycling. Failure to account for these correlations can result in a significant underestimate of the uncertainty. The previous method of accounting for correlations due to recycling by placing all recycled particles in the same batch did work. Neither the new method nor the batch method take into account correlations between incident particles when a phase-space source is restarted so one must avoid restarts.

  2. 42 CFR 93.522 - Filing post-hearing briefs.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... RESEARCH MISCONDUCT Opportunity To Contest ORI Findings of Research Misconduct and HHS Administrative Actions Hearing Process § 93.522 Filing post-hearing briefs. (a) After the hearing and under a schedule...

  3. 42 CFR 93.522 - Filing post-hearing briefs.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... RESEARCH MISCONDUCT Opportunity To Contest ORI Findings of Research Misconduct and HHS Administrative Actions Hearing Process § 93.522 Filing post-hearing briefs. (a) After the hearing and under a schedule...

  4. High School and Beyond Transcripts Survey (1982). Data File User's Manual. Contractor Report.

    ERIC Educational Resources Information Center

    Jones, Calvin; And Others

    This data file user's manual documents the procedures used to collect and process high school transcripts for a large sample of the younger cohort (1980 sophomores) in the High School and Beyond survey. The manual provides the user with the technical assistance needed to use the computer file and also discusses the following: (1) sample design for…

  5. 77 FR 58828 - Alaska Energy Authority; Notice of Extension of Time To File Comments on the Proposed Study and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-09-24

    ... Authority; Notice of Extension of Time To File Comments on the Proposed Study and Revised Study Plan On July 16, 2012, Alaska Energy Authority (AEA) filed its proposed study plan for the Susitna-Watana Project... Process, making comments on the study plan due October 14, 2012. During the comment period, AEA finalized...

  6. 50 CFR 221.22 - How do I file a notice of intervention and response?

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 50 Wildlife and Fisheries 10 2013-10-01 2013-10-01 false How do I file a notice of intervention... intervention and response? (a) General. (1) To intervene as a party to the hearing process, you must: (i) Be a license party; and (ii) File with the Office of Habitat Conservation a notice of intervention and a...

  7. 50 CFR 221.22 - How do I file a notice of intervention and response?

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 50 Wildlife and Fisheries 10 2014-10-01 2014-10-01 false How do I file a notice of intervention... intervention and response? (a) General. (1) To intervene as a party to the hearing process, you must: (i) Be a license party; and (ii) File with the Office of Habitat Conservation a notice of intervention and a...

  8. 50 CFR 221.22 - How do I file a notice of intervention and response?

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 50 Wildlife and Fisheries 10 2012-10-01 2012-10-01 false How do I file a notice of intervention... intervention and response? (a) General. (1) To intervene as a party to the hearing process, you must: (i) Be a license party; and (ii) File with the Office of Habitat Conservation a notice of intervention and a...

  9. A Class of Administrative Models for Maintaining Anonymity During Merge of Data Files. A Draft.

    ERIC Educational Resources Information Center

    Boruch, Robert F.

    This report examines a series of general models that represent the process of merging records from separate files when it becomes essential to inhibit identifiability of records in at least one of the files. Models are illustrated symbolically by flow diagrams, and examples of each variation are taken from the social sciences. These variations…

  10. 76 FR 47210 - Notices of Filing of Petitions for Food Additives and Color Additives; Relocation in the Federal...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-04

    ...] Notices of Filing of Petitions for Food Additives and Color Additives; Relocation in the Federal Register...) is notifying the public that notices of filing of petitions for food additives and color additives... additive petition approval process for food additives for use in human and animal food. Section 409(b)(5...

  11. Battery Data MI Importer Template Quick Start Guide

    NASA Technical Reports Server (NTRS)

    Levinson, Laurie H.

    2017-01-01

    In order to ensure the persistent availability and reliability of test data generated over the course of the project, the M-SHELLS Project has decided to store acquired test data, as well as associated pedigree information, in the Granta Materials Intelligence (MI) database. To facilitate that effort, an importer template and associated graphical user interface (GUI) software have been developed, with this guide providing the operating instructions for their use. The template and automation software GUI are contained in the BatteryDataImporter.xlsm Excel workbook, and are to be used to import M-SHELLS summary, or pedigree, data and the associated raw test data results into an importer template-based file, formatted in such a way as to be ready for immediate upload to the Test Data: Battery Performance table of the Granta MI database. The provided GUI enables the user to select the appropriate summary data file(s), with each file containing the required information to identify any associated raw test data file(s) to be processed. In addition to describing the setup and operation of the importer template and GUI software, this guide also provides instructions for uploading processed data to the database and for viewing the data following upload.

  12. Informatics in radiology (infoRAD): free DICOM image viewing and processing software for the Macintosh computer: what's available and what it can do for you.

    PubMed

    Escott, Edward J; Rubinstein, David

    2004-01-01

    It is often necessary for radiologists to use digital images in presentations and conferences. Most imaging modalities produce images in the Digital Imaging and Communications in Medicine (DICOM) format. The image files tend to be large and thus cannot be directly imported into most presentation software, such as Microsoft PowerPoint; the large files also consume storage space. There are many free programs that allow viewing and processing of these files on a personal computer, including conversion to more common file formats such as the Joint Photographic Experts Group (JPEG) format. Free DICOM image viewing and processing software for computers running on the Microsoft Windows operating system has already been evaluated. However, many people use the Macintosh (Apple Computer) platform, and a number of programs are available for these users. The World Wide Web was searched for free DICOM image viewing or processing software that was designed for the Macintosh platform or is written in Java and is therefore platform independent. The features of these programs and their usability were evaluated. There are many free programs for the Macintosh platform that enable viewing and processing of DICOM images. (c) RSNA, 2004.

  13. Evaluation of an interactive case simulation system in dermatology and venereology for medical students

    PubMed Central

    Wahlgren, Carl-Fredrik; Edelbring, Samuel; Fors, Uno; Hindbeck, Hans; Ståhle, Mona

    2006-01-01

    Background Most of the many computer resources used in clinical teaching of dermatology and venereology for medical undergraduates are information-oriented and focus mostly on finding a "correct" multiple-choice alternative or free-text answer. We wanted to create an interactive computer program, which facilitates not only factual recall but also clinical reasoning. Methods Through continuous interaction with students, a new computerised interactive case simulation system, NUDOV, was developed. It is based on authentic cases and contains images of real patients, actors and healthcare providers. The student selects a patient and proposes questions for medical history, examines the skin, and suggests investigations, diagnosis, differential diagnoses and further management. Feedback is given by comparing the user's own suggestions with those of a specialist. In addition, a log file of the student's actions is recorded. The program includes a large number of images, video clips and Internet links. It was evaluated with a student questionnaire and by randomising medical students to conventional teaching (n = 85) or conventional teaching plus NUDOV (n = 31) and comparing the results of the two groups in a final written examination. Results The questionnaire showed that 90% of the NUDOV students stated that the program facilitated their learning to a large/very large extent, and 71% reported that extensive working with authentic computerised cases made it easier to understand and learn about diseases and their management. The layout, user-friendliness and feedback concept were judged as good/very good by 87%, 97%, and 100%, respectively. Log files revealed that the students, in general, worked with each case for 60–90 min. However, the intervention group did not score significantly better than the control group in the written examination. Conclusion We created a computerised case simulation program allowing students to manage patients in a non-linear format supporting the clinical reasoning process. The student gets feedback through comparison with a specialist, eliminating the need for external scoring or correction. The model also permits discussion of case processing, since all transactions are stored in a log file. The program was highly appreciated by the students, but did not significantly improve their performance in the written final examination. PMID:16907972

  14. GEOTHERM user guide

    USGS Publications Warehouse

    Swanson, James R.

    1977-01-01

    GEOTHERM is a computerized geothermal resources file developed by the U.S. Geological Survey. The file contains data on geothermal fields, wells, and chemical analyses from the United states and international sources. The General Information Processing System (GIPSY) in the IBM 370/155 computer is used to store and retrieve data. The GIPSY retrieval program contains simple commands which can be used to search the file, select a narrowly defined subset, sort the records, and output the data in a variety of forms. Eight commands are listed and explained so that the GEOTHERM file can be accessed directly by geologists. No programming experience is necessary to retrieve data from the file.

  15. LAS - LAND ANALYSIS SYSTEM, VERSION 5.0

    NASA Technical Reports Server (NTRS)

    Pease, P. B.

    1994-01-01

    The Land Analysis System (LAS) is an image analysis system designed to manipulate and analyze digital data in raster format and provide the user with a wide spectrum of functions and statistical tools for analysis. LAS offers these features under VMS with optional image display capabilities for IVAS and other display devices as well as the X-Windows environment. LAS provides a flexible framework for algorithm development as well as for the processing and analysis of image data. Users may choose between mouse-driven commands or the traditional command line input mode. LAS functions include supervised and unsupervised image classification, film product generation, geometric registration, image repair, radiometric correction and image statistical analysis. Data files accepted by LAS include formats such as Multi-Spectral Scanner (MSS), Thematic Mapper (TM) and Advanced Very High Resolution Radiometer (AVHRR). The enhanced geometric registration package now includes both image to image and map to map transformations. The over 200 LAS functions fall into image processing scenario categories which include: arithmetic and logical functions, data transformations, fourier transforms, geometric registration, hard copy output, image restoration, intensity transformation, multispectral and statistical analysis, file transfer, tape profiling and file management among others. Internal improvements to the LAS code have eliminated the VAX VMS dependencies and improved overall system performance. The maximum LAS image size has been increased to 20,000 lines by 20,000 samples with a maximum of 256 bands per image. The catalog management system used in earlier versions of LAS has been replaced by a more streamlined and maintenance-free method of file management. This system is not dependent on VAX/VMS and relies on file naming conventions alone to allow the use of identical LAS file names on different operating systems. While the LAS code has been improved, the original capabilities of the system have been preserved. These include maintaining associated image history, session logging, and batch, asynchronous and interactive mode of operation. The LAS application programs are integrated under version 4.1 of an interface called the Transportable Applications Executive (TAE). TAE 4.1 has four modes of user interaction: menu, direct command, tutor (or help), and dynamic tutor. In addition TAE 4.1 allows the operation of LAS functions using mouse-driven commands under the TAE-Facelift environment provided with TAE 4.1. These modes of operation allow users, from the beginner to the expert, to exercise specific application options. LAS is written in C-language and FORTRAN 77 for use with DEC VAX computers running VMS with approximately 16Mb of physical memory. This program runs under TAE 4.1. Since TAE 4.1 is not a current version of TAE, TAE 4.1 is included within the LAS distribution. Approximately 130,000 blocks (65Mb) of disk storage space are necessary to store the source code and files generated by the installation procedure for LAS and 44,000 blocks (22Mb) of disk storage space are necessary for TAE 4.1 installation. The only other dependencies for LAS are the subroutine libraries for the specific display device(s) that will be used with LAS/DMS (e.g. X-Windows and/or IVAS). The standard distribution medium for LAS is a set of two 9track 6250 BPI magnetic tapes in DEC VAX BACKUP format. It is also available on a set of two TK50 tape cartridges in DEC VAX BACKUP format. This program was developed in 1986 and last updated in 1992.

  16. Building a Steganography Program Including How to Load, Process, and Save JPEG and PNG Files in Java

    ERIC Educational Resources Information Center

    Courtney, Mary F.; Stix, Allen

    2006-01-01

    Instructors teaching beginning programming classes are often interested in exercises that involve processing photographs (i.e., files stored as .jpeg). They may wish to offer activities such as color inversion, the color manipulation effects archived with pixel thresholding, or steganography, all of which Stevenson et al. [4] assert are sought by…

  17. 37 CFR 2.190 - Addresses for trademark correspondence with the United States Patent and Trademark Office.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... recordation; requests for copies of trademark documents; and certain documents filed under the Madrid Protocol... to review an action of the Office's Madrid Processing Unit, when filed by mail, must be mailed to: Madrid Processing Unit, 600 Dulany Street, MDE-7B87, Alexandria, VA 22314-5793. [68 FR 48289, Aug. 13...

  18. 37 CFR 2.190 - Addresses for trademark correspondence with the United States Patent and Trademark Office.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... recordation; requests for copies of trademark documents; and certain documents filed under the Madrid Protocol... to review an action of the Office's Madrid Processing Unit, when filed by mail, must be mailed to: Madrid Processing Unit, 600 Dulany Street, MDE-7B87, Alexandria, VA 22314-5793. [68 FR 48289, Aug. 13...

  19. 37 CFR 2.190 - Addresses for trademark correspondence with the United States Patent and Trademark Office.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... recordation; requests for copies of trademark documents; and certain documents filed under the Madrid Protocol... to review an action of the Office's Madrid Processing Unit, when filed by mail, must be mailed to: Madrid Processing Unit, 600 Dulany Street, MDE-7B87, Alexandria, VA 22314-5793. [68 FR 48289, Aug. 13...

  20. 37 CFR 2.190 - Addresses for trademark correspondence with the United States Patent and Trademark Office.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... recordation; requests for copies of trademark documents; and certain documents filed under the Madrid Protocol... to review an action of the Office's Madrid Processing Unit, when filed by mail, must be mailed to: Madrid Processing Unit, 600 Dulany Street, MDE-7B87, Alexandria, VA 22314-5793. [68 FR 48289, Aug. 13...

  1. 18 CFR 5.8 - Notice of commencement of proceeding and scoping document, or of approval to use traditional...

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... of proceeding and scoping document, or of approval to use traditional licensing process or... PROCESS § 5.8 Notice of commencement of proceeding and scoping document, or of approval to use traditional... required under § 5.5, filing of the pre-application document pursuant to § 5.6, and filing of any request...

  2. WFF TOPEX Software Documentation Altimeter Instrument File (AIF) Processing, October 1998. Volume 3

    NASA Technical Reports Server (NTRS)

    Lee, Jeffrey; Lockwood, Dennis

    2003-01-01

    This document is a compendium of the WFF TOPEX Software Development Team's knowledge regarding Sensor Data Record (SDR) Processing. It includes many elements of a requirements document, a software specification document, a software design document, and a user's manual. In the more technical sections, this document assumes the reader is familiar with TOPEX and instrument files.

  3. 78 FR 53452 - Revisions to Electric Quarterly Report Filing Process; Notice of Extended Availability of Sandbox...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-29

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. RM12-3-000] Revisions to Electric Quarterly Report Filing Process; Notice of Extended Availability of Sandbox Electronic Test Site Take notice that the opportunity to use the Sandbox Electronic Test Site (ETS) has been extended until September 15, 2013. The ETS including a web...

  4. Regional seismic lines reprocessed using post-stack processing techniques; National Petroleum Reserve, Alaska

    USGS Publications Warehouse

    Miller, John J.; Agena, W.F.; Lee, M.W.; Zihlman, F.N.; Grow, J.A.; Taylor, D.J.; Killgore, Michele; Oliver, H.L.

    2000-01-01

    This CD-ROM contains stacked, migrated, 2-Dimensional seismic reflection data and associated support information for 22 regional seismic lines (3,470 line-miles) recorded in the National Petroleum Reserve ? Alaska (NPRA) from 1974 through 1981. Together, these lines constitute about one-quarter of the seismic data collected as part of the Federal Government?s program to evaluate the petroleum potential of the Reserve. The regional lines, which form a grid covering the entire NPRA, were created by combining various individual lines recorded in different years using different recording parameters. These data were reprocessed by the USGS using modern, post-stack processing techniques, to create a data set suitable for interpretation on interactive seismic interpretation computer workstations. Reprocessing was done in support of ongoing petroleum resource studies by the USGS Energy Program. The CD-ROM contains the following files: 1) 22 files containing the digital seismic data in standard, SEG-Y format; 2) 1 file containing navigation data for the 22 lines in standard SEG-P1 format; 3) 22 small scale graphic images of each seismic line in Adobe Acrobat? PDF format; 4) a graphic image of the location map, generated from the navigation file, with hyperlinks to the graphic images of the seismic lines; 5) an ASCII text file with cross-reference information for relating the sequential trace numbers on each regional line to the line number and shotpoint number of the original component lines; and 6) an explanation of the processing used to create the final seismic sections (this document). The SEG-Y format seismic files and SEG-P1 format navigation file contain all the information necessary for loading the data onto a seismic interpretation workstation.

  5. Navy Occupational Health Information Management System (NOHIMS). Environmental Exposure Module. Users’ Manual

    DTIC Science & Technology

    1987-01-16

    menus , controls user and device access to the system, manages the security features associated with menus , devices, and users, provides...in the files, or the number of files in the system. 2-2 3.0 MODULE INPUT PROCESSES 3.1 Summary of Input Processes The EE module contains many menu ...Output Processes The EE module contains many menu options which enable the user to obtain needed information from the module. These options can be

  6. WEB Services Networks and Technological Hybrids — The Integration Challenges of WAN Distributed Computing for ASP Providers

    NASA Astrophysics Data System (ADS)

    Mroczkiewicz, Pawel

    A necessity of integration of both information systems and office software existing in organizations has had a long history. The beginning of this kind of solutions reaches back to the old generation of network protocols called EDI (Electronic Data Interchange) and EDIFACT standard, which was initiated in 1988 and has dynamically evolved ever since (S. Michalski, M. Suskiewicz, 1995). The mentioned protocol was usually used for converting documents into natural formats processed by applications. It caused problems with binary files and, furthermore, the communication mechanisms had to be modified each time new documents or applications were added. When we compare EDI with the previously used communication mechanisms, EDI was a great step forward as it was the first, big scale attempt to define standards of data interchange between the applications in business transactions (V. Leyland, 1995, p. 47).

  7. 74. PHOTOCOPY OF PANORAMA 'B' DEPICTING REGRADING OPERATIONS ON EAST ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    74. PHOTOCOPY OF PANORAMA 'B' DEPICTING REGRADING OPERATIONS ON EAST BANK OF CREEK BETWEEN M AND P STREETS, FROM 1940 REPORT ON PROPOSED DEVELOPMENT OF ROCK CREEK AND POTOMAC PARKWAY, SECTION II (ROCK CREEK AND POTOMAC PARKWAY FILE, HISTORY DEPARTMENT ARCHIVES, NATIONAL PARK SERVICE, WASHINGTON, DC); NUMBER 1 OF 5. - Rock Creek & Potomac Parkway, Washington, District of Columbia, DC

  8. 70. PHOTOCOPY OF PANORAMA 'A' DEPICTING REGRADING OPERATIONS ON EAST ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    70. PHOTOCOPY OF PANORAMA 'A' DEPICTING REGRADING OPERATIONS ON EAST BANK OF CREEK BETWEEN M AND P STREETS, FROM 1940 REPORT ON PROPOSED DEVELOPMENT OF ROCK CREEK AND POTOMAC PARKWAY, SECTION II (ROCK CREEK AND POTOMAC PARKWAY FILE, HISTORY DEPARTMENT ARCHIVES, NATIONAL PARK SERVICE, WASHINGTON, DC); NUMBER 1 OF 4. - Rock Creek & Potomac Parkway, Washington, District of Columbia, DC

  9. Design Foundations for Content-Rich Acoustic Interfaces: Investigating Audemes as Referential Non-Speech Audio Cues

    ERIC Educational Resources Information Center

    Ferati, Mexhid Adem

    2012-01-01

    To access interactive systems, blind and visually impaired users can leverage their auditory senses by using non-speech sounds. The current structure of non-speech sounds, however, is geared toward conveying user interface operations (e.g., opening a file) rather than large theme-based information (e.g., a history passage) and, thus, is ill-suited…

  10. 21 CFR 801.45 - Devices that must be directly marked with a unique device identifier.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 21 Food and Drugs 8 2014-04-01 2014-04-01 false Devices that must be directly marked with a unique device identifier. 801.45 Section 801.45 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF... previously marked under paragraph (a) of this section. (e) Exception to be noted in design history file. A...

  11. 25 CFR 15.202 - What items must the agency include in the probate file?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... § 15.104. (b) A completed “Data for Heirship Findings and Family History Form” or successor form...) Originals or copies of all wills, codicils, and revocations that have been provided to us. (i) A copy of any...) Any statement renouncing an interest in the estate that has been submitted to us, and the information...

  12. 78 FR 1894 - Self-Regulatory Organizations; NYSE Arca, Inc.; Notice of Filing and Immediate Effectiveness of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-09

    ... whole months [sic]). In the case of spin-offs, the operating history of the spin-off will be considered... component price per share, (a) the highest price per share of a component was $661.15 (Google, Inc.), (b... top five highest weighted components was 40.78% (Apple Inc., Microsoft Corporation, Google Inc...

  13. NREL Staff Honored for Innovative Thinking, Accomplishments | News | NREL

    Science.gov Websites

    Awards ceremony last month. The event occurred as NREL is on pace to see more records of invention and software records submitted this year than at any time in the lab's history. Since fiscal 2018 began in October, 111 records have been filed compared to 168 during all fiscal 2017, which is about 30 percent

  14. 26 CFR 1.882-4 - Allowance of deductions and credits to foreign corporations.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... income. In Year 1, FC, a technology company, opened an office in the United States to market and sell a..., a technology company, opened an office in the United States to market and sell a software program.... Foreign corporation with prior filing history. FC began a U.S. trade or business in Year 1. FC's tax...

  15. Sandia MEMS Visualization Tools v. 3.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yarberry, Victor; Jorgensen, Craig R.; Young, Andrew I.

    This is a revision to the Sandia MEMS Visualization Tools. It replaces all previous versions. New features in this version: Support for AutoCAD 2014 and 2015 . This CD contains an integrated set of electronic files that: a) Provides a 2D Process Visualizer that generates cross-section images of devices constructed using the SUMMiT V fabrication process. b) Provides a 3D Visualizer that generates 3D images of devices constructed using the SUMMiT V fabrication process. c) Provides a MEMS 3D Model generator that creates 3D solid models of devices constructed using the SUMMiT V fabrication process. While there exists some filesmore » on the CD that are used in conjunction with software package AutoCAD , these files are not intended for use independent of the CD. Note that the customer must purchase his/her own copy of AutoCAD to use with these files.« less

  16. Orbital and attitude evolution of SCD-1 and SCD-2 Brazilian satellites

    NASA Astrophysics Data System (ADS)

    Murcia, J. O.; Carrara, V.; Kuga, H. K.

    2017-10-01

    The SCD-1 and SCD-2 satellites were launched in 1993 and 1998, respectively, with use of the Launcher “Pegasus” of the OSC (Orbital Sciences Corporation). 21 and 16 years later, the satellites are still in orbit around the Earth and providing data for users. Mission and Operational data from Satellite Tracking Center Network are stored in mission files in the Satellite Control Center (SCC) and made available to the users. The SCC also stores history files of the satellite orbit and attitude ephemeris, besides the on-board telemetry, temperatures, equipment status, etc. This work will present some analysis of the orbit ephemeris evolution based upon the Two-Line Elements sets (TLE’s) obtained from NORAD (North American Aerospace Defense Command). Attitude evolution along time is also presented for both satellites from SCC data. The orbit decay will be explained as resulting mainly due to the solar activity during the satellite lifetime. This work aims to report the history of more than 20 years of continuous operation of SCD-1 and SCD-2. At the end, an estimation of the orbital decay is forecast with the use of NASA’s DAS software.

  17. Indicators of Dysphagia in Aged Care Facilities.

    PubMed

    Pu, Dai; Murry, Thomas; Wong, May C M; Yiu, Edwin M L; Chan, Karen M K

    2017-09-18

    The current cross-sectional study aimed to investigate risk factors for dysphagia in elderly individuals in aged care facilities. A total of 878 individuals from 42 aged care facilities were recruited for this study. The dependent outcome was speech therapist-determined swallowing function. Independent factors were Eating Assessment Tool score, oral motor assessment score, Mini-Mental State Examination, medical history, and various functional status ratings. Binomial logistic regression was used to identify independent variables associated with dysphagia in this cohort. Two statistical models were constructed. Model 1 used variables from case files without the need for hands-on assessment, and Model 2 used variables that could be obtained from hands-on assessment. Variables positively associated with dysphagia identified in Model 1 were male gender, total dependence for activities of daily living, need for feeding assistance, mobility, requiring assistance walking or using a wheelchair, and history of pneumonia. Variables positively associated with dysphagia identified in Model 2 were Mini-Mental State Examination score, edentulousness, and oral motor assessments score. Cognitive function, dentition, and oral motor function are significant indicators associated with the presence of swallowing in the elderly. When assessing the frail elderly, case file information can help clinicians identify frail elderly individuals who may be suffering from dysphagia.

  18. Photogrammetry Impression Technique: A Case History Report.

    PubMed

    Sánchez-Monescillo, Andrés; Sánchez-Turrión, Andrés; Vellon-Domarco, Elena; Salinas-Goodier, Carmen; Prados-Frutos, Juan Carlos

    2016-01-01

    The aim of this report is to present photogrammetry as a reliable step in the fabrication of a full-arch immediate rehabilitation. A 59-year-old man attended the department seeking dental rehabilitation for the sequelae of severe oral health neglect. The mandibular teeth suffered from advanced periodontal disease and the patient wore a maxillary complete denture. An irreversible hydrocolloid impression of the mandibular arch was made, poured in stone, and digitally scanned to create the first stereolithography (STL) file. All teeth with the exception of two retained as landmarks were extracted, and seven implants were placed under local anesthesia and their positions recorded using photogrammetry. Maxillary and mandibular dental arch alginate impressions were made, poured in laboratory stone, and scanned. A provisional restoration was placed 7 hours after surgery using the STL files to determine the best-fit line. Radiographic and clinical follow-up after 1 year showed a favorable evolution of the implants. No screw loosening or other mechanical or biologic complications were observed. The case history using the described system suggests certain advantages over conventional techniques. More research is needed to assess the possible benefits associated with photogrammetry when making implant-supported restorations.

  19. [The early history of "Ecstasy"].

    PubMed

    Benzenhöfer, U; Passie, T

    2006-01-01

    There is no consensus in the literature regarding the early history of MDMA (Methylendioxymethamphetamine, so-called "Ecstasy"). Various authors credit the first synthesis of MDMA to the German chemist Fritz Haber, but it appears neither in his doctoral thesis (Berlin 1891) nor in his accompanying articles. The man who first synthesized MDMA was the chemist Dr. Anton Köllisch, who worked for the German pharmaceutical company Merck. He created MDMA as a by-product while trying to synthesize hydrastinin, a styptic substance. In 1912, Merck filed to patent the applied method of preparation. The patent was issued in 1914, yet no pharmaceutical testing followed at that time.

  20. Computational provenance in hydrologic science: a snow mapping example.

    PubMed

    Dozier, Jeff; Frew, James

    2009-03-13

    Computational provenance--a record of the antecedents and processing history of digital information--is key to properly documenting computer-based scientific research. To support investigations in hydrologic science, we produce the daily fractional snow-covered area from NASA's moderate-resolution imaging spectroradiometer (MODIS). From the MODIS reflectance data in seven wavelengths, we estimate the fraction of each 500 m pixel that snow covers. The daily products have data gaps and errors because of cloud cover and sensor viewing geometry, so we interpolate and smooth to produce our best estimate of the daily snow cover. To manage the data, we have developed the Earth System Science Server (ES3), a software environment for data-intensive Earth science, with unique capabilities for automatically and transparently capturing and managing the provenance of arbitrary computations. Transparent acquisition avoids the scientists having to express their computations in specific languages or schemas in order for provenance to be acquired and maintained. ES3 models provenance as relationships between processes and their input and output files. It is particularly suited to capturing the provenance of an evolving algorithm whose components span multiple languages and execution environments.

  1. Documented historical landslide dams from around the world

    USGS Publications Warehouse

    Costa, John E.; Schuster, Robert L.

    1991-01-01

    This data compilation consists of dBase IV1 data files of the location, date, triggering mechanism, kind, size, failure time and mechanism, breach dimensions, subsequent controls, materials, and references for 463 historical landslide dams and associated natural reservoirs that have been recorded throughout the World. The data base presented in this report is a compilation of information on the characteristics of 463 landslide dams from around the World. It forms a basis on which to assess potential threats from existing landslide dams, or newly-formed landslide dams. The data base includes only landslide dams that have formed in historical times - that is, those formed during times when humans were able to record their occurrence, and the information transferred through various means of written and/or oral documentation. There have been far more prehistoric landslide dams about which relatively little is known. None of these is included in this data base. The focus on historical landslide dams allows insights into this natural process that will aid in understanding their role as a significant geologic process in recent Earth history.

  2. Python Processing and Version Control using VisTrails for the Netherlands Hydrological Instrument (Invited)

    NASA Astrophysics Data System (ADS)

    Verkaik, J.

    2013-12-01

    The Netherlands Hydrological Instrument (NHI) model predicts water demands in periods of drought, supporting the Dutch decision makers in taking operational as well as long-term decisions with respect to the water supply. Other applications of NHI are predicting fresh-salt interaction, nutrient loadings, and agriculture change. The NHI model consists of several coupled models: a saturated groundwater model (MODFLOW), an unsaturated groundwater model (MetaSWAP), a sub-catchment surface water model (MOZART), and a distribution network of surface waters model (DM/SOBEK). Each of these models requires specific, usually large, input data that may be the result of sophisticated schematization workflows. Input data can also be dependent on each other, for example, the precipitation data is input for the unsaturated zone model (cells) as well as for the surface water models (polygons). For efficient data management, we developed several Python tools such that the modeler or stakeholder can use the model in a user-friendly manner, and data is managed in a consistent, transparent and reproducible way. Two open source Python tools are presented here: the data version control module for the workflow manager VisTrails called FileSync, and the NHI model control script that uses FileSync. VisTrails is an open-source scientific workflow and provenance management system that provides support for simulations, data exploration and visualization. Since VisTrails does not directly support version control we developed a version control module called FileSync. With this generic module, the user can synchronize data from and to his workflow through a dialog window. The FileSync dialog calls the FileSync script that is command-line based and performs the actual data synchronization. This script allows the user to easily create a model repository, upload and download data, create releases and define scenarios. The data synchronization approach applied here differs from systems as Subversion or Git, since these systems do not perform well for large (binary) model data files. For this reason, a new concept of parameterization and data splitting has been implemented. Each file, or set of files, is uniquely labeled as a parameter, and for this parameter metadata is maintained by Subversion. The metadata data contains file hashes to identify data content and the location where the actual bulk data are stored that can be reached by FTP. The NHI model control script is a command-line driven Python script for pre-processing, running, and post-processing the NHI model and uses one single configuration file for all computational kernels. This configuration file is an easy-to-use, keyword-driven, Windows INI-file, having separate sections for all the kernels. It also includes a FileSync data section where the user can specify version controlled model data to be used as input. The NHI control script keeps all the data consistent during the pre-processing. Furthermore, this script is able to do model state handling when the NHI model is used for ensemble forecasting.

  3. National Household Education Surveys Program of 2012: Data File User's Manual. Parent and Family Involvement in Education Survey. Early Childhood Program Participation Survey. NCES 2015-030

    ERIC Educational Resources Information Center

    McPhee, C.; Bielick, S.; Masterton, M.; Flores, L.; Parmer, R.; Amchin, S.; Stern, S.; McGowan, H.

    2015-01-01

    The 2012 National Household Education Surveys Program (NHES:2012) Data File User's Manual provides documentation and guidance for users of the NHES:2012 data files. The manual provides information about the purpose of the study, the sample design, data collection procedures, data processing procedures, response rates, imputation, weighting and…

  4. 7 CFR 1.622 - How do I file a notice of intervention and response?

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 1 2011-01-01 2011-01-01 false How do I file a notice of intervention and response? 1... intervention and response? (a) General. (1) To intervene as a party to the hearing process, you must: (i) Be a license party; and (ii) File with NFS a notice of intervention and a written response to any request for a...

  5. 43 CFR 45.22 - How do I file a notice of intervention and response?

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 43 Public Lands: Interior 1 2014-10-01 2014-10-01 false How do I file a notice of intervention and... notice of intervention and response? (a) General. (1) To intervene as a party to the hearing process, you must: (i) Be a license party; and (ii) File with OEPC a notice of intervention and a written response...

  6. 43 CFR 45.22 - How do I file a notice of intervention and response?

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 43 Public Lands: Interior 1 2013-10-01 2013-10-01 false How do I file a notice of intervention and... notice of intervention and response? (a) General. (1) To intervene as a party to the hearing process, you must: (i) Be a license party; and (ii) File with OEPC a notice of intervention and a written response...

  7. 43 CFR 45.22 - How do I file a notice of intervention and response?

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 43 Public Lands: Interior 1 2012-10-01 2011-10-01 true How do I file a notice of intervention and... notice of intervention and response? (a) General. (1) To intervene as a party to the hearing process, you must: (i) Be a license party; and (ii) File with OEPC a notice of intervention and a written response...

  8. 7 CFR 1.622 - How do I file a notice of intervention and response?

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 7 Agriculture 1 2014-01-01 2014-01-01 false How do I file a notice of intervention and response? 1... intervention and response? (a) General. (1) To intervene as a party to the hearing process, you must: (i) Be a license party; and (ii) File with NFS a notice of intervention and a written response to any request for a...

  9. 7 CFR 1.622 - How do I file a notice of intervention and response?

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 1 2010-01-01 2010-01-01 false How do I file a notice of intervention and response? 1... intervention and response? (a) General. (1) To intervene as a party to the hearing process, you must: (i) Be a license party; and (ii) File with NFS a notice of intervention and a written response to any request for a...

  10. 7 CFR 1.622 - How do I file a notice of intervention and response?

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 1 2013-01-01 2013-01-01 false How do I file a notice of intervention and response? 1... intervention and response? (a) General. (1) To intervene as a party to the hearing process, you must: (i) Be a license party; and (ii) File with NFS a notice of intervention and a written response to any request for a...

  11. 43 CFR 45.22 - How do I file a notice of intervention and response?

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 43 Public Lands: Interior 1 2011-10-01 2011-10-01 false How do I file a notice of intervention and... notice of intervention and response? (a) General. (1) To intervene as a party to the hearing process, you must: (i) Be a license party; and (ii) File with OEPC a notice of intervention and a written response...

  12. 7 CFR 1.622 - How do I file a notice of intervention and response?

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 7 Agriculture 1 2012-01-01 2012-01-01 false How do I file a notice of intervention and response? 1... intervention and response? (a) General. (1) To intervene as a party to the hearing process, you must: (i) Be a license party; and (ii) File with NFS a notice of intervention and a written response to any request for a...

  13. 43 CFR 45.22 - How do I file a notice of intervention and response?

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 43 Public Lands: Interior 1 2010-10-01 2010-10-01 false How do I file a notice of intervention and... notice of intervention and response? (a) General. (1) To intervene as a party to the hearing process, you must: (i) Be a license party; and (ii) File with OEPC a notice of intervention and a written response...

  14. [Master files: less paper, more substance. Special rules for special medicines: Plasma Master File and Vaccine Antigen Master File].

    PubMed

    Seitz, Rainer; Haase, M

    2008-07-01

    The process of reviewing the European pharmaceutical legislation resulted in a codex, which contains two new instruments related to marketing authorisation of biological medicines: Plasma Master File (PMF) and Vaccine Antigen Master File (VAMF). In the manufacture of plasma derivatives (e. g. coagulation factors, albumin, immunoglobulins), usually the same starting material, i. e. a plasma pool, is used for several products. In the case of vaccines, the same active substance, i.e. vaccine antigen, may be included in several combination vaccine products. The intention behind the introduction of PMF and VAMF was to avoid unnecessary and redundant documentation, and to improve and harmonise assessment by means of procedures for certification of master files on the community level.

  15. Converting from DDOR SASF to APF

    NASA Technical Reports Server (NTRS)

    Gladden, Roy E.; Khanampompan, Teerapat; Fisher, Forest W.

    2008-01-01

    A computer program called ddor_sasf2apf converts delta-door (delta differential one-way range) request from an SASF (spacecraft activity sequence file) format to an APF (apgen plan file) format for use in the Mars Reconnaissance Orbiter (MRO) missionplanning- and-sequencing process. The APF is used as an input to APGEN/AUTOGEN in the MRO activity- planning and command-sequencegenerating process to sequence the delta-door (DDOR) activity. The DDOR activity is a spacecraft tracking technique for determining spacecraft location. The input to ddor_sasf2apf is an input request SASF provided by an observation team that utilizes DDOR. ddor_sasf2apf parses this DDOR SASF input, rearranging parameters and reformatting the request to produce an APF file for use in AUTOGEN and/or APGEN. The benefit afforded by ddor_sasf2apf is to enable the use of the DDOR SASF file earlier in the planning stage of the command-sequence-generating process and to produce sequences, optimized for DDOR operations, that are more accurate and more robust than would otherwise be possible.

  16. Block Architecture Problem with Depth First Search Solution and Its Application

    NASA Astrophysics Data System (ADS)

    Rahim, Robbi; Abdullah, Dahlan; Simarmata, Janner; Pranolo, Andri; Saleh Ahmar, Ansari; Hidayat, Rahmat; Napitupulu, Darmawan; Nurdiyanto, Heri; Febriadi, Bayu; Zamzami, Z.

    2018-01-01

    Searching is a common process performed by many computer users, Raita algorithm is one algorithm that can be used to match and find information in accordance with the patterns entered. Raita algorithm applied to the file search application using java programming language and the results obtained from the testing process of the file search quickly and with accurate results and support many data types.

  17. Cloud Engineering Principles and Technology Enablers for Medical Image Processing-as-a-Service

    PubMed Central

    Bao, Shunxing; Plassard, Andrew J.; Landman, Bennett A.; Gokhale, Aniruddha

    2017-01-01

    Traditional in-house, laboratory-based medical imaging studies use hierarchical data structures (e.g., NFS file stores) or databases (e.g., COINS, XNAT) for storage and retrieval. The resulting performance from these approaches is, however, impeded by standard network switches since they can saturate network bandwidth during transfer from storage to processing nodes for even moderate-sized studies. To that end, a cloud-based “medical image processing-as-a-service” offers promise in utilizing the ecosystem of Apache Hadoop, which is a flexible framework providing distributed, scalable, fault tolerant storage and parallel computational modules, and HBase, which is a NoSQL database built atop Hadoop’s distributed file system. Despite this promise, HBase’s load distribution strategy of region split and merge is detrimental to the hierarchical organization of imaging data (e.g., project, subject, session, scan, slice). This paper makes two contributions to address these concerns by describing key cloud engineering principles and technology enhancements we made to the Apache Hadoop ecosystem for medical imaging applications. First, we propose a row-key design for HBase, which is a necessary step that is driven by the hierarchical organization of imaging data. Second, we propose a novel data allocation policy within HBase to strongly enforce collocation of hierarchically related imaging data. The proposed enhancements accelerate data processing by minimizing network usage and localizing processing to machines where the data already exist. Moreover, our approach is amenable to the traditional scan, subject, and project-level analysis procedures, and is compatible with standard command line/scriptable image processing software. Experimental results for an illustrative sample of imaging data reveals that our new HBase policy results in a three-fold time improvement in conversion of classic DICOM to NiFTI file formats when compared with the default HBase region split policy, and nearly a six-fold improvement over a commonly available network file system (NFS) approach even for relatively small file sets. Moreover, file access latency is lower than network attached storage. PMID:28884169

  18. Broken instrument retrieval with indirect ultrasonics in a primary molar.

    PubMed

    Pk, Musale; Sc, Kataria; As, Soni

    2016-02-01

    The separation of a file during pulpectomy is a rare incident in primary teeth due to inherently wider and relatively straighter root canals. A broken instrument hinders the clinician from optimal preparation and obturation of the root canal system invariably leading to failure, although in such teeth, an extraction followed by suitable space maintenance is considered as the treatment of choice. This case report demonstrates successful nonsurgical retrieval of a separated H file fragment in 84. A 7-year-old girl was referred to the Department of Paedodontics and Preventive Dentistry for endodontic management of a primary tooth 84 with a dento-alveolar abscess. Her medical history was noncontributory. After diagnosing a broken H file in the mesio-lingual canal, the tooth was endodontically treated in two appointments. At the first session, a broken file was successfully retrieved after using low intensity ultrasonic vibrations through a DG 16 endodontic explorer viewed under an operating microscope. After abscess resolution, Vitapex root canal obturation with a preformed metal crown cementation was completed at a second session. The patient was recalled at 3, 6, 12 and 15 month interval and reported to be clinically asymptomatic and radiographically with complete furcal healing. Integration of microscopes and ultrasonics in paediatric dental practice has made it possible to save such teeth with a successful outcome. Favourable location of the separated file, relatively straighter root canal system and patient cooperation resulted in successful nonsurgical management in this case.

  19. OVERSMART Reporting Tool for Flow Computations Over Large Grid Systems

    NASA Technical Reports Server (NTRS)

    Kao, David L.; Chan, William M.

    2012-01-01

    Structured grid solvers such as NASA's OVERFLOW compressible Navier-Stokes flow solver can generate large data files that contain convergence histories for flow equation residuals, turbulence model equation residuals, component forces and moments, and component relative motion dynamics variables. Most of today's large-scale problems can extend to hundreds of grids, and over 100 million grid points. However, due to the lack of efficient tools, only a small fraction of information contained in these files is analyzed. OVERSMART (OVERFLOW Solution Monitoring And Reporting Tool) provides a comprehensive report of solution convergence of flow computations over large, complex grid systems. It produces a one-page executive summary of the behavior of flow equation residuals, turbulence model equation residuals, and component forces and moments. Under the automatic option, a matrix of commonly viewed plots such as residual histograms, composite residuals, sub-iteration bar graphs, and component forces and moments is automatically generated. Specific plots required by the user can also be prescribed via a command file or a graphical user interface. Output is directed to the user s computer screen and/or to an html file for archival purposes. The current implementation has been targeted for the OVERFLOW flow solver, which is used to obtain a flow solution on structured overset grids. The OVERSMART framework allows easy extension to other flow solvers.

  20. 28 CFR 42.601 - Purpose and application.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 42.601 Judicial Administration DEPARTMENT OF JUSTICE NONDISCRIMINATION; EQUAL EMPLOYMENT OPPORTUNITY; POLICIES AND PROCEDURES Procedures for Complaints of Employment Discrimination Filed Against Recipients of... procedures for processing and resolving complaints of employment discrimination filed against recipients of...

Top