Encryption and decryption using FPGA
NASA Astrophysics Data System (ADS)
Nayak, Nikhilesh; Chandak, Akshay; Shah, Nisarg; Karthikeyan, B.
2017-11-01
In this paper, we are performing multiple cryptography methods on a set of data and comparing their outputs. Here AES algorithm and RSA algorithm are used. Using AES Algorithm an 8 bit input (plain text) gets encrypted using a cipher key and the result is displayed on tera term (serially). For simulation a 128 bit input is used and operated with a 128 bit cipher key to generate encrypted text. The reverse operations are then performed to get decrypted text. In RSA Algorithm file handling is used to input plain text. This text is then operated on to get the encrypted and decrypted data, which are then stored in a file. Finally the results of both the algorithms are compared.
Decryption-decompression of AES protected ZIP files on GPUs
NASA Astrophysics Data System (ADS)
Duong, Tan Nhat; Pham, Phong Hong; Nguyen, Duc Huu; Nguyen, Thuy Thanh; Le, Hung Duc
2011-10-01
AES is a strong encryption system, so decryption-decompression of AES encrypted ZIP files requires very large computing power and techniques of reducing the password space. This makes implementations of techniques on common computing system not practical. In [1], we reduced the original very large password search space to a much smaller one which surely containing the correct password. Based on reduced set of passwords, in this paper, we parallel decryption, decompression and plain text recognition for encrypted ZIP files by using CUDA computing technology on graphics cards GeForce GTX295 of NVIDIA, to find out the correct password. The experimental results have shown that the speed of decrypting, decompressing, recognizing plain text and finding out the original password increases about from 45 to 180 times (depends on the number of GPUs) compared to sequential execution on the Intel Core 2 Quad Q8400 2.66 GHz. These results have demonstrated the potential applicability of GPUs in this cryptanalysis field.
78 FR 32383 - Combined Notice of Filings #2
Federal Register 2010, 2011, 2012, 2013, 2014
2013-05-30
... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission Combined Notice of Filings 2 Take notice that the Commission received the following electric rate filings: Docket Numbers: ER09-548-002; EC11-108-001. Applicants: ITC Great Plains, LLC. Description: ITC Great Plains, LLC submits compliance filing to Begin Amortization of Regulatory Assets...
Resources for comparing the speed and performance of medical autocoders.
Berman, Jules J
2004-06-15
Concept indexing is a popular method for characterizing medical text, and is one of the most important early steps in many data mining efforts. Concept indexing differs from simple word or phrase indexing because concepts are typically represented by a nomenclature code that binds a medical concept to all equivalent representations. A concept search on the term renal cell carcinoma would be expected to find occurrences of hypernephroma, and renal carcinoma (concept equivalents). The purpose of this study is to provide freely available resources to compare speed and performance among different autocoders. These tools consist of: 1) a public domain autocoder written in Perl (a free and open source programming language that installs on any operating system); 2) a nomenclature database derived from the unencumbered subset of the publicly available Unified Medical Language System; 3) a large corpus of autocoded output derived from a publicly available medical text. A simple lexical autocoder was written that parses plain-text into a listing of all 1,2,3, and 4-word strings contained in text, assigning a nomenclature code for text strings that match terms in the nomenclature. The nomenclature used is the unencumbered subset of the 2003 Unified Medical Language System (UMLS). The unencumbered subset of UMLS was reduced to exclude homonymous one-word terms and proper names, resulting in a term/code data dictionary containing about a half million medical terms. The Online Mendelian Inheritance in Man (OMIM), a 92+ Megabyte publicly available medical opus, was used as sample medical text for the autocoder. The autocoding Perl script is remarkably short, consisting of just 38 command lines. The 92+ Megabyte OMIM file was completely autocoded in 869 seconds on a 2.4 GHz processor (less than 10 seconds per Megabyte of text). The autocoded output file (9,540,442 bytes) contains 367,963 coded terms from OMIM and is distributed with this manuscript. A public domain Perl script is provided that can parse through plain-text files of any length, matching concepts against an external nomenclature. The script and associated files can be used freely to compare the speed and performance of autocoding software.
Solar Astronomy Data Base: Packaged Information on Diskette
NASA Technical Reports Server (NTRS)
Mckinnon, John A.
1990-01-01
In its role as a library, the National Geophysical Data Center has transferred to diskette a collection of small, digital files of routinely measured solar indices for use on an IBM-compatible desktop computer. Recording these observations on diskette allows the distribution of specialized information to researchers with a wide range of expertise in computer science and solar astronomy. Every data set was made self-contained by including formats, extraction utilities, and plain-language descriptive text. Moreover, for several archives, two versions of the observations are provided - one suitable for display, the other for analysis with popular software packages. Since the files contain no control characters, each one can be modified with any text editor.
Enhanced K-means clustering with encryption on cloud
NASA Astrophysics Data System (ADS)
Singh, Iqjot; Dwivedi, Prerna; Gupta, Taru; Shynu, P. G.
2017-11-01
This paper tries to solve the problem of storing and managing big files over cloud by implementing hashing on Hadoop in big-data and ensure security while uploading and downloading files. Cloud computing is a term that emphasis on sharing data and facilitates to share infrastructure and resources.[10] Hadoop is an open source software that gives us access to store and manage big files according to our needs on cloud. K-means clustering algorithm is an algorithm used to calculate distance between the centroid of the cluster and the data points. Hashing is a algorithm in which we are storing and retrieving data with hash keys. The hashing algorithm is called as hash function which is used to portray the original data and later to fetch the data stored at the specific key. [17] Encryption is a process to transform electronic data into non readable form known as cipher text. Decryption is the opposite process of encryption, it transforms the cipher text into plain text that the end user can read and understand well. For encryption and decryption we are using Symmetric key cryptographic algorithm. In symmetric key cryptography are using DES algorithm for a secure storage of the files. [3
FEQinput—An editor for the full equations (FEQ) hydraulic modeling system
Ancalle, David S.; Ancalle, Pablo J.; Domanski, Marian M.
2017-10-30
IntroductionThe Full Equations Model (FEQ) is a computer program that solves the full, dynamic equations of motion for one-dimensional unsteady hydraulic flow in open channels and through control structures. As a result, hydrologists have used FEQ to design and operate flood-control structures, delineate inundation maps, and analyze peak-flow impacts. To aid in fighting floods, hydrologists are using the software to develop a system that uses flood-plain models to simulate real-time streamflow.Input files for FEQ are composed of text files that contain large amounts of parameters, data, and instructions that are written in a format exclusive to FEQ. Although documentation exists that can aid in the creation and editing of these input files, new users face a steep learning curve in order to understand the specific format and language of the files.FEQinput provides a set of tools to help a new user overcome the steep learning curve associated with creating and modifying input files for the FEQ hydraulic model and the related utility tool, Full Equations Utilities (FEQUTL).
Ferrigno, C.F.
1986-01-01
Machine-readable files were developed for the High Plains Regional Aquifer-System Analysis project are stored on two magnetic tapes available from the U.S. Geological Survey. The first tape contains computer programs that were used to prepare, store, retrieve, organize, and preserve the areal interpretive data collected by the project staff. The second tape contains 134 data files that can be divided into five general classes: (1) Aquifer geometry data, (2) aquifer and water characteristics , (3) water levels, (4) climatological data, and (5) land use and water use data. (Author 's abstract)
The prevalence of encoded digital trace evidence in the nonfile space of computer media(,) (.).
Garfinkel, Simson L
2014-09-01
Forensically significant digital trace evidence that is frequently present in sectors of digital media not associated with allocated or deleted files. Modern digital forensic tools generally do not decompress such data unless a specific file with a recognized file type is first identified, potentially resulting in missed evidence. Email addresses are encoded differently for different file formats. As a result, trace evidence can be categorized as Plain in File (PF), Encoded in File (EF), Plain Not in File (PNF), or Encoded Not in File (ENF). The tool bulk_extractor finds all of these formats, but other forensic tools do not. A study of 961 storage devices purchased on the secondary market and shows that 474 contained encoded email addresses that were not in files (ENF). Different encoding formats are the result of different application programs that processed different kinds of digital trace evidence. Specific encoding formats explored include BASE64, GZIP, PDF, HIBER, and ZIP. Published 2014. This article is a U.S. Government work and is in the public domain in the USA. Journal of Forensic Sciences published by Wiley Periodicals, Inc. on behalf of American Academy of Forensic Sciences.
mod_bio: Apache modules for Next-Generation sequencing data.
Lindenbaum, Pierre; Redon, Richard
2015-01-01
We describe mod_bio, a set of modules for the Apache HTTP server that allows the users to access and query fastq, tabix, fasta and bam files through a Web browser. Those data are made available in plain text, HTML, XML, JSON and JSON-P. A javascript-based genome browser using the JSON-P communication technique is provided as an example of cross-domain Web service. https://github.com/lindenb/mod_bio. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
NASA Technical Reports Server (NTRS)
Martinko, E. A. (Principal Investigator); Poracsky, J.; Kipp, E. R.; Krieger, H.
1980-01-01
The activity concentrated on identifying crop and irrigation data sources for the eight states within the High Plains Aquifer and making contacts concerning the nature of these data. A mail questionnaire was developed to gather specific data not routinely reported through standard data collection channels. Input/output routines were designed for High Plains crop and irrigation data and initial statistical data on crops were input to computer files.
A Customizable Importer for the Clinical Data Warehouses PaDaWaN and I2B2.
Fette, Georg; Kaspar, Mathias; Dietrich, Georg; Ertl, Maximilian; Krebs, Jonathan; Stoerk, Stefan; Puppe, Frank
2017-01-01
In recent years, clinical data warehouses (CDW) storing routine patient data have become more and more popular to support scientific work in the medical domain. Although CDW systems provide interfaces to import new data, these interfaces have to be used by processing tools that are often not included in the systems themselves. In order to establish an extraction-transformation-load (ETL) workflow, already existing components have to be taken or new components have to be developed to perform the load part of the ETL. We present a customizable importer for the two CDW systems PaDaWaN and I2B2, which is able to import the most common import formats (plain text, CSV and XML files). In order to be run, the importer only needs a configuration file with the user credentials for the target CDW and a list of XML import configuration files, which determine how already exported data is indented to be imported. The importer is provided as a Java program, which has no further software requirements.
Snake River Plain Geothermal Play Fairway Analysis - Phase 1 KMZ files
John Shervais
2015-10-10
This dataset contain raw data files in kmz files (Google Earth georeference format). These files include volcanic vent locations and age, the distribution of fine-grained lacustrine sediments (which act as both a seal and an insulating layer for hydrothermal fluids), and post-Miocene faults compiled from the Idaho Geological Survey, the USGS Quaternary Fault database, and unpublished mapping. It also contains the Composite Common Risk Segment Map created during Phase 1 studies, as well as a file with locations of select deep wells used to interrogate the subsurface.
COMPASS: a suite of pre- and post-search proteomics software tools for OMSSA
Wenger, Craig D.; Phanstiel, Douglas H.; Lee, M. Violet; Bailey, Derek J.; Coon, Joshua J.
2011-01-01
Here we present the Coon OMSSA Proteomic Analysis Software Suite (COMPASS): a free and open-source software pipeline for high-throughput analysis of proteomics data, designed around the Open Mass Spectrometry Search Algorithm. We detail a synergistic set of tools for protein database generation, spectral reduction, peptide false discovery rate analysis, peptide quantitation via isobaric labeling, protein parsimony and protein false discovery rate analysis, and protein quantitation. We strive for maximum ease of use, utilizing graphical user interfaces and working with data files in the original instrument vendor format. Results are stored in plain text comma-separated values files, which are easy to view and manipulate with a text editor or spreadsheet program. We illustrate the operation and efficacy of COMPASS through the use of two LC–MS/MS datasets. The first is a dataset of a highly annotated mixture of standard proteins and manually validated contaminants that exhibits the identification workflow. The second is a dataset of yeast peptides, labeled with isobaric stable isotope tags and mixed in known ratios, to demonstrate the quantitative workflow. For these two datasets, COMPASS performs equivalently or better than the current de facto standard, the Trans-Proteomic Pipeline. PMID:21298793
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zong, Ziliang; Job, Joshua; Zhang, Xuesong
Geo-visualization is significantly changing the way we view spatial data and discover information. On the one hand, a large number of spatial data are generated every day. On the other hand, these data are not well utilized due to the lack of free and easily used data-visualization tools. This becomes even worse when most of the spatial data remains in the form of plain text such as log files. This paper describes a way of visualizing massive plain-text spatial data at no cost by utilizing Google Earth and NASAWorld Wind. We illustrate our methods by visualizing over 170,000 global downloadmore » requests for satellite images maintained by the Earth Resources Observation and Science (EROS) Center of U.S. Geological Survey (USGS). Our visualization results identify the most popular satellite images around the world and discover the global user download patterns. The benefits of this research are: 1. assisting in improving the satellite image downloading services provided by USGS, and 2. providing a proxy for analyzing the hot spot areas of research. Most importantly, our methods demonstrate an easy way to geovisualize massive textual spatial data, which is highly applicable to mining spatially referenced data and information on a wide variety of research domains (e.g., hydrology, agriculture, atmospheric science, natural hazard, and global climate change).« less
Dibblee, T. W.; Digital database compiled by Graham, S. E.; Mahony, T.M.; Blissenbach, J.L.; Mariant, J.J.; Wentworth, C.M.
1999-01-01
This Open-File Report is a digital geologic map database. The report serves to introduce and describe the digital data. There is no paper map included in the Open-File Report. The report includes PostScript and PDF plot files that can be used to plot images of the geologic map sheet and explanation sheet. This digital map database is prepared from a previously published map by Dibblee (1973). The geologic map database delineates map units that are identified by general age, lithology, and clast size following the stratigraphic nomenclature of the U.S. Geological Survey. For descriptions of the units, their stratigraphic relations, and sources of geologic mapping, consult the explanation sheet (of99-14_4b.ps or of99-14_4d.pdf), or the original published paper map (Dibblee, 1973). The scale of the source map limits the spatial resolution (scale) of the database to 1:125,000 or smaller. For those interested in the geology of Carrizo Plain and vicinity who do not use an ARC/INFO compatible Geographic Information System (GIS), but would like to obtain a paper map and explanation, PDF and PostScript plot files containing map images of the data in the digital database, as well as PostScript and PDF plot files of the explanation sheet and explanatory text, have been included in the database package (please see the section 'Digital Plot Files', page 5). The PostScript plot files require a gzip utility to access them. For those without computer capability, we can provide users with the PostScript or PDF files on tape that can be taken to a vendor for plotting. Paper plots can also be ordered directly from the USGS (please see the section 'Obtaining Plots from USGS Open-File Services', page 5). The content and character of the database, methods of obtaining it, and processes of extracting the map database from the tar (tape archive) file are described herein. The map database itself, consisting of six ARC/INFO coverages, can be obtained over the Internet or by magnetic tape copy as described below. The database was compiled using ARC/INFO, a commercial Geographic Information System (Environmental Systems Research Institute, Redlands, California), with version 3.0 of the menu interface ALACARTE (Fitzgibbon and Wentworth, 1991, Fitzgibbon, 1991, Wentworth and Fitzgibbon, 1991). The ARC/INFO coverages are stored in uncompressed ARC export format (ARC/INFO version 7.x). All data files have been compressed, and may be uncompressed with gzip, which is available free of charge over the Internet via links from the USGS Public Domain Software page (http://edcwww.cr.usgs.gov/doc/edchome/ndcdb/public.html). ARC/INFO export files (files with the .e00 extension) can be converted into ARC/INFO coverages in ARC/INFO (see below) and can be read by some other Geographic Information Systems, such as MapInfo via ArcLink and ESRI's ArcView.
Qi, Sharon
2010-01-01
This digital data set represents the extent of the High Plains aquifer in the central United States. The extent of the High Plains aquifer covers 174,000 square miles in eight states: Colorado, Kansas, Nebraska, New Mexico, Oklahoma, South Dakota, Texas, and Wyoming. This data set represents a compilation of information from digital and paper sources and personal communication. This boundary is an update to the boundary published in U.S. Geological Survey Professional Paper 1400-B, and this report supersedes Open-File Report 99-267. The purpose of this data set is to refine and update the extent of the High Plains aquifer based on currently available information. This data set represents a compilation of arcs from a variety of sources and scales that represent the 174,000 square-mile extent of the High Plains aquifer within the eight states. Where updated information was not available, the original boundary extent defined by OFR 99-267 was retained. The citations for the sources in each State are listed in the 00README.txt file. The boundary also contains internal polygons, or 'islands', that represent the areas within the aquifer boundary where the aquifer is not present due to erosion or non-deposition. The datasets that pertain to this report can be found on the U.S. Geological Survey's NSDI (National Spatial Data Infrastructure) Node, the links are provided on the sidebar.
Snake River Plain Geothermal Play Fairway Analysis - Phase 1 Raster Files
John Shervais
2015-10-09
Snake River Plain Play Fairway Analysis - Phase 1 CRS Raster Files. This dataset contains raster files created in ArcGIS. These raster images depict Common Risk Segment (CRS) maps for HEAT, PERMEABILITY, AND SEAL, as well as selected maps of Evidence Layers. These evidence layers consist of either Bayesian krige functions or kernel density functions, and include: (1) HEAT: Heat flow (Bayesian krige map), Heat flow standard error on the krige function (data confidence), volcanic vent distribution as function of age and size, groundwater temperature (equivalue interval and natural breaks bins), and groundwater T standard error. (2) PERMEABILTY: Fault and lineament maps, both as mapped and as kernel density functions, processed for both dilational tendency (TD) and slip tendency (ST), along with data confidence maps for each data type. Data types include mapped surface faults from USGS and Idaho Geological Survey data bases, as well as unpublished mapping; lineations derived from maximum gradients in magnetic, deep gravity, and intermediate depth gravity anomalies. (3) SEAL: Seal maps based on presence and thickness of lacustrine sediments and base of SRP aquifer. Raster size is 2 km. All files generated in ArcGIS.
Case study of visualizing global user download patterns using Google Earth and NASA World Wind
NASA Astrophysics Data System (ADS)
Zong, Ziliang; Job, Joshua; Zhang, Xuesong; Nijim, Mais; Qin, Xiao
2012-01-01
Geo-visualization is significantly changing the way we view spatial data and discover information. On the one hand, a large number of spatial data are generated every day. On the other hand, these data are not well utilized due to the lack of free and easily used data-visualization tools. This becomes even worse when most of the spatial data remains in the form of plain text such as log files. This paper describes a way of visualizing massive plain-text spatial data at no cost by utilizing Google Earth and NASA World Wind. We illustrate our methods by visualizing over 170,000 global download requests for satellite images maintained by the Earth Resources Observation and Science (EROS) Center of U.S. Geological Survey (USGS). Our visualization results identify the most popular satellite images around the world and discover the global user download patterns. The benefits of this research are: 1. assisting in improving the satellite image downloading services provided by USGS, and 2. providing a proxy for analyzing the "hot spot" areas of research. Most importantly, our methods demonstrate an easy way to geo-visualize massive textual spatial data, which is highly applicable to mining spatially referenced data and information on a wide variety of research domains (e.g., hydrology, agriculture, atmospheric science, natural hazard, and global climate change).
GFFview: A Web Server for Parsing and Visualizing Annotation Information of Eukaryotic Genome.
Deng, Feilong; Chen, Shi-Yi; Wu, Zhou-Lin; Hu, Yongsong; Jia, Xianbo; Lai, Song-Jia
2017-10-01
Owing to wide application of RNA sequencing (RNA-seq) technology, more and more eukaryotic genomes have been extensively annotated, such as the gene structure, alternative splicing, and noncoding loci. Annotation information of genome is prevalently stored as plain text in General Feature Format (GFF), which could be hundreds or thousands Mb in size. Therefore, it is a challenge for manipulating GFF file for biologists who have no bioinformatic skill. In this study, we provide a web server (GFFview) for parsing the annotation information of eukaryotic genome and then generating statistical description of six indices for visualization. GFFview is very useful for investigating quality and difference of the de novo assembled transcriptome in RNA-seq studies.
Deceit: A flexible distributed file system
NASA Technical Reports Server (NTRS)
Siegel, Alex; Birman, Kenneth; Marzullo, Keith
1989-01-01
Deceit, a distributed file system (DFS) being developed at Cornell, focuses on flexible file semantics in relation to efficiency, scalability, and reliability. Deceit servers are interchangeable and collectively provide the illusion of a single, large server machine to any clients of the Deceit service. Non-volatile replicas of each file are stored on a subset of the file servers. The user is able to set parameters on a file to achieve different levels of availability, performance, and one-copy serializability. Deceit also supports a file version control mechanism. In contrast with many recent DFS efforts, Deceit can behave like a plain Sun Network File System (NFS) server and can be used by any NFS client without modifying any client software. The current Deceit prototype uses the ISIS Distributed Programming Environment for all communication and process group management, an approach that reduces system complexity and increases system robustness.
Agaku, Israel T; Filippidis, Filippos T; Vardavas, Constantine I
2015-01-01
Tobacco product warning labels are a key health communication medium with plain packaging noted as the next step in the evolution of tobacco packaging. We assessed the self-reported impact of text versus pictorial health warnings and the determinants of support for plain packaging of tobacco products in the European Union (EU). The Special Eurobarometer 385 survey was analyzed for 26,566 adults from 27 EU countries in 2012. The self-reported impact of warning labels (text or pictorial) and determinants of EU-wide support for plain packaging were assessed using multivariate logistic regression. Current smokers in countries where cigarette pictorial warnings were implemented had higher odds of reporting that health warning labels had any effect on their smoking behavior (making a quit attempt or reducing number of cigarettes smoked per day) compared to respondents in countries with text-only warning labels (adjusted odds ratio, aOR = 1.31; 95% confidence interval, 95% CI: 1.10-1.56). Population support for plain packaging of tobacco packs was higher in countries where cigarette pictorial warnings already existed (aOR = 1.17; 95% CI: 1.07-1.28). These findings indicate that the implementation of pictorial warnings at an EU level may have a positive behavioral impact among smokers and pave the way for population support for plain packaging in the EU.
Preparing PNNL Reports with LaTeX
DOE Office of Scientific and Technical Information (OSTI.GOV)
Waichler, Scott R.
2005-06-01
LaTeX is a mature document preparation system that is the standard in many scientific and academic workplaces. It has been used extensively by scattered individuals and research groups within PNNL for years, but until now there have been no centralized or lab-focused resources to help authors and editors. PNNL authors and editors can produce correctly formatted PNNL or PNWD reports using the LaTeX document preparation system and the available template files. Please visit the PNNL-LaTeX Project (http://stidev.pnl.gov/resources/latex/, inside the PNNL firewall) for additional information and files. In LaTeX, document content is maintained separately from document structure for the most part.more » This means that the author can easily produce the same content in different formats and, more importantly, can focus on the content and write it in a plain text file that doesn't go awry, is easily transferable, and won't become obsolete due to software changes. LaTeX produces the finest print quality output; its typesetting is noticeably better than that of MS Word. This is particularly true for mathematics, tables, and other types of special text. Other benefits of LaTeX: easy handling of large numbers of figures and tables; automatic and error-free captioning, citation, cross-referencing, hyperlinking, and indexing; excellent published and online documentation; free or low-cost distributions for Windows/Linux/Unix/Mac OS X. This document serves two purposes: (1) it provides instructions to produce reports formatted to PNNL requirements using LaTeX, and (2) the document itself is in the form of a PNNL report, providing examples of many solved formatting challenges. Authors can use this document or its skeleton version (with formatting examples removed) as the starting point for their own reports. The pnnreport.cls class file and pnnl.bst bibliography style file contain the required formatting specifications for reports to the Department of Energy. Options are also provided for formatting PNWD (non-1830) reports. This documentation and the referenced files are meant to provide a complete package of PNNL particulars for authors and editors who wish to prepare technical reports using LaTeX. The example material in this document was borrowed from real reports and edited for demonstration purposes. The subject matter content of the example material is not relevant here and generally does not make literal sense in the context of this document. Brackets ''[]'' are used to denote large blocks of example text. The PDF file for this report contains hyperlinks to facilitate navigation. Hyperlinks are provided for all cross-referenced material, including section headings, figures, tables, and references. Not all hyperlinks are colored but will be obvious when you move your mouse over them.« less
NASA Astrophysics Data System (ADS)
Prasad, U.; Rahabi, A.
2001-05-01
The following utilities developed for HDF-EOS format data dump are of special use for Earth science data for NASA's Earth Observation System (EOS). This poster demonstrates their use and application. The first four tools take HDF-EOS data files as input. HDF-EOS Metadata Dumper - metadmp Metadata dumper extracts metadata from EOS data granules. It operates by simply copying blocks of metadata from the file to the standard output. It does not process the metadata in any way. Since all metadata in EOS granules is encoded in the Object Description Language (ODL), the output of metadmp will be in the form of complete ODL statements. EOS data granules may contain up to three different sets of metadata (Core, Archive, and Structural Metadata). HDF-EOS Contents Dumper - heosls Heosls dumper displays the contents of HDF-EOS files. This utility provides detailed information on the POINT, SWATH, and GRID data sets. in the files. For example: it will list, the Geo-location fields, Data fields and objects. HDF-EOS ASCII Dumper - asciidmp The ASCII dump utility extracts fields from EOS data granules into plain ASCII text. The output from asciidmp should be easily human readable. With minor editing, asciidmp's output can be made ingestible by any application with ASCII import capabilities. HDF-EOS Binary Dumper - bindmp The binary dumper utility dumps HDF-EOS objects in binary format. This is useful for feeding the output of it into existing program, which does not understand HDF, for example: custom software and COTS products. HDF-EOS User Friendly Metadata - UFM The UFM utility tool is useful for viewing ECS metadata. UFM takes an EOSDIS ODL metadata file and produces an HTML report of the metadata for display using a web browser. HDF-EOS METCHECK - METCHECK METCHECK can be invoked from either Unix or Dos environment with a set of command line options that a user might use to direct the tool inputs and output . METCHECK validates the inventory metadata in (.met file) using The Descriptor file (.desc) as the reference. The tool takes (.desc), and (.met) an ODL file as inputs, and generates a simple output file contains the results of the checking process.
Summary and status of the Horizons ephemeris system
NASA Astrophysics Data System (ADS)
Giorgini, J.
2011-10-01
Since 1996, the Horizons system has provided searchable access to JPL ephemerides for all known solar system bodies, several dozen spacecraft, planetary system barycenters, and some libration points. Responding to 18 400 000 requests from 300 000 unique addresses, the system has recently averaged 420 000 ephemeris requests per month. Horizons is accessed and automated using three interfaces: interactive telnet, web-browser form, and e-mail command-file. Asteroid and comet ephemerides are numerically integrated from JPL's database of initial conditions. This small-body database is updated hourly by a separate process as new measurements and discoveries are reported by the Minor Planet Center and automatically incorporated into new JPL orbit solutions. Ephemerides for other objects are derived by interpolating previously developed solutions whose trajectories have been represented in a file. For asteroids and comets, such files may be dynamically created and transferred to users, effectively recording integrator output. These small-body SPK files may then be interpolated by user software to reproduce the trajectory without duplicating the numerically integrated n-body dynamical model or PPN equations of motion. Other Horizons output is numerical and in the form of plain-text observer, vector, osculating element, or close-approach tables, typically expected be read by other software as input. About one hundred quantities can be requested in various time-scales and coordinate systems. For JPL small-body solutions, this includes statistical uncertainties derived from measurement covariance and state transition matrices. With the exception of some natural satellites, Horizons is consistent with DE405/DE406, the IAU 1976 constants, ITRF93, and IAU2009 rotational models.
Plain language: a strategic response to the health literacy challenge.
Stableford, Sue; Mettger, Wendy
2007-01-01
Low health literacy is a major challenge confronting American and international health organizations. Research in the past decade has documented the prevalence of limited literacy and limited health literacy skills among adults worldwide. This creates a major policy challenge: how to create text-based health information - a common method of health communication - that is accessible to the public. Plain language is a logical, flexible response. While touted by American, Canadian, and European health policy makers, adoption and promotion of plain language standards and skills in health-focused organizations have lagged. Most text-based health information continues to be too hard for most adults to read. Barriers to more rapid diffusion of plain language are reflected in a set of myths perpetuated by critics. These myths are identified and refuted. While plain language is only one of many broad-based solutions needed to address low health literacy, the benefits to everyone demand increased use by health organizations.
75 FR 59250 - Combined Notice of Filings #1
Federal Register 2010, 2011, 2012, 2013, 2014
2010-09-27
... Partnership, Edgecombe Genco, LLC, Logan Generating Company, L.P., Plains End II, LLC, Rathdrum Power, LLC.... Applicants: ISO New England Inc. Description: ISO New England Inc. submits Supplement to Forward Capacity.... Docket Numbers: ER10-2686-000. Applicants: West Penn Power Company. Description: The West Penn Power...
NASA Technical Reports Server (NTRS)
Kotler, R. S.
1983-01-01
File Comparator program IFCOMP, is text file comparator for IBM OS/VScompatable systems. IFCOMP accepts as input two text files and produces listing of differences in pseudo-update form. IFCOMP is very useful in monitoring changes made to software at the source code level.
Laing, G L; Bruce, J L; Aldous, C; Clarke, D L
2014-01-01
The Pietermaritzburg Metropolitan Trauma Service formerly lacked a robust computerised trauma registry. This made surgical audit difficult for the purpose of quality of care improvement and development. We aimed to design, construct and implement a computerised trauma registry within our service. Twelve months following its implementation, we sought to examine and report on the quality of the registry. Formal ethical approval to maintain a computerised trauma registry was obtained prior to undertaking any design and development. Appropriate commercial software was sourced to develop this project. The registry was designed as a flat file. A flat file is a plain text or mixed text and binary file which usually contains one record per line or physical record. Thereafter the registry file was launched onto a secure server. This provided the benefits of access security and automated backups. Registry training was provided to clients by the developer. The exercise of data capture was then integrated into the process of service delivery, taking place at the endpoint of patient care (discharge, transfer or death). Twelve months following its implementation, the compliance rates of data entry were measured. The developer of this project managed to design, construct and implement an electronic trauma registry into the service. Twelve months following its implementation the data were extracted and audited to assess the quality. A total of 2640 patient entries were captured onto the registry. Compliance rates were in the order of eighty percent and client satisfaction rates were high. A number of deficits were identified. These included the omission of weekend discharges and underreporting of deaths. The construction and implementation of the computerised trauma registry was the beginning of an endeavour to continue improvements in the quality of care within our service. The registry provided a reliable audit at twelve months post implementation. Deficits and limitations were identified and new strategies have been planned to overcome these problems and integrate the trauma registry into the process of clinical care. Copyright © 2013 Elsevier Ltd. All rights reserved.
Williams, Lester J.; Raines, Jessica E.; Lanning, Amanda E.
2013-04-04
A database of borehole geophysical logs and other types of data files were compiled as part of ongoing studies of water availability and assessment of brackish- and saline-water resources. The database contains 4,883 logs from 1,248 wells in Florida, Georgia, Alabama, South Carolina, and from a limited number of offshore wells of the eastern Gulf of Mexico and the Atlantic Ocean. The logs can be accessed through a download directory organized by state and county for onshore wells and in a single directory for the offshore wells. A flat file database is provided that lists the wells, their coordinates, and the file listings.
An IBM Compatible Participant Data Base System for Outdoor Programs.
ERIC Educational Resources Information Center
Watters, Ron
The process of maintaining mailing lists and other informational files on outdoor program participants is, plainly and simply, a pain in the neck. Mailing list maintenance is particularly difficult for programs that deal with university students, due to their frequent moves. This paper describes a new software program, the Outdoor Program Data…
Static Extraction and Conformance Analysis of Hierarchical Runtime Architectural Structure
2010-05-14
Example: CryptoDB 253 Architectural Component Java Class Note CustomerManager cryptodb.test.CustomerManager AKA “ crypto consumer” CustomerManager.Receipts...PROVIDERS PLAIN KEYID KEYMANAGEMENT KEYSTORAGE CRYPTO (+) (+) (+) (+) (+) (+) (+)(+) Figure 7.29: CryptoDB: Level-0 OOG with String objects...better understand this communication, we declared different domains for plain-text (PLAIN), encrypted ( CRYPTO ), alias identifier (ALIASID), and key
Session Initiation Protocol Network Encryption Device Plain Text Domain Discovery Service
2007-12-07
MONITOR’S REPORT NUMBER(S) 12. DISTRIBUTION / AVAILABILITY STATEMENT 13. SUPPLEMENTARY NOTES 14. ABSTRACT 15. SUBJECT TERMS 16. SECURITY CLASSIFICATION OF: a...such as the TACLANE, have developed unique discovery methods to establish Plain Text Domain (PTD) Security Associations (SA). All of these techniques...can include network and host Internet Protocol (IP) addresses, Information System Security Office (ISSO) point of contact information and PTD status
. These tables may be defined within a separate ASCII text file (see Description and Format of BUFR Tables time, the BUFR tables are usually read from an external ASCII text file (although it is also possible reports. Click here to view the ASCII text file (called /nwprod/fix/bufrtab.002 on the NCEP CCS machines
1980-03-01
Geological Survey ( AAPG -USGS) thermal gradient map of North America, at a scale of 1:5,000,000, gives the hypothesized average depth (by contours) in...file reports; USGS topographic and geologic maps; AAPG -USGS special geologic maps; APL/JHU reports; VPI-SU progress re- ports to DOE/DGE; technical
Federal Register 2010, 2011, 2012, 2013, 2014
2011-10-25
... DEPARTMENT OF THE INTERIOR Bureau of Land Management [CACA 52030, LLCA920000 L1310000 FI0000] Notice of Proposed Reinstatement of Terminated Oil and Gas Lease CACA 52030, California AGENCY: Bureau of... oil and gas lease CACA 52030 from Plains Exploration & Production Co. The petition was filed on time...
31 CFR 515.803 - Customs procedures; merchandise specified in § 515.204.
Code of Federal Regulations, 2013 CFR
2013-07-01
... its face the number of the license pursuant to which it is filed. The original copy of the specific..., shall bear plainly on its face the following statement: “This document is presented under the provisions... 20220, with an endorsement thereon reading: This document has been accepted pursuant to § 515.808(c) (2...
31 CFR 515.803 - Customs procedures; merchandise specified in § 515.204.
Code of Federal Regulations, 2014 CFR
2014-07-01
... its face the number of the license pursuant to which it is filed. The original copy of the specific..., shall bear plainly on its face the following statement: “This document is presented under the provisions... 20220, with an endorsement thereon reading: This document has been accepted pursuant to § 515.808(c) (2...
31 CFR 515.803 - Customs procedures; merchandise specified in § 515.204.
Code of Federal Regulations, 2010 CFR
2010-07-01
... its face the number of the license pursuant to which it is filed. The original copy of the specific..., shall bear plainly on its face the following statement: “This document is presented under the provisions... 20220, with an endorsement thereon reading: This document has been accepted pursuant to § 515.808(c) (2...
31 CFR 515.803 - Customs procedures; merchandise specified in § 515.204.
Code of Federal Regulations, 2011 CFR
2011-07-01
... its face the number of the license pursuant to which it is filed. The original copy of the specific..., shall bear plainly on its face the following statement: “This document is presented under the provisions... 20220, with an endorsement thereon reading: This document has been accepted pursuant to § 515.808(c) (2...
31 CFR 515.803 - Customs procedures; merchandise specified in § 515.204.
Code of Federal Regulations, 2012 CFR
2012-07-01
... its face the number of the license pursuant to which it is filed. The original copy of the specific..., shall bear plainly on its face the following statement: “This document is presented under the provisions... 20220, with an endorsement thereon reading: This document has been accepted pursuant to § 515.808(c) (2...
Forensic Analysis of Compromised Computers
NASA Technical Reports Server (NTRS)
Wolfe, Thomas
2004-01-01
Directory Tree Analysis File Generator is a Practical Extraction and Reporting Language (PERL) script that simplifies and automates the collection of information for forensic analysis of compromised computer systems. During such an analysis, it is sometimes necessary to collect and analyze information about files on a specific directory tree. Directory Tree Analysis File Generator collects information of this type (except information about directories) and writes it to a text file. In particular, the script asks the user for the root of the directory tree to be processed, the name of the output file, and the number of subtree levels to process. The script then processes the directory tree and puts out the aforementioned text file. The format of the text file is designed to enable the submission of the file as input to a spreadsheet program, wherein the forensic analysis is performed. The analysis usually consists of sorting files and examination of such characteristics of files as ownership, time of creation, and time of most recent access, all of which characteristics are among the data included in the text file.
Sending Foreign Language Word Processor Files over Networks.
ERIC Educational Resources Information Center
Feustle, Joseph A., Jr.
1992-01-01
Advantages of using online systems are described, and specific techniques for successfully transmitting computer text files are described. Topics covered include Microsoft's Rich TextFile, WordPerfect encoding, text compression, and especially encoding and decoding with UNIX programs. (LB)
An EXCEL macro for importing log ASCII standard (LAS) files into EXCEL worksheets
NASA Astrophysics Data System (ADS)
Özkaya, Sait Ismail
1996-02-01
An EXCEL 5.0 macro is presented for converting a LAS text file into an EXCEL worksheet. Although EXCEL has commands for importing text files and parsing text lines, LAS files must be decoded line-by-line because three different delimiters are used to separate fields of differing length. The macro is intended to eliminate manual decoding of LAS version 2.0. LAS is a floppy disk format for storage and transfer of log data as text files. LAS was proposed by the Canadian Well Logging Society. The present EXCEL macro decodes different sections of a LAS file, separates, and places the fields into different columns of an EXCEL worksheet. To import a LAS file into EXCEL without errors, the file must not contain any unrecognized symbols, and the data section must be the last section. The program does not check for the presence of mandatory sections or fields as required by LAS rules. Once a file is incorporated into EXCEL, mandatory sections and fields may be inspected visually.
Ground-water withdrawals from the Coastal Plain of New Jersey, 1956-1980
Vowinkel, E.F.
1984-01-01
Withdrawals and site data for wells with a pump capacity of 100 ,000 gallons per day or greater in the Coastal Plain of New Jersey are stored in computer files for 1956-80. The data are aggregated by computer into tables, graphs and maps to show the distribution of ground-water withdrawals. Withdrawals are reported by type of use and aquifer for each county in the Coastal Plain. Public-supply wells withdraw the largest quantity of ground water in the Coastal Plain, followed by industrial and agricultural wells. In 1980 public-supply withdrawals were about 280 million gallons per day; the maximum monthly rate was about 355 million gallons per day in July, and the lowest was about 215 million gallons per day in February. Average industrial withdrawals were about 65 million gallons per day. Ground-water withdrawals used for agriculture vary significantly during the year. In 1980, about 75 percent of the agricultural withdrawals occurred from June through September. Several aquifers are used as sources of water supply in the Coastal Plain. Five regional aquifers are the major sources of water for public-supply, industrial, or agricultural use. In decreasing order of withdrawals in 1980, in million gallons per day, they are: The Potomac-Raritan-Magothy aquifer system, 243; Kirkwood-Cohansey aquifer system, 70; Atlantic City 800-foot sand, 21; Englishtown aquifer, 12; and the Wenonah-Mount Laurel aquifer system, 5. (USGS)
Listen Up! The Best in Educational Audio
ERIC Educational Resources Information Center
Adam, Anna; Mowers, Helen
2007-01-01
Podcasts are a great way to expand learning beyond the four walls of the classroom or library. Just imagine taking the students on a tour of the great halls of the Louvre one day and the high-altitude plains of the Peruvian altiplano the next. This and more can be done with podcasts, episodic digital files that are the 21st-century equivalent of…
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-02
... notice advises the public that the Bureau of Indian Affairs intends to file a draft environmental impact... available for public review and BIA will hold a public hearing to receive comments on the mixed use project. DATES: Submit written comments by April 16, 2012. The public hearing will be held on March 26, 2012...
Compressing images for the Internet
NASA Astrophysics Data System (ADS)
Beretta, Giordano B.
1998-01-01
The World Wide Web has rapidly become the hot new mass communications medium. Content creators are using similar design and layout styles as in printed magazines, i.e., with many color images and graphics. The information is transmitted over plain telephone lines, where the speed/price trade-off is much more severe than in the case of printed media. The standard design approach is to use palettized color and to limit as much as possible the number of colors used, so that the images can be encoded with a small number of bits per pixel using the Graphics Interchange Format (GIF) file format. The World Wide Web standards contemplate a second data encoding method (JPEG) that allows color fidelity but usually performs poorly on text, which is a critical element of information communicated on this medium. We analyze the spatial compression of color images and describe a methodology for using the JPEG method in a way that allows a compact representation while preserving full color fidelity.
JAva GUi for Applied Research (JAGUAR) v 3.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
JAGUAR is a Java software tool for automatically rendering a graphical user interface (GUI) from a structured input specification. It is designed as a plug-in to the Eclipse workbench to enable users to create, edit, and externally execute analysis application input decks and then view the results. JAGUAR serves as a GUI for Sandia's DAKOTA software toolkit for optimization and uncertainty quantification. It will include problem (input deck)set-up, option specification, analysis execution, and results visualization. Through the use of wizards, templates, and views, JAGUAR helps uses navigate the complexity of DAKOTA's complete input specification. JAGUAR is implemented in Java, leveragingmore » Eclipse extension points and Eclipse user interface. JAGUAR parses a DAKOTA NIDR input specification and presents the user with linked graphical and plain text representations of problem set-up and option specification for DAKOTA studies. After the data has been input by the user, JAGUAR generates one or more input files for DAKOTA, executes DAKOTA, and captures and interprets the results« less
... PDF file Microsoft PowerPoint file Microsoft Word file Microsoft Excel file Audio/Video file Apple Quicktime file RealPlayer file Text file Zip Archive file SAS file ePub file RIS file Page last reviewed: February 18, 2013 Page last updated: March 30, 2017 Content source: ...
Early Detection | Division of Cancer Prevention
[[{"fid":"171","view_mode":"default","fields":{"format":"default","field_file_image_alt_text[und][0][value]":"Early Detection Research Group Homepage Logo","field_file_image_title_text[und][0][value]":"Early Detection Research Group Homepage Logo","field_folder[und]":"15"},"type":"media","field_deltas":{"1":{"format":"default","field_file_image_alt_text[und][0][value]":"Early
Scabies: Workplace Frequently Asked Questions (FAQs)
... PDF file Microsoft PowerPoint file Microsoft Word file Microsoft Excel file Audio/Video file Apple Quicktime file RealPlayer file Text file Zip Archive file SAS file ePub file RIS file Page last reviewed: July 19, 2013 Page last updated: July 19, 2013 Content source: ...
A fast image encryption algorithm based on only blocks in cipher text
NASA Astrophysics Data System (ADS)
Wang, Xing-Yuan; Wang, Qian
2014-03-01
In this paper, a fast image encryption algorithm is proposed, in which the shuffling and diffusion is performed simultaneously. The cipher-text image is divided into blocks and each block has k ×k pixels, while the pixels of the plain-text are scanned one by one. Four logistic maps are used to generate the encryption key stream and the new place in the cipher image of plain image pixels, including the row and column of the block which the pixel belongs to and the place where the pixel would be placed in the block. After encrypting each pixel, the initial conditions of logistic maps would be changed according to the encrypted pixel's value; after encrypting each row of plain image, the initial condition would also be changed by the skew tent map. At last, it is illustrated that this algorithm has a faster speed, big key space, and better properties in withstanding differential attacks, statistical analysis, known plaintext, and chosen plaintext attacks.
Semi automatic indexing of PostScript files using Medical Text Indexer in medical education.
Mollah, Shamim Ara; Cimino, Christopher
2007-10-11
At Albert Einstein College of Medicine a large part of online lecture materials contain PostScript files. As the collection grows it becomes essential to create a digital library to have easy access to relevant sections of the lecture material that is full-text indexed; to create this index it is necessary to extract all the text from the document files that constitute the originals of the lectures. In this study we present a semi automatic indexing method using robust technique for extracting text from PostScript files and National Library of Medicine's Medical Text Indexer (MTI) program for indexing the text. This model can be applied to other medical schools for indexing purposes.
... because it was fifth in a list of historical classifications of common skin rash illnesses in children. ... Audio/Video file Apple Quicktime file RealPlayer file Text file Zip Archive file SAS file ePub file ...
FastStats: Chronic Liver Disease and Cirrhosis
... PDF file Microsoft PowerPoint file Microsoft Word file Microsoft Excel file Audio/Video file Apple Quicktime file RealPlayer file Text file Zip Archive file SAS file ePub file RIS file Page last reviewed: May 30, 2013 Page last updated: October 6, 2016 Content source: ...
Tools for Requirements Management: A Comparison of Telelogic DOORS and the HiVe
2006-07-01
types DOORS deals with are text files, spreadsheets, FrameMaker , rich text, Microsoft Word and Microsoft Project. 2.5.1 Predefined file formats DOORS...during the export. DOORS exports FrameMaker files in an incomplete format, meaning DOORS exported files will have to be opened in FrameMaker and saved
76 FR 10405 - Federal Copyright Protection of Sound Recordings Fixed Before February 15, 1972
Federal Register 2010, 2011, 2012, 2013, 2014
2011-02-24
... file in either the Adobe Portable Document File (PDF) format that contains searchable, accessible text (not an image); Microsoft Word; WordPerfect; Rich Text Format (RTF); or ASCII text file format (not a..., comments may be delivered in hard copy. If hand delivered by a private party, an original [[Page 10406...
Data Science Bowl Launched to Improve Lung Cancer Screening | Division of Cancer Prevention
[[{"fid":"2078","view_mode":"default","fields":{"format":"default","field_file_image_alt_text[und][0][value]":"Data Science Bowl Logo","field_file_image_title_text[und][0][value]":"Data Science Bowl Logo","field_folder[und]":"76"},"type":"media","field_deltas":{"1":{"format":"default","field_file_image_alt_text[und][0][value]":"Data Science Bowl
Detailed Sections from Auger Holes in the Roanoke Rapids 1:100,000 Map Sheet, North Carolina
Weems, Robert E.; Lewis, William C.
2007-01-01
Introduction The Roanoke Rapids 1:100,000 map sheet straddles the Coastal Plain / Piedmont boundary in northernmost North Carolina (Figure 1). Sediments of the Coastal Plain underlie the eastern three-fourths of this area, and patchy outliers of Coastal Plain units cap many of the higher hills in the western one-fourth of the area. Sediments dip gently to the east and reach a maximum known thickness in the extreme southeast part of the map area (Figure 2). The gentle eastward dip is disrupted in several areas due to faulting. The U.S. Geological Survey recovered one core and augered 97 research test holes within the Roanoke Rapids 1:100,000 map sheet to supplement sparse outcrop data available from the Coastal Plain portion of the map area. The recovered sediments were studied and data from them recorded to determine the lithologic characteristics, spatial distribution, and temporal framework of the represented Coastal Plain stratigraphic units. These test holes were critical for accurately determining the distribution of major geologic units and the position of unit boundaries that will be shown on the forthcoming Roanoke Rapids geologic map, but much of the detailed subsurface data cannot be shown readily through this map product. Therefore, detailed descriptions have been collected in this open-file report for geologists, hydrologists, engineers, and community planners to provide a detailed shallow-subsurface stratigraphic framework for much of the Roanoke Rapids map region.
Smoking and Tobacco Use Health Effects
... Reports Vital Signs Surgeon General’s Reports 2016 2014 Historical Reports 2012 2010 2006 2004 2001 A Brief ... Audio/Video file Apple Quicktime file RealPlayer file Text file Zip Archive file SAS file ePub file ...
NIH Seeks Input on In-patient Clinical Research Areas | Division of Cancer Prevention
[[{"fid":"2476","view_mode":"default","fields":{"format":"default","field_file_image_alt_text[und][0][value]":"Aerial view of the National Institutes of Health Clinical Center (Building 10) in Bethesda, Maryland.","field_file_image_title_text[und][0][value]":false},"type":"media","field_deltas":{"1":{"format":"default","field_file_image_alt_text[und][0][value]":"Aerial view of
Pancreatic Cancer Detection Consortium (PCDC) | Division of Cancer Prevention
[[{"fid":"2256","view_mode":"default","fields":{"format":"default","field_file_image_alt_text[und][0][value]":"A 3-dimensional image of a human torso highlighting the pancreas.","field_file_image_title_text[und][0][value]":false},"type":"media","field_deltas":{"1":{"format":"default","field_file_image_alt_text[und][0][value]":"A 3-dimensional image of a human torso
Houston, Natalie A.; Gonzales-Bradford, Sophia L.; Flynn, Amanda T.; Qi, Sharon L.; Peterson, Steven M.; Stanton, Jennifer S.; Ryter, Derek W.; Sohl, Terry L.; Senay, Gabriel B.
2013-01-01
The High Plains aquifer underlies almost 112 million acres in the central United States. It is one of the largest aquifers in the Nation in terms of annual groundwater withdrawals and provides drinking water for 2.3 million people. The High Plains aquifer has gained national and international attention as a highly stressed groundwater supply primarily because it has been appreciably depleted in some areas. The U.S. Geological Survey has an active program to monitor the changes in groundwater levels for the High Plains aquifer and has documented substantial water-level changes since predevelopment: the High Plains Groundwater Availability Study is part of a series of regional groundwater availability studies conducted to evaluate the availability and sustainability of major aquifers across the Nation. The goals of the regional groundwater studies are to quantify current groundwater resources in an aquifer system, evaluate how these resources have changed over time, and provide tools to better understand a systems response to future demands and environmental stresses. The purpose of this report is to present selected data developed and synthesized for the High Plains aquifer as part of the High Plains Groundwater Availability Study. The High Plains Groundwater Availability Study includes the development of a water-budget-component analysis for the High Plains completed in 2011 and development of a groundwater-flow model for the northern High Plains aquifer. Both of these tasks require large amounts of data about the High Plains aquifer. Data pertaining to the High Plains aquifer were collected, synthesized, and then organized into digital data containers called geodatabases. There are 8 geodatabases, 1 file geodatabase and 7 personal geodatabases, that have been grouped in three categories: hydrogeologic data, remote sensing data, and water-budget-component data. The hydrogeologic data pertaining to the northern High Plains aquifer is included in three separate geodatabases: (1) base data from a groundwater-flow model; (2) hydrogeology and hydraulic properties data; and (3) groundwater-flow model data to be used as calibration targets. The remote sensing data for this study were developed by the U. S. Geological Survey Earth Resources Observation and Science Center and include historical and predicted land-use/land-cover data and actual evapotranspiration data by using remotely sensed temperature data. The water-budget-component data contains selected raster data from maps in the “Selected Approaches to Estimate Water-Budget Components of the High Plains, 1940 Through 1949 and 2000 Through 2009” report completed in 2011 (http://pubs.usgs.gov/sir/2011/5183/). Federal Geographic Data Committee compliant metadata were created for each spatial and tabular data layer in the geodatabases.
... HEADS UP Resources Training Custom PDFs Mobile Apps Videos Graphics Podcasts Social Media File Formats Help: How do I view different file formats (PDF, DOC, PPT, MPEG) on this site? Adobe PDF file Microsoft PowerPoint ... file Apple Quicktime file RealPlayer file Text file ...
Tobacco plain packaging: Evidence based policy or public health advocacy?
McKeganey, Neil; Russell, Christopher
2015-06-01
In December 2012, Australia became the first country to require all tobacco products be sold solely in standardised or 'plain' packaging, bereft of the manufacturers' trademarked branding and colours, although retaining large graphic and text health warnings. Following the publication of Sir Cyril Chantler's review of the evidence on the effects of plain tobacco packaging, the Ministers of the United Kingdom Parliament voted in March 2015 to implement similar legislation. Support for plain packaging derives from the belief that tobacco products sold in plain packs have reduced appeal and so are more likely to deter young people and non-smokers from starting tobacco use, and more likely to motivate smokers to quit and stay quit. This article considers why support for the plain packaging policy has grown among tobacco control researchers, public health advocates and government ministers, and reviews Australian survey data that speak to the possible introductory effect of plain packaging on smoking prevalence within Australia. The article concludes by emphasising the need for more detailed research to be undertaken before judging the capacity of the plain packaging policy to deliver the multitude of positive effects that have been claimed by its most ardent supporters. Copyright © 2015 Elsevier B.V. All rights reserved.
Text and Illustration Processing System (TIPS) User’s Manual. Volume 1. Text Processing System.
1981-07-01
m.st De in tre file citalog. To copy a file, begin by calling up the file. Access the Main Menu and, T<ESSq: 2 - Edit an Existing File After you have...23 III MAKING REVISIONS............................................ 24 Call Up an Existing File...above the keyboard is called a Cathode Ray Tube (CRT). It displays information as you key it in. A CURSOR is an underscore character on the screen which
P2P Watch: Personal Health Information Detection in Peer-to-Peer File-Sharing Networks
El Emam, Khaled; Arbuckle, Luk; Neri, Emilio; Rose, Sean; Jonker, Elizabeth
2012-01-01
Background Users of peer-to-peer (P2P) file-sharing networks risk the inadvertent disclosure of personal health information (PHI). In addition to potentially causing harm to the affected individuals, this can heighten the risk of data breaches for health information custodians. Automated PHI detection tools that crawl the P2P networks can identify PHI and alert custodians. While there has been previous work on the detection of personal information in electronic health records, there has been a dearth of research on the automated detection of PHI in heterogeneous user files. Objective To build a system that accurately detects PHI in files sent through P2P file-sharing networks. The system, which we call P2P Watch, uses a pipeline of text processing techniques to automatically detect PHI in files exchanged through P2P networks. P2P Watch processes unstructured texts regardless of the file format, document type, and content. Methods We developed P2P Watch to extract and analyze PHI in text files exchanged on P2P networks. We labeled texts as PHI if they contained identifiable information about a person (eg, name and date of birth) and specifics of the person’s health (eg, diagnosis, prescriptions, and medical procedures). We evaluated the system’s performance through its efficiency and effectiveness on 3924 files gathered from three P2P networks. Results P2P Watch successfully processed 3924 P2P files of unknown content. A manual examination of 1578 randomly selected files marked by the system as non-PHI confirmed that these files indeed did not contain PHI, making the false-negative detection rate equal to zero. Of 57 files marked by the system as PHI, all contained both personally identifiable information and health information: 11 files were PHI disclosures, and 46 files contained organizational materials such as unfilled insurance forms, job applications by medical professionals, and essays. Conclusions PHI can be successfully detected in free-form textual files exchanged through P2P networks. Once the files with PHI are detected, affected individuals or data custodians can be alerted to take remedial action. PMID:22776692
P2P watch: personal health information detection in peer-to-peer file-sharing networks.
Sokolova, Marina; El Emam, Khaled; Arbuckle, Luk; Neri, Emilio; Rose, Sean; Jonker, Elizabeth
2012-07-09
Users of peer-to-peer (P2P) file-sharing networks risk the inadvertent disclosure of personal health information (PHI). In addition to potentially causing harm to the affected individuals, this can heighten the risk of data breaches for health information custodians. Automated PHI detection tools that crawl the P2P networks can identify PHI and alert custodians. While there has been previous work on the detection of personal information in electronic health records, there has been a dearth of research on the automated detection of PHI in heterogeneous user files. To build a system that accurately detects PHI in files sent through P2P file-sharing networks. The system, which we call P2P Watch, uses a pipeline of text processing techniques to automatically detect PHI in files exchanged through P2P networks. P2P Watch processes unstructured texts regardless of the file format, document type, and content. We developed P2P Watch to extract and analyze PHI in text files exchanged on P2P networks. We labeled texts as PHI if they contained identifiable information about a person (eg, name and date of birth) and specifics of the person's health (eg, diagnosis, prescriptions, and medical procedures). We evaluated the system's performance through its efficiency and effectiveness on 3924 files gathered from three P2P networks. P2P Watch successfully processed 3924 P2P files of unknown content. A manual examination of 1578 randomly selected files marked by the system as non-PHI confirmed that these files indeed did not contain PHI, making the false-negative detection rate equal to zero. Of 57 files marked by the system as PHI, all contained both personally identifiable information and health information: 11 files were PHI disclosures, and 46 files contained organizational materials such as unfilled insurance forms, job applications by medical professionals, and essays. PHI can be successfully detected in free-form textual files exchanged through P2P networks. Once the files with PHI are detected, affected individuals or data custodians can be alerted to take remedial action.
NASA Astrophysics Data System (ADS)
Suftin, I.; Read, J. S.; Walker, J.
2013-12-01
Scientists prefer not having to be tied down to a specific machine or operating system in order to analyze local and remote data sets or publish work. Increasingly, analysis has been migrating to decentralized web services and data sets, using web clients to provide the analysis interface. While simplifying workflow access, analysis, and publishing of data, the move does bring with it its own unique set of issues. Web clients used for analysis typically offer workflows geared towards a single user, with steps and results that are often difficult to recreate and share with others. Furthermore, workflow results often may not be easily used as input for further analysis. Older browsers further complicate things by having no way to maintain larger chunks of information, often offloading the job of storage to the back-end server or trying to squeeze it into a cookie. It has been difficult to provide a concept of "session storage" or "workflow sharing" without a complex orchestration of the back-end for storage depending on either a centralized file system or database. With the advent of HTML5, browsers gained the ability to store more information through the use of the Web Storage API (a browser-cookie holds a maximum of 4 kilobytes). Web Storage gives us the ability to store megabytes of arbitrary data in-browser either with an expiration date or just for a session. This allows scientists to create, update, persist and share their workflow without depending on the backend to store session information, providing the flexibility for new web-based workflows to emerge. In the DSASWeb portal ( http://cida.usgs.gov/DSASweb/ ), using these techniques, the representation of every step in the analyst's workflow is stored as plain-text serialized JSON, which we can generate as a text file and provide to the analyst as an upload. This file may then be shared with others and loaded back into the application, restoring the application to the state it was in when the session file was generated. A user may then view results produced during that session or go back and alter input parameters, creating new results and producing new, unique sessions which they can then again share. This technique not only provides independence for the user to manage their session as they like, but also allows much greater freedom for the application provider to scale out without having to worry about carrying over user information or maintaining it in a central location.
BatMass: a Java Software Platform for LC-MS Data Visualization in Proteomics and Metabolomics.
Avtonomov, Dmitry M; Raskind, Alexander; Nesvizhskii, Alexey I
2016-08-05
Mass spectrometry (MS) coupled to liquid chromatography (LC) is a commonly used technique in metabolomic and proteomic research. As the size and complexity of LC-MS-based experiments grow, it becomes increasingly more difficult to perform quality control of both raw data and processing results. In a practical setting, quality control steps for raw LC-MS data are often overlooked, and assessment of an experiment's success is based on some derived metrics such as "the number of identified compounds". The human brain interprets visual data much better than plain text, hence the saying "a picture is worth a thousand words". Here, we present the BatMass software package, which allows for performing quick quality control of raw LC-MS data through its fast visualization capabilities. It also serves as a testbed for developers of LC-MS data processing algorithms by providing a data access library for open mass spectrometry file formats and a means of visually mapping processing results back to the original data. We illustrate the utility of BatMass with several use cases of quality control and data exploration.
BatMass: a Java software platform for LC/MS data visualization in proteomics and metabolomics
Avtonomov, Dmitry; Raskind, Alexander; Nesvizhskii, Alexey I.
2017-01-01
Mass spectrometry (MS) coupled to liquid chromatography (LC) is a commonly used technique in metabolomic and proteomic research. As the size and complexity of LC/MS based experiments grow, it becomes increasingly more difficult to perform quality control of both raw data and processing results. In a practical setting, quality control steps for raw LC/MS data are often overlooked and assessment of an experiment's success is based on some derived metrics such as “the number of identified compounds”. Human brain interprets visual data much better than plain text, hence the saying “a picture is worth a thousand words”. Here we present BatMass software package which allows to perform quick quality control of raw LC/MS data through its fast visualization capabilities. It also serves as a testbed for developers of LC/MS data processing algorithms by providing a data access library for open mass spectrometry file formats and a means of visually mapping processing results back to the original data. We illustrate the utility of BatMass with several use cases of quality control and data exploration. PMID:27306858
Extract and visualize geolocation from any text file
NASA Astrophysics Data System (ADS)
Boustani, M.
2015-12-01
There are variety of text file formats such as PDF, HTML and more which contains words about locations(countries, cities, regions and more). GeoParser developed as one of sub-projects under DARPA Memex to help finding any geolocation information crawled website data. It is a web application benefiting from Apache Tika to extract locations from any text file format and visualize geolocations on the map. https://github.com/MBoustani/GeoParserhttps://github.com/chrismattmann/tika-pythonhttp://www.darpa.mil/program/memex
But What Is There to See? An Exploration of a Great Plains Aesthetic
ERIC Educational Resources Information Center
Tangney, ShaunAnne
2004-01-01
In the fall of 2001 I taught a beginning college composition course at Minot State University, a small state university located in the northwestern quadrant of North Dakota. It is typical of such courses to include a fair amount of reading, and one of the texts I assigned was Ian Frazier's "Great Plains". The book is a travelogue that…
Crop-phenology and LANDSAT-based irrigated lands inventory in the high plains. [United States
NASA Technical Reports Server (NTRS)
Martinko, E. A. (Principal Investigator); Poracsky, J.; Kipp, E. R.; Krieger, H.
1981-01-01
Optimal LANDSAT image dates for 1980 were identified based on the weekly crop-weather reports for Colorado, New Mexico, South Dakota, Texas, Oklahoma, Kansas, Nebraska, and Wyoming. The 1979 agricultural statistics data were entered into computer files and a revised questionnaire was developed and mailed to ASCS county agents. A set of computer programs was developed to allow the preparation of computer-assisted graphic displays of much of the collected data.
prepbufr BUFR biascr.$CDUMP.$CDATE Time dependent sat bias correction file abias text satang.$CDUMP.$CDATE Angle dependent sat bias correction satang text sfcanl.$CDUMP.$CDATE surface analysis sfcanl binary tcvitl.$CDUMP.$CDATE Tropical Storm Vitals syndata.tcvitals.tm00 text adpsfc.$CDUMP.$CDATE Surface land
Section 525(a) of the bankruptcy code plainly does not apply to Medicare provider agreements.
Sperow, E H
2001-01-01
Section 525(a) of the Bankruptcy Code prevents government entities from discriminating against debtors based on the debtor's bankruptcy filing. This Article analyzes how this provision is applied to healthcare providers who file for bankruptcy. Some commentators have expressed concerns that because of Section 525, the federal government is unable to deny a bankrupt provider a new Medicare provider agreement due to the debtor's failure to pay debts discharged during bankruptcy. This Article, however, argues that Section 525 does not apply to a provider agreements because it is not a "license, permit, charter, franchise, or other similar grant" as defined by the statute. Therefore, the author concludes that debtor healthcare providers should not be allowed back into the Medicare program without first paying their statutorily required debts.
Integration of Geophysical and Geochemical Data
NASA Astrophysics Data System (ADS)
Yamagishi, Y.; Suzuki, K.; Tamura, H.; Nagao, H.; Yanaka, H.; Tsuboi, S.
2006-12-01
Integration of geochemical and geophysical data would give us a new insight to the nature of the Earth. It should advance our understanding for the dynamics of the Earth's interior and surface processes. Today various geochemical and geophysical data are available on Internet. These data are stored in various database systems. Each system is isolated and provides own format data. The goal of this study is to display both the geochemical and geophysical data obtained from such databases together visually. We adopt Google Earth as the presentation tool. Google Earth is virtual globe software and is provided free of charge by Google, Inc. Google Earth displays the Earth's surface using satellite images with mean resolution of ~15m. We display any graphical features on Google Earth by KML format file. We have developed softwares to convert geochemical and geophysical data to KML file. First of all, we tried to overlay data from Georoc and PetDB and seismic tomography data on Google Earth. Georoc and PetDB are both online database systems for geochemical data. The data format of Georoc is CSV and that of PetDB is Microsoft Excel. The format of tomography data we used is plain text. The conversion software can process these different file formats. The geochemical data (e. g. compositional abundance) is displayed as a three-dimensional column on the Earth's surface. The shape and color of the column mean the element type. The size and color tone vary according to the abundance of the element. The tomography data can be converted into a KML file for each depth. This overlay plot of geochemical data and tomography data should help us to correlate internal temperature anomalies to geochemical anomalies, which are observed at the surface of the Earth. Our tool can convert any geophysical and geochemical data to a KML as long as the data is associated with longitude and latitude. We are going to support more geophysical data formats. In addition, we are currently trying to obtain scientific insights for the Earth's interior based on the view of both geophysical and geochemical data on Google Earth.
Design of the aerosol sampling manifold for the Southern Great Plains site
DOE Office of Scientific and Technical Information (OSTI.GOV)
Leifer, R.; Knuth, R.H.; Guggenheim, S.F.
1995-04-01
To meet the needs of the ARM program, the Environmental Measurements Laboratory (EML) has the responsibility to establish a surface aerosol measurements program at the Southern Great Plains (SGP) site in Lamont, OK. At the present time, EML has scheduled installation of five instruments at SGP: a single wavelength nephelometer, an optical particle counter (OPC), a condensation particle counter (CPC), an optical absorption monitor (OAM), and an ozone monitor. ARM`s operating protocol requires that all the observational data be placed online and sent to the main computer facility in real time. EML currently maintains a computer file containing back trajectorymore » (BT) analyses for the SGP site. These trajectories are used to characterize air mass types as they pass over the site. EML is continuing to calculate and store the resulting trajectory analyses for future use by the ARM science team.« less
Flores, Romeo M.; Ochs, A.M.; Stricker, G.D.; Ellis, M.S.; Roberts, S.B.; Keighin, C.W.; Murphy, E.C.; Cavaroc, V.V.; Johnson, R.C.; Wilde, E.M.
1999-01-01
One of the objectives of the National Coal Resource Assessment in the Northern Rocky Mountains and Great Plains region was to compile stratigraphic and coal quality-trace-element data on selected and potentially minable coal beds and zones of the Fort Union Formation (Paleocene) and equivalent formations. In order to implement this objective, drill-hole information was compiled from hard-copy and digital files of the: (1) U.S. Bureau of Land Management (BLM) offices in Casper, Rawlins, and Rock Springs, Wyoming, and in Billings, Montana, (2) State geological surveys of Montana, North Dakota, and Wyoming, (3) Wyoming Department of Environmental Quality in Cheyenne, (4) U.S. Office of Surface Mining in Denver, Colorado, (5) U.S. Geological Survey, National Coal Resource Data System (NCRDS) in Reston, Virginia, (6) U.S. Geological Survey coal publications, (7) university theses, and (8) mining companies.
Ellis, M.S.; Nichols, D.J.
2002-01-01
In 1999, 1,100 million short tons of coal were produced in the United States, 38 percent from the Northern Rocky Mountains and Great Plains region. This coal has low ash content, and sulfur content is in compliance with Clean Air Act standards (U.S. Statutes at Large, 1990).The National Coal Resource Assessment for this region includes geologic, stratigraphic, palynologic, and geochemical studies and resource calculations for 18 major coal zones in the Powder River, Williston, Green River, Hanna, and Carbon Basins. Calculated resources are 660,000 million short tons. Results of the study are available in U.S. Geological Survey Professional Paper 1625?A (Fort Union Coal Assess-ment Team, 1999) and Open-File Report 99-376 (Flores and others, 1999) in CD-ROM format.
Digital radiographic imaging transfer: comparison with plain radiographs.
Averch, T D; O'Sullivan, D; Breitenbach, C; Beser, N; Schulam, P G; Moore, R G; Kavoussi, L R
1997-04-01
Advances in digital imaging and computer display technology have allowed development of clinical teleradiographic systems. There are limited data assessing the effectiveness of such systems when applied to urologic pathology. In an effort to appraise the effectiveness of teleradiology in identifying renal calculi, the accuracy of findings on transmitted radiographic images were compared with those made when viewing the actual plain film. Plain films (KUB) were obtained from 26 patients who presented to the radiology department to rule out urinary calculous disease. The films were digitalized by a radiograph scanner into ARCNEMA-2 file format, compressed by a NASA algorithm, and transferred via a 28.8-kbps modern over standard telephone lines to a remote section 25 miles away, where they were decompressed and viewed on a 1600 x 1200-pixel monitor. Two attending urologists and two endourologic fellows were randomized to read either the transmitted image or the original radiograph with minimal clinical history provided. Of the 26 plain radiographic films, 24 were correctly interpreted by the fellows and 25 by the attending physicians (92% and 96% accuracy, respectively) for a total accuracy of 94% with no statistical difference (p = 0.16). After compression, all but one of the digital images were transferred successfully. The attending physicians correctly interpreted 24 of the 25 digital images (96%), whereas the fellows were correct on 21 interpretations (84%), resulting in a total 90% accuracy with a significant difference between the groups (p < or = 0.04). Overall, no statistical difference between the interpretations of the plain film and the digital image was revealed (p = 0.21). Using available technology, KUB images can be transmitted to a remote site, and the location of a stone can be determined correctly. Higher accuracy is demonstrated by experienced surgeons.
2002-03-01
This inspirational publication is one that practises what it preaches. It is about clear writing and provides very useful guidance to the nursing professional who wants to get an unclouded message across. It provides useful examples throughout to guide the reader and highlights how simple interventions can make big improvements.
Chemopreventive Agent Development | Division of Cancer Prevention
[[{"fid":"174","view_mode":"default","fields":{"format":"default","field_file_image_alt_text[und][0][value]":"Chemoprevenentive Agent Development Research Group Homepage Logo","field_file_image_title_text[und][0][value]":"Chemoprevenentive Agent Development Research Group Homepage
Mays, Darren; Niaura, Raymond S.; Evans, W. Douglas; Hammond, David; Luta, George; Tercyak, Kenneth P.
2014-01-01
Objective This study examined the impact of pictorial cigarette warning labels, warning label message framing, and plain cigarette packaging on young adult smokers’ motivation to quit. Methods Smokers ages 18–30 (n=740) from a consumer research panel were randomized to one of four experimental conditions where they viewed online images of 4 cigarette packs with warnings about lung disease, cancer, stroke/heart disease, and death, respectively. Packs differed across conditions by warning message framing (gain versus loss) and packaging (branded versus plain). Measures captured demographics, smoking behavior, covariates, and motivation to quit in response to cigarette packs. Results Pictorial warnings about lung disease and cancer generated the strongest motivation to quit across conditions. Adjusting for pre-test motivation and covariates, a message framing by packaging interaction revealed gain-framed warnings on plain packs generated greater motivation to quit for lung disease, cancer, and mortality warnings (p < 0.05), compared with loss-framed warnings on plain packs. Conclusions Warnings combining pictorial depictions of smoking-related health risks with text-based messages about how quitting reduces risks may achieve better outcomes among young adults, especially in countries considering or implementing plain packaging regulations. PMID:24420310
Federal Register 2010, 2011, 2012, 2013, 2014
2012-02-01
... instant rule filing, the Exchange is proposing to amend the text of the Certificate previously filed with... rule change that are filed with the Commission, and all written communications relating to the proposed...
Prostate and Urologic Cancer | Division of Cancer Prevention
[[{"fid":"183","view_mode":"default","fields":{"format":"default","field_file_image_alt_text[und][0][value]":"Prostate and Urologic Cancer Research Group Homepage Logo","field_file_image_title_text[und][0][value]":"Prostate and Urologic Cancer Research Group Homepage
Code of Federal Regulations, 2013 CFR
2013-10-01
... Start field End field Record type 1 Numeric 1=Filing2=Cancellation B 1 1 Insurer number 8 Text FMCSA... Filing type 1 Numeric 1 = BI&PD2 = Cargo 3 = Bond 4 = Trust Fund B 10 10 FMCSA docket number 8 Text FMCSA... 264 265 Insured zip code 9 Numeric (Do not include dash if using 9 digit code) B 266 274 Insured...
Code of Federal Regulations, 2014 CFR
2014-10-01
... Start field End field Record type 1 Numeric 1=Filing2=Cancellation B 1 1 Insurer number 8 Text FMCSA... Filing type 1 Numeric 1 = BI&PD2 = Cargo 3 = Bond 4 = Trust Fund B 10 10 FMCSA docket number 8 Text FMCSA... 264 265 Insured zip code 9 Numeric (Do not include dash if using 9 digit code) B 266 274 Insured...
Alton Ochsner, MD (1896-1981): surgical pioneer and legacy linking smoking and disease.
Costantino, Christina L; Winter, Jordan M; Yeo, Charles J; Cowan, Scott W
2015-06-01
Edward William Alton Ochsner kept a plain, metal card file in which he recorded close to 50 years worth of medical experiences, research, and insights. The most populated topics were filed as "Cancer, Lung" and "Cancer, Bronchogenic." These reflected his areas of greatest interest, for which he would go on to produce groundbreaking work. Of his many lifetime accomplishments, he is perhaps best known for being the first to report a link between cigarette smoking and lung cancer. This was just one of the many ways in which Ochsner worked to effect social change. The establishment of the Ochsner Health System in New Orleans was born from this similar passion. Ochsner went on to become one of the giants of his generation as a result of this tireless work as a leader, educator, and mentor.
The dairy_wa.zip file is a zip file containing an Arc/Info export file and a text document. Note the DISCLAIM.TXT file as these data are not verified. Map extent: statewide. Input Source: Address database obtained from Wa Dept of Agriculture. Data was originally developed und...
Extending the Kerberos Protocol for Distributed Data as a Service
2012-09-20
exported as a UIMA [11] PEAR file for deployment to IBM Content Analytics (ICA). A UIMA PEAR file is a deployable text analytics “pipeline” (analogous...to a web application packaged in a WAR file). ICA is a text analysis and search application that supports UIMA . The key entities targeted by NLP rules...workbench. [Online]. Available: https: //www.ibm.com/developerworks/community/alphaworks/lrw/ [11] Apache UIMA . [Online]. Available: http
Arkansas and Louisiana Aeromagnetic and Gravity Maps and Data - A Website for Distribution of Data
Bankey, Viki; Daniels, David L.
2008-01-01
This report contains digital data, image files, and text files describing data formats for aeromagnetic and gravity data used to compile the State aeromagnetic and gravity maps of Arkansas and Louisiana. The digital files include grids, images, ArcInfo, and Geosoft compatible files. In some of the data folders, ASCII files with the extension 'txt' describe the format and contents of the data files. Read the 'txt' files before using the data files.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-30
....19b-4. \\3\\ The text of the proposed rule change is attached as Exhibit 5 to NSCC's filing, which is... any comments it received on the proposed rule change. The text of these statements may be examined at... the text of the summaries prepared by NSCC. (A) Self-Regulatory Organization's Statement of the...
ERIC Educational Resources Information Center
Kichuk, Diana
2015-01-01
The electronic conversion of scanned image files to readable text using optical character recognition (OCR) software and the subsequent migration of raw OCR text to e-book text file formats are key remediation or media conversion technologies used in digital repository e-book production. Despite real progress, the OCR problem of reliability and…
Mucan, Burcu; Moodie, Crawford
2017-11-09
The Turkish Government's 'National Tobacco Control Program 2015-2018' included plans to introduce plain packaging and also a ban on brand names on cigarette packs, allowing only assigned numbers on packs. We explored perceptions of these proposed measures, and also pack inserts with cessation messages, another novel way of using the packaging to communicate with consumers. Eight focus groups were conducted with 47 young adult smokers in Manisa and Kutahya (Turkey) in December 2016. Participants were shown three straight-edged plain cigarette packs, as required in Australia, and then three bevelled-edged plain packs, as permitted in the UK. They were then shown plain packs with numbers rather than brand names, and finally three pack inserts with messages encouraging quitting or offering tips on how to do so. Participants were asked about their perceptions of each. Plain packs were considered unappealing and off-putting, although the bevelled-edged packs were viewed more favourably than the straight-edged packs. Numbered packs were thought by some to diminish the appeal created by the brand name and potentially decrease interest among never smokers and newer smokers. Pack inserts were thought to have less of an impact than the on-pack warnings, but could potentially help discourage initiation and encourage cessation. That bevelled-edged plain packs were perceived more positively than straight-edged plain packs is relevant to countries planning to introduce plain packaging. The study provides a first insight into smokers' perceptions of a ban on brand names, which was perceived to reduce appeal among young people. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
ALPS - A LINEAR PROGRAM SOLVER
NASA Technical Reports Server (NTRS)
Viterna, L. A.
1994-01-01
Linear programming is a widely-used engineering and management tool. Scheduling, resource allocation, and production planning are all well-known applications of linear programs (LP's). Most LP's are too large to be solved by hand, so over the decades many computer codes for solving LP's have been developed. ALPS, A Linear Program Solver, is a full-featured LP analysis program. ALPS can solve plain linear programs as well as more complicated mixed integer and pure integer programs. ALPS also contains an efficient solution technique for pure binary (0-1 integer) programs. One of the many weaknesses of LP solvers is the lack of interaction with the user. ALPS is a menu-driven program with no special commands or keywords to learn. In addition, ALPS contains a full-screen editor to enter and maintain the LP formulation. These formulations can be written to and read from plain ASCII files for portability. For those less experienced in LP formulation, ALPS contains a problem "parser" which checks the formulation for errors. ALPS creates fully formatted, readable reports that can be sent to a printer or output file. ALPS is written entirely in IBM's APL2/PC product, Version 1.01. The APL2 workspace containing all the ALPS code can be run on any APL2/PC system (AT or 386). On a 32-bit system, this configuration can take advantage of all extended memory. The user can also examine and modify the ALPS code. The APL2 workspace has also been "packed" to be run on any DOS system (without APL2) as a stand-alone "EXE" file, but has limited memory capacity on a 640K system. A numeric coprocessor (80X87) is optional but recommended. The standard distribution medium for ALPS is a 5.25 inch 360K MS-DOS format diskette. IBM, IBM PC and IBM APL2 are registered trademarks of International Business Machines Corporation. MS-DOS is a registered trademark of Microsoft Corporation.
Klett, T.R.; Le, P.A.
2007-01-01
This chapter describes data used in support of the process being applied by the U.S. Geological Survey (USGS) National Oil and Gas Assessment (NOGA) project. Digital tabular data used in this report and archival data that permit the user to perform further analyses are available elsewhere on this CD-ROM. Computers and software may import the data without transcription from the Portable Document Format files (.pdf files) of the text by the reader. Graphical images are provided as .pdf files and tabular data are provided in a raw form as tab-delimited text files (.tab files) because of the number and variety of platforms and software available.
ERIC Educational Resources Information Center
Jul, Erik
1992-01-01
Describes the use of file transfer protocol (FTP) on the INTERNET computer network and considers its use as an electronic publishing system. The differing electronic formats of text files are discussed; the preparation and access of documents are described; and problems are addressed, including a lack of consistency. (LRW)
48 CFR 1552.211-75 - Working files.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 48 Federal Acquisition Regulations System 6 2010-10-01 2010-10-01 true Working files. 1552.211-75... FORMS SOLICITATION PROVISIONS AND CONTRACT CLAUSES Texts of Provisions and Clauses 1552.211-75 Working.... Working Files (APR 1984) The Contractor shall maintain accurate working files (by task or work assignment...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-01-13
... electronic filing of comments and has dedicated eFiling expert staff available to assist you at (202) 502... Filings. An eComment is an easy method for interested persons to submit brief, text-only comments on a project; (2) You may file your comments electronically by using the eFiling feature, which is located on...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-01-13
... Commission encourages electronic filing of comments and has expert eFiling staff available to assist you at... Documents and Filings. An eComment is an easy method for interested persons to submit brief, text-only comments on a project; (2) You may file your comments electronically by using the eFiling feature, which is...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-05-01
... Text April 25, 2013. Pursuant to Section 19(b)(1) \\1\\ of the Securities Exchange Act of 1934 (the ``Act... clarifications to rule text. The Exchange has designated the proposed changes as immediately effective, and proposes to implement the changes on or shortly after the 30th day after the date of the filing. The text...
Koltun, G.F.; Ostheimer, Chad J.; Griffin, Michael S.
2006-01-01
Velocity, bathymetry, and transverse (cross-channel) mixing characteristics were studied in a 34-mile study reach of the Ohio River extending from the lower pool of the Captain Anthony Meldahl Lock and Dam, near Willow Grove, Ky, to just downstream from the confluence of the Licking and Ohio Rivers, near Newport, Ky. Information gathered in this study ultimately will be used to parameterize hydrodynamic and water-quality models that are being developed for the study reach. Velocity data were measured at an average cross-section spacing of about 2,200 feet by means of boat-mounted acoustic Doppler current profilers (ADCPs). ADCP data were postprocessed to create text files describing the three-dimensional velocity characteristics in each transect. Bathymetry data were measured at an average transect spacing of about 800 feet by means of a boat-mounted single-beam echosounder. Depth information obtained from the echosounder were postprocessed with water-surface slope and elevation information collected during the surveys to compute stream-bed elevations. The bathymetry data were written to text files formatted as a series of space-delimited x-, y-, and z-coordinates. Two separate dye-tracer studies were done on different days in overlapping stream segments in an 18.3-mile section of the study reach to assess transverse mixing characteristics in the Ohio River. Rhodamine WT dye was injected into the river at a constant rate, and concentrations were measured in downstream cross sections, generally spaced 1 to 2 miles apart. The dye was injected near the Kentucky shoreline during the first study and near the Ohio shoreline during the second study. Dye concentrations were measured along transects in the river by means of calibrated fluorometers equipped with flow-through chambers, automatic temperature compensation, and internal data loggers. The use of flow-through chambers permitted water to be pumped continuously out of the river from selected depths and through the fluorometer for measurement as the boat traversed the river. Time-tagged concentration readings were joined with horizontal coordinate data simultaneously captured from a differentially corrected Global Positioning System (GPS) device to create a plain-text, comma-separated variable file containing spatially tagged dye-concentration data. Plots showing the transverse variation in relative dye concentration indicate that, within the stream segments sampled, complete transverse mixing of the dye did not occur. In addition, the highest concentrations of dye tended to be nearest the side of the river from which the dye was injected. Velocity, bathymetry, and dye-concentration data collected during this study are available for Internet download by means of hyperlinks in this report. Data contained in this report were collected between October 2004 and March 2006.
Becker, C.J.; Runkle, D.L.; Rea, Alan
1997-01-01
ARC/INFO export files This diskette contains digitized aquifer boundaries and maps of hydraulic conductivity, recharge, and ground-water level elevation contours for the High Plains aquifer in western Oklahoma. This area encompasses the panhandle counties of Cimarron, Texas, and Beaver, and the western counties of Harper, Ellis, Woodward, Dewey, and Roger Mills. The High Plains aquifer underlies approximately 7,000 square miles of Oklahoma and is used extensively for irrigation. The High Plains aquifer is a water-table aquifer and consists predominately of the Tertiary-age Ogallala Formation and overlying Quaternary-age alluvial and terrace deposits. In some areas the aquifer is absent and the underlying Triassic, Jurassic, or Cretaceous-age rocks are exposed at the surface. These rocks are hydraulically connected with the aquifer in some areas. The High Plains aquifer is composed of interbedded sand, siltstone, clay, gravel, thin limestones, and caliche. The proportion of various lithological materials changes rapidly from place to place, but poorly sorted sand and gravel predominate. The rocks are poorly to moderately well cemented by calcium carbonate. The aquifer boundaries, hydraulic conductivity, and recharge data sets were created by extracting geologic contact lines from published digital surficial geology maps based on a scale of 1:125,000 for the panhandle counties and 1:250,000 for the western counties. The water-level elevation contours and some boundary lines were digitized from maps in a published water-level elevation map for 1980 based on a scale of 1:250,000. The hydraulic conductivity and recharge values in this report were used as input to the ground-water flow model on the High Plains aquifer. Ground-water flow models are numerical representations that simplify and aggregate natural systems. Models are not unique; different combinations of aquifer characteristics may produce similar results. Therefore, values of hydraulic conductivity and recharge used in the model and presented in this data set are not precise, but are within a reasonable range when compared to independently collected data.
Chan, K C; Pharoah, M; Lee, L; Weinreb, I; Perez-Ordonez, B
2013-01-01
The purpose of this case series is to present the common features of intraosseous mucoepidermoid carcinoma (IMC) of the jaws in plain film and CT imaging. Two oral and maxillofacial radiologists reviewed and characterized the common features of four biopsy-proven cases of IMC in the jaws in plain film and CT imaging obtained from the files of the Department of Oral Radiology, Faculty of Dentistry, University of Toronto, Toronto, Canada. The common features are a well-defined sclerotic periphery, the presence of internal amorphous sclerotic bone and numerous small loculations, lack of septae bordering many of the loculations, and expansion and perforation of the outer cortical plate with extension into surrounding soft tissue. Other characteristics include tooth displacement and root resorption. The four cases of IMC reviewed have common imaging characteristics. All cases share some diagnostic imaging features with other multilocular-appearing entities of the jaws. However, the presence of amorphous sclerotic bone and malignant characteristics can be useful in the differential diagnosis.
Okada, K; Unoki, E; Kubota, H; Abe, E; Taniwaki, M; Morita, M; Sato, K
1996-02-01
To clarify the clinicopathological features of periosteal ganglion. Three patients with periosteal ganglion were studied clinicopathologically. One patient was selected from the files of our institute and two from a consultation file. All three lesions were located over the medial aspect of the tibia. Plain radiographs showed cortical erosions of varying degrees and mild periosteal reaction of the medial side of the tibia. MR images demonstrated well-circumscribed lesions overlying the cortical bone of the tibia, shown as low-intensity areas on T1-weighted images. On T2-weighted images, lesions were homogeneous, lobulated, and showed a characteristic markedly increased signal intensity. These findings are helpful in making a diagnosis of periosteal ganglion. Each patient had an uneventful clinical course after an excision involving the wall of the ganglion, the adjoining periosteum, and the underlying sclerotic cortical bone.
Chemistry Data for Geothermometry Mapping of Deep Hydrothermal Reservoirs in Southeastern Idaho
Earl Mattson
2016-01-18
This dataset includes chemistry of geothermal water samples of the Eastern Snake River Plain and surrounding area. The samples included in this dataset were collected during the springs and summers of 2014 and 2015. All chemical analysis of the samples were conducted in the Analytical Laboratory at the Center of Advanced Energy Studies in Idaho Falls, Idaho. This data set supersedes #425 submission and is the final submission for AOP 3.1.2.1 for INL. Isotopic data collected by Mark Conrad will be submitted in a separate file.
Cancer Biomarkers | Division of Cancer Prevention
[[{"fid":"175","view_mode":"default","fields":{"format":"default","field_file_image_alt_text[und][0][value]":"Cancer Biomarkers Research Group Homepage Logo","field_file_image_title_text[und][0][value]":"Cancer Biomarkers Research Group Homepage Logo","field_folder[und]":"15"},"type":"media","attributes":{"alt":"Cancer Biomarkers Research Group Homepage Logo","title":"Cancer
Gastrointestinal and Other Cancers | Division of Cancer Prevention
[[{"fid":"181","view_mode":"default","fields":{"format":"default","field_file_image_alt_text[und][0][value]":"Gastrointestinal and Other Cancers Research Group Homepage Logo","field_file_image_title_text[und][0][value]":"Gastrointestinal and Other Cancers Research Group Homepage Logo","field_folder[und]":"15"},"type":"media","attributes":{"alt":"Gastrointestinal and Other
Biometry | Division of Cancer Prevention
[[{"fid":"66","view_mode":"default","fields":{"format":"default","field_file_image_alt_text[und][0][value]":"Biometry Research Group Homepage Logo","field_file_image_title_text[und][0][value]":"Biometry Research Group Homepage Logo","field_folder[und]":"15"},"type":"media","attributes":{"alt":"Biometry Research Group Homepage Logo","title":"Biometry Research Group Homepage
Klett, T.R.; Le, P.A.
2006-01-01
This chapter describes data used in support of the process being applied by the U.S. Geological Survey (USGS) National Oil and Gas Assessment (NOGA) project. Digital tabular data used in this report and archival data that permit the user to perform further analyses are available elsewhere on this CD-ROM. Computers and software may import the data without transcription from the Portable Document Format files (.pdf files) of the text by the reader. Because of the number and variety of platforms and software available, graphical images are provided as .pdf files and tabular data are provided in a raw form as tab-delimited text files (.tab files).
Electronic hand-drafting and picture management system.
Yang, Tsung-Han; Ku, Cheng-Yuan; Yen, David C; Hsieh, Wen-Huai
2012-08-01
The Department of Health of Executive Yuan in Taiwan (R.O.C.) is implementing a five-stage project entitled Electronic Medical Record (EMR) converting all health records from written to electronic form. Traditionally, physicians record patients' symptoms, related examinations, and suggested treatments on paper medical records. Currently when implementing the EMR, all text files and image files in the Hospital Information System (HIS) and Picture Archiving and Communication Systems (PACS) are kept separate. The current medical system environment is unable to combine text files, hand-drafted files, and photographs in the same system, so it is difficult to support physicians with the recording of medical data. Furthermore, in surgical and other related departments, physicians need immediate access to medical records in order to understand the details of a patient's condition. In order to address these problems, the Department of Health has implemented an EMR project, with the primary goal of building an electronic hand-drafting and picture management system (HDP system) that can be used by medical personnel to record medical information in a convenient way. This system can simultaneously edit text files, hand-drafted files, and image files and then integrate these data into Portable Document Format (PDF) files. In addition, the output is designed to fit a variety of formats in order to meet various laws and regulations. By combining the HDP system with HIS and PACS, the applicability can be enhanced to fit various scenarios and can assist the medical industry in moving into the final phase of EMR.
Lossef, S V; Schwartz, L H
1990-09-01
A computerized reference system for radiology journal articles was developed by using an IBM-compatible personal computer with a hand-held optical scanner and optical character recognition software. This allows direct entry of scanned text from printed material into word processing or data-base files. Additionally, line diagrams and photographs of radiographs can be incorporated into these files. A text search and retrieval software program enables rapid searching for keywords in scanned documents. The hand scanner and software programs are commercially available, relatively inexpensive, and easily used. This permits construction of a personalized radiology literature file of readily accessible text and images requiring minimal typing or keystroke entry.
VizieR Online Data Catalog: ND2 rotational spectrum (Melosso+,
NASA Astrophysics Data System (ADS)
Melosso, M.; Degli Esposti, C.; Dore, L.
2018-01-01
files used with the SPFIT/SPCAT program suite. There are 8 files of supplementary material, including a ReadMe, which was created by the AAS data editors. The text files are as follows: 1_Explan.txt = information on the content of the other files. 2ND2.fit = the output file of the fit of spectroscopic data used in the present study. 3ND2.lin = the corresponding line file. 4ND2.par = the corresponding parameter file. 5ND2.cat = the output file of the prediction made with the parameters determined in this study. 6ND2.var = the corresponding parameter file 7ND2.int = the corresponding intensity file (1 data file).
SUSHI: an exquisite recipe for fully documented, reproducible and reusable NGS data analysis.
Hatakeyama, Masaomi; Opitz, Lennart; Russo, Giancarlo; Qi, Weihong; Schlapbach, Ralph; Rehrauer, Hubert
2016-06-02
Next generation sequencing (NGS) produces massive datasets consisting of billions of reads and up to thousands of samples. Subsequent bioinformatic analysis is typically done with the help of open source tools, where each application performs a single step towards the final result. This situation leaves the bioinformaticians with the tasks to combine the tools, manage the data files and meta-information, document the analysis, and ensure reproducibility. We present SUSHI, an agile data analysis framework that relieves bioinformaticians from the administrative challenges of their data analysis. SUSHI lets users build reproducible data analysis workflows from individual applications and manages the input data, the parameters, meta-information with user-driven semantics, and the job scripts. As distinguishing features, SUSHI provides an expert command line interface as well as a convenient web interface to run bioinformatics tools. SUSHI datasets are self-contained and self-documented on the file system. This makes them fully reproducible and ready to be shared. With the associated meta-information being formatted as plain text tables, the datasets can be readily further analyzed and interpreted outside SUSHI. SUSHI provides an exquisite recipe for analysing NGS data. By following the SUSHI recipe, SUSHI makes data analysis straightforward and takes care of documentation and administration tasks. Thus, the user can fully dedicate his time to the analysis itself. SUSHI is suitable for use by bioinformaticians as well as life science researchers. It is targeted for, but by no means constrained to, NGS data analysis. Our SUSHI instance is in productive use and has served as data analysis interface for more than 1000 data analysis projects. SUSHI source code as well as a demo server are freely available.
76 FR 6311 - Regulations Affecting Publication of the United States Government Manual
Federal Register 2010, 2011, 2012, 2013, 2014
2011-02-04
... as a single PDF file that includes bookmarks. Finally, he asked if any smart phone applications... an annual online edition of the Manual in both text-only files and PDF files. It is now possible to...
18 CFR 34.7 - Number of copies to be filed.
Code of Federal Regulations, 2010 CFR
2010-04-01
..., § 34.7 was revised, effective at the time of the next e-filing release during the Commission's next fiscal year. For the convenience of the user, the revised text follows: § 34.7 Filing requirements. Each...) and (2) of this chapter. As a qualified document, no paper copy version of the filing is required...
A Simple Text File for Curing Rainbow Blindness
NASA Technical Reports Server (NTRS)
Krylo, Robert; Tomlin, Marilyn; Seager, Michael
2008-01-01
This slide presentation reviews the use of a simple text file to work with large, multi-component thermal models that present a post-processing challenge. This complexity is due to temperatures for many components, with varying requirements, need to be examined and that false color temperature maps, or rainbows, provide a qualitative assessment of results.
Breast and Gynecologic Cancer | Division of Cancer Prevention
[[{"fid":"184","view_mode":"default","fields":{"format":"default","field_file_image_alt_text[und][0][value]":"Breast and Gynecologic Cancer Research Group Homepage Logo","field_file_image_title_text[und][0][value]":"Breast and Gynecologic Cancer Research Group Homepage Logo","field_folder[und]":"15"},"type":"media","attributes":{"alt":"Breast and Gynecologic Cancer Research
Community Oncology and Prevention Trials | Division of Cancer Prevention
[[{"fid":"168","view_mode":"default","fields":{"format":"default","field_file_image_alt_text[und][0][value]":"Early Detection Research Group Homepage Image","field_file_image_title_text[und][0][value]":"Early Detection Research Group Homepage Image","field_folder[und]":"15"},"type":"media","attributes":{"alt":"Early Detection Research Group Homepage Image","title":"Early
Lung and Upper Aerodigestive Cancer | Division of Cancer Prevention
[[{"fid":"180","view_mode":"default","fields":{"format":"default","field_file_image_alt_text[und][0][value]":"Lung and Upper Aerodigestive Cancer Research Group Homepage Logo","field_file_image_title_text[und][0][value]":"Lung and Upper Aerodigestive Cancer Research Group Homepage Logo","field_folder[und]":"15"},"type":"media","attributes":{"alt":"Lung and Upper Aerodigestive
Mays, Darren; Niaura, Raymond S; Evans, W Douglas; Hammond, David; Luta, George; Tercyak, Kenneth P
2015-03-01
This study examined the impact of pictorial cigarette-warning labels, warning-label message framing and plain cigarette packaging, on young adult smokers' motivation to quit. Smokers aged 18-30 years (n=740) from a consumer research panel were randomised to one of four experimental conditions where they viewed online images of four cigarette packs with warnings about lung disease, cancer, stroke/heart disease and death, respectively. Packs differed across conditions by warning-message framing (gain vs loss) and packaging (branded vs plain). Measures captured demographics, smoking behaviour, covariates and motivation to quit in response to cigarette packs. Pictorial warnings about lung disease and cancer generated the strongest motivation to quit across conditions. Adjusting for pretest motivation and covariates, a message framing by packaging interaction revealed gain-framed warnings on plain packs generated greater motivation to quit for lung disease, cancer and mortality warnings (p<0.05), compared with loss-framed warnings on plain packs. Warnings combining pictorial depictions of smoking-related health risks with text-based messages about how quitting reduces risks, may achieve better outcomes among young adults, especially in countries considering or implementing plain packaging regulations. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
,
2006-01-01
This chapter describes data used in support of the process being applied by the U.S. Geological Survey (USGS) National Oil and Gas Assessment (NOGA) project. Digital tabular data used in this report and archival data that permit the user to perform further analyses are available elsewhere on the CD-ROM. Computers and software may import the data without transcription from the Portable Document Format files (.pdf files) of the text by the reader. Because of the number and variety of platforms and software available, graphical images are provided as .pdf files and tabular data are provided in a raw form as tab-delimited text files (.tab files).
Klett, T.R.; Le, P.A.
2006-01-01
This chapter describes data used in support of the process being applied by the U.S. Geological Survey (USGS) National Oil and Gas Assessment (NOGA) project. Digital tabular data used in this report and archival data that permit the user to perform further analyses are available elsewhere on the CD-ROM. Computers and software may import the data without transcription from the Portable Document Format files (.pdf files) of the text by the reader. Because of the number and variety of platforms and software available, graphical images are provided as .pdf files and tabular data are provided in a raw form as tab-delimited text files (.tab files).
Portable system to luminaries characterization
NASA Astrophysics Data System (ADS)
Tecpoyotl-Torres, M.; Vera-Dimas, J. G.; Koshevaya, S.; Escobedo-Alatorre, J.; Cisneros-Villalobos, L.; Sanchez-Mondragon, J.
2014-09-01
For illumination sources designers is important to know the illumination distribution of their products. They can use several viewers of IES files (standard file format determined by Illuminating Engineering Society). This files are necessary not only know the distribution of illumination, but also to plain the construction of buildings by means of specialized softwares, such as Autodesk Revit. In this paper, a complete portable system for luminaries' characterization is given. The components of the systems are: Irradiance profile meter, which can generate photometry of luminaries of small sizes which covers indoor illumination requirements and luminaries for general areas. One of the meteŕs attributes is given by the color sensor implemented, which allows knowing the color temperature of luminary under analysis. The Graphic Unit Interface (GUI) has several characteristics: It can control the meter, acquires the data obtained by the sensor and graphs them in 2D under Cartesian and polar formats or 3D, in Cartesian format. The graph can be exported to png, jpg, or bmp formats, if necessary. These remarkable characteristics differentiate this GUI. This proposal can be considered as a viable option for enterprises of illumination design and manufacturing, due to the relatively low investment level and considering the complete illumination characterization provided.
Doug Blankenship
2016-03-01
x,y,z text file of the downhole lithologic interpretations in the wells in and around the Fallon FORGE site. All the relevant information is in the file header (the spatial reference, the projection etc.) In addition all the fields in the data file are identified in the header.
DOE Office of Scientific and Technical Information (OSTI.GOV)
"rsed" is an R package that contains tools for stream editing: manipulating text files by making insertions, replacements, deletions, substitutions, or commenting. It hails from the powerful Unix command, "sed". While the "rsed" package is not nearly as powerful as "see", it is much simpler to use. R programmers often write scripts that may require simple manipulation of text files. "rsed" addresses that need.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-06
... Members. The text of the proposed rule change is available on the Exchange's Internet Web site at www... proposed rule change. The text of these statements may be examined at the places specified in Item IV below... all historical order information for orders routed to away destinations by the Exchange. The Filing...
Task-Oriented Access to Data Files: An Evaluation.
ERIC Educational Resources Information Center
Watters, Carolyn; And Others
1994-01-01
Discussion of information retrieval highlights DalText, a prototype information retrieval system that provides access to nonindexed textual data files where the mode of access is determined by the user based on the task at hand. A user study is described that was conducted at Dalhousie University (Nova Scotia) to test DalText. (Contains 23…
Klett, T.R.; Le, P.A.
2007-01-01
This chapter describes data used in support of the assessment process. Digital tabular data used in this report and archival data that permit the user to perform further analyses are available elsewhere on this CD–ROM. Computers and software may import the data without transcription from the portable document format (.pdf) files of the text by the reader. Because of the number and variety of platforms and software available, graphical images are provided as .pdf files and tabular data are provided in a raw form as tab-delimited text files (.tab files).
Some utilities to help produce Rich Text Files from Stata.
Gillman, Matthew S
Producing RTF files from Stata can be difficult and somewhat cryptic. Utilities are introduced to simplify this process; one builds up a table row-by-row, another inserts a PNG image file into an RTF document, and the others start and finish the RTF document.
Klett, T.R.; Le, P.A.
2013-01-01
This chapter describes data used in support of the process being applied by the U.S. Geological Survey (USGS) National Oil and Gas Assessment (NOGA) project. Digital tabular data used in this report and archival data that permit the user to perform further analyses are available elsewhere on this CD–ROM. Computers and software may import the data without transcription from the Portable Document Format files (.pdf files) of the text by the reader. Because of the number and variety of platforms and software available, graphical images are provided as .pdf files and tabular data are provided in a raw form as tab-delimited text files (.tab files).
Code of Federal Regulations, 2012 CFR
2012-10-01
... Description RequiredF=filing C=cancel B=both Start field End field Record type 1 Numeric 1=Filing 2=Cancellation B 1 1 Insurer number 8 Text FMCSA Assigned Insurer Number (Home Office) With Suffix (Issuing Office), If Different, e.g. 12345-01 B 2 9 Filing type 1 Numeric 1 = BI&PD 2 = Cargo 3 = Bond 4 = Trust...
Code of Federal Regulations, 2011 CFR
2011-10-01
... Description RequiredF=filing C=cancel B=both Start field End field Record type 1 Numeric 1=Filing 2=Cancellation B 1 1 Insurer number 8 Text FMCSA Assigned Insurer Number (Home Office) With Suffix (Issuing Office), If Different, e.g. 12345-01 B 2 9 Filing type 1 Numeric 1 = BI&PD 2 = Cargo 3 = Bond 4 = Trust...
48 CFR 1652.204-72 - Filing health benefit claims/court review of disputed claims.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 48 Federal Acquisition Regulations System 6 2011-10-01 2011-10-01 false Filing health benefit... System OFFICE OF PERSONNEL MANAGEMENT FEDERAL EMPLOYEES HEALTH BENEFITS ACQUISITION REGULATION CLAUSES AND FORMS CONTRACT CLAUSES Texts of FEHBP Clauses 1652.204-72 Filing health benefit claims/court...
48 CFR 1652.204-72 - Filing health benefit claims/court review of disputed claims.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 48 Federal Acquisition Regulations System 6 2013-10-01 2013-10-01 false Filing health benefit... System OFFICE OF PERSONNEL MANAGEMENT FEDERAL EMPLOYEES HEALTH BENEFITS ACQUISITION REGULATION CLAUSES AND FORMS CONTRACT CLAUSES Texts of FEHBP Clauses 1652.204-72 Filing health benefit claims/court...
48 CFR 1652.204-72 - Filing health benefit claims/court review of disputed claims.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 48 Federal Acquisition Regulations System 6 2012-10-01 2012-10-01 false Filing health benefit... System OFFICE OF PERSONNEL MANAGEMENT FEDERAL EMPLOYEES HEALTH BENEFITS ACQUISITION REGULATION CLAUSES AND FORMS CONTRACT CLAUSES Texts of FEHBP Clauses 1652.204-72 Filing health benefit claims/court...
A SARA Timeseries Utility supports analysis and management of time-varying environmental data including listing, graphing, computing statistics, computing meteorological data and saving in a WDM or text file. File formats supported include WDM, HSPF Binary (.hbn), USGS RDB, and T...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-10-02
...-Regulatory Organizations; Financial Industry Regulatory Authority, Inc.; Notice of Filing and Immediate... hereby given that, on September 24, 2013, Financial Industry Regulatory Authority, Inc. (``FINRA'') filed... discussed any comments it received on the proposed rule change. The text of these statements may be examined...
Some utilities to help produce Rich Text Files from Stata
Gillman, Matthew S.
2018-01-01
Producing RTF files from Stata can be difficult and somewhat cryptic. Utilities are introduced to simplify this process; one builds up a table row-by-row, another inserts a PNG image file into an RTF document, and the others start and finish the RTF document. PMID:29731697
38 CFR 75.115 - Risk analysis.
Code of Federal Regulations, 2010 CFR
2010-07-01
... preparation of the risk analysis may include data mining if necessary for the development of relevant... degree of protection for the data, e.g., unencrypted, plain text; (6) Time the data has been out of VA...
Hybrid Cryptosystem Using Tiny Encryption Algorithm and LUC Algorithm
NASA Astrophysics Data System (ADS)
Rachmawati, Dian; Sharif, Amer; Jaysilen; Andri Budiman, Mohammad
2018-01-01
Security becomes a very important issue in data transmission and there are so many methods to make files more secure. One of that method is cryptography. Cryptography is a method to secure file by writing the hidden code to cover the original file. Therefore, if the people do not involve in cryptography, they cannot decrypt the hidden code to read the original file. There are many methods are used in cryptography, one of that method is hybrid cryptosystem. A hybrid cryptosystem is a method that uses a symmetric algorithm to secure the file and use an asymmetric algorithm to secure the symmetric algorithm key. In this research, TEA algorithm is used as symmetric algorithm and LUC algorithm is used as an asymmetric algorithm. The system is tested by encrypting and decrypting the file by using TEA algorithm and using LUC algorithm to encrypt and decrypt the TEA key. The result of this research is by using TEA Algorithm to encrypt the file, the cipher text form is the character from ASCII (American Standard for Information Interchange) table in the form of hexadecimal numbers and the cipher text size increase by sixteen bytes as the plaintext length is increased by eight characters.
Semantator: semantic annotator for converting biomedical text to linked data.
Tao, Cui; Song, Dezhao; Sharma, Deepak; Chute, Christopher G
2013-10-01
More than 80% of biomedical data is embedded in plain text. The unstructured nature of these text-based documents makes it challenging to easily browse and query the data of interest in them. One approach to facilitate browsing and querying biomedical text is to convert the plain text to a linked web of data, i.e., converting data originally in free text to structured formats with defined meta-level semantics. In this paper, we introduce Semantator (Semantic Annotator), a semantic-web-based environment for annotating data of interest in biomedical documents, browsing and querying the annotated data, and interactively refining annotation results if needed. Through Semantator, information of interest can be either annotated manually or semi-automatically using plug-in information extraction tools. The annotated results will be stored in RDF and can be queried using the SPARQL query language. In addition, semantic reasoners can be directly applied to the annotated data for consistency checking and knowledge inference. Semantator has been released online and was used by the biomedical ontology community who provided positive feedbacks. Our evaluation results indicated that (1) Semantator can perform the annotation functionalities as designed; (2) Semantator can be adopted in real applications in clinical and transactional research; and (3) the annotated results using Semantator can be easily used in Semantic-web-based reasoning tools for further inference. Copyright © 2013 Elsevier Inc. All rights reserved.
Goodwin, C S
1976-01-01
A manual system of microbiology reporting with a National Cash Register (NCR) form with printed names of bacteria and antiboitics required less time to compose reports than a previous manual system that involved rubber stamps and handwriting on plain report sheets. The NCR report cost 10-28 pence and, compared with a computer system, it had the advantages of simplicity and familarity, and reports were not delayed by machine breakdown, operator error, or data being incorrectly submitted. A computer reporting system for microbiology resulted in more accurate reports costing 17-97 pence each, faster and more accurate filing and recall of reports, and a greater range of analyses of reports that was valued particularly by the control-of-infection staff. Composition of computer-readable reports by technicians on Port-a-punch cards took longer than composing NCR reports. Enquiries for past results were more quickly answered from computer printouts of reports and a day book in alphabetical order. PMID:939810
Reportable STDs in Young People 15-24 Years of Age, by State
... STD 101 in a Box Home Script for Sex in the City Video STD Clinical Slides STD Clinical Slides STD Picture ... include: line graphs by year; pie charts for sex; bar charts by state and country; bar charts for age, race/ethnicity, and transmission ... Quicktime file RealPlayer file Text file ...
Aerospell Supplemental Spell Check File
NASA Technical Reports Server (NTRS)
2000-01-01
Aerospell is a supplemental spell check file that can be used as a resource for researchers, writers, editors, students, and others who compose scientific and technical texts. The file extends the general spell check dictionaries of word processors by adding more than 13,000 words used in a broad range of aerospace and related disciplines.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-09-03
... information could include a significant amount of statistical information that would be difficult to file... required static pool information. Given the large amount of statistical information involved, commentators....; and 18 U.S.C. 1350. * * * * * 2. Amend Sec. 232.312 paragraph (a) introductory text by removing...
Defense Data Network/TOPS-20 Tutorial. An Interative Computer Program.
1985-12-01
33 XI. ELECTRONIC MAIL HOST ( EMH )-------------------------- 34 XII. EMACS...contents of text buffer to a file X EXCHANGE , used to search for and replace text Z ZAP, puts your entire file into the print buffer 23 - -° SWITCH...prompt USC-ISIE.ARPA> Sample foreign host command level prompt FTP COMMAND LEVEL COMMANDS COMMAND USE(S) FTP Invokes the FTP protocol CONNECT Connects
Recent Subsidence and Erosion at Diverse Wetland Sites in the Southeastern Mississippi Delta Plain
Morton, Robert A.; Bernier, Julie C.; Kelso, Kyle W.
2009-01-01
A prior study (U.S. Geological Survey Open-File Report 2005-1216) examined historical land- and water-area changes and estimated magnitudes of land subsidence and erosion at five wetland sites in the Terrebonne hydrologic basin of the Mississippi delta plain. The present study extends that work by analyzing interior wetland loss and relative magnitudes of subsidence and erosion at five additional wetland sites in the adjacent Barataria hydrologic basin. The Barataria basin sites were selected for their diverse physical settings and their recent (post-1978) conversion from marsh to open water. Historical aerial photography, datum-corrected marsh elevations and water depths, sediment cores, and radiocarbon dates were integrated to evaluate land-water changes in the Mississippi delta plain on both historical and geological time scales. The thickness of the organic-rich sediments (peat) and the elevation of the stratigraphic contact between peat and underlying mud were compared at marsh and open-water sites across areas of formerly continuous marsh to estimate magnitudes of recent delta-plain elevation loss caused by vertical erosion and subsidence of the wetlands. Results of these analyses indicate that erosion exceeded subsidence at most of the study areas, although both processes have contributed to historical wetland loss. Comparison of these results with prior studies indicates that subsidence largely caused rapid interior wetland loss in the Terrebonne basin before 1978, whereas erosional processes primarily caused more gradual interior wetland loss in the Barataria basin after 1978. Decadal variations in rates of relative sea-level rise at a National Ocean Service tide gage, elevation changes between repeat benchmark-leveling surveys, and GPS height monitoring at three National Geodetic Survey Continuously Operating Reference Stations indicate that subsidence rates since the early 1990s are substantially lower than those previously reported and are similar in magnitude to time-averaged subsidence rates at geological time scales. The historical decrease in land-loss rates across the Mississippi delta plain generally is consistent with the recent decrease in subsidence rates within the same region.
Floods on White Rock Creek above White Rock Lake at Dallas, Texas
Gilbert, Clarence R.
1963-01-01
The White Rock Creek watershed within the city limits of Dallas , Texas, presents problems not unique in the rapid residential and industrial development encountered by many cities throughout the United States. The advantages of full development of the existing area within a city before expanding city boundaries, are related to both economics and civic pride. The expansion of city boundaries usually results in higher per capital costs for the operation of city governments. Certainly no responsible city official would oppose reasonable development of watersheds and flood plains and thus sacrifice an increase in tax revenue. Within the words "reasonable development" lies the problem faced by these officials. They are aware that the natural function of a stream channel, and its associated flood plain is to carry away excess water in time of flood. They are also aware that failure to recognize this has often led to haphazard development on flood plains with a consequent increase in flood damages. In the absence of factual data defining the risk involved in occupying flood plains, stringent corrective and preventative measures must be taken to regulate man's activities on flood plains to a point beyond normal precaution. Flood-flow characteristics in the reach of White Rock Creek that lies between the northern city boundary of Dallas and Northwest Highway (Loop 12) at the upper end of White Rock Lake, are presented in this report. Hydrologic data shown include history and magnitude of floods, flood profiles, outlines of areas inundated by three floods, and estimates of mean velocities of flow at selected points. Approximate areas inundated by floods of April 1942 and July 1962 along White Rock Creek and by the flood of October 1962 along Cottonwood Creek, Floyd Branch, and Jackson Branch, are delineated on maps. Greater floods have undoubtedly occurred in the past but no attempt is made to show their probable overflow limits because basic data on such floods could not be obtained. Depths of inundation can be estimated from the information shown. Elevations shown are in feet above mean sea level, datum of 1929. The data and computations supporting the results given herein are in the files of the Geological Survey in Austin, Texas.
Functional evaluation of telemedicine with super high definition images and B-ISDN.
Takeda, H; Matsumura, Y; Okada, T; Kuwata, S; Komori, M; Takahashi, T; Minatom, K; Hashimoto, T; Wada, M; Fujio, Y
1998-01-01
In order to determine whether a super high definition (SHD) image running at a series of 2048 resolution x 2048 line x 60 frame/sec was capable of telemedicine, we established a filing system for medical images and two experiments for transmission of high quality images were performed. All images of various types, produced from one case of ischemic heart disease were digitized and registered into the filing system. Images consisted of plain chest x-ray, electrocardiogram, ultrasound cardiogram, cardiac scintigram, coronary angiogram, left ventriculogram and so on. All images were animated and totaled a number of 243. We prepared a graphic user interface (GUI) for image retrieval based on the medical events and modalities. Twenty one cardiac specialists evaluated quality of the SHD images to be somewhat poor compared to the original pictures but sufficient for making diagnoses, and effective as a tool for teaching and case study purposes. The system capability of simultaneously displaying several animated images was especially deemed effective in grasping comprehension of diagnosis. Efficient input methods and creating capacity of filing all produced images are future issue. Using B-ISDN network, the SHD file was prefetched to the servers at Kyoto University Hospital and BBCC (Bradband ISDN Business chance & Culture Creation) laboratory as an telemedicine experiment. Simultaneous video conference system, the control of image retrieval and pointing function made the teleconference successful in terms of high quality of medical images, quick response time and interactive data exchange.
Byers, J A
1992-09-01
A compiled program, JCE-REFS.EXE (coded in the QuickBASIC language), for use on IBM-compatible personal computers is described. The program converts a DOS text file of current B-I-T-S (BIOSIS Information Transfer System) or BIOSIS Previews references into a DOS file of citations, including abstracts, in a general style used by scientific journals. The latter file can be imported directly into a word processor or the program can convert the file into a random access data base of the references. The program can search the data base for up to 40 text strings with Boolean logic. Selected references in the data base can be exported as a DOS text file of citations. Using the search facility, articles in theJournal of Chemical Ecology from 1975 to 1991 were searched for certain key words in regard to semiochemicals, taxa, methods, chemical classes, and biological terms to determine trends in usage over the period. Positive trends were statistically significant in the use of the words: semiochemical, allomone, allelochemic, deterrent, repellent, plants, angiosperms, dicots, wind tunnel, olfactometer, electrophysiology, mass spectrometry, ketone, evolution, physiology, herbivore, defense, and receptor. Significant negative trends were found for: pheromone, vertebrates, mammals, Coleoptera, Scolytidae,Dendroctonus, lactone, isomer, and calling.
Visual Basic VPython Interface: Charged Particle in a Magnetic Field
NASA Astrophysics Data System (ADS)
Prayaga, Chandra
2006-12-01
A simple Visual Basic (VB) to VPython interface is described and illustrated with the example of a charged particle in a magnetic field. This interface allows data to be passed to Python through a text file read by Python. The first component of the interface is a user-friendly data entry screen designed in VB, in which the user can input values of the charge, mass, initial position and initial velocity of the particle, and the magnetic field. Next, a command button is coded to write these values to a text file. Another command button starts the VPython program, which reads the data from the text file, numerically solves the equation of motion, and provides the 3d graphics animation. Students can use the interface to run the program several times with different data and observe changes in the motion.
Cytoscape file of chemical networks
The maximum connectivity scores of pairwise chemical conditions summarized from Cmap results in a file with Cytoscape format (http://www.cytoscape.org/). The figures in the publication were generated from this file. The Cytoscape file is formed from importing the eight text file therein.This dataset is associated with the following publication:Wang , R., A. Biales , N. Garcia-Reyero, E. Perkins, D. Villeneuve, G. Ankley, and D. Bencic. Fish Connectivity Mapping: Linking Chemical Stressors by Their MOA-Driven Transcriptomic Profiles. BMC Genomics. BioMed Central Ltd, London, UK, 17(84): 1-20, (2016).
2012-07-19
THIS TEXT is a straightforward, no nonsense guide for novice researchers venturing into research involving quantitative methodologies. It is also an excellent resource to refresh more experienced researchers. It is well written in plain English and encourages researchers to carry out their own analyses.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-07
...-Regulatory Organizations; NYSE Arca, Inc.; Notice of Filing and Immediate Effectiveness of Proposed Rule... Exchange Act of 1934, as amended (the ``Act''). Upon effectiveness of a plan filed pursuant to Rule 17d-2...(a) through 9.25, which concern advertising and sales literature, would be deleted and the text of...
NetCDF4/HDF5 and Linked Data in the Real World - Enriching Geoscientific Metadata without Bloat
NASA Astrophysics Data System (ADS)
Ip, Alex; Car, Nicholas; Druken, Kelsey; Poudjom-Djomani, Yvette; Butcher, Stirling; Evans, Ben; Wyborn, Lesley
2017-04-01
NetCDF4 has become the dominant generic format for many forms of geoscientific data, leveraging (and constraining) the versatile HDF5 container format, while providing metadata conventions for interoperability. However, the encapsulation of detailed metadata within each file can lead to metadata "bloat", and difficulty in maintaining consistency where metadata is replicated to multiple locations. Complex conceptual relationships are also difficult to represent in simple key-value netCDF metadata. Linked Data provides a practical mechanism to address these issues by associating the netCDF files and their internal variables with complex metadata stored in Semantic Web vocabularies and ontologies, while complying with and complementing existing metadata conventions. One of the stated objectives of the netCDF4/HDF5 formats is that they should be self-describing: containing metadata sufficient for cataloguing and using the data. However, this objective can be regarded as only partially-met where details of conventions and definitions are maintained externally to the data files. For example, one of the most widely used netCDF community standards, the Climate and Forecasting (CF) Metadata Convention, maintains standard vocabularies for a broad range of disciplines across the geosciences, but this metadata is currently neither readily discoverable nor machine-readable. We have previously implemented useful Linked Data and netCDF tooling (ncskos) that associates netCDF files, and individual variables within those files, with concepts in vocabularies formulated using the Simple Knowledge Organization System (SKOS) ontology. NetCDF files contain Uniform Resource Identifier (URI) links to terms represented as SKOS Concepts, rather than plain-text representations of those terms, so we can use simple, standardised web queries to collect and use rich metadata for the terms from any Linked Data-presented SKOS vocabulary. Geoscience Australia (GA) manages a large volume of diverse geoscientific data, much of which is being translated from proprietary formats to netCDF at NCI Australia. This data is made available through the NCI National Environmental Research Data Interoperability Platform (NERDIP) for programmatic access and interdisciplinary analysis. The netCDF files contain both scientific data variables (e.g. gravity, magnetic or radiometric values), but also domain-specific operational values (e.g. specific instrument parameters) best described fully in formal vocabularies. Our ncskos codebase provides access to multiple stores of detailed external metadata in a standardised fashion. Geophysical datasets are generated from a "survey" event, and GA maintains corporate databases of all surveys and their associated metadata. It is impractical to replicate the full source survey metadata into each netCDF dataset so, instead, we link the netCDF files to survey metadata using public Linked Data URIs. These URIs link to Survey class objects which we model as a subclass of Activity objects as defined by the PROV Ontology, and we provide URI resolution for them via a custom Linked Data API which draws current survey metadata from GA's in-house databases. We have demonstrated that Linked Data is a practical way to associate netCDF data with detailed, external metadata. This allows us to ensure that catalogued metadata is kept consistent with metadata points-of-truth, and we can infer complex conceptual relationships not possible with netCDF key-value attributes alone.
Bittner, Anja; Jonietz, Ansgar; Bittner, Johannes; Beickert, Luise; Harendza, Sigrid
2015-09-01
To train and assess undergraduate medical students' written communication skills by exercises in translating medical reports into plain language for real patients. 27 medical students participated in a newly developed communication course. They attended a 3-h seminar including a briefing on patient-centered communication and an introduction to working with the internet platform http://washabich.de. In the following ten weeks, participants "translated" one medical report every fortnight on this platform receiving feedback by a near-peer supervisor. A pre- and post-course assignment consisted of a self-assessment questionnaire on communication skills, analysis of a medical text with respect to medical jargon, and the translation of a medical report into plain language. In the self-assessment, students rated themselves in most aspects of patient-centered communication significantly higher after attending the course. After the course they marked significantly more medical jargon terms correctly than before (p<0.001). In a written plain language translation of a medical report they scored significantly higher with respect to communicative aspects (p<0.05) and medical correctness (p<0.001). Translating medical reports into plain language under near-peer supervision is associated with improved communication skills and medical knowledge in undergraduate medical students. To include translation exercises in the undergraduate medical curriculum. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
User's guide to revised method-of-characteristics solute-transport model (MOC--version 31)
Konikow, Leonard F.; Granato, G.E.; Hornberger, G.Z.
1994-01-01
The U.S. Geological Survey computer model to simulate two-dimensional solute transport and dispersion in ground water (Konikow and Bredehoeft, 1978; Goode and Konikow, 1989) has been modified to improve management of input and output data and to provide progressive run-time information. All opening and closing of files are now done automatically by the program. Names of input data files are entered either interactively or using a batch-mode script file. Names of output files, created automatically by the program, are based on the name of the input file. In the interactive mode, messages are written to the screen during execution to allow the user to monitor the status and progress of the simulation and to anticipate total running time. Information reported and updated during a simulation include the current pumping period and time step, number of particle moves, and percentage completion of the current time step. The batch mode enables a user to run a series of simulations consecutively, without additional control. A report of the model's activity in the batch mode is written to a separate output file, allowing later review. The user has several options for creating separate output files for different types of data. The formats are compatible with many commercially available applications, which facilitates graphical postprocessing of model results. Geohydrology and Evaluation of Stream-Aquifer Relations in the Apalachicola-Chattahoochee-Flint River Basin, Southeastern Alabama, Northwestern Florida, and Southwestern Georgia By Lynn J. Torak, Gary S. Davis, George A. Strain, and Jennifer G. Herndon Abstract The lower Apalachieola-Chattahoochec-Flint River Basin is underlain by Coastal Plain sediments of pre-Cretaceous to Quaternary age consisting of alternating units of sand, clay, sandstone, dolomite, and limestone that gradually thicken and dip gently to the southeast. The stream-aquifer system consism of carbonate (limestone and dolomite) and elastic sediments, which define the Upper Floridan aquifer and Intermediate system, in hydraulic connection with the principal rivers of the basin and other surface-water features, natural and man made. Separate digital models of the Upper Flori-dan aquifer and Intermediate system were constructed by using the U.S. Geological Survey's MODular Finite-Element model of two dimensional ground-water flow, based on concep- tualizations of the stream-aquifer system, and calibrated to drought conditions of October 1986. Sensitivity analyses performed on the models indicated that aquifer hydraulic conductivity, lateral and vertical boundary flows, and pumpage have a strong influence on groundwater levels. Simulated pumpage increases in the Upper Floridan aquifer, primarily in the Dougherty Plain physiographic district of Georgia,. caused significant reductions in aquifer discharge to streams that eventually flow to Lake Seminole and the Apalachicola River and Bay. Simulated pumpage increases greater than 3 times the October 1986 rates caused drying ofsome stream reaches and parts of the Upper Floridan aquifer in Georgia. Water budgets prepared from simulation results indicate that ground- water discharge to streams and recharge by horizontal and vertical flow are the principal mechanisms for moving water through the flow system. The potential for changes in ground-water quality is high in areas where chemical constituents can be mobilized by these mechanisms. Less than 2 percent of ground-water discharge to streams comes from the Intermediate system; thus, it plays a minor role in the hydrodynamics of the stream- aquifer system.
76 FR 47606 - Sport Fishing and Boating Partnership Council
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-05
... the following formats: One hard copy with original signature, and one electronic copy via e- mail (acceptable file formats are Adobe Acrobat PDF, WordPerfect, MS Word, MS PowerPoint, or rich text file...
NASA Technical Reports Server (NTRS)
Pototzky, Anthony S.
2010-01-01
A methodology is described for generating first-order plant equations of motion for aeroelastic and aeroservoelastic applications. The description begins with the process of generating data files representing specialized mode-shapes, such as rigid-body and control surface modes, using both PATRAN and NASTRAN analysis. NASTRAN executes the 146 solution sequence using numerous Direct Matrix Abstraction Program (DMAP) calls to import the mode-shape files and to perform the aeroelastic response analysis. The aeroelastic response analysis calculates and extracts structural frequencies, generalized masses, frequency-dependent generalized aerodynamic force (GAF) coefficients, sensor deflections and load coefficients data as text-formatted data files. The data files are then re-sequenced and re-formatted using a custom written FORTRAN program. The text-formatted data files are stored and coefficients for s-plane equations are fitted to the frequency-dependent GAF coefficients using two Interactions of Structures, Aerodynamics and Controls (ISAC) programs. With tabular files from stored data created by ISAC, MATLAB generates the first-order aeroservoelastic plant equations of motion. These equations include control-surface actuator, turbulence, sensor and load modeling. Altitude varying root-locus plot and PSD plot results for a model of the F-18 aircraft are presented to demonstrate the capability.
Cannon, William F.; Woodruff, Laurel G.
2003-01-01
This data set consists of nine files of geochemical information on various types of surficial deposits in northwestern Wisconsin and immediately adjacent parts of Michigan and Minnesota. The files are presented in two formats: as dbase files in dbaseIV form and Microsoft Excel form. The data present multi-element chemical analyses of soils, stream sediments, and lake sediments. Latitude and longitude values are provided in each file so that the dbf files can be readily imported to GIS applications. Metadata files are provided in outline form, question and answer form and text form. The metadata includes information on procedures for sample collection, sample preparation, and chemical analyses including sensitivity and precision.
Staradmin -- Starlink User Database Maintainer
NASA Astrophysics Data System (ADS)
Fish, Adrian
The subject of this SSN is a utility called STARADMIN. This utility allows the system administrator to build and maintain a Starlink User Database (UDB). The principal source of information for each user is a text file, named after their username. The content of each file is a list consisting of one keyword followed by the relevant user data per line. These user database files reside in a single directory. The STARADMIN program is used to manipulate these user data files and automatically generate user summary lists.
Creating and indexing teaching files from free-text patient reports.
Johnson, D. B.; Chu, W. W.; Dionisio, J. D.; Taira, R. K.; Kangarloo, H.
1999-01-01
Teaching files based on real patient data can enhance the education of students, staff and other colleagues. Although information retrieval system can index free-text documents using keywords, these systems do not work well where content bearing terms (e.g., anatomy descriptions) frequently appears. This paper describes a system that uses multi-word indexing terms to provide access to free-text patient reports. The utilization of multi-word indexing allows better modeling of the content of medical reports, thus improving retrieval performance. The method used to select indexing terms as well as early evaluation of retrieval performance is discussed. PMID:10566473
West Flank Coso, CA FORGE 3D geologic model
Doug Blankenship
2016-03-01
This is an x,y,z file of the West Flank FORGE 3D geologic model. Model created in Earthvision by Dynamic Graphic Inc. The model was constructed with a grid spacing of 100 m. Geologic surfaces were extrapolated from the input data using a minimum tension gridding algorithm. The data file is tabular data in a text file, with lithology data associated with X,Y,Z grid points. All the relevant information is in the file header (the spatial reference, the projection etc.) In addition all the fields in the data file are identified in the header.
Fallon FORGE 3D Geologic Model
Doug Blankenship
2016-03-01
An x,y,z scattered data file for the 3D geologic model of the Fallon FORGE site. Model created in Earthvision by Dynamic Graphic Inc. The model was constructed with a grid spacing of 100 m. Geologic surfaces were extrapolated from the input data using a minimum tension gridding algorithm. The data file is tabular data in a text file, with lithology data associated with X,Y,Z grid points. All the relevant information is in the file header (the spatial reference, the projection etc.) In addition all the fields in the data file are identified in the header.
78 FR 17233 - Notice of Opportunity To File Amicus Briefs
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-20
.... Any commonly-used word processing format or PDF format is acceptable; text formats are preferable to image formats. Briefs may also be filed with the Office of the Clerk of the Board, Merit Systems...
its references list. To use SMARTS, users construct text files of 20-30 lines of simple text and ' output consists of spreadsheet-compatible American Standard Code for Information Interchange (ASCII) text
IDG - INTERACTIVE DIF GENERATOR
NASA Technical Reports Server (NTRS)
Preheim, L. E.
1994-01-01
The Interactive DIF Generator (IDG) utility is a tool used to generate and manipulate Directory Interchange Format files (DIF). Its purpose as a specialized text editor is to create and update DIF files which can be sent to NASA's Master Directory, also referred to as the International Global Change Directory at Goddard. Many government and university data systems use the Master Directory to advertise the availability of research data. The IDG interface consists of a set of four windows: (1) the IDG main window; (2) a text editing window; (3) a text formatting and validation window; and (4) a file viewing window. The IDG main window starts up the other windows and contains a list of valid keywords. The keywords are loaded from a user-designated file and selected keywords can be copied into any active editing window. Once activated, the editing window designates the file to be edited. Upon switching from the editing window to the formatting and validation window, the user has options for making simple changes to one or more files such as inserting tabs, aligning fields, and indenting groups. The viewing window is a scrollable read-only window that allows fast viewing of any text file. IDG is an interactive tool and requires a mouse or a trackball to operate. IDG uses the X Window System to build and manage its interactive forms, and also uses the Motif widget set and runs under Sun UNIX. IDG is written in C-language for Sun computers running SunOS. This package requires the X Window System, Version 11 Revision 4, with OSF/Motif 1.1. IDG requires 1.8Mb of hard disk space. The standard distribution medium for IDG is a .25 inch streaming magnetic tape cartridge in UNIX tar format. It is also available on a 3.5 inch diskette in UNIX tar format. The program was developed in 1991 and is a copyrighted work with all copyright vested in NASA. SunOS is a trademark of Sun Microsystems, Inc. X Window System is a trademark of Massachusetts Institute of Technology. OSF/Motif is a trademark of the Open Software Foundation, Inc. UNIX is a trademark of Bell Laboratories.
DOCU-TEXT: A tool before the data dictionary
NASA Technical Reports Server (NTRS)
Carter, B.
1983-01-01
DOCU-TEXT, a proprietary software package that aids in the production of documentation for a data processing organization and can be installed and operated only on IBM computers is discussed. In organizing information that ultimately will reside in a data dictionary, DOCU-TEXT proved to be a useful documentation tool in extracting information from existing production jobs, procedure libraries, system catalogs, control data sets and related files. DOCU-TEXT reads these files to derive data that is useful at the system level. The output of DOCU-TEXT is a series of user selectable reports. These reports can reflect the interactions within a single job stream, a complete system, or all the systems in an installation. Any single report, or group of reports, can be generated in an independent documentation pass.
Kafkas, Şenay; Kim, Jee-Hyub; Pi, Xingjun; McEntyre, Johanna R
2015-01-01
In this study, we present an analysis of data citation practices in full text research articles and their corresponding supplementary data files, made available in the Open Access set of articles from Europe PubMed Central. Our aim is to investigate whether supplementary data files should be considered as a source of information for integrating the literature with biomolecular databases. Using text-mining methods to identify and extract a variety of core biological database accession numbers, we found that the supplemental data files contain many more database citations than the body of the article, and that those citations often take the form of a relatively small number of articles citing large collections of accession numbers in text-based files. Moreover, citation of value-added databases derived from submission databases (such as Pfam, UniProt or Ensembl) is common, demonstrating the reuse of these resources as datasets in themselves. All the database accession numbers extracted from the supplementary data are publicly accessible from http://dx.doi.org/10.5281/zenodo.11771. Our study suggests that supplementary data should be considered when linking articles with data, in curation pipelines, and in information retrieval tasks in order to make full use of the entire research article. These observations highlight the need to improve the management of supplemental data in general, in order to make this information more discoverable and useful.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-22
... Proposed Rule Change To Remove the Expired Pilot Under Rule 4753(c) From the NASDAQ Rule Book August 16... ``Volatility Guard'') from the NASDAQ rule book. NASDAQ will remove the rule text 30 days after the filing date... proposing to remove the expired pilot under Rule 4753(c) from the rule book. On June 18, 2010, NASDAQ filed...
A Detailed Examination of the GPM Core Satellite Gridded Text Product
NASA Technical Reports Server (NTRS)
Stocker, Erich Franz; Kelley, Owen A.; Kummerow, C.; Huffman, George; Olson, William S.; Kwiatowski, John M.
2015-01-01
The Global Precipitation Measurement (GPM) mission quarter-degree gridded-text product has a similar file format and a similar purpose as the Tropical Rainfall Measuring Mission (TRMM) 3G68 quarter-degree product. The GPM text-grid format is an hourly summary of surface precipitation retrievals from various GPM instruments and combinations of GPM instruments. The GMI Goddard Profiling (GPROF) retrieval provides the widest swath (800 km) and does the retrieval using the GPM Microwave Imager (GMI). The Ku radar provides the widest radar swath (250 km swath) and also provides continuity with the TRMM Ku Precipitation Radar. GPM's Ku+Ka band matched swath (125 km swath) provides a dual-frequency precipitation retrieval. The "combined" retrieval (125 km swath) provides a multi-instrument precipitation retrieval based on the GMI, the DPR Ku radar, and the DPR Ka radar. While the data are reported in hourly grids, all hours for a day are packaged into a single text file that is g-zipped to reduce file size and to speed up downloading. The data are reported on a 0.25deg x 0.25 deg grid.
Weems, Robert E.; Schindler, J. Stephen; Lewis, William C.
2010-01-01
The Emporia 1:100,000-scale quadrangle straddles the Tidewater Fall Line in southern Virginia and includes a small part of northernmost North Carolina. Sediments of the coastal plain underlie the eastern three-fifths of this area. These sediments onlap crystalline basement rocks toward the west and dip gently to the east, reaching a maximum known thickness of 821 feet in the extreme southeastern part of the map area. The gentle eastward dip is disrupted in several areas due to faulting delineated during the course of mapping. In order to produce a new geologic map of the Emporia 1:100,000-scale quadrangle, the U.S. Geological Survey drilled one corehole to a depth of 223 feet and augered 192 shallow research test holes (maximum depth 135 feet) to supplement sparse outcrop data available from the coastal plain part of the map area. The recovered sediments were studied and data from them recorded to determine the lithologic characteristics, spatial distribution, and temporal framework of the represented coastal plain stratigraphic units. These test holes were critical for accurately determining the distribution of major geologic units and the position of unit boundaries that will be shown on the forthcoming Emporia geologic map, but much of the detailed subsurface data cannot be shown readily through this map product. Therefore, the locations and detailed descriptions of the auger test holes and one corehole are provided in this open-file report for geologists, hydrologists, engineers, and community planners in need of a detailed shallow-subsurface stratigraphic framework for much of the Emporia map region.
Launch Control System Software Development System Automation Testing
NASA Technical Reports Server (NTRS)
Hwang, Andrew
2017-01-01
The Spaceport Command and Control System (SCCS) is the National Aeronautics and Space Administration's (NASA) launch control system for the Orion capsule and Space Launch System, the next generation manned rocket currently in development. This system requires high quality testing that will measure and test the capabilities of the system. For the past two years, the Exploration and Operations Division at Kennedy Space Center (KSC) has assigned a group including interns and full-time engineers to develop automated tests to save the project time and money. The team worked on automating the testing process for the SCCS GUI that would use streamed simulated data from the testing servers to produce data, plots, statuses, etc. to the GUI. The software used to develop automated tests included an automated testing framework and an automation library. The automated testing framework has a tabular-style syntax, which means the functionality of a line of code must have the appropriate number of tabs for the line to function as intended. The header section contains either paths to custom resources or the names of libraries being used. The automation library contains functionality to automate anything that appears on a desired screen with the use of image recognition software to detect and control GUI components. The data section contains any data values strictly created for the current testing file. The body section holds the tests that are being run. The function section can include any number of functions that may be used by the current testing file or any other file that resources it. The resources and body section are required for all test files; the data and function sections can be left empty if the data values and functions being used are from a resourced library or another file. To help equip the automation team with better tools, the Project Lead of the Automated Testing Team, Jason Kapusta, assigned the task to install and train an optical character recognition (OCR) tool to Brandon Echols, a fellow intern, and I. The purpose of the OCR tool is to analyze an image and find the coordinates of any group of text. Some issues that arose while installing the OCR tool included the absence of certain libraries needed to train the tool and an outdated software version. We eventually resolved the issues and successfully installed the OCR tool. Training the tool required many images and different fonts and sizes, but in the end the tool learned to accurately decipher the text in the images and their coordinates. The OCR tool produced a file that contained significant metadata for each section of text, but only the text and coordinates of the text was required for our purpose. The team made a script to parse the information we wanted from the OCR file to a different file that would be used by automation functions within the automated framework. Since a majority of development and testing for the automated test cases for the GUI in question has been done using live simulated data on the workstations at the Launch Control Center (LCC), a large amount of progress has been made. As of this writing, about 60% of all of automated testing has been implemented. Additionally, the OCR tool will help make our automated tests more robust due to the tool's text recognition being highly scalable to different text fonts and text sizes. Soon we will have the whole test system automated, allowing for more full-time engineers working on development projects.
77 FR 60138 - Trinity Adaptive Management Working Group; Public Teleconference/Web-Based Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-02
... statements must be supplied to Elizabeth Hadley in one of the following formats: One hard copy with original... file formats are Adobe Acrobat PDF, MS Word, PowerPoint, or rich text file). Registered speakers who...
Atmospheric Science Data Center
2018-04-12
SSE Global Data Text files of monthly averaged data for the entire ... Version: V6 Location: Global Spatial Coverage: (90N, 90S)(180W,180E) ... File Format: ASCII Order Data: SSE Global Data: Order Data SCAR-B Block: ...
Steamer Training System and Graphics Editor, 1987 Version
1987-09-01
NIIHAU : >simenv>documentation>simenv-read me.text.24 7/30/87 18:29:51 Page 1 SMode: Text-- Herewith are instructions for installing the Genera 7.0 (should...lowercase: t; package: file-system; - (set-logical-pathname-host "simenv" : physical-host " niihau " :translations ((C"simenv;" ">3imenv>") steamer-system...translations ;;--- mode: lisp; base: 10; lowercase: t; package: file-system;-- (fS:set-logical-pathname-host "steamer-system" :physical-host " niihau
Raw Magnetotelluric Data, McGregory Range, Fort Bliss, New Mexico
Nash, Greg
2017-01-01
This is a zipped file containing raw magnetotelluric (MT) data collected as part of the Phase 2 Tularosa Basin geothermal play fairway analysis project in New Mexico. The data for each MT station are in standard .edi text files which are accompanied by graphic files illustrating details. These data cover part of McGregor Range, Fort Bliss, New Mexico. The MT survey was done by Quantec Geoscience.
77 FR 11483 - Request for Extension of a Currently Approved Information Collection
Federal Register 2010, 2011, 2012, 2013, 2014
2012-02-27
... that is a scanned Adobe PDF file, it must be scanned as text and not as an image, thus allowing FCIC to... February 17, 2012. William J. Murphy, Manager, Federal Crop Insurance Corporation. [FR Doc. 2012-4465 Filed...
1995 Joseph E. Whitley, MD, Award. A World Wide Web gateway to the radiologic learning file.
Channin, D S
1995-12-01
Computer networks in general, and the Internet specifically, are changing the way information is manipulated in the world at large and in radiology. The goal of this project was to develop a computer system in which images from the Radiologic Learning File, available previously only via a single-user laser disc, are made available over a generic, high-availability computer network to many potential users simultaneously. Using a networked workstation in our laboratory and freely available distributed hypertext software, we established a World Wide Web (WWW) information server for radiology. Images from the Radiologic Learning File are requested through the WWW client software, digitized from a single laser disc containing the entire teaching file and then transmitted over the network to the client. The text accompanying each image is incorporated into the transmitted document. The Radiologic Learning File is now on-line, and requests to view the cases result in the delivery of the text and images. Image digitization via a frame grabber takes 1/30th of a second. Conversion of the image to a standard computer graphic format takes 45-60 sec. Text and image transmission speed on a local area network varies between 200 and 400 kilobytes (KB) per second depending on the network load. We have made images from a laser disc of the Radiologic Learning File available through an Internet-based hypertext server. The images previously available through a single-user system located in a remote section of our department are now ubiquitously available throughout our department via the department's computer network. We have thus converted a single-user, limited functionality system into a multiuser, widely available resource.
Data Compression in Full-Text Retrieval Systems.
ERIC Educational Resources Information Center
Bell, Timothy C.; And Others
1993-01-01
Describes compression methods for components of full-text systems such as text databases on CD-ROM. Topics discussed include storage media; structures for full-text retrieval, including indexes, inverted files, and bitmaps; compression tools; memory requirements during retrieval; and ranking and information retrieval. (Contains 53 references.)…
Interactive publications: creation and usage
NASA Astrophysics Data System (ADS)
Thoma, George R.; Ford, Glenn; Chung, Michael; Vasudevan, Kirankumar; Antani, Sameer
2006-02-01
As envisioned here, an "interactive publication" has similarities to multimedia documents that have been in existence for a decade or more, but possesses specific differentiating characteristics. In common usage, the latter refers to online entities that, in addition to text, consist of files of images and video clips residing separately in databases, rarely providing immediate context to the document text. While an interactive publication has many media objects as does the "traditional" multimedia document, it is a self-contained document, either as a single file with media files embedded within it, or as a "folder" containing tightly linked media files. The main characteristic that differentiates an interactive publication from a traditional multimedia document is that the reader would be able to reuse the media content for analysis and presentation, and to check the underlying data and possibly derive alternative conclusions leading, for example, to more in-depth peer reviews. We have created prototype publications containing paginated text and several media types encountered in the biomedical literature: 3D animations of anatomic structures; graphs, charts and tabular data; cell development images (video sequences); and clinical images such as CT, MRI and ultrasound in the DICOM format. This paper presents developments to date including: a tool to convert static tables or graphs into interactive entities, authoring procedures followed to create prototypes, and advantages and drawbacks of each of these platforms. It also outlines future work including meeting the challenge of network distribution for these large files.
He, Sijin; Yong, May; Matthews, Paul M; Guo, Yike
2017-03-01
TranSMART has a wide range of functionalities for translational research and a large user community, but it does not support imaging data. In this context, imaging data typically includes 2D or 3D sets of magnitude data and metadata information. Imaging data may summarise complex feature descriptions in a less biased fashion than user defined plain texts and numeric numbers. Imaging data also is contextualised by other data sets and may be analysed jointly with other data that can explain features or their variation. Here we describe the tranSMART-XNAT Connector we have developed. This connector consists of components for data capture, organisation and analysis. Data capture is responsible for imaging capture either from PACS system or directly from an MRI scanner, or from raw data files. Data are organised in a similar fashion as tranSMART and are stored in a format that allows direct analysis within tranSMART. The connector enables selection and download of DICOM images and associated resources using subjects' clinical phenotypic and genotypic criteria. tranSMART-XNAT connector is written in Java/Groovy/Grails. It is maintained and available for download at https://github.com/sh107/transmart-xnat-connector.git. sijin@ebi.ac.uk. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com
Xu, Guoai; Li, Qi; Guo, Yanhui; Zhang, Miao
2017-01-01
Authorship attribution is to identify the most likely author of a given sample among a set of candidate known authors. It can be not only applied to discover the original author of plain text, such as novels, blogs, emails, posts etc., but also used to identify source code programmers. Authorship attribution of source code is required in diverse applications, ranging from malicious code tracking to solving authorship dispute or software plagiarism detection. This paper aims to propose a new method to identify the programmer of Java source code samples with a higher accuracy. To this end, it first introduces back propagation (BP) neural network based on particle swarm optimization (PSO) into authorship attribution of source code. It begins by computing a set of defined feature metrics, including lexical and layout metrics, structure and syntax metrics, totally 19 dimensions. Then these metrics are input to neural network for supervised learning, the weights of which are output by PSO and BP hybrid algorithm. The effectiveness of the proposed method is evaluated on a collected dataset with 3,022 Java files belong to 40 authors. Experiment results show that the proposed method achieves 91.060% accuracy. And a comparison with previous work on authorship attribution of source code for Java language illustrates that this proposed method outperforms others overall, also with an acceptable overhead. PMID:29095934
75 FR 49233 - Amendments to Form ADV
Federal Register 2010, 2011, 2012, 2013, 2014
2010-08-12
...The Securities and Exchange Commission is adopting amendments to Part 2 of Form ADV, and related rules under the Investment Advisers Act, to require investment advisers registered with us to provide new and prospective clients with a brochure and brochure supplements written in plain English. These amendments are designed to provide new and prospective advisory clients with clearly written, meaningful, current disclosure of the business practices, conflicts of interest and background of the investment adviser and its advisory personnel. Advisers must file their brochures with us electronically and we will make them available to the public through our Web site. The Commission also is withdrawing the Advisers Act rule requiring advisers to disclose certain disciplinary and financial information.
An Invisible Text Watermarking Algorithm using Image Watermark
NASA Astrophysics Data System (ADS)
Jalil, Zunera; Mirza, Anwar M.
Copyright protection of digital contents is very necessary in today's digital world with efficient communication mediums as internet. Text is the dominant part of the internet contents and there are very limited techniques available for text protection. This paper presents a novel algorithm for protection of plain text, which embeds the logo image of the copyright owner in the text and this logo can be extracted from the text later to prove ownership. The algorithm is robust against content-preserving modifications and at the same time, is capable of detecting malicious tampering. Experimental results demonstrate the effectiveness of the algorithm against tampering attacks by calculating normalized hamming distances. The results are also compared with a recent work in this domain
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-31
...In the Rules and Regulations section of this issue of the Federal Register, the IRS and the Department of the Treasury (Treasury Department) are issuing temporary regulations that provide guidance on determining the ownership of a passive foreign investment company (PFIC), the annual filing requirements for shareholders of PFICs, and an exclusion from certain filing requirement for shareholders that constructively own interests in certain foreign corporations. The temporary regulations primarily affect shareholders of PFICs that do not currently file Form 8621, ``Information Return by a Shareholder of a Passive Foreign Investment Company or Qualified Electing Fund'', with respect to their PFIC interests. The temporary regulations also affect certain shareholders that rely on a constructive ownership exception to the requirement to file Form 5471, ``Information Return of U.S. Persons with Respect to Certain Foreign Corporations.'' The text of those temporary regulations published in this issue of the Federal Register also serves as the text of these proposed regulations.
Combination Base64 Algorithm and EOF Technique for Steganography
NASA Astrophysics Data System (ADS)
Rahim, Robbi; Nurdiyanto, Heri; Hidayat, Rahmat; Saleh Ahmar, Ansari; Siregar, Dodi; Putera Utama Siahaan, Andysah; Faisal, Ilham; Rahman, Sayuti; Suita, Diana; Zamsuri, Ahmad; Abdullah, Dahlan; Napitupulu, Darmawan; Ikhsan Setiawan, Muhammad; Sriadhi, S.
2018-04-01
The steganography process combines mathematics and computer science. Steganography consists of a set of methods and techniques to embed the data into another media so that the contents are unreadable to anyone who does not have the authority to read these data. The main objective of the use of base64 method is to convert any file in order to achieve privacy. This paper discusses a steganography and encoding method using base64, which is a set of encoding schemes that convert the same binary data to the form of a series of ASCII code. Also, the EoF technique is used to embed encoding text performed by Base64. As an example, for the mechanisms a file is used to represent the texts, and by using the two methods together will increase the security level for protecting the data, this research aims to secure many types of files in a particular media with a good security and not to damage the stored files and coverage media that used.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barton, G.W. Jr.
In UCID-19588, Communicating between the Apple and the Wang, we described how to take Apple DOS text files and send them to the Wang, and how to return Wang files to the Apple. It is also possible to use your Apple as an Octopus terminal, and to exchange files with Octopus 7600's. Presumably, you can also talk to the Crays, or any other part of the system. This connection has another virtue. It eliminates one of the terminals in your office.
AstroBlend: Visualization package for use with Blender
NASA Astrophysics Data System (ADS)
Naiman, J. P.
2015-12-01
AstroBlend is a visualization package for use in the three dimensional animation and modeling software, Blender. It reads data in via a text file or can use pre-fab isosurface files stored as OBJ or Wavefront files. AstroBlend supports a variety of codes such as FLASH (ascl:1010.082), Enzo (ascl:1010.072), and Athena (ascl:1010.014), and combines artistic 3D models with computational astrophysics datasets to create models and animations.
Divergence Measures Tool:An Introduction with Brief Tutorial
2014-03-01
in detecting differences across a wide range of Arabic -language text files (they varied by genre, domain, spelling variation, size, etc.), our...other. 2 These measures have been put to many uses in natural language processing ( NLP ). In the evaluation of machine translation (MT...files uploaded into the tool must be .txt files in ASCII or UTF-8 format. • This tool has been tested on English and Arabic script**, but should
CancerNet redistribution via WWW.
Quade, G; Püschel, N; Far, F
1996-01-01
CancerNet from the National Cancer Institute contains nearly 500 ASCII-files, updated monthly, with up-to-date information about cancer and the "Golden Standard" in tumor therapy. Perl scripts are used to convert these files to HTML-documents. A complex algorithm, using regular expression matching and extensive exception handling, detects headlines, listings and other constructs of the original ASCII-text and converts them into their HTML-counterparts. A table of contents is also created during the process. The resulting files are indexed for full-text search via WAIS. Building the complete CancerNet WWW redistribution takes less than two hours with a minimum of manual work. For 26,000 requests of information from our service per month the average costs for the worldwide delivery of one document is about 19 cents.
Author fees for online publication
NASA Astrophysics Data System (ADS)
Like the journals themselves, AGU publication fees have been restructured to accommodate the new online, publish-as-ready approach. The new fee structure is based on authors' providing electronic files of their text and art in acceptable formats (Word, WordPerfect, and LaTeX for text, and .eps or .tif for digital art). However, if you are unable to supply electronic files, you can opt for a higher-charge, full-service route in which AGU will create electronic files from hard copy. All authors for AGU journals are expected to support the journal archive through fees based on number as well as size of article files. The revenue from these fees is set aside for the "Perpetual Care Trust Fund," which will support the migration of the journal archive to new formats or media as technology changes. For several journals, excess length fees remain in place to encourage submission of concisely written articles. During this first transition year, most author fees are based on the number of print page equivalents (pdf) in an article; in the future, however, charges are expected to be associated with file size. The specific fees for each journal are posted on AGU's Web site under Publications-Tools for Authors.
Action Research Methods: Plain and Simple
ERIC Educational Resources Information Center
Klein, Sheri R., Ed.
2012-01-01
Among the plethora of action research books on the market, there is no one text exclusively devoted to understanding how to acquire and interpret research data. Action Research Methods provides a balanced overview of the quantitative and qualitative methodologies and methods for conducting action research within a variety of educational…
Reading, Writing, and Rhetoric: An Inquiry into the Art of Legal Language.
ERIC Educational Resources Information Center
Ranney, Frances J.
1999-01-01
Describes research creating a feminist, rhetorical analysis of legal language by examining in detail both the "Plain English" and the "Law and Literature" movements. Examines legal texts that construct the "reasonable woman," asking how that hypothetical legal subject is construed by judicial discourse and what its…
Wadeable Streams Assessment Data
The Wadeable Streams Assessment (WSA) is a first-ever statistically-valid survey of the biological condition of small streams throughout the U.S. The U.S. Environmental Protection Agency (EPA) worked with the states to conduct the assessment in 2004-2005. Data for each parameter sampled in the Wadeable Streams Assessment (WSA) are available for downloading in a series of files as comma separated values (*.csv). Each *.csv data file has a companion text file (*.txt) that lists a dataset label and individual descriptions for each variable. Users should view the *.txt files first to help guide their understanding and use of the data.
7 CFR 283.17 - Post-hearing procedure.
Code of Federal Regulations, 2013 CFR
2013-01-01
..., but not later than the time fixed for filing proposed findings of fact, conclusions of law, order and... text). (b) Proposed findings of fact, conclusions of law, order, and briefs. The parties may file proposed findings of fact, conclusions of law and orders based solely upon the record and on officially...
7 CFR 283.17 - Post-hearing procedure.
Code of Federal Regulations, 2010 CFR
2010-01-01
..., but not later than the time fixed for filing proposed findings of fact, conclusions of law, order and... text). (b) Proposed findings of fact, conclusions of law, order, and briefs. The parties may file proposed findings of fact, conclusions of law and orders based solely upon the record and on officially...
78 FR 64883 - Filing Financial and Other Reports
Federal Register 2010, 2011, 2012, 2013, 2014
2013-10-30
... paragraph (a) introductory text to read as follows: Sec. 741.6 Financial and statistical and other reports..., statistical, and other reports and credit union profiles by requiring all federally insured credit unions.... Section 741.6(a) of NCUA's regulations requires FICUs to file financial, statistical, and other reports...
Finding Related Entities by Retrieving Relations: UIUC at TREC 2009 Entity Track
2009-11-01
classes, depending on the categories they belong to. A music album could have any generic name, whereas a laptop model has a more generalizable name. A...names of music albums are simply plain text often capitalized, and so on. Thus, we feel that a better ap- proach would be to first identify the...origin domain of the text to be tagged (e.g., pharmaceutical, music , journal, etc.), and then apply tagging rules that are specific to that domain
Digital geologic map of the Butler Peak 7.5' quadrangle, San Bernardino County, California
Miller, Fred K.; Matti, Jonathan C.; Brown, Howard J.; digital preparation by Cossette, P. M.
2000-01-01
Open-File Report 00-145, is a digital geologic map database of the Butler Peak 7.5' quadrangle that includes (1) ARC/INFO (Environmental Systems Research Institute) version 7.2.1 Patch 1 coverages, and associated tables, (2) a Portable Document Format (.pdf) file of the Description of Map Units, Correlation of Map Units chart, and an explanation of symbols used on the map, btlrpk_dcmu.pdf, (3) a Portable Document Format file of this Readme, btlrpk_rme.pdf (the Readme is also included as an ascii file in the data package), and (4) a PostScript plot file of the map, Correlation of Map Units, and Description of Map Units on a single sheet, btlrpk.ps. No paper map is included in the Open-File report, but the PostScript plot file (number 4 above) can be used to produce one. The PostScript plot file generates a map, peripheral text, and diagrams in the editorial format of USGS Geologic Investigation Series (I-series) maps.
Mapping DICOM to OpenDocument format
NASA Astrophysics Data System (ADS)
Yu, Cong; Yao, Zhihong
2009-02-01
In order to enhance the readability, extensibility and sharing of DICOM files, we have introduced XML into DICOM file system (SPIE Volume 5748)[1] and the multilayer tree structure into DICOM (SPIE Volume 6145)[2]. In this paper, we proposed mapping DICOM to ODF(OpenDocument Format), for it is also based on XML. As a result, the new format realizes the separation of content(including text content and image) and display style. Meanwhile, since OpenDocument files take the format of a ZIP compressed archive, the new kind of DICOM files can benefit from ZIP's lossless compression to reduce file size. Moreover, this open format can also guarantee long-term access to data without legal or technical barriers, making medical images accessible to various fields.
Discriminability measures for predicting readability of text on textured backgrounds
NASA Technical Reports Server (NTRS)
Scharff, L. F.; Hill, A. L.; Ahumada, A. J. Jr; Watson, A. B. (Principal Investigator)
2000-01-01
Several discriminability measures were examined for their ability to predict reading search times for three levels of text contrast and a range of backgrounds (plain, a periodic texture, and four spatial-frequency-filtered textures created from the periodic texture). Search times indicate that these background variations only affect readability when the text contrast is low, and that spatial frequency content of the background affects readability. These results were not well predicted by the single variables of text contrast (Spearman rank correlation = -0.64) and background RMS contrast (0.08), but a global masking index and a spatial-frequency-selective masking index led to better predictions (-0.84 and -0.81, respectively). c2000 Optical Society of America.
Bollard, Tessa; Maubach, Ninya; Walker, Natalie; Ni Mhurchu, Cliona
2016-09-01
Consumption of sugar-sweetened beverages (SSBs) is associated with increased risk of obesity, diabetes, heart disease and dental caries. Our aim was to assess the effects of plain packaging, warning labels, and a 20 % tax on predicted SSB preferences, beliefs and purchase probabilities amongst young people. A 2 × 3 × 2 between-group experimental study was conducted over a one-week period in August 2014. Intervention scenarios were delivered, and outcome data collected, via an anonymous online survey. Participants were 604 New Zealand young people aged 13-24 years who consumed soft drinks regularly. Participants were randomly allocated using a computer-generated algorithm to view one of 12 experimental conditions, specifically images of branded versus plain packaged SSBs, with either no warning, a text warning, or a graphic warning, and with or without a 20 % tax. Participant perceptions of the allocated SSB product and of those who might consume the product were measured using seven-point Likert scales. Purchase probabilities were measured using 11-point Juster scales. Six hundred and four young people completed the survey (51 % female, mean age 18 (SD 3.4) years). All three intervention scenarios had a significant negative effect on preferences for SSBs (plain packaging: F (6, 587) = 54.4, p <0.001; warning label: F (6, 588) = 19.8, p <0.001; 20 % tax: F (6, 587) = 11.3, p <0.001). Plain packaging and warning labels also had a significant negative impact on reported likelihood of purchasing SSB's (p = <0.001). A 20 % tax reduced participants' purchase probability but the difference was not statistically significant (p = 0.2). Plain packaging and warning labels significantly reduce young people's predicted preferences for, and reported probability of purchasing, SSBs.
MSL: Facilitating automatic and physical analysis of published scientific literature in PDF format.
Ahmed, Zeeshan; Dandekar, Thomas
2015-01-01
Published scientific literature contains millions of figures, including information about the results obtained from different scientific experiments e.g. PCR-ELISA data, microarray analysis, gel electrophoresis, mass spectrometry data, DNA/RNA sequencing, diagnostic imaging (CT/MRI and ultrasound scans), and medicinal imaging like electroencephalography (EEG), magnetoencephalography (MEG), echocardiography (ECG), positron-emission tomography (PET) images. The importance of biomedical figures has been widely recognized in scientific and medicine communities, as they play a vital role in providing major original data, experimental and computational results in concise form. One major challenge for implementing a system for scientific literature analysis is extracting and analyzing text and figures from published PDF files by physical and logical document analysis. Here we present a product line architecture based bioinformatics tool 'Mining Scientific Literature (MSL)', which supports the extraction of text and images by interpreting all kinds of published PDF files using advanced data mining and image processing techniques. It provides modules for the marginalization of extracted text based on different coordinates and keywords, visualization of extracted figures and extraction of embedded text from all kinds of biological and biomedical figures using applied Optimal Character Recognition (OCR). Moreover, for further analysis and usage, it generates the system's output in different formats including text, PDF, XML and images files. Hence, MSL is an easy to install and use analysis tool to interpret published scientific literature in PDF format.
ERIC Educational Resources Information Center
Woodruff, Allison; Rosenholtz, Ruth; Morrison, Julie B.; Faulring, Andrew; Pirolli, Peter
2002-01-01
Discussion of Web search strategies focuses on a comparative study of textual and graphical summarization mechanisms applied to search engine results. Suggests that thumbnail images (graphical summaries) can increase efficiency in processing results, and that enhanced thumbnails (augmented with readable textual elements) had more consistent…
76 FR 75898 - Sport Fishing and Boating Partnership Council
Federal Register 2010, 2011, 2012, 2013, 2014
2011-12-05
... following formats: One hard copy with original signature, and one electronic copy via email (acceptable file format: Adobe Acrobat PDF, WordPerfect, MS Word, MS PowerPoint, or Rich Text files in IBM-PC/Windows 98/2000/XP format). Please submit your statement to Douglas Hobbs, Council Coordinator (see FOR FURTHER...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-05-10
... references to ``OTC Bulletin Board'' and ``OTCBB'' with ``Non-NMS Quotation Service'' and ``NNQS.'' The text... filing the proposed rule change to rename the OTC Bulletin Board (``OTCBB'') as the Non-NMS Quotation... OTCBB assets do not include the technology comprising the interdealer quotation system operated by FINRA...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-15
...-Regulatory Organizations; NYSE Amex LLC; Notice of Filing and Immediate Effectiveness of Proposed Rule Change Implementing Changes to the Per Contract Execution Costs for Certain Participants March 9, 2012. Pursuant to... Schedule (``Fee Schedule'') to increase the per contract execution costs for certain participants. The text...
ERIC Educational Resources Information Center
Buchanan, Larry
1996-01-01
Defines HyperText Markup Language (HTML) as it relates to the World Wide Web (WWW). Describes steps needed to create HTML files on a UNIX system and to make them accessible via the WWW. Presents a list of basic HTML formatting codes and explains the coding used in the author's personal HTML file. (JMV)
Federal Register 2010, 2011, 2012, 2013, 2014
2012-04-12
... SECURITIES AND EXCHANGE COMMISSION [Release No. 34-66754; File No. SR-Phlx-2012-41] Self.... I. Self-Regulatory Organization's Statement of the Terms of Substance of the Proposed Rule Change... XL'' for branding purposes. The text of the proposed rule change is available on the Exchange's Web...
Windows VPN Set Up | High-Performance Computing | NREL
it in your My Documents folder Configure the client software using that conf file Start the TEXT NEEDED Configure the Client Software Start the Endian Connect App. You'll configure the connection using the hpcvpn-win.conf file, uncheck the "save password" link, and add your UserID. Start
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-22
... exchange to a fully automated limit-order matching system. This filing would alter or eliminate references... Article 6, Rule 11 (Continuing Education for Registered Persons), we are deleting the rule text which was..., which addressed cancellation or modification of transactions due to systems malfunctions or disruptions...
Extending the Online Public Access Catalog into the Microcomputer Environment.
ERIC Educational Resources Information Center
Sutton, Brett
1990-01-01
Describes PCBIS, a database program for MS-DOS microcomputers that features a utility for automatically converting online public access catalog search results stored as text files into structured database files that can be searched, sorted, edited, and printed. Topics covered include the general features of the program, record structure, record…
Updating the Evidence for Oceans on Early Mars
NASA Technical Reports Server (NTRS)
Fairen, Alberto G.; Dohm, James M.; Oner, Tayfun; Ruiz, Javier; Rodriguez, Alexis P.; Schulze-Makuch, Dirk; Ormoe, Jens; McKay, Chris P.; Baker, Victor R.; Amils, Ricardo
2004-01-01
Different-sized bodies of water have been proposed to have occurred episodically in the lowlands of Mars throughout the planet's history, largely related to major stages of development of Tharsis and/or orbital obliquity. These water bodies range from large oceans in the Noachian-Early Hesperian, to a minor sea in the Late Hesperian, and dispersed lakes during the Amazonian. To evaluate the more recent discoveries regarding the oceanic possibility, here we perform a comprehensive analysis of the evolution of water on Mars, including: 1. Geological assessment of proposed shorelines; 2. A volumetric approximation to the plains-filing proposed oceans; 3. Geochemistry of the oceans and derived mineralogies; 4. Post-oceanic (i.e., Amazonian) evolution of the shorelines; and 5. Ultimate water evolution on Mars.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-02-18
... Delete the Text of Rule 1500, Which Governs MatchPoint's Functionality February 11, 2011. Pursuant to... the text of Rule 1500, which governs MatchPoint's functionality. The text of the proposed rule change... the proposed rule change. The text of those statements may be examined at the places specified in Item...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-05-01
... Dividend, Payment or Distribution, and To Make Related Clarifications to Rule Text April 25, 2013. Pursuant... distribution, and to make related clarifications to rule text. The text of the proposed rule change is... and discussed any comments it received on the proposed rule change. The text of these statements may...
Geology of Point Reyes National Seashore and vicinity, California: a digital database
Clark, Jospeh C.; Brabb, Earl E.
1997-01-01
This Open-File report is a digital geologic map database. This pamphlet serves to introduce and describe the digital data. There is no paper map included in the Open-File report. The report does include, however, a PostScript plot file containing an image of the geologic map sheet with explanation, as well as the accompanying text describing the geology of the area. For those interested in a paper plot of information contained in the database or in obtaining the PostScript plot files, please see the section entitled 'For Those Who Aren't Familiar With Digital Geologic Map Databases' below. This digital map database, compiled from previously published and unpublished data and new mapping by the authors, represents the general distribution of surficial deposits and rock units in Point Reyes and surrounding areas. Together with the accompanying text file (pr-geo.txt or pr-geo.ps), it provides current information on the stratigraphy and structural geology of the area covered. The database delineates map units that are identified by general age and lithology following the stratigraphic nomenclature of the U.S. Geological Survey. The scale of the source maps limits the spatial resolution (scale) of the database to 1:48,000 or smaller.
Nonnemaker, James; Kim, Annice; Shafer, Paul; Loomis, Brett; Hill, Edward; Holloway, John; Farrelly, Matthew
2016-05-01
We examined the potential impact of banning tobacco displays and mandating plain packaging and cigarette advertisements at the point of sale (POS) on adult outcomes. A virtual convenience store was created with scenarios in which the tobacco product display was either fully visible (status quo) or enclosed behind a cabinet (display ban), and cigarette packs and advertisements were either in full color (status quo) or black and white, text only (plain). A national convenience sample of 1313 adult current smokers and recent quitters was randomized to 1 of 4 conditions and given a shopping task to complete in the virtual store. Main outcomes were participants' self-reported urge to smoke and tobacco purchase attempts in the virtual store. Compared with recent quitters in the status quo conditions, recent quitters in the display ban condition had lower urges to smoke (β=-4.82, 95% CI=-8.16--1.49, p<0.01). Compared with current smokers in the status quo conditions, smokers in the display ban conditions were less likely to attempt to purchase cigarettes in the virtual store (OR=0.05, 95% CI=0.03-0.08, P<0.01). Smokers exposed to plain packs and ads were significantly less likely to attempt to purchase cigarettes (OR=0.31, 95% CI=0.20-0.47, P<0.01) than those exposed to color packs and ads. Policies that ban the display of tobacco products or require plain packaging and advertising at the POS may help reduce adult smoking. Copyright © 2016 Elsevier Ltd. All rights reserved.
Informatics in radiology (infoRAD): HTML and Web site design for the radiologist: a primer.
Ryan, Anthony G; Louis, Luck J; Yee, William C
2005-01-01
A Web site has enormous potential as a medium for the radiologist to store, present, and share information in the form of text, images, and video clips. With a modest amount of tutoring and effort, designing a site can be as painless as preparing a Microsoft PowerPoint presentation. The site can then be used as a hub for the development of further offshoots (eg, Web-based tutorials, storage for a teaching library, publication of information about one's practice, and information gathering from a wide variety of sources). By learning the basics of hypertext markup language (HTML), the reader will be able to produce a simple and effective Web page that permits display of text, images, and multimedia files. The process of constructing a Web page can be divided into five steps: (a) creating a basic template with formatted text, (b) adding color, (c) importing images and multimedia files, (d) creating hyperlinks, and (e) uploading one's page to the Internet. This Web page may be used as the basis for a Web-based tutorial comprising text documents and image files already in one's possession. Finally, there are many commercially available packages for Web page design that require no knowledge of HTML.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-07
... Deleting the Text of Rule 409(f)--NYSE Amex Equities and Adopting the Text of FINRA Rule 2232 June 30, 2011... proposes to delete the text of Rule 409(f)--NYSE Amex Equities and adopt the text of FINRA Rule 2232. The text of the proposed rule change is available at the Exchange, the Commission's Public Reference Room...
Layout-aware text extraction from full-text PDF of scientific articles.
Ramakrishnan, Cartic; Patnia, Abhishek; Hovy, Eduard; Burns, Gully Apc
2012-05-28
The Portable Document Format (PDF) is the most commonly used file format for online scientific publications. The absence of effective means to extract text from these PDF files in a layout-aware manner presents a significant challenge for developers of biomedical text mining or biocuration informatics systems that use published literature as an information source. In this paper we introduce the 'Layout-Aware PDF Text Extraction' (LA-PDFText) system to facilitate accurate extraction of text from PDF files of research articles for use in text mining applications. Our paper describes the construction and performance of an open source system that extracts text blocks from PDF-formatted full-text research articles and classifies them into logical units based on rules that characterize specific sections. The LA-PDFText system focuses only on the textual content of the research articles and is meant as a baseline for further experiments into more advanced extraction methods that handle multi-modal content, such as images and graphs. The system works in a three-stage process: (1) Detecting contiguous text blocks using spatial layout processing to locate and identify blocks of contiguous text, (2) Classifying text blocks into rhetorical categories using a rule-based method and (3) Stitching classified text blocks together in the correct order resulting in the extraction of text from section-wise grouped blocks. We show that our system can identify text blocks and classify them into rhetorical categories with Precision1 = 0.96% Recall = 0.89% and F1 = 0.91%. We also present an evaluation of the accuracy of the block detection algorithm used in step 2. Additionally, we have compared the accuracy of the text extracted by LA-PDFText to the text from the Open Access subset of PubMed Central. We then compared this accuracy with that of the text extracted by the PDF2Text system, 2commonly used to extract text from PDF. Finally, we discuss preliminary error analysis for our system and identify further areas of improvement. LA-PDFText is an open-source tool for accurately extracting text from full-text scientific articles. The release of the system is available at http://code.google.com/p/lapdftext/.
Layout-aware text extraction from full-text PDF of scientific articles
2012-01-01
Background The Portable Document Format (PDF) is the most commonly used file format for online scientific publications. The absence of effective means to extract text from these PDF files in a layout-aware manner presents a significant challenge for developers of biomedical text mining or biocuration informatics systems that use published literature as an information source. In this paper we introduce the ‘Layout-Aware PDF Text Extraction’ (LA-PDFText) system to facilitate accurate extraction of text from PDF files of research articles for use in text mining applications. Results Our paper describes the construction and performance of an open source system that extracts text blocks from PDF-formatted full-text research articles and classifies them into logical units based on rules that characterize specific sections. The LA-PDFText system focuses only on the textual content of the research articles and is meant as a baseline for further experiments into more advanced extraction methods that handle multi-modal content, such as images and graphs. The system works in a three-stage process: (1) Detecting contiguous text blocks using spatial layout processing to locate and identify blocks of contiguous text, (2) Classifying text blocks into rhetorical categories using a rule-based method and (3) Stitching classified text blocks together in the correct order resulting in the extraction of text from section-wise grouped blocks. We show that our system can identify text blocks and classify them into rhetorical categories with Precision1 = 0.96% Recall = 0.89% and F1 = 0.91%. We also present an evaluation of the accuracy of the block detection algorithm used in step 2. Additionally, we have compared the accuracy of the text extracted by LA-PDFText to the text from the Open Access subset of PubMed Central. We then compared this accuracy with that of the text extracted by the PDF2Text system, 2commonly used to extract text from PDF. Finally, we discuss preliminary error analysis for our system and identify further areas of improvement. Conclusions LA-PDFText is an open-source tool for accurately extracting text from full-text scientific articles. The release of the system is available at http://code.google.com/p/lapdftext/. PMID:22640904
1:2,000,000-scale digital line graph data on CD-ROM
,
1995-01-01
Updated U.S. Geological Survey digital line graph (DLG) data collected at a scale of 1:2,000,000 are now available on two compact discs-read only memory (CD-ROM). Each CD-ROM contains digital cartographic data for 49 States and the District of Columbia. The U.S. Virgin Islands, Puerto Rico, and Alaska will be ready within the next year. These DLG data were originally collected from maps published in 1970. Extensive revisions have been made and no data source more than 5 years old was used in this update. In addition, text files containing information such as place names and population have been added for the first time. The records in these text files can be related to corresponding features in the DLG data files. Metadata that comply with the Federal Geographic Data Committee Content Standards for Digital Geospatial Metadata are included for each category of DLG data.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-02-23
... Requirements, and Adopt New Rule Text That Is Substantially Similar to FINRA Rule 4360 February 16, 2012... adopt new rule text that is substantially similar to FINRA Rule 4360. The text of the proposed rule... rule change. The text of those statements may be examined at the places specified in Item IV below. The...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-21
... Effectiveness of a Proposed Rule Change To Amend the Text in the Exchange Fees Schedule February 14, 2013... amend the text in the Fees Schedule. The text of the proposed rule change is available on the Exchange's... comments it received on the proposed rule change. The text of these statements may be examined at the...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-05-30
... Related to a Dividend, Payment or Distribution, and To Make Related Clarifications to Rule Text May 23... distribution, and to make related clarifications to rule text. The text of the proposed rule change is... proposed rule change and discussed any comments it received on the proposed rule change. The text of these...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-19
... into ADAMS, which provides text and image files of the NRC's public documents. If you do not have... section, except that State, local governmental bodies, and Federally- recognized Indian tribes do not need... media. Participants may not submit paper copies of their filings unless they seek an exemption in...
75 FR 47624 - Sport Fishing and Boating Partnership Council
Federal Register 2010, 2011, 2012, 2013, 2014
2010-08-06
... Coordinator in both of the following formats: One hard copy with original signature, and one electronic copy via e- mail (acceptable file format: Adobe Acrobat PDF, WordPerfect, MS Word, MS PowerPoint, or Rich Text files in IBM-PC/Windows 98/2000/XP format). In order to attend this meeting, you must register by...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-09-27
... Organizations; The Chicago Stock Exchange, Inc.; Notice of Filing of a Proposed Rule Change To Enhance Quotation... amend its rules to enhance quotation requirements for market makers. The text of this proposed rule... rules to enhance minimum quotation requirements for market makers. Under the proposal, the Exchange will...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-04-03
... program for certain claims arising from the initial public offering (``IPO'') of Facebook, Inc. (``FB... description and text of the proposed rule change, at least five business days prior to the date of the filing... Commission's Public Reference Room, 100 F Street NE., Washington, DC 20549-1090, on official business days...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-09
... market conditions. The text of the proposed rule change is attached as Exhibit 5.\\3\\ \\3\\ The Commission... event of unusual market conditions. This is a competitive filing that is based on two recently approved... expiration if unusual market conditions exist. The exchanges amended their rules to permit the opening of...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-07-05
... Organizations; C2 Options Exchange, Incorporated; Notice of Filing and Immediate Effectiveness of a Proposed Rule Change To Amend Rule 8.2 Regarding Market-Maker Registration Cost June 28, 2012. Pursuant to... cost. The text of the proposed rule change is available on the Exchange's Web site ( http://www...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-04-18
... an assessment that evaluates the statistical and economic impact of Straddle States on liquidity and..., along with a brief description and text of the proposed rule change, at least five business days prior... Organizations; BOX Options Exchange LLC; Notice of Filing and Immediate Effectiveness of Proposed Rule Change To...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-11-17
...; (5) two vertical Kaplan turbine-generator units with a combined capacity of 3.0 megawatts; (6) a new.../ferconline.asp ) under the ``eFiling'' link. For a simpler method of submitting text only comments, click on ``eComment.'' For assistance, please contact FERC Online Support at FERCOnlineSupport.gov ; call toll...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-09-17
... Organizations; C2 Options Exchange, Incorporated; Notice of Filing of a Proposed Rule Change Relating to Complex... its Rules regarding complex order auctions. The text of the proposed rule change is available on the... basis, the Exchange may activate the electronic complex order request for responses (``RFR'') auction...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-04
... SECURITIES AND EXCHANGE COMMISSION [Release No. 34-63979; File No. SR-Phlx-2011-21] Self.... Self-Regulatory Organization's Statement of the Terms of Substance of the Proposed Rule Change The... ``Phlx XL II'' to ``PHLX XL'' for branding purposes. The text of the proposed rule change is available on...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-02-01
... Effectiveness of Proposed Rule Change Regarding the Listing of Option Series With $1 Strike Prices January 25... by the Exchange. The Exchange filed the proposal as a ``non- controversial'' proposed rule change... amend its rules regarding the listing of $1 strike prices. The text of the rule proposal is available on...
48 CFR 1652.204-72 - Filing health benefit claims/court review of disputed claims.
Code of Federal Regulations, 2014 CFR
2014-10-01
... AND FORMS CONTRACT CLAUSES Texts of FEHBP Clauses 1652.204-72 Filing health benefit claims/court... time of its original decision. (g) Court review. (1) A suit to compel enrollment under § 890.102 of... enrollment decision. (2) A suit to review the legality of OPM's regulations under this part must be brought...
The Toolbox for Local and Global Plagiarism Detection
ERIC Educational Resources Information Center
Butakov, Sergey; Scherbinin, Vladislav
2009-01-01
Digital plagiarism is a problem for educators all over the world. There are many software tools on the market for uncovering digital plagiarism. Most of them can work only with text submissions. In this paper, we present a new architecture for a plagiarism detection tool that can work with many different kinds of digital submissions, from plain or…
Handling Math Expressions in Economics: Recoding Spreadsheet Teaching Tool of Growth Models
ERIC Educational Resources Information Center
Moro-Egido, Ana I.; Pedauga, Luis E.
2017-01-01
In the present paper, we develop a teaching methodology for economic theory. The main contribution of this paper relies on combining the interactive characteristics of spreadsheet programs such as Excel and Unicode plain-text linear format for mathematical expressions. The advantage of Unicode standard rests on its ease for writing and reading…
1982-06-01
undecorated, plain surfaced utilitarian vessel, with loop handles on some specimens. The exceptions to this stereotype are few. A rare example of lip...latter. All are mammalian remains but none are classifiable as to family or genus . Falk describes the samples from both 23 CL 274 and 276 as consisting
ERIC Educational Resources Information Center
Haack, Paul A.
An overview of early country school music and music education in the Mountain Plains region of America provides impresssions gained from texts, journals, official records, and personal interviews. Music is portrayed as a socializer to engender community spirit, an enhancement of patriotism, a means to enculturate to the "American way of…
S-SPatt: simple statistics for patterns on Markov chains.
Nuel, Grégory
2005-07-01
S-SPatt allows the counting of patterns occurrences in text files and, assuming these texts are generated from a random Markovian source, the computation of the P-value of a given observation using a simple binomial approximation.
Manheim, Frank T.; Lane-Bostwick, Candice M.
1989-01-01
A comprehensive database of chemical and mineralogical properties for ferromanganese crusts collected throughout the Atlantic, Pacific, and Indian Oceans, and has been assembled from published and unpublished sources which provide collection and analytical information for these samples. These crusts, their chemical compositions and natural distribution, have been a topic of interest to scientific research, as well as to industrial and military applications. Unlike abyssal ferromanganese nodules, which form in areas of low disturbance and high sediment accumulation, crusts have been found to contain three to five times more cobalt than these nodules, and can be found on harder, steeper substrates which can be too steep for permanent sediment accumulation. They have also been documented on seamounts and plateaus within the U.S. exclusive economic zone in both Pacific and Atlantic Oceans, and these are therefore of strategic importance to the United States Government, as well as to civilian mining and metallurgical industries. The data tables provided in this report have been digitized and previously uploaded to the National Oceanic and Atmospheric Administration National Geophysical Data Center in 1991 for online distribution, and were provided in plain text format. The 2014 update to the original U.S. Geological Survey open-file report published in 1989 provides these data tables in a slightly reformatted version to make them easier to ingest into geographic information system software, converted to shapefiles, and have completed metadata written and associated with them.
Isotopica: a tool for the calculation and viewing of complex isotopic envelopes.
Fernandez-de-Cossio, Jorge; Gonzalez, Luis Javier; Satomi, Yoshinori; Betancourt, Lazaro; Ramos, Yassel; Huerta, Vivian; Amaro, Abel; Besada, Vladimir; Padron, Gabriel; Minamino, Naoto; Takao, Toshifumi
2004-07-01
The web application Isotopica has been developed as an aid to the interpretation of ions that contain naturally occurring isotopes in a mass spectrum. It allows the calculation of mass values and isotopic distributions based on molecular formulas, peptides/proteins, DNA/RNA, carbohydrate sequences or combinations thereof. In addition, Isotopica takes modifications of the input molecule into consideration using a simple and flexible language as a straightforward extension of the molecular formula syntax. This function is especially useful for biomolecules, which are often subjected to additional modifications other than normal constituents, such as the frequently occurring post-translational modification in proteins. The isotopic distribution of any molecule thus defined can be calculated by considering full widths at half maximum or mass resolution. The combined envelope of several overlapping isotopic distributions of a mixture of molecules can be determined after specifying each molecule's relative abundance. The results can be displayed graphically on a local PC using the Isotopica viewer, a standalone application that is downloadable from the sites below, as a complement to the client browser. The m/z and intensity values can also be obtained in the form of a plain ASCII text file. The software has proved to be useful for peptide mass fingerprinting and validating an observed isotopic ion distribution with reference to the theoretical one, even from a multi-component sample. The web server can be accessed at http://bioinformatica.cigb.edu.cu/isotopica and http://coco.protein.osaka-u.ac.jp/isotopica [correction].
Federal Register 2010, 2011, 2012, 2013, 2014
2011-12-21
... Proposed Rule Change Adopting the Text of Financial Industry Regulatory Authority Rule 5210, Which... The Exchange proposes to adopt the text of Financial Industry Regulatory Authority (``FINRA'') Rule... Rule 5210. The text of the proposed rule change is available at the Exchange, the Commission's Public...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-04-09
... Contracts Overlying 10 Shares of a Security (``Mini-Options Contracts'') and Implementing Rule Text... contracts'') and implement rule text necessary to distinguish mini-options contracts from option contracts overlying 100 shares of a security (``standard contracts''). The text of the proposed rule change is...
Text-interpreter language for flexible generation of patient notes and instructions.
Forker, T S
1992-01-01
An interpreted computer language has been developed along with a windowed user interface and multi-printer-support formatter to allow preparation of documentation of patient visits, including progress notes, prescriptions, excuses for work/school, outpatient laboratory requisitions, and patient instructions. Input is by trackball or mouse with little or no keyboard skill required. For clinical problems with specific protocols, the clinician can be prompted with problem-specific items of history, exam, and lab data to be gathered and documented. The language implements a number of text-related commands as well as branching logic and arithmetic commands. In addition to generating text, it is simple to implement arithmetic calculations such as weight-specific drug dosages; multiple branching decision-support protocols for paramedical personnel (or physicians); and calculation of clinical scores (e.g., coma or trauma scores) while simultaneously documenting the status of each component of the score. ASCII text files produced by the interpreter are available for computerized quality audit. Interpreter instructions are contained in text files users can customize with any text editor.
MSL: Facilitating automatic and physical analysis of published scientific literature in PDF format
Ahmed, Zeeshan; Dandekar, Thomas
2018-01-01
Published scientific literature contains millions of figures, including information about the results obtained from different scientific experiments e.g. PCR-ELISA data, microarray analysis, gel electrophoresis, mass spectrometry data, DNA/RNA sequencing, diagnostic imaging (CT/MRI and ultrasound scans), and medicinal imaging like electroencephalography (EEG), magnetoencephalography (MEG), echocardiography (ECG), positron-emission tomography (PET) images. The importance of biomedical figures has been widely recognized in scientific and medicine communities, as they play a vital role in providing major original data, experimental and computational results in concise form. One major challenge for implementing a system for scientific literature analysis is extracting and analyzing text and figures from published PDF files by physical and logical document analysis. Here we present a product line architecture based bioinformatics tool ‘Mining Scientific Literature (MSL)’, which supports the extraction of text and images by interpreting all kinds of published PDF files using advanced data mining and image processing techniques. It provides modules for the marginalization of extracted text based on different coordinates and keywords, visualization of extracted figures and extraction of embedded text from all kinds of biological and biomedical figures using applied Optimal Character Recognition (OCR). Moreover, for further analysis and usage, it generates the system’s output in different formats including text, PDF, XML and images files. Hence, MSL is an easy to install and use analysis tool to interpret published scientific literature in PDF format. PMID:29721305
Shankleman, M; Sykes, C; Mandeville, K L; Di Costa, S; Yarrow, K
2015-01-01
To investigate whether standardised cigarette packaging increases the time spent looking at health warnings, regardless of the format of those warnings. A factorial (two pack styles x three warning types) within-subject experiment, with participants randomised to different orders of conditions, completed at a university in London, UK. Mock-ups of cigarette packets were presented to participants with their branded portion in either standardised (plain) or manufacturer-designed (branded) format. Health warnings were present on all packets, representing all three types currently in use in the UK: black & white text, colour text, or colour images with accompanying text. Gaze position was recorded using a specialised eye tracker, providing the main outcome measure, which was the mean proportion of a five-second viewing period spent gazing at the warning-label region of the packet. An opportunity sample of 30 (six male, mean age = 23) young adults met the following inclusion criteria: 1) not currently a smoker; 2) <100 lifetime cigarettes smoked; 3) gaze position successfully tracked for > 50% viewing time. These participants spent a greater proportion of the available time gazing at the warning-label region when the branded section of the pack was standardised (following current Australian guidelines) rather than containing the manufacturer's preferred design (mean difference in proportions = 0.078, 95% confidence interval 0.049 to 0.106, p < 0.001). There was no evidence that this effect varied based on the type of warning label (black & white text vs. colour text vs. colour image & text; interaction p = 0.295). During incidental viewing of cigarette packets, young adult never-smokers are likely to spend more time looking at health warnings if manufacturers are compelled to use standardised packaging, regardless of the warning design. Copyright © 2014 The Authors. Published by Elsevier Ltd.. All rights reserved.
Tobacco product developments in the Australian market in the 4 years following plain packaging.
Scollo, Michelle; Bayly, Megan; White, Sarah; Lindorff, Kylie; Wakefield, Melanie
2017-10-09
This paper aimed to identify continued and emerging trends in the Australian tobacco market following plain packaging implementation, over a period of substantial increases in tobacco taxes. Since 2012, our surveillance activities (including review of trade product and price lists, ingredient reports submitted by tobacco companies to government and monitoring of the retail environment) found several trends in the factory-made cigarette market. These include the continued release of extra-long and slim cigarettes and packs with bonus cigarettes, particularly in the mainstream and premium market segments; new menthol capsule products; other novel flavourings in cigarettes; filter innovations including recessed and firm filters; continued use of evocative and descriptive product names; the proliferation of the new super-value market segment; and umbrella branding, where new products are introduced within established brand families. Several similar trends were also observed within the smoking tobacco market. While not all of these trends were new to the Australian market at the time of plain packaging implementation, their continued and increased use is notable. Plain packaging legislation could be strengthened to standardise cigarette and pack size, restrict brand and variant names, and ban features such as menthol capsules and filters innovations that provide novelty value or that may provide false reassurance to smokers. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Full-text, Downloading, & Other Issues.
ERIC Educational Resources Information Center
Tenopir, Carol
1983-01-01
Issues having a possible impact on online search services in libraries are discussed including full text databases, front-end processors which translate user's input into the command language of an appropriate system, downloading to create personal files from commercial databases, and pricing. (EJS)
Knudsen, Keith L.; Noller, Jay S.; Sowers, Janet M.; Lettis, William R.
1997-01-01
This Open-File report is a digital geologic map database. This pamphlet serves to introduce and describe the digital data. There are no paper maps included in the Open-File report. The report does include, however, PostScript plot files containing the images of the geologic map sheets with explanations, as well as the accompanying text describing the geology of the area. For those interested in a paper plot of information contained in the database or in obtaining the PostScript plot files, please see the section entitled 'For Those Who Aren't Familiar With Digital Geologic Map Databases' below. This digital map database, compiled from previously unpublished data, and new mapping by the authors, represents the general distribution of surficial deposits in the San Francisco bay region. Together with the accompanying text file (sf_geo.txt or sf_geo.pdf), it provides current information on Quaternary geology and liquefaction susceptibility of the San Francisco, California, 1:100,000 quadrangle. The database delineates map units that are identified by general age and lithology following the stratigraphic nomenclature of the U.S. Geological Survey. The scale of the source maps limits the spatial resolution (scale) of the database to 1:100,000 or smaller. The content and character of the database, as well as three methods of obtaining the database, are described below.
ATLAS software configuration and build tool optimisation
NASA Astrophysics Data System (ADS)
Rybkin, Grigory; Atlas Collaboration
2014-06-01
ATLAS software code base is over 6 million lines organised in about 2000 packages. It makes use of some 100 external software packages, is developed by more than 400 developers and used by more than 2500 physicists from over 200 universities and laboratories in 6 continents. To meet the challenge of configuration and building of this software, the Configuration Management Tool (CMT) is used. CMT expects each package to describe its build targets, build and environment setup parameters, dependencies on other packages in a text file called requirements, and each project (group of packages) to describe its policies and dependencies on other projects in a text project file. Based on the effective set of configuration parameters read from the requirements files of dependent packages and project files, CMT commands build the packages, generate the environment for their use, or query the packages. The main focus was on build time performance that was optimised within several approaches: reduction of the number of reads of requirements files that are now read once per package by a CMT build command that generates cached requirements files for subsequent CMT build commands; introduction of more fine-grained build parallelism at package task level, i.e., dependent applications and libraries are compiled in parallel; code optimisation of CMT commands used for build; introduction of package level build parallelism, i. e., parallelise the build of independent packages. By default, CMT launches NUMBER-OF-PROCESSORS build commands in parallel. The other focus was on CMT commands optimisation in general that made them approximately 2 times faster. CMT can generate a cached requirements file for the environment setup command, which is especially useful for deployment on distributed file systems like AFS or CERN VMFS. The use of parallelism, caching and code optimisation significantly-by several times-reduced software build time, environment setup time, increased the efficiency of multi-core computing resources utilisation, and considerably improved software developer and user experience.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-07
... Proposed Rule Change Deleting the Text of NYSE Rule 409(f) and Adopting the Text of FINRA Rule 2232 and... of the Terms of Substance of the Proposed Rule Change The Exchange proposes (1) To delete the text of NYSE Rule 409(f) and adopt the text of FINRA Rule 2232 and (2) delete the Rule Interpretations to NYSE...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dolan, Daniel H.; Ao, Tommy
The Sandia Data Archive (SDA) format is a specific implementation of the HDF5 (Hierarchal Data Format version 5) standard. The format was developed for storing data in a universally accessible manner. SDA files may contain one or more data records, each associated with a distinct text label. Primitive records provide basic data storage, while compound records support more elaborate grouping. External records allow text/binary files to be carried inside an archive and later recovered. This report documents version 1.0 of the SDA standard. The information provided here is sufficient for reading from and writing to an archive. Although the formatmore » was original designed for use in MATLAB, broader use is encouraged.« less
Federal Register 2010, 2011, 2012, 2013, 2014
2010-11-29
...'') the identity of the firm entering a Directed Order until May 31, 2011. II. Self-Regulatory... change. The text of these statements may be examined at the places specified in Item IV below. The self... change to identify to a DMM the identity of the firm entering a Directed Order. The ISE filed this system...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-04-12
...-Regulatory Organizations; NYSE Arca, Inc.; Notice of Filing and Immediate Effectiveness of Proposed Rule... Defray the Costs of Routing Orders to Away Markets April 6, 2011. Pursuant to Section 19(b)(1) of the... routing fees to defray the costs of routing orders to away markets. The text of the proposed rule change...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-06-03
... meeting. Written statements should be supplied to the DFO in the following formats: One hard copy with original signature and one electronic copy via e-mail (acceptable file format: Adobe Acrobat PDF, MS Word, WordPerfect, MS PowerPoint, or Rich Text files in IBM-PC/Windows 98/2000/XP format). Submitters are...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-01-25
... their consideration. Written statements should be supplied to the DFO in the following formats: one hard copy with original signature, and one electronic copy via e-mail (acceptable file format: Adobe Acrobat PDF, WordPerfect, MS Word, MS PowerPoint, or Rich Text files in IBM-PC/ Windows 98/2000/XP format...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-01-29
... statements should be supplied to the DFO in the following formats: one hard copy with original signature, and one electronic copy via e-mail (acceptable file format: Adobe Acrobat PDF, WordPerfect, MS Word, MS PowerPoint, or Rich Text files in IBM-PC/Windows 98/2000/XP format). Submitters are asked to provide...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-08-30
... should be supplied to the DFO in the following formats: One hard copy with original signature and one electronic copy via e-mail (acceptable file format: Adobe Acrobat PDF, MS Word, WordPerfect, MS PowerPoint, or Rich Text files in IBM-PC/Windows 98/2000/XP format). Submitters are asked to provide electronic...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-21
... be supplied to the DFO in the following formats: One hard copy with original signature, and one electronic copy via e-mail (acceptable file format: Adobe Acrobat PDF, WordPerfect, MS Word, MS PowerPoint, or Rich Text files in IBM-PC/ Windows 98/2000/XP format). Submitters are requested to provide two...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-30
... supplied to the DFO in the following formats: One hard copy with original signature, and one electronic copy via e-mail (acceptable file format: Adobe Acrobat PDF, WordPerfect, MS Word, MS PowerPoint, or Rich Text files in IBM-PC/ Windows 98/2000/XP format). Submitters are requested to provide two versions...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-01-11
... supplied to the DFO in the following formats: one hard copy with original signature and one electronic copy via e-mail (acceptable file format: Adobe Acrobat PDF, MS Word, WordPerfect, MS PowerPoint, or Rich Text files in IBM-PC/Windows 98/2000/XP format). Submitters are asked to provide versions of each...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-25
... statements should be supplied to the DFO in the following formats: One hard copy with original signature and one electronic copy via e-mail (acceptable file format: Adobe Acrobat PDF, WordPerfect, MS Word, MS PowerPoint, or Rich Text files in IBM-PC/Windows 98/2000/XP format). Submitters are requested to provide...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-10-08
.... Written statements should be supplied to the DFO in the following formats: one hard copy with original signature, and one electronic copy via e-mail (acceptable file format: Adobe Acrobat PDF, WordPerfect, MS Word, MS PowerPoint, or Rich Text files in IBM-PC/Windows 98/2000/XP format). Submitters are asked to...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-01
... their consideration. Written statements should be supplied to the DFO in the following formats: one hard copy with original signature, and one electronic copy via e-mail (acceptable file format: Adobe Acrobat PDF, WordPerfect, MS Word, MS PowerPoint, or Rich Text files in IBM-PC/ Windows 98/2000/XP format...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-08
... Organizations; NASDAQ OMX BX; Notice of Filing and Immediate Effectiveness of Proposed Rule Change To Dissolve... dissolve the BOX Committee of the Board of Directors. The text of the proposed rule change is available at... Basis for, the Proposed Rule Change 1. Purpose The Exchange proposes to dissolve the BOX Committee of...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-08-06
... Effectiveness of Proposed Rule Change To Amend Its Minor Rule Violation Plan July 24, 2010. Pursuant to Section... CBOE Rule 17.50--Imposition of Fines for Minor Rule Violations. The text of the proposed rule change is... (``Commission'') approved a CBOE rule filing amending Rule 17.50-- Imposition of Fines for Minor Rule Violations...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-11-12
... vertical Kaplan turbine-generator units with a combined capacity of 2.5 megawatts; (5) a new 3-MVA.../ferconline.asp ) under the ``eFiling'' link. For a simpler method of submitting text only comments, click on... this project, including a copy of the application can be viewed or printed on the ``eLibrary'' link of...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-11-17
...-foot diameter penstock; (4) two vertical Kaplan turbine-generator units with a combined capacity of 7.0... ) under the ``eFiling'' link. For a simpler method of submitting text only comments, click on ``Quick... project, including a copy of the application can be viewed or printed on the ``eLibrary'' link of...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-11-12
... penstock; (4) two vertical Kaplan turbine- generator units with a combined capacity of 3.5 megawatts; (5) a.../ferconline.asp ) under the ``eFiling'' link. For a simpler method of submitting text only comments, click on... this project, including a copy of the application can be viewed or printed on the ``eLibrary'' link of...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-10-27
... Change To Codify Prices for Co-Location Services October 21, 2010. Pursuant to Section 19(b)(1) of the... (``Commission'') a proposed rule change to codify pricing for co- location services. The text of the proposed... for the Exchange's co-location services.\\3\\ This filing seeks to codify additional fees not included...
Banta, Edward R.; Provost, Alden M.
2008-01-01
This report documents HUFPrint, a computer program that extracts and displays information about model structure and hydraulic properties from the input data for a model built using the Hydrogeologic-Unit Flow (HUF) Package of the U.S. Geological Survey's MODFLOW program for modeling ground-water flow. HUFPrint reads the HUF Package and other MODFLOW input files, processes the data by hydrogeologic unit and by model layer, and generates text and graphics files useful for visualizing the data or for further processing. For hydrogeologic units, HUFPrint outputs such hydraulic properties as horizontal hydraulic conductivity along rows, horizontal hydraulic conductivity along columns, horizontal anisotropy, vertical hydraulic conductivity or anisotropy, specific storage, specific yield, and hydraulic-conductivity depth-dependence coefficient. For model layers, HUFPrint outputs such effective hydraulic properties as horizontal hydraulic conductivity along rows, horizontal hydraulic conductivity along columns, horizontal anisotropy, specific storage, primary direction of anisotropy, and vertical conductance. Text files tabulating hydraulic properties by hydrogeologic unit, by model layer, or in a specified vertical section may be generated. Graphics showing two-dimensional cross sections and one-dimensional vertical sections at specified locations also may be generated. HUFPrint reads input files designed for MODFLOW-2000 or MODFLOW-2005.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Campbell, S.; Bartoshesky, J.; Heimbuch, D.
1987-06-01
Precipitation and stream-water chemistry data were collected from three watersheds in the Coastal Plain region of Maryland during the period May 1984 through June 1985 in an attempt to determine the potential effects of acidic deposition on the chemistry of these streams. The study streams included Lyons Creek, Morgan Creek, and Granny Finley Branch; these streams were chosen based on their differential responses to storm events observed in a survey of Coastal Plain streams in the spring of 1983. Lyons Creek typically exhibited lower pH, acid-neutralizing capacity, and concentrations of base cations than observed in the other streams. Sulfate massmore » balances suggest that the soils in the Lyons Creek watershed also have less affinity for sulfur retention than do soils of the other watersheds. Acidic pulses were observed in all three streams during the spring months; however, the magnitude of these pulses was less than that observed in 1983. Modeling of the relationships between precipitation chemistry, watershed interactions, and stream chemistry suggests that precipitation acidity can influence stream-water acidity, depending upon hydrological conditions and availabiility of acid-neutralizing materials in the watersheds.« less
Image encryption using random sequence generated from generalized information domain
NASA Astrophysics Data System (ADS)
Xia-Yan, Zhang; Guo-Ji, Zhang; Xuan, Li; Ya-Zhou, Ren; Jie-Hua, Wu
2016-05-01
A novel image encryption method based on the random sequence generated from the generalized information domain and permutation-diffusion architecture is proposed. The random sequence is generated by reconstruction from the generalized information file and discrete trajectory extraction from the data stream. The trajectory address sequence is used to generate a P-box to shuffle the plain image while random sequences are treated as keystreams. A new factor called drift factor is employed to accelerate and enhance the performance of the random sequence generator. An initial value is introduced to make the encryption method an approximately one-time pad. Experimental results show that the random sequences pass the NIST statistical test with a high ratio and extensive analysis demonstrates that the new encryption scheme has superior security.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-11
... Amending Rule 472-- Equities, Which Addresses Communications With the Public, Adopting New Rule Text To... text to conform to the changes adopted by the Financial Industry Regulatory Authority, Inc. (``FINRA... ``JOBS Act'').\\4\\ The text of the proposed rule change is available on the Exchange's Web site at www...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-12-21
... Change Adopting the Text of Financial Industry Regulatory Authority Rule 5210, Which Prohibits the... Statement of the Terms of Substance of the Proposed Rule Change The Exchange proposes to adopt the text of... manipulative or deceptive quotations or transactions, as NYSE Arca Equities Rule 5210. The text of the proposed...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-09-14
... Proposed Rule Change To Delete Non-Operable Text Within Its Price List Applicable to Supplemental Liquidity... non-operable text within its Price List applicable to Supplemental Liquidity Providers (``SLPs''). The text of the proposed rule change is available on the Exchange's Web site at www.nyse.com , at the...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-11
... Rule Text To Conform to the Changes Adopted by the Financial Industry Regulatory Authority, Inc. for... public, to adopt new rule text to conform to the changes adopted by the Financial Industry Regulatory... Business Startups Act (the ``JOBS Act'').\\4\\ The text of the proposed rule change is available on the...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-12-22
... Adopting the Text of Financial Industry Regulatory Authority Rule 5210, Which Prohibits the Publication of... Terms of the Substance of the Proposed Rule Change The Exchange proposes to adopt the text of Financial... deceptive quotations or transactions, as NYSE Amex Equities Rule 5210. The text of the proposed rule change...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-07-02
... Eliminate Certain Rule Text Which Has Been Made Unnecessary Due to the Decommissioning of the OCC Hub June... text which has been made unnecessary due to the decommissioning of the Options Clearing Corporation (``OCC'') Hub. The text of the proposed rule change is available on BX's Web site, on the Commission's...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-02-23
... Deleting NYSE Amex Equities Rule 319, Which Addresses Fidelity Bond Requirements, and Adopt New Rule Text... 319, which addresses fidelity bond requirements, and adopt new rule text that is substantially similar to FINRA Rule 4360. The text of the proposed rule change is available at the Exchange, the Commission...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-13
... Proposed Rule Change To Cease Operating New York Block Exchange and Contemporaneously Delete the Text of... of the Proposed Rule Change The Exchange proposes to contemporaneously delete the text of Rule 1600, which governs NYBX functionality. The text of the proposed rule change is available on the Exchange's...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-27
... Effectiveness of Proposed Rule Change To Remove a Feature and Revise Outdated Text Regarding Certain Execution... Proposed Rule Change The Exchange is proposing to eliminate a feature and revise outdated text regarding certain of its execution rules. The text of the proposed rule change is available on CBOE's Web site at...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-02
... transparency and maintain clarity in the rules. The text of the proposed rule change is available on the... rule change and discussed any comments it received on the proposed rule change. The text of these... useful to explicitly define these terms within the rule text to reduce confusion. First, the Exchange...
Reprocessing of multi-channel seismic-reflection data collected in the Beaufort Sea
Agena, W.F.; Lee, Myung W.; Hart, P.E.
2000-01-01
Contained on this set of two CD-ROMs are stacked and migrated multi-channel seismic-reflection data for 65 lines recorded in the Beaufort Sea by the United States Geological Survey in 1977. All data were reprocessed by the USGS using updated processing methods resulting in improved interpretability. Each of the two CD-ROMs contains the following files: 1) 65 files containing the digital seismic data in standard, SEG-Y format; 2) 1 file containing navigation data for the 65 lines in standard SEG-P1 format; 3) an ASCII text file with cross-reference information for relating the sequential trace numbers on each line to cdp numbers and shotpoint numbers; 4) 2 small scale graphic images (stacked and migrated) of a segment of line 722 in Adobe Acrobat (R) PDF format; 5) a graphic image of the location map, generated from the navigation file; 6) PlotSeis, an MS-DOS Application that allows PC users to interactively view the SEG-Y files; 7) a PlotSeis documentation file; and 8) an explanation of the processing used to create the final seismic sections (this document).
2011-02-10
DesigneD for clinicians involved in the management of acutely ill patients, this book enables them to recognise quickly the appearance of life-threatening conditions. It also concentrates on the interpretation of plain chest and abdominal images.
Plain Speaking: A Theory and Grammar of Spontaneous Discourse.
1981-06-01
reveals surface linguistic phenomena that contradict traditional theories based on single sentence studies and longer texts artificially constructed...Excerpt 1 , Chapter 1, to illustrate the importance of functional development and discernment. In the midst of discussing the case of two twins under study ...his social interactive behavior in kindergarten. Authority: Source: Study Method : Investigative filming of kids over time. Credentials: Excellent
The Study and Implementation of Text-to-Speech System for Agricultural Information
NASA Astrophysics Data System (ADS)
Zheng, Huoguo; Hu, Haiyan; Liu, Shihong; Meng, Hong
The Broadcast and Television coverage has increased to more than 98% in china. Information services by radio have wide coverage, low cost, easy-to-grass-roots farmers to accept etc. characteristics. In order to play the better role of broadcast information service, as well as aim at the problem of lack of information resource in rural, we R & D the text-to-speech system. The system includes two parts, software and hardware device, both of them can translate text into audio file. The software subsystem was implemented basic on third-part middleware, and the hardware subsystem was realized with microelectronics technology. Results indicate that the hardware is better than software. The system has been applied in huailai city hebei province, which has conversed more than 8000 audio files as programming materials for the local radio station.
Speeding up ontology creation of scientific terms
NASA Astrophysics Data System (ADS)
Bermudez, L. E.; Graybeal, J.
2005-12-01
An ontology is a formal specification of a controlled vocabulary. Ontologies are composed of classes (similar to categories), individuals (members of classes) and properties (attributes of the individuals). Having vocabularies expressed in a formal specification like the Web Ontology Language (OWL) enables interoperability due to the comprehensiveness of OWL by software programs. Two main non-inclusive strategies exist when constructing an ontology: an up-down approach and a bottom-up approach. The former one is directed towards the creation of top classes first (main concepts) and then finding the required subclasses and individuals. The later approach starts from the individuals and then finds similar properties promoting the creation of classes. At the Marine Metadata Interoperability (MMI) Initiative we used a bottom-up approach to create ontologies from simple-vocabularies (those that are not expressed in a conceptual way). We found that the vocabularies were available in different formats (relational data bases, plain files, HTML, XML, PDF) and sometimes were composed of thousands of terms, making the ontology creation process a very time consuming activity. To expedite the conversion process we created a tool VOC2OWL that takes a vocabulary in a table like structure (CSV or TAB format) and a conversion-property file to create automatically an ontology. We identified two basic structures of simple-vocabularies: Flat vocabularies (e.g., phone directory) and hierarchical vocabularies (e.g., taxonomies). The property file defines a list of attributes for the conversion process for each structure type. The attributes included metadata information (title, description, subject, contributor, urlForMoreInformation) and conversion flags (treatAsHierarchy, generateAutoIds) and other conversion information needed to create the ontology (columnForPrimaryClass, columnsToCreateClassesFrom, fileIn, fileOut, namespace, format). We created more than 50 ontologies and generated more than 250,000 statements (or triples). The previous ontologies allowed domain experts to create 800 relations allowing to infer 2200 more relations among different vocabularies in the MMI workshop "Advancing Domain Vocabularies" held in Boulder Aug, 2005.
The Cheetah Data Management System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kunz, P.F.; Word, G.B.
1991-03-01
Cheetah is a data management system based on the C programming language. The premise of Cheetah is that the banks' of FORTRAN based systems should be structures' as defined by the C language. Cheetah is a system to mange these structures, while preserving the use of the C language in its native form. For C structures managed by Cheetah, the user can use Cheetah utilities such as reading and writing, in a machine independent form, both binary and text files to disk or over a network. Files written by Cheetah also contain a dictionary describing in detail the data containedmore » in the file. Such information is intended to be used by interactive programs for presenting the contents of the file. Such information is intended to be used by interactive programs for presenting the contents of file. Cheetah has been ported to many different operating systems with no operating system dependent switches.« less
Document Delivery from Full-Text Online Files: A Pilot Project.
ERIC Educational Resources Information Center
Gillikin, David P.
1990-01-01
Describes the Electronic Journal Retrieval Project (EJRP) developed at the University of Tennessee, Knoxville Libraries, to provide full-text journal articles from online systems. Highlights include costs of various search strategies; implications for library services; collection development and interlibrary loan considerations; and suggestions…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-08-06
...)(1) of the Securities Exchange Act of 1934 (``Act'') \\1\\ and Rule 19b-4 thereunder,\\2\\ notice is.... 78s(b)(1). \\2\\ 17 CFR 240.19b-4. I. Self-Regulatory Organization's Statement of the Terms of Substance... effective upon filing, the Exchange has designated these changes to be operative on August 2, 2010. The text...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-04-13
...\\ may also be known as Gold/Silver Index. The text of the proposed rule change is available on the...-Regulatory Organizations; Notice of Filing of Proposed Rule Change by NASDAQ OMX PHLX LLC To Expand the Number of Components in the PHLX Gold/Silver Sector\\SM\\ Known as XAU\\SM\\, on Which Options Are Listed and...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-30
... operative delay period contained in Exchange Act Rule 19b- 4(f)(6)(iii).\\4\\ The text of the proposed rule... concept of leverage is not novel to the markets, the Information Circular will be distributed to provide... system, and, in general to protect investors and the public interest. For the reasons noted in the filing...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-15
... Change To Correct a Typographical Error in Exchange Rule 1080 August 9, 2011. Pursuant to Section 19(b)(1... Rule 1080 (Phlx XL and XL II) to correct a typographical error. The text of the proposed rule change is... in subsection (m)(iii)(D) of Rule 1080. On July 13, 2011, the Exchange filed an immediately effective...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-07
... Deleting NYSE Amex Equities Rule 351(a)-(d) and Supplementary Material .10 and .13, Adopting the Text of... NYSE Amex Equities Rule 351(a)-(d) and Supplementary Material .10 and .13, adopt the text of FINRA Rule 4530, and make certain conforming changes. The text of the proposed rule change is available at the...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-24
... Proposed Rule Change Deleting the Text of NYSE Rule 92 and Adopting a New NYSE Rule 5320 That Is... Terms of Substance of the Proposed Rule Change The Exchange proposes to delete the text of NYSE Rule 92... same as Financial Industry Regulatory Authority (``FINRA'') Rule 5320. The text of the proposed rule...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-06-29
... To Include Text in Its Options Rules Governing the Use of Its Affiliate Broker-Dealer, Archipelago Securities LLC for Outbound Routing of Option Orders, and To Adopt Text in Its Options Rules To Permit the... Statement of the Terms of Substance of the Proposed Rule Change The Exchange proposes (1) To include text in...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-04
... Moving the Rule Text That Provides for Pegging on the Exchange From Supplementary Material .26 of Rule 70--Equities to Rule 13--Equities and Amending Such Text to (i) Permit Designated Market Maker Interest To Be... Proposed Rule Change The Exchange proposes to move the rule text that provides for pegging on the Exchange...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-05-12
... NYSE Arca Equities Rule 7.23 To Simplify Certain Aspects of the Text While Also Conforming Certain of... certain aspects of the text while also conforming certain of the percentages thereunder to the proposed changes to Rule 7.11. The text of the proposed rule change is available at the Exchange, the Commission's...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-24
... Deleting the Text of NYSE Amex Equities Rules 92, 513, 514 and Adopting New NYSE Amex Equities Rule 5320... the text of NYSE Amex Equities Rules 92, 513, and 514, which limit trading ahead of customer orders... Regulatory Authority (``FINRA'') Rule 5320. The text of the proposed rule change is available at the Exchange...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-05-12
... Aspects of the Text While Also Conforming Certain of the Percentages Thereunder to the Proposed Changes to... operates and amend Rule 4613(a) to simplify certain aspects of the text while also conforming certain of... breaker pilot pause. The text of the proposed rule change is below. Proposed new language is italicized...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-04-23
... Certain Obsolete Text April 17, 2013. Pursuant to Section 19(b)(1) \\1\\ of the Securities Exchange Act of... and Charges for Exchange Services (the ``Fee Schedule'') to remove certain obsolete text related to a.... The text of the proposed rule change is available on the Exchange's Web site at www.nyse.com , at the...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-26
... the Text of NYSE Amex Options Rule 993NY(b)(1)(B) To More Accurately Reflect the Regulatory Services... exchange (with the attendant obligations and conditions) and to clarify the text of NYSE Amex Options Rule... Exchange and the Financial Industry Regulatory Authority (``FINRA''). The text of the proposed rule change...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-05-12
... Rule 104--NYSE Amex Equities To Simplify Certain Aspects of the Text While Also Conforming Certain of... certain aspects of the text while also conforming certain of the percentages thereunder to the proposed changes to Rule 80C. The text of the proposed rule change is available at the Exchange, the Commission's...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-05-12
... Simplify Certain Aspects of the Text While Also Conforming Certain of the Percentages Thereunder to the... amend Rule 104 to simplify certain aspects of the text while also conforming certain of the percentages thereunder to the proposed changes to Rule 80C. The text of the proposed rule change is available at the...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-25
... Proposed Rule Change Revising Rule 61(a)(iii) To Harmonize the Existing Rule Text With the Recent Amendment... Rule Change The Exchange proposes to revise Rule 61(a)(iii) to harmonize the existing rule text with... the Consolidated Tape. The text of the proposed rule change is available on the Exchange's Web site at...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-03
... Proposed Rule Change Moving the Rule Text That Provides for Pegging on the Exchange From Supplementary Material .26 of NYSE Rule 70 to NYSE Rule 13 and Amending Such Text to (i) Permit Designated Market Maker... of the Terms of Substance of the Proposed Rule Change The Exchange proposes to move the rule text...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-26
... Clarify the Text of NYSE Arca Equities Rule 7.45(c)(1)(B) to More Accurately Reflect the Regulatory... attendant obligations and conditions) and to clarify the text of NYSE Arca Equities Rule 7.45(c)(1)(B) to... Financial Industry Regulatory Authority (``FINRA''). The text of the proposed rule change is available at...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-05-12
... 4613(a) To Simplify Certain Aspects of the Text While Also Conforming Certain of the Percentages... such rule operates and amend Rule 4613(a) to simplify certain aspects of the text while also conforming... proposed rule change and discussed any comments it received on the proposed rule change. The text of these...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-26
... Clarify the Text of NYSE Arca Options Rule 6.96(b)(1)(B) To More Accurately Reflect the Regulatory... attendant obligations and conditions) and to clarify the text of NYSE Arca Options Rule 6.96(b)(1)(B) to... Financial Industry Regulatory Authority (``FINRA''). The text of the proposed rule change is available at...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-10-18
... Change Adopting the Text of the FINRA Rule 7400 Series, the Order Audit Trail System (``OATS'') Rules... of the Terms of Substance of the Proposed Rule Change The Exchange proposes to adopt the text of the... changes. The text of the proposed rule change is available at the Exchange, the Commission's Public...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-07
... Text of FINRA Rule 4530, and Making Certain Conforming Changes June 30, 2011. Pursuant to Section 19(b... NYSE Rule 351(a)-(d) and Supplementary Material .10 and .13, adopt the text of FINRA Rule 4530, and make certain conforming changes. The text of the proposed rule change is available at the Exchange, the...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-06-29
... Change To Include Text in Its Options Rules Governing the Use of Its Affiliate Broker-Dealer, Archipelago Securities LLC for Outbound Routing of Option Orders, and To Adopt Text in Its Options Rules To Permit the... include text in its options rules governing the use of its affiliate broker-dealer, Archipelago Securities...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-05-10
...\\ The Exchange is not proposing to amend any rule text, but simply administering or enforcing an... Pilot). \\4\\ See Chapter VI, Section 5 regarding the Penny Pilot. The text of the proposed rule change is... comments it received on the proposed rule change. The text of these statements may be examined at the...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-05-10
... 3, 2010.\\3\\ The Exchange is not proposing to amend any rule text, but simply administering or... Pilot). \\4\\ See Rule 1034 regarding the Penny Pilot. The text of the proposed rule change is available... proposed rule change and discussed any comments it received on the proposed rule change. The text of these...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-22
... PHLX LLC Relating to Clarifying Amendments to the Rule Book August 16, 2011. Pursuant to Section 19(b... text. The Exchange also proposes to eliminate an unnecessary title in the Rule Book. The text of the... Rule Book. The various amendments relate to cross-references and text in several Rules that were not...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-05-19
... the message. Comments and suggestions should be provided in WordPerfect, Microsoft Word, PDF, or text file format. The full text of the interpretive rule is available at http://www1.eere.energy.gov.... The full text of the interpretive rule is available at http://www1.eere.energy.gov/buildings/appliance...
Supporting geoscience with graphical-user-interface Internet tools for the Macintosh
NASA Astrophysics Data System (ADS)
Robin, Bernard
1995-07-01
This paper describes a suite of Macintosh graphical-user-interface (GUI) software programs that can be used in conjunction with the Internet to support geoscience education. These software programs allow science educators to access and retrieve a large body of resources from an increasing number of network sites, taking advantage of the intuitive, simple-to-use Macintosh operating system. With these tools, educators easily can locate, download, and exchange not only text files but also sound resources, video movie clips, and software application files from their desktop computers. Another major advantage of these software tools is that they are available at no cost and may be distributed freely. The following GUI software tools are described including examples of how they can be used in an educational setting: ∗ Eudora—an e-mail program ∗ NewsWatcher—a newsreader ∗ TurboGopher—a Gopher program ∗ Fetch—a software application for easy File Transfer Protocol (FTP) ∗ NCSA Mosaic—a worldwide hypertext browsing program. An explosive growth of online archives currently is underway as new electronic sites are being added continuously to the Internet. Many of these resources may be of interest to science educators who learn they can share not only ASCII text files, but also graphic image files, sound resources, QuickTime movie clips, and hypermedia projects with colleagues from locations around the world. These powerful, yet simple to learn GUI software tools are providing a revolution in how knowledge can be accessed, retrieved, and shared.
[Effects of marshland reclamation on evapotranspiration in the Sanjiang Plain].
Jia, Zhi-jun; Zhang, Wen; Huang, Yao; Zhao, Xiao-song; Song, Chang-chun
2010-04-01
Extensive reclamation of marshland into cropland has had tremendous effects on the ecological environment in the Sanjiang Plain. Observations over marshland, rice paddy and soybean field were made with eddy covariance measuring systems from May to October in 2005, 2006 and 2007. The objective of this study was to identify the effects of the conversion of marshland to cropland on evapotranspiration in the Sanjiang Plain. The results showed that the diurnal variation curves of latent heat flux were single peaked in marshland, rice paddy and soybean field. The daily maximum latent heat flux increased by 14%-130% in rice paddy in the three measuring years, however, in soybean field, it increased by 3%-77% in 2006 but decreased by 25%-40% in 2005 and 2007 by comparison with that in marshland. This difference was due to the change of leaf area index when marshland was reclaimed into cropland. Seasonal change of latent heat flux was identical for the three land use types. Daily averaged latent heat flux of rice paddy, from May to October, showed 38%-53% increase compared with that of marshland, which resulted from the increase in net radiation and leaf area index. When marshland was reclaimed into soybean field, the variation of daily averaged latent heat flux depended primarily on precipitation. Precipitation was the main factor that controlled evapotranspiration over soybean field which was usually in condition of soil water deficit. Drought caused 11%-17% decrease of daily averaged latent heat flux over soybean field in 2005 and 2007, while sufficient precipitation caused 22% increase in 2006, comparing to marshland. Similarly, during the growing season from June to September, total evapotranspiration of rice paddy increased by 24%-51% compared with that of marshland, and the total evapotranspiration of soybean field decreased by 19%-23% in 2005 and 2007 and increased by 19% in 2006. It is concluded that the evapotranspiration changes significantly when the marshland was reclaimed into rice paddy or soybean field in the Sanjiang Plain. Compared to marshland, the evapotranspiration is higher in rice paddy and soybean filed with sufficient precipitation, while lower in soybean field under drought. These changes are found to be highly related to the variations of net radiation, leaf area index and precipitation.
Brabb, Earl E.; Roberts, Sebastian; Cotton, William R.; Kropp, Alan L.; Wright, Robert H.; Zinn, Erik N.; Digital database by Roberts, Sebastian; Mills, Suzanne K.; Barnes, Jason B.; Marsolek, Joanna E.
2000-01-01
This publication consists of a digital map database on a geohazards web site, http://kaibab.wr.usgs.gov/geohazweb/intro.htm, this text, and 43 digital map images available for downloading at this site. The report is stored as several digital files, in ARC export (uncompressed) format for the database, and Postscript and PDF formats for the map images. Several of the source data layers for the images have already been released in other publications by the USGS and are available for downloading on the Internet. These source layers are not included in this digital database, but rather a reference is given for the web site where the data can be found in digital format. The exported ARC coverages and grids lie in UTM zone 10 projection. The pamphlet, which only describes the content and character of the digital map database, is included as Postscript, PDF, and ASCII text files and is also available on paper as USGS Open-File Report 00-127. The full versatility of the spatial database is realized by importing the ARC export files into ARC/INFO or an equivalent GIS. Other GIS packages, including MapInfo and ARCVIEW, can also use the ARC export files. The Postscript map image can be used for viewing or plotting in computer systems with sufficient capacity, and the considerably smaller PDF image files can be viewed or plotted in full or in part from Adobe ACROBAT software running on Macintosh, PC, or UNIX platforms.
Gibbs, Ann E.; Richmond, Bruce M.
2009-01-01
The Arctic Coastal Plain of northern Alaska, an area of strategic economic importance to the United States, is home to remote Native American communities and encompasses unique habitats of global significance. Coastal erosion along the Arctic coast is chronic and widespread; recent evidence suggests that erosion rates are among the highest in the world (up to ~16 m/yr) and may be accelerating. Coastal erosion adversely impacts energy-related infrastructure, natural shoreline habitats, and Native American communities. Climate change is thought to be a key component of recent environmental changes in the Arctic. Reduced sea-ice cover in the Arctic Ocean is one of the probable mechanisms responsible for increasing coastal exposure to wave attack and the resulting increase in erosion. Extended periods of permafrost melting and associated decrease in bluff cohesion and stability are another possible source of the increase in erosion. Several studies of selected areas on the Alaska coast document past shoreline positions and coastal change, but none have examined the entire North coast systematically. Results from these studies indicate high rates of coastal retreat that vary spatially along the coast. To address the need for a comprehensive and regionally consistent evaluation of shoreline change along the North coast of Alaska, the U.S. Geological Survey (USGS), as part of their Coastal and Marine Geology Program's (CMGP) National Assessment of Shoreline Change Study, is evaluating shoreline change from Peard Bay to the United States/Canadian border, using historical maps and photography and a standardized methodology that is consistent with other shoreline-change studies along the Nation's coastlines (for example, URL http://coastal.er.usgs.gov/shoreline-change/ (last accessed March 2, 2009). This report contains photographs collected during an aerial-reconnaissance survey conducted in support of this study. An accompanying ESRI ArcGIS shape file (and plain-text copy) indicates the position of the aircraft and time when each photograph was taken. The USGS-CMGP Field Activity ID for the survey is A-1-06-AK, and more information on the survey and how to view the photographs using Google Earth software is available online at: URL http://walrus.wr.usgs.gov/infobank/a/a106ak/html/a-1-06-ak.meta.html (last accessed March 2, 2009).
Draghi, Ferdinando; Gitto, Salvatore; Bortolotto, Chandra; Draghi, Anna Guja; Ori Belometti, Gioia
2017-02-01
Plantar fascia (PF) disorders commonly cause heel pain and disability in the general population. Imaging is often required to confirm diagnosis. This review article aims to provide simple and systematic guidelines for imaging assessment of PF disease, focussing on key findings detectable on plain radiography, ultrasound and magnetic resonance imaging (MRI). Sonographic characteristics of plantar fasciitis include PF thickening, loss of fibrillar structure, perifascial collections, calcifications and hyperaemia on Doppler imaging. Thickening and signal changes in the PF as well as oedema of adjacent soft tissues and bone marrow can be assessed on MRI. Radiographic findings of plantar fasciitis include PF thickening, cortical irregularities and abnormalities in the fat pad located deep below the PF. Plantar fibromatosis appears as well-demarcated, nodular thickenings that are iso-hypoechoic on ultrasound and show low-signal intensity on MRI. PF tears present with partial or complete fibre interruption on both ultrasound and MRI. Imaging description of further PF disorders, including xanthoma, diabetic fascial disease, foreign-body reactions and plantar infections, is detailed in the main text. Ultrasound and MRI should be considered as first- and second-line modalities for assessment of PF disorders, respectively. Indirect findings of PF disease can be ruled out on plain radiography. Teaching Points • PF disorders commonly cause heel pain and disability in the general population.• Imaging is often required to confirm diagnosis or reveal concomitant injuries.• Ultrasound and MRI respectively represent the first- and second-line modalities for diagnosis.• Indirect findings of PF disease can be ruled out on plain radiography.
NASA Astrophysics Data System (ADS)
Yoshimi, M.; Matsushima, S.; Ando, R.; Miyake, H.; Imanishi, K.; Hayashida, T.; Takenaka, H.; Suzuki, H.; Matsuyama, H.
2017-12-01
We conducted strong ground motion prediction for the active Beppu-Haneyama Fault zone (BHFZ), Kyushu island, southwestern Japan. Since the BHFZ runs through Oita and Beppy cities, strong ground motion as well as fault displacement may affect much to the cities.We constructed a 3-dimensional velocity structure of a sedimentary basin, Beppu bay basin, where the fault zone runs through and Oita and Beppu cities are located. Minimum shear wave velocity of the 3d model is 500 m/s. Additional 1-d structure is modeled for sites with softer sediment: holocene plain area. We observed, collected, and compiled data obtained from microtremor surveys, ground motion observations, boreholes etc. phase velocity and H/V ratio. Finer structure of the Oita Plain is modeled, as 250m-mesh model, with empirical relation among N-value, lithology, depth and Vs, using borehole data, then validated with the phase velocity data obtained by the dense microtremor array observation (Yoshimi et al., 2016).Synthetic ground motion has been calculated with a hybrid technique composed of a stochastic Green's function method (for HF wave), a 3D finite difference (LF wave) and 1D amplification calculation. Fault geometry has been determined based on reflection surveys and active fault map. The rake angles are calculated with a dynamic rupture simulation considering three fault segments under a stress filed estimated from source mechanism of earthquakes around the faults (Ando et al., JpGU-AGU2017). Fault parameters such as the average stress drop, a size of asperity etc. are determined based on an empirical relation proposed by Irikura and Miyake (2001). As a result, strong ground motion stronger than 100 cm/s is predicted in the hanging wall side of the Oita plain.This work is supported by the Comprehensive Research on the Beppu-Haneyama Fault Zone funded by the Ministry of Education, Culture, Sports, Science, and Technology (MEXT), Japan.
David R. Larsen; Ian Scott
2010-01-01
In the field of forestry, the output of forest growth models provide a wealth of detailed information that can often be difficult to analyze and perceive due to presentation either as plain text summary tables or static stand visualizations. This paper describes the design and implementation of a cross-platform computer application for dynamic and interactive forest...
Charpentier, Ronald R.; Klett, T.R.; Obuch, R.C.; Brewton, J.D.
1996-01-01
This CD-ROM contains files in support of the 1995 USGS National assessment of United States oil and gas resources (DDS-30), which was published separately and summarizes the results of a 3-year study of the oil and gas resources of the onshore and state waters of the United States. The study describes about 560 oil and gas plays in the United States; confirmed and hypothetical, conventional and unconventional. A parallel study of the Federal offshore is being conducted by the U.S. Minerals Management Service. This CD-ROM contains files in multiple formats, so that almost any computer user can import them into word processors and spreadsheets. The tabular data include some tables not released in DDS-30. No proprietary data are released on this CD-ROM, but some tables of summary statistics from the proprietary files are provided. The complete text of DDS-30 is also available, as well as many figures. Also included are some of the programs used in the assessment, in source code and with supporting documentation. A companion CD-ROM (DDS-35) includes the map data and the same text data, but none of the tabular data or assessment programs.
Rep. Nunes, Devin [R-CA-21
2009-06-26
House - 12/02/2009 Motion to Discharge Committee filed by Mr. Nunes. Petition No: 111-8. (All Actions) Notes: On 12/2/2009, a motion was filed to discharge the Committee on Natural Resources from consideration of H.R.3105. A discharge petition requires 218 signatures for further action. (Discharge Petition No. 111-8: text with signatures.) Tracker: This bill has the status IntroducedHere are the steps for Status of Legislation:
Military Surviving Spouses Equity Act
Rep. Ortiz, Solomon P. [D-TX-27
2009-01-28
House - 03/15/2010 Motion to Discharge Committee filed by Mr. Jones. Petition No: 111-10. (All Actions) Notes: On 3/15/2010, a motion was filed to discharge the Committee on Armed Services from consideration of H.R.775. A discharge petition requires 218 signatures for further action. (Discharge Petition No. 111-10: text with signatures.) Tracker: This bill has the status IntroducedHere are the steps for Status of Legislation:
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-30
...-Regulatory Organizations; NYSE Arca, Inc.; Notice of Filing of Proposed Rule Change Amending Rule 6.37A and Rule 6.64 March 23, 2010. Pursuant to Section 19(b)(1) \\1\\ of the Securities Exchange Act of 1934 (the... Change The Exchange proposes to amend Rule 6.37A and Rule 6.64. The text of the proposed rule change is...
Keep Terrorists Out of America Act
Rep. Boehner, John A. [R-OH-8
2009-05-07
House - 11/18/2009 Motion to Discharge Committee filed by Mr. Hoekstra. Petition No: 111-7. (All Actions) Notes: On 11/18/2009, a motion was filed to discharge the Committee on Armed Services from consideration of H.R.2294. A discharge petition requires 218 signatures for further action. (Discharge Petition No. 111-7: text with signatures.) Tracker: This bill has the status IntroducedHere are the steps for Status of Legislation:
Federal Register 2010, 2011, 2012, 2013, 2014
2010-11-08
... SECURITIES AND EXCHANGE COMMISSION [Release No. 34-63223; File No. SR-FINRA-2010-054] Self... interested persons. \\1\\ 15 U.S.C. 78s(b)(1). \\2\\ 17 CFR 240.19b-4. \\3\\ 17 CFR 240.19b-4(f)(6). I. Self... proposed rule change would not make any new changes to the text of FINRA rules. II. Self-Regulatory...
JOVIAL (J73) to Ada Translator.
1982-06-01
editors, file managers , and other APSE , the Translator will Provide significant (though not total) Ltion of the conversion of J73 Proorams for use...vlobal knowlede only of compool declarationsi externals are not resolved until the compiled modules are linked. Creatinv a vlobal data base durin...translation (as shown in Figure 2-1) will require the Job control, file management , and text editing capabilities which are provided by a typical
Rep. Carter, John R. [R-TX-31
2009-01-28
House - 03/31/2009 Motion to Discharge Committee filed by Mr. Carter. Petition No: 111-2. (All Actions) Notes: On 3/31/2009, a motion was filed to discharge the Committee on Ways and Means from consideration of H.R.735. A discharge petition requires 218 signatures for further action. (Discharge Petition No. 111-2: text with signatures.) Tracker: This bill has the status IntroducedHere are the steps for Status of Legislation:
Small Business Paperwork Mandate Elimination Act
Rep. Lungren, Daniel E. [R-CA-3
2010-04-26
House - 09/15/2010 Motion to Discharge Committee filed by Mr. Lungren. Petition No: 111-13. (All Actions) Notes: On 9/15/2010, a motion was filed to discharge the Committee on Ways and Means from consideration of H.R.5141. A discharge petition requires 218 signatures for further action. (Discharge Petition No. 111-13: text with signatures.) Tracker: This bill has the status IntroducedHere are the steps for Status of Legislation:
Code of Federal Regulations, 2010 CFR
2010-10-01
... transmitted online via the Internet at: http://fhwa-li.volpe.dot.gov or via American Standard Code Information... legal name 120 Text Legal Name B 19 138 Insured d/b/a name 60 Text Doing Business As Name If Different...
Database Management Systems: New Homes for Migrating Bibliographic Records.
ERIC Educational Resources Information Center
Brooks, Terrence A.; Bierbaum, Esther G.
1987-01-01
Assesses bibliographic databases as part of visionary text systems such as hypertext and scholars' workstations. Downloading is discussed in terms of the capability to search records and to maintain unique bibliographic descriptions, and relational database management systems, file managers, and text databases are reviewed as possible hosts for…
PDB explorer -- a web based algorithm for protein annotation viewer and 3D visualization.
Nayarisseri, Anuraj; Shardiwal, Rakesh Kumar; Yadav, Mukesh; Kanungo, Neha; Singh, Pooja; Shah, Pratik; Ahmed, Sheaza
2014-12-01
The PDB file format, is a text format characterizing the three dimensional structures of macro molecules available in the Protein Data Bank (PDB). Determined protein structure are found in coalition with other molecules or ions such as nucleic acids, water, ions, Drug molecules and so on, which therefore can be described in the PDB format and have been deposited in PDB database. PDB is a machine generated file, it's not human readable format, to read this file we need any computational tool to understand it. The objective of our present study is to develop a free online software for retrieval, visualization and reading of annotation of a protein 3D structure which is available in PDB database. Main aim is to create PDB file in human readable format, i.e., the information in PDB file is converted in readable sentences. It displays all possible information from a PDB file including 3D structure of that file. Programming languages and scripting languages like Perl, CSS, Javascript, Ajax, and HTML have been used for the development of PDB Explorer. The PDB Explorer directly parses the PDB file, calling methods for parsed element secondary structure element, atoms, coordinates etc. PDB Explorer is freely available at http://www.pdbexplorer.eminentbio.com/home with no requirement of log-in.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-17
... Change To Modify the Text of NSX Rule 11.15 To Clarify the Manner in Which Certain Orders are Routed by... Exchange, Inc. (``NSX''[supreg] or ``Exchange'') is proposing to modify the text of NSX Rule 11.15 to clarify the manner in which certain orders are routed by the Exchange to other market centers. The text of...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-10-17
... Proposed Rule Change Deleting NYSE Rules 132A, 132B, and 132C, Adopting the Text of the FINRA Rule 7400... Exchange proposes to delete NYSE Rules 132A, 132B, and 132C, adopt the text of the FINRA Rule 7400 Series, the Order Audit Trail System (``OATS'') Rules, and make certain conforming changes. The text of the...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-24
... Change Deleting the Text of NYSE Arca Equities Rules 6.16 and 6.16A, and Adopting New NYSE Arca Equities... Change The Exchange proposes to delete the text of NYSE Arca Equities Rules 6.16 and 6.16A, which limit... substantially the same as Financial Industry Regulatory Authority (``FINRA'') Rule 5320. The text of the...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-07-05
... Flow Providers and Clearing Members and Make a Conforming Change to the Current Text in the Fee... change to the current text in the Fee Schedule. The proposed change will be operative on July 1, 2012. The text of the proposed rule change is available on the Exchange's Web site at www.nyse.com , at the...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-25
... Revising Rule 61(a)(iii)--Equities To Harmonize the Existing Rule Text With the Recent Amendment to the CTA... the existing rule text with the recent amendment to the CTA Plan (and concordant change to the Nasdaq... text of the proposed rule change is available on the Exchange's Web site at www.nyse.com , at the...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-10-17
... Deleting NYSE Amex Equities Rules 132A, 132B, and 132C, Adopting the Text of the FINRA Rule 7400 Series... Equities Rules 132A, 132B, and 132C, adopt the text of the FINRA Rule 7400 Series, the Order Audit Trail System (``OATS'') Rules, and make certain conforming changes. The text of the proposed rule change is...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-26
... the Text of NYSE Rule 17(c)(2)(A)(ii) To More Accurately Reflect the Regulatory Services Agreement... obligations and conditions) and to clarify the text of NYSE Rule 17(c)(2)(A)(ii) to more accurately reflect... Authority (``FINRA''). The text of the proposed rule change is available at the Exchange, at the Exchange's...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-02-08
... Change Deleting the Text of NYSE Arca Equities Rule 5.2(b)(1) and Adopting New NYSE Arca Equities Rule... the text of NYSE Arca Equities Rule 5.2(b)(1) and adopt new NYSE Arca Equities Rule 5190 that is substantially the same as Financial Industry Regulatory Authority (``FINRA'') Rule 5190. The text of the...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-15
... Change Relating To Amend NYSE Arca Equities Rule 7.31(b) To Add Text Describing How Limit Orders Priced a... NYSE Arca Equities Rule 7.31(b) to add text describing how limit orders priced a specified percentage away from the national best bid or offer will be rejected by Exchange systems. The text of the proposed...
Lipid-converter, a framework for lipid manipulations in molecular dynamics simulations
Larsson, Per; Kasson, Peter M.
2014-01-01
Construction of lipid membrane and membrane protein systems for molecular dynamics simulations can be a challenging process. In addition, there are few available tools to extend existing studies by repeating simulations using other force fields and lipid compositions. To facilitate this, we introduce lipidconverter, a modular Python framework for exchanging force fields and lipid composition in coordinate files obtained from simulations. Force fields and lipids are specified by simple text files, making it easy to introduce support for additional force fields and lipids. The converter produces simulation input files that can be used for structural relaxation of the new membranes. PMID:25081234
Proof of cipher text ownership based on convergence encryption
NASA Astrophysics Data System (ADS)
Zhong, Weiwei; Liu, Zhusong
2017-08-01
Cloud storage systems save disk space and bandwidth through deduplication technology, but with the use of this technology has been targeted security attacks: the attacker can get the original file just use hash value to deceive the server to obtain the file ownership. In order to solve the above security problems and the different security requirements of cloud storage system files, an efficient information theory security proof of ownership scheme is proposed. This scheme protects the data through the convergence encryption method, and uses the improved block-level proof of ownership scheme, and can carry out block-level client deduplication to achieve efficient and secure cloud storage deduplication scheme.
Development of an indexed integrated neuroradiology reports for teaching file creation
NASA Astrophysics Data System (ADS)
Tameem, Hussain Z.; Morioka, Craig; Bennett, David; El-Saden, Suzie; Sinha, Usha; Taira, Ricky; Bui, Alex; Kangarloo, Hooshang
2007-03-01
The decrease in reimbursement rates for radiology procedures has placed even more pressure on radiology departments to increase their clinical productivity. Clinical faculties have less time for teaching residents, but with the advent and prevalence of an electronic environment that includes PACS, RIS, and HIS, there is an opportunity to create electronic teaching files for fellows, residents, and medical students. Experienced clinicians, who select the most appropriate radiographic image, and clinical information relevant to that patient, create these teaching files. Important cases are selected based on the difficulty in determining the diagnosis or the manifestation of rare diseases. This manual process of teaching file creation is time consuming and may not be practical under the pressure of increased demands on the radiologist. It is the goal of this research to automate the process of teaching file creation by manually selecting key images and automatically extracting key sections from clinical reports and laboratories. The text report is then processed for indexing to two standard nomenclatures UMLS and RADLEX. Interesting teaching files can then be queried based on specific anatomy and findings found within the clinical reports.
Chaos based video encryption using maps and Ikeda time delay system
NASA Astrophysics Data System (ADS)
Valli, D.; Ganesan, K.
2017-12-01
Chaos based cryptosystems are an efficient method to deal with improved speed and highly secured multimedia encryption because of its elegant features, such as randomness, mixing, ergodicity, sensitivity to initial conditions and control parameters. In this paper, two chaos based cryptosystems are proposed: one is the higher-dimensional 12D chaotic map and the other is based on the Ikeda delay differential equation (DDE) suitable for designing a real-time secure symmetric video encryption scheme. These encryption schemes employ a substitution box (S-box) to diffuse the relationship between pixels of plain video and cipher video along with the diffusion of current input pixel with the previous cipher pixel, called cipher block chaining (CBC). The proposed method enhances the robustness against statistical, differential and chosen/known plain text attacks. Detailed analysis is carried out in this paper to demonstrate the security and uniqueness of the proposed scheme.
Lamy, Jean-Baptiste; Ugon, Adrien; Berthelot, Hélène
2016-01-01
Potential adverse effects (AEs) of drugs are described in their summary of product characteristics (SPCs), a textual document. Automatic extraction of AEs from SPCs is useful for detecting AEs and for building drug databases. However, this task is difficult because each AE is associated with a frequency that must be extracted and the presentation of AEs in SPCs is heterogeneous, consisting of plain text and tables in many different formats. We propose a taxonomy for the presentation of AEs in SPCs. We set up natural language processing (NLP) and table parsing methods for extracting AEs from texts and tables of any format, and evaluate them on 10 SPCs. Automatic extraction performed better on tables than on texts. Tables should be recommended for the presentation of the AEs section of the SPCs.
Powers, Richard B.
1993-01-01
This study provides brief discussions of the petroleum geology, play descriptions, and resource estimates of 220 individually assessed exploration plays in all 80 onshore geologic provinces within nine assessment regions of the continental United States in 1989; these 80 onshore provinces were assessed in connection with the determination of the Nation's estimated undiscovered resources of oil and gas. The present report covers the 25 provinces that make up Region 1, Alaska, and Region 2, Pacific Coast. It is our intention to issue Region 3, Colorado Plateau and Basin and Range, and Region 4, Rocky Mountains and Northern Great Plains, in book form as well. Regions 5 through 9 (West Texas and Eastern New Mexico, Gulf Coast, Midcontinent, Eastern Interior and Atlantic Coast) will be released individually, as Open-File Reports.
1996-05-01
8-7 COMPLETE TEXT OF THESIS ROCKET PROPULSION FUNDEMENTALS EXPERIMENTAL DATA (MICROSOFT EXCEL FILES) 4 ANALYSIS WORKSHEETS (MATHSOFT MATHCAD FILES...up and running. At ~413,000, this represents a very small investment considering it encompasses the entire program. Similar programs run at... investment would be -needed along with over two man-years of effort. However, this is for the first flight article. Subsequent flight articles of identical
Cross-Matching of Very Large Catalogs
NASA Astrophysics Data System (ADS)
Martynov, M. V.; Bodryagin, D. V.
Modern astronomical catalogs and sky surveys, that contain billions of objects, belong to the "big data" data class. Existing available services have limited functionality and do not include all required and available catalogs. The software package ACrId (Astronomical Cross Identification) for cross-matching large astronomical catalogs, which uses an sphere pixelation algorithm HEALPix, ReiserFS file system and JSON-type text files for storage, has been developed at the Research Institution "Mykolaiv Astronomical Observatory".
Rep. Miller, George [D-CA-11
2013-03-06
House - 02/26/2014 Motion to Discharge Committee filed by Mr. Bishop (NY). Petition No: 113-7. (All Actions) Notes: On 2/26/2014, a motion was filed to discharge the Committee on Education and the Workforce from the consideration of H.R.1010. A discharge petition requires 218 signatures for further action. (Discharge Petition No. 113-7: text with signatures.) Tracker: This bill has the status IntroducedHere are the steps for Status of Legislation:
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sorokine, Alexandre
2011-10-01
Simple Ontology Format (SOFT) library and file format specification provides a set of simple tools for developing and maintaining ontologies. The library, implemented as a perl module, supports parsing and verification of the files in SOFt format, operations with ontologies (adding, removing, or filtering of entities), and converting of ontologies into other formats. SOFT allows users to quickly create ontologies using only a basic text editor, verify it, and portray it in a graph layout system using customized styles.
Providing for the consideration of legislation to reopen the Government.
Rep. Van Hollen, Chris [D-MD-8
2013-10-04
House - 10/12/2013 Motion to Discharge Committee filed by Mr. Van Hollen. Petition No: 113-5. (All Actions) Notes: On 10/12/2013, a motion was filed to discharge the Committee on Rules from the consideration of H.Res.372. A discharge petition requires 218 signatures for further action. (Discharge Petition No. 113-5: text with signatures.) Tracker: This bill has the status IntroducedHere are the steps for Status of Legislation:
Rep. Van Hollen, Chris [D-MD-8
2012-02-09
House - 07/12/2012 Motion to Discharge Committee filed by Mr. Van Hollen. Petition No: 112-4. (All Actions) Notes: On 7/12/2012, a motion was filed to discharge the Committees on House Administration and the Judiciary from the consideration of H.R.4010. A discharge petition requires 218 signatures for further action. (Discharge Petition No. 112-4: text with signatures.) Tracker: This bill has the status IntroducedHere are the steps for Status of Legislation:
Rep. Levin, Sander M. [D-MI-12
2012-07-30
House - 12/04/2012 Motion to Discharge Committee filed by Mr. Walz (MN). Petition No: 112-6. (All Actions) Notes: On 12/4/2012, a motion was filed to discharge the Committees on Ways and Means and the Budget from the consideration of H.R.15. A discharge petition requires 218 signatures for further action. (Discharge Petition No. 112-6: text with signatures.) Tracker: This bill has the status IntroducedHere are the steps for Status of Legislation:
The Design and Usage of the New Data Management Features in NASTRAN
NASA Technical Reports Server (NTRS)
Pamidi, P. R.; Brown, W. K.
1984-01-01
Two new data management features are installed in the April 1984 release of NASTRAN. These two features are the Rigid Format Data Base and the READFILE capability. The Rigid Format Data Base is stored on external files in card image format and can be easily maintained and expanded by the use of standard text editors. This data base provides the user and the NASTRAN maintenance contractor with an easy means for making changes to a Rigid Format or for generating new Rigid Formats without unnecessary compilations and link editing of NASTRAN. Each Rigid Format entry in the data base contains the Direct Matrix Abstraction Program (DMAP), along with the associated restart, DMAP sequence subset and substructure control flags. The READFILE capability allows an user to reference an external secondary file from the NASTRAN primary input file and to read data from this secondary file. There is no limit to the number of external secondary files that may be referenced and read.
ERIC Educational Resources Information Center
Qiao, Hong Liang; Sussex, Roland
1996-01-01
Presents methods for using the Longman Mini-Concordancer on tagged and parsed corpora rather than plain text corpora. The article discusses several aspects with models to be applied in the classroom as an aid to grammar learning. This paper suggests exercises suitable for teaching English to both native and nonnative speakers. (13 references)…
For and from Cyberspace: Conceptualizing Cyber Intelligence, Surveillance, and Reconnaissance
2012-12-01
intelligence. Cyber ISR, there- fore, “requires the development of algorithms and visualizations capa- bilities to make activities in the cyber domain... Pentagon , 19 January 2012), https://www.intelink.gov/inteldocs/action.php?kt_path_info=ktcore.actions.docu- ment.view&fDocumentId=1517681, defines...selected proxy servers, with successive levels of encryption and then de- cryption, before delivery to their final destination as plain text. W. Earl
Anna, L.O.
2009-01-01
The U.S. Geological Survey completed an assessment of the undiscovered oil and gas potential of the Powder River Basin in 2006. The assessment of undiscovered oil and gas used the total petroleum system concept, which includes mapping the distribution of potential source rocks and known petroleum accumulations and determining the timing of petroleum generation and migration. Geologically based, it focuses on source and reservoir rock stratigraphy, timing of tectonic events and the configuration of resulting structures, formation of traps and seals, and burial history modeling. The total petroleum system is subdivided into assessment units based on similar geologic characteristics and accumulation and petroleum type. In chapter 1 of this report, five total petroleum systems, eight conventional assessment units, and three continuous assessment units were defined and the undiscovered oil and gas resources within each assessment unit quantitatively estimated. Chapter 2 describes data used in support of the process being applied by the U.S. Geological Survey (USGS) National Oil and Gas Assessment (NOGA) project. Digital tabular data used in this report and archival data that permit the user to perform further analyses are available elsewhere on this CD-ROM. Computers and software may import the data without transcription from the Portable Document Format files (.pdf files) of the text by the reader. Because of the number and variety of platforms and software available, graphical images are provided as .pdf files and tabular data are provided in a raw form as tab-delimited text files (.tab files).
Federal Register 2010, 2011, 2012, 2013, 2014
2012-07-31
... Deleting Existing Rule Text Found in Rule 923NY(d)(1) to (4) Concerning the Number of ATP's Required by an... rule text found in Rule 923NY(d)(1) to (4) concerning the number of ATP's required by an NYSE Amex Options Market Maker to quote on the Exchange and move it to the Fee Schedule. The text of the proposed...
Reliable Electronic Text: The Elusive Prerequisite for a Host of Human Language Technologies
2010-09-30
is not always the case—for example, ligatures in Latin-fonts, and glyphs in Arabic fonts (King, 2008; Carrier, 2009). This complexity, and others...such effects can render electronic text useless for natural language processing ( NLP ). Typically, file converters do not expose the details of the...the many component NLP technologies typically used inside information extraction and text categorization applications, such as tokenization, part-of
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-07
.... The text of the proposed rule change is available on the Exchange's Web site at http://nasdaqtrader... and discussed any comments it received on the proposed rule change. The text of these statements may... Mining Corporation (``NEM''); Palm, Inc. (``PALM''); Pfizer, Inc. (``PFE''); ''); Potash Corp...
14 CFR 221.202 - The filing of tariffs and amendments to tariffs.
Code of Federal Regulations, 2010 CFR
2010-01-01
... time; (4) Proposed effective date; (5) Justification text; reference to geographic area and affected... part of official tariff”, in a manner acceptable to the Department. (c) A Historical File—which shall..., reference shall be made to that tariff containing the rule; (8) Rule text applicable to each fare at the...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-10-23
..., Incorporated (the ``Exchange'' or ``CBOE'') proposes to: (i) Make available historical Customized Option Pricing Service (``COPS'') data and (ii) revise the description of COPS. The text of the proposed rule.... The text of these statements may be examined at the places specified in Item IV below. The Exchange...
Chinese Communicating in the Culture Performance 2
ERIC Educational Resources Information Center
Walker, Galal
2005-01-01
This is the second text in a series of Mandarin Chinese learning texts. It continues with the theme of learning to communicate in various forms, especially with time and location, and the hanzi writing system is introduced. An MP3 file accompanies this book. Contents include: (1) Acknowledgments; (2) Introduction; (3) Unit Three, Time When:…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-10-19
... Exchange routing functionality and to describe available routing strategies in greater detail. The text of... received on the proposed rule change. The text of these statements may be examined at the places specified... Exchange routing functionality and to describe available routing strategies in greater detail. Also...
An Introduction to the Extensible Markup Language (XML).
ERIC Educational Resources Information Center
Bryan, Martin
1998-01-01
Describes Extensible Markup Language (XML), a subset of the Standard Generalized Markup Language (SGML) that is designed to make it easy to interchange structured documents over the Internet. Topics include Document Type Definition (DTD), components of XML, the use of XML, text and non-text elements, and uses for XML-coded files. (LRW)
NASA Astrophysics Data System (ADS)
Quintanar, Jessica; Khan, Shuhab D.; Fathy, Mohamed S.; Zalat, Abdel-Fattah A.
2013-11-01
The Pelusiac Branch was a distributary river in the Nile Delta that splits off from the main trunk of the Nile River as it flowed toward the Mediterranean. At approximately 25 A.D., it was chocked by sand and silt deposits from prograding beach accretion processes. The lower course of the river and its bifurcation point from the trunk of the Nile have been hypothesized based on ancient texts and maps, as well as previous research, but results have been inconsistent. Previous studies partly mapped the lower course of the Pelusiac River in the Plain of Tineh, east of the Suez Canal, but rapid urbanization related to the inauguration of the Peace Canal mega-irrigation project has covered any trace of the linear feature reported by these previous studies. The present study used multispectral remote sensing data of GeoEYE-1 and Landsat-TM to locate and accurately map the course of the defunct Pelusiac River within the Plain of Tineh. Remote sensing analysis identified a linear feature that is 135 m wide at its maximum and approximately 13 km long. It extends from the Pelusium ruins to the Suez Canal, just north of the Peace Canal. This remotely located linear feature corresponds to the path of the Pelusiac River during Roman times. Planform geomorphology was applied to determine the hydrological regime and paleodischarge of the river prior to becoming defunct. Planform analysis derived a bankfull paleodischarge value of ~ 5700 m3 s- 1 and an average discharge of 650 m3 s- 1, using the reach average for the interpreted Pelusiac River. The derived values show a river distributary similar in discharge to the modern dammed Damietta river. Field work completed in April of 2012 derived four sedimentary lithofacies of the upper formation on the plain that included pro-delta, delta-front and delta-plain depositional environments. Diatom and fossil mollusk samples were also identified that support coastal beach and lagoonal environments of deposition. Measured section columns and a shoreline parallel transect were also constructed to portray the paleogeography of the Mediterranean coastline in the Plain of Tineh at ~ 25 A.D. and indicate that the sampled study area is the downdrift margin of an asymmetric delta with barrier lagoon systems.
Displaying Annotations for Digitised Globes
NASA Astrophysics Data System (ADS)
Gede, Mátyás; Farbinger, Anna
2018-05-01
Thanks to the efforts of the various globe digitising projects, nowadays there are plenty of old globes that can be examined as 3D models on the computer screen. These globes usually contain a lot of interesting details that an average observer would not entirely discover for the first time. The authors developed a website that can display annotations for such digitised globes. These annotations help observers of the globe to discover all the important, interesting details. Annotations consist of a plain text title, a HTML formatted descriptive text and a corresponding polygon and are stored in KML format. The website is powered by the Cesium virtual globe engine.
ANLPS. Graphics Driver for PostScript Output
DOE Office of Scientific and Technical Information (OSTI.GOV)
Engert, D.E.
1987-09-01
ANLPS is a PostScript graphics device driver for use with the proprietary CA TELLAGRAF, CUECHART, and DISSPLA products. The driver allows the user to create and send text and graphics output in the Adobe Systems` PostScript page description language, which is accepted by many print devices. The PostScript output can be generated by TELLAGRAF 6.0 and DISSPLA 10.0. The files containing the PostScript output are sent to PostScript laser printers, such as the Apple LaserWriter. It is not necessary to initialize the printer, as the output for each plot is self-contained. All CA fonts are mapped to PostScript fonts, e.g.more » Swiss-Medium is mapped to Helvetica, and the mapping is easily changed. Hardware shading and hardware characters, area fill, and color are included. Auxiliary routines are provided which allow graphics files containing figures, logos, and diagrams to be merged with text files. The user can then position, scale, and rotate the figures on the output page in the reserved area specified.« less
Text to Speech (TTS) Capabilities for the Common Driver Trainer (CDT)
2010-10-01
harnessing in’leigle jalClpeno jocelyn linu ~ los angeles lottery margarine mathematlze mathematized mathematized meme memes memol...including Julie, Kate, and Paul . Based upon the names of the voices, it may be that the VoiceText capability is the technology being used currently on...DFTTSExportToFileEx(O, " Paul ", 1, 1033, "Testing the Digital Future Text-to-Speech SDK.", -1, -1, -1, -1, -1, DFTTS_ TEXT_ TYPE_ XML, "test.wav", 0, "", -1
Rowsell, Alison; Stuart, Beth; Hayter, Victoria; Little, Paul; Ganahl, Kristin; Müller, Gabriele; Doyle, Gerardine; Chang, Peter; Lyles, Courtney R; Nutbeam, Don; Yardley, Lucy
2017-01-01
Background Developing accessible Web-based materials to support diabetes self-management in people with lower levels of health literacy is a continuing challenge. Objective The objective of this international study was to develop a Web-based intervention promoting physical activity among people with type 2 diabetes to determine whether audiovisual presentation and interactivity (quizzes, planners, tailoring) could help to overcome the digital divide by making digital interventions accessible and effective for people with all levels of health literacy. This study also aimed to determine whether these materials can improve health literacy outcomes for people with lower levels of health literacy and also be effective for people with higher levels of health literacy. Methods To assess the impact of interactivity and audiovisual features on usage, engagement, and health literacy outcomes, we designed two versions of a Web-based intervention (one interactive and one plain-text version of the same content) to promote physical activity in people with type 2 diabetes. We randomly assigned participants from the United Kingdom, Austria, Germany, Ireland, and Taiwan to either an interactive or plain-text version of the intervention in English, German, or Mandarin. Intervention usage was objectively recorded by the intervention software. Self-report measures were taken at baseline and follow-up (immediately after participants viewed the intervention) and included measures of health literacy, engagement (website satisfaction and willingness to recommend the intervention to others), and health literacy outcomes (diabetes knowledge, enablement, attitude, perceived behavioral control, and intention to undertake physical activity). Results In total, 1041 people took part in this study. Of the 1005 who completed health literacy information, 268 (26.67%) had intermediate or low levels of health literacy. The interactive intervention overall did not produce better outcomes than did the plain-text version. Participants in the plain-text intervention group looked at significantly more sections of the intervention (mean difference –0.47, 95% CI –0.64 to –0.30, P<.001), but this did not lead to better outcomes. Health literacy outcomes, including attitudes and intentions to engage in physical activity, significantly improved following the intervention for participants in both intervention groups. These improvements were similar across higher and lower health literacy levels and in all countries. Participants in the interactive intervention group had acquired more diabetes knowledge (mean difference 0.80, 95% CI 0.65-0.94, P<.001). Participants from both groups reported high levels of website satisfaction and would recommend the website to others. Conclusions Following established practice for simple, clear design and presentation and using a person-based approach to intervention development, with in-depth iterative feedback from users, may be more important than interactivity and audiovisual presentations when developing accessible digital health interventions to improve health literacy outcomes. ClinicalTrial International Standard Randomized Controlled Trial Number (ISRCTN): 43587048; http://www.isrctn.com/ISRCTN43587048. (Archived by WebCite at http://www.webcitation.org/6nGhaP9bv) PMID:28115299
Wildfire Disaster Funding Act of 2014
Rep. Simpson, Michael K. [R-ID-2
2014-02-05
House - 07/11/2014 Motion to Discharge Committee filed by Mr. Peters (CA). Petition No: 113-10. (All Actions) Notes: On 7/11/2014, a motion was filed to discharge the Committees on the Budget, Agriculture, and Natural Resources from the consideration of H.R.3992. A discharge petition requires 218 signatures for further action. (Discharge Petition No. 113-10: text with signatures.) Tracker: This bill has the status IntroducedHere are the steps for Status of Legislation:
2018-01-18
processing. Specifically, the method described herein uses wgrib2 commands along with a Python script or program to produce tabular text files that in...It makes use of software that is readily available and can be implemented on many computer systems combined with relatively modest additional...example), extracts appropriate information, and lists the extracted information in a readable tabular form. The Python script used here is described in
Ensuring Pay for Our Military Act of 2011
Rep. Gohmert, Louie [R-TX-1
2011-03-31
House - 07/14/2011 Motion to Discharge Committee filed by Mr. Gohmert. Petition No: 112-2. (All Actions) Notes: On 7/14/2011, a motion was filed to discharge the Committees on Armed Services and Transportation and Infrastructure from the consideration of H.R.1297. A discharge petition requires 218 signatures for further action. (Discharge Petition No. 112-2: text with signatures.) Tracker: This bill has the status IntroducedHere are the steps for Status of Legislation:
GeNemo: a search engine for web-based functional genomic data.
Zhang, Yongqing; Cao, Xiaoyi; Zhong, Sheng
2016-07-08
A set of new data types emerged from functional genomic assays, including ChIP-seq, DNase-seq, FAIRE-seq and others. The results are typically stored as genome-wide intensities (WIG/bigWig files) or functional genomic regions (peak/BED files). These data types present new challenges to big data science. Here, we present GeNemo, a web-based search engine for functional genomic data. GeNemo searches user-input data against online functional genomic datasets, including the entire collection of ENCODE and mouse ENCODE datasets. Unlike text-based search engines, GeNemo's searches are based on pattern matching of functional genomic regions. This distinguishes GeNemo from text or DNA sequence searches. The user can input any complete or partial functional genomic dataset, for example, a binding intensity file (bigWig) or a peak file. GeNemo reports any genomic regions, ranging from hundred bases to hundred thousand bases, from any of the online ENCODE datasets that share similar functional (binding, modification, accessibility) patterns. This is enabled by a Markov Chain Monte Carlo-based maximization process, executed on up to 24 parallel computing threads. By clicking on a search result, the user can visually compare her/his data with the found datasets and navigate the identified genomic regions. GeNemo is available at www.genemo.org. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.
RCHILD - an R-package for flexible use of the landscape evolution model CHILD
NASA Astrophysics Data System (ADS)
Dietze, Michael
2014-05-01
Landscape evolution models provide powerful approaches to numerically assess earth surface processes, to quantify rates of landscape change, infer sediment transfer rates, estimate sediment budgets, investigate the consequences of changes in external drivers on a geomorphic system, to provide spatio-temporal interpolations between known landscape states or to test conceptual hypotheses. CHILD (Channel-Hillslope Integrated Landscape Development Model) is one of the most-used models of landscape change in the context of at least tectonic and geomorphologic process interactions. Running CHILD from command line and working with the model output can be a rather awkward task (static model control via text input file, only numeric output in text files). The package RCHILD is a collection of functions for the free statistical software R that help using CHILD in a flexible, dynamic and user-friendly way. The comprised functions allow creating maps, real-time scenes, animations and further thematic plots from model output. The model input files can be modified dynamically and, hence, (feedback-related) changes in external factors can be implemented iteratively. Output files can be written to common formats that can be readily imported to standard GIS software. This contribution presents the basic functionality of the model CHILD as visualised and modified by the package. A rough overview of the available functions is given. Application examples help to illustrate the great potential of numeric modelling of geomorphologic processes.
GRIDGEN Version 1.0: a computer program for generating unstructured finite-volume grids
Lien, Jyh-Ming; Liu, Gaisheng; Langevin, Christian D.
2015-01-01
GRIDGEN is a computer program for creating layered quadtree grids for use with numerical models, such as the MODFLOW–USG program for simulation of groundwater flow. The program begins by reading a three-dimensional base grid, which can have variable row and column widths and spatially variable cell top and bottom elevations. From this base grid, GRIDGEN will continuously divide into four any cell intersecting user-provided refinement features (points, lines, and polygons) until the desired level of refinement is reached. GRIDGEN will then smooth, or balance, the grid so that no two adjacent cells, including overlying and underlying cells, differ by more than a user-specified level tolerance. Once these gridding processes are completed, GRIDGEN saves a tree structure file so that the layered quadtree grid can be quickly reconstructed as needed. Once a tree structure file has been created, GRIDGEN can then be used to (1) export the layered quadtree grid as a shapefile, (2) export grid connectivity and cell information as ASCII text files for use with MODFLOW–USG or other numerical models, and (3) intersect the grid with shapefiles of points, lines, or polygons, and save intersection output as ASCII text files and shapefiles. The GRIDGEN program is demonstrated by creating a layered quadtree grid for the Biscayne aquifer in Miami-Dade County, Florida, using hydrologic features to control where refinement is added.
Carrell, David; Malin, Bradley; Aberdeen, John; Bayer, Samuel; Clark, Cheryl; Wellner, Ben; Hirschman, Lynette
2013-01-01
Secondary use of clinical text is impeded by a lack of highly effective, low-cost de-identification methods. Both, manual and automated methods for removing protected health information, are known to leave behind residual identifiers. The authors propose a novel approach for addressing the residual identifier problem based on the theory of Hiding In Plain Sight (HIPS). HIPS relies on obfuscation to conceal residual identifiers. According to this theory, replacing the detected identifiers with realistic but synthetic surrogates should collectively render the few 'leaked' identifiers difficult to distinguish from the synthetic surrogates. The authors conducted a pilot study to test this theory on clinical narrative, de-identified by an automated system. Test corpora included 31 oncology and 50 family practice progress notes read by two trained chart abstractors and an informaticist. Experimental results suggest approximately 90% of residual identifiers can be effectively concealed by the HIPS approach in text containing average and high densities of personal identifying information. This pilot test suggests HIPS is feasible, but requires further evaluation. The results need to be replicated on larger corpora of diverse origin under a range of detection scenarios. Error analyses also suggest areas where surrogate generation techniques can be refined to improve efficacy. If these results generalize to existing high-performing de-identification systems with recall rates of 94-98%, HIPS could increase the effective de-identification rates of these systems to levels above 99% without further advancements in system recall. Additional and more rigorous assessment of the HIPS approach is warranted.
Grauch, V.J.; Kucks, Robert P.
1997-01-01
This report presents principal facts for gravity stations collected along profiles near the Osgood Mountains and Slumbering Hills, north- central Nevada. These include (1) data collected near the Osgood Mountains by U. S. Geological Survey (USGS) personnel in the years 1989, 1990, and 1993; and (2) data released to the USGS by Battle Mountain Gold (now Battle Mountain Exploration) that were collected in 1989 near the Osgood Mountains and the Slumbering Hills. The digital data, text of this report (figures in separate files) can be downloaded via 'anonymous ftp' from a USGS system named greenwood.cr.usgs.gov (136.177.21.122). The files are located in a directory named /pub/open-file-reports/ofr-97-0085 and are described in an ASCII file named readme.txt. This information is also contained below in Table 1.
Desktop document delivery using portable document format (PDF) files and the Web.
Shipman, J P; Gembala, W L; Reeder, J M; Zick, B A; Rainwater, M J
1998-01-01
Desktop access to electronic full-text literature was rated one of the most desirable services in a client survey conducted by the University of Washington Libraries. The University of Washington Health Sciences Libraries (UW HSL) conducted a ten-month pilot test from August 1996 to May 1997 to determine the feasibility of delivering electronic journal articles via the Internet to remote faculty. Articles were scanned into Adobe Acrobat Portable Document Format (PDF) files and delivered to individuals using Multipurpose Internet Mail Extensions (MIME) standard e-mail attachments and the Web. Participants retrieved scanned articles and used the Adobe Acrobat Reader software to view and print files. The pilot test required a special programming effort to automate the client notification and file deletion processes. Test participants were satisfied with the pilot test despite some technical difficulties. Desktop delivery is now offered as a routine delivery method from the UW HSL. PMID:9681165
Desktop document delivery using portable document format (PDF) files and the Web.
Shipman, J P; Gembala, W L; Reeder, J M; Zick, B A; Rainwater, M J
1998-07-01
Desktop access to electronic full-text literature was rated one of the most desirable services in a client survey conducted by the University of Washington Libraries. The University of Washington Health Sciences Libraries (UW HSL) conducted a ten-month pilot test from August 1996 to May 1997 to determine the feasibility of delivering electronic journal articles via the Internet to remote faculty. Articles were scanned into Adobe Acrobat Portable Document Format (PDF) files and delivered to individuals using Multipurpose Internet Mail Extensions (MIME) standard e-mail attachments and the Web. Participants retrieved scanned articles and used the Adobe Acrobat Reader software to view and print files. The pilot test required a special programming effort to automate the client notification and file deletion processes. Test participants were satisfied with the pilot test despite some technical difficulties. Desktop delivery is now offered as a routine delivery method from the UW HSL.
NASA Technical Reports Server (NTRS)
1981-01-01
The set of computer programs described allows for data definition, data input, and data transfer between the LSI-11 microcomputers and the VAX-11/780 minicomputer. Program VAXCOM allows for a simple method of textual file transfer from the LSI to the VAX. Program LSICOM allows for easy file transfer from the VAX to the LSI. Program TTY changes the LSI-11 operators console to the LSI's printing device. Program DICTIN provides a means for defining a data set for input to either computer. Program DATAIN is a simple to operate data entry program which is capable of building data files on either machine. Program LEDITV is an extremely powerful, easy to use, line oriented text editor. Program COPYSBF is designed to print out textual files on the line printer without character loss from FORTRAN carriage control or wide record transfer.
Beeman, William R.; Obuch, Raymond C.; Brewton, James D.
1996-01-01
This CD-ROM contains files in support of the 1995 USGS National assessment of United States oil and gas resources (DDS-30), which was published separately and summarizes the results of a 3-year study of the oil and gas resources of the onshore and state waters of the United States. The study describes about 560 oil and gas plays in the United States--confirmed and hypothetical, conventional and unconventional. A parallel study of the Federal offshore is being conducted by the U.S. Minerals Management Service. This CD-ROM contains files in multiple formats, so that almost any computer user can import them into word processors and mapping software packages. No proprietary data are released on this CD-ROM. The complete text of DDS-30 is also available, as well as many figures. A companion CD-ROM (DDS-36) includes the tabular data, the programs, and the same text data, but none of the map data.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-18
...-wide trading halt due to extraordinary market volatility. The text of the proposed rule change is below... rule change and discussed any comments it received on the proposed rule change. The text of these... are calculated by the Processors.\\9\\ When the National Best Bid (Offer) is below (above) the Lower...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-07-07
... 1000'') and specified Exchange Traded Products (``ETP'') to the pilot rule. The text of the proposed...\\ Changes are marked to the rule text that appears in the electronic manual of NASDAQ found at http... last-sale price disseminated by a network processor over a five minute rolling period measured...
77 FR 74116 - Extension of Tolerances for Emergency Exemptions (Multiple Chemicals)
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-13
.../text/text-idx?&c=ecfr&tpl=/ecfrbrowse/Title40/40tab_02.tpl . C. How can I file an objection or hearing... Management and Budget (OMB) has exempted these types of actions from review under Executive Order 12866.... 601 et seq.), do not apply. This final rule directly regulates growers, food processors, food handlers...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-01
... the Regulation NMS Plan to Address Extraordinary Market Volatility. The text of the proposed rule... text of these statements may be examined at the places specified in Item IV below. FINRA has prepared... thereunder.\\6\\ The Limit Up-Limit Down mechanism is intended to address the type of sudden price movements...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-06
... changes described herein are applicable to EDGA Members. The text of the proposed rule change is available... change and discussed any comments it received on the proposed rule change. The text of these statements... Liquidity Report is a data feed that contains all historical order information for orders routed to away...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-15
... options to away markets. The text of the proposed rule change is provided in Exhibit 5. The text of the... routing and executing certain orders in equity options to away markets. The Exchange proposes to amend Routing Fees for the following away markets: BATS Exchange, Inc. (``BATS''), BOX Options Exchange LLC...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-05-10
... Change To Reduce the Minimum Size of the Nominating and Governance Committee May 4, 2011. Pursuant to... size of the CBOE Nominating and Governance Committee. The text of the proposed amendments to CBOE's... proposed rule change and discussed any comments it received on the proposed rule change. The text of these...
Choe, Sanggil; Kim, Suncheun; Choi, Hyeyoung; Choi, Hwakyoung; Chung, Heesun; Hwang, Bangyeon
2010-06-15
Agilent GC-MS MSD Chemstation offers automated library search report for toxicological screening using total ion chromatogram (TIC) and mass spectroscopy in normal mode. Numerous peaks appear in the chromatogram of biological specimen such as blood or urine and often large migrating peaks obscure small target peaks, in addition, any target peaks of low abundance regularly give wrong library search result or low matching score. As a result, retention time and mass spectrum of all the peaks in the chromatogram have to be checked to see if they are relevant. These repeated actions are very tedious and time-consuming to toxicologists. MSD Chemstation software operates using a number of macro files which give commands and instructions on how to work on and extract data from the chromatogram and spectroscopy. These macro files are developed by the own compiler of the software. All the original macro files can be modified and new macro files can be added to the original software by users. To get more accurate results with more convenient method and to save time for data analysis, we developed new macro files for reports generation and inserted new menus in the Enhanced Data Analysis program. Toxicological screening reports generated by these new macro files are in text mode or graphic mode and these reports can be generated with three different automated subtraction options. Text reports have Brief mode and Full mode and graphic reports have the option with or without mass spectrum mode. Matched mass spectrum and matching score for detected compounds are printed in reports by modified library searching modules. We have also developed an independent application program named DrugMan. This program manages drug groups, lists and parameters that are in use in MSD Chemstation. The incorporation of DrugMan with modified macro modules provides a powerful tool for toxicological screening and save a lot of valuable time on toxicological work. (c) 2010 Elsevier Ireland Ltd. All rights reserved.
An OpenMI Implementation of a Water Resources System using Simple Script Wrappers
NASA Astrophysics Data System (ADS)
Steward, D. R.; Aistrup, J. A.; Kulcsar, L.; Peterson, J. M.; Welch, S. M.; Andresen, D.; Bernard, E. A.; Staggenborg, S. A.; Bulatewicz, T.
2013-12-01
This team has developed an adaption of the Open Modelling Interface (OpenMI) that utilizes Simple Script Wrappers. Code is made OpenMI compliant through organization within three modules that initialize, perform time steps, and finalize results. A configuration file is prepared that specifies variables a model expects to receive as input and those it will make available as output. An example is presented for groundwater, economic, and agricultural production models in the High Plains Aquifer region of Kansas. Our models use the programming environments in Scilab and Matlab, along with legacy Fortran code, and our Simple Script Wrappers can also use Python. These models are collectively run within this interdisciplinary framework from initial conditions into the future. It will be shown that by applying model constraints to one model, the impact may be accessed on changes to the water resources system.
Korcok, M
1995-01-01
In the ¿war zones¿ of Texas, lawyers use billboards, television commercials and Yellow Page advertisements to announce their availability to help the ¿unjustly injured,¿ and medicolegal lawsuits are as common as the rain that sweeps in from the nearby Gulf of Mexico. Almost 75% of the suits are dismissed without award or settlement, since many are plainly frivolous. However, even these can mean torment for physicians, who have to hire lawyers, answer charges, collect paperwork, take time off work for depositions and consultations, and then worry about how insurers will react the next time premiums are due--even if they are cleared. Texas estimates that defensive medicine practised because of legal fears costs the state at least $702 million annually, spending that is bound to continue as long as one lawsuit is filed annually for every 5.3 doctors in the state. PMID:7553498
Capitation among Medicare beneficiaries.
Bazos, D A; Fisher, E S
1999-01-01
The Medicare program has promoted capitation as a way to contain costs. About 15% of Medicare beneficiaries nationwide are currently under capitation, but tremendous regional variation exists. The proportion of Medicare beneficiaries who have enrolled in risk-contract plans in individual states and in the 25 largest metropolitan areas in the United States. Health Care Financing Administration data files. Medicare beneficiaries are most likely to be under capitation in Arizona (38%) and California (37%). Eight other states have capitation rates greater than 20%: Colorado, Florida, Rhode Island, Oregon, Washington, Pennsylvania, Massachusetts, and Nevada. Thirty states, largely in the Great Plains area and the southern United States, have capitation rates less than 10%. Four major metropolitan areas have market penetration rates greater than 40%: San Bernardino, California; San Diego, California; Phoenix, Arizona; and Miami, Florida. Little penetration exists outside of metropolitan areas. Capitation in Medicare is a regional and predominantly an urban phenomenon.
Optical design of cipher block chaining (CBC) encryption mode by using digital holography
NASA Astrophysics Data System (ADS)
Gil, Sang Keun; Jeon, Seok Hee; Jung, Jong Rae; Kim, Nam
2016-03-01
We propose an optical design of cipher block chaining (CBC) encryption by using digital holographic technique, which has higher security than the conventional electronic method because of the analog-type randomized cipher text with 2-D array. In this paper, an optical design of CBC encryption mode is implemented by 2-step quadrature phase-shifting digital holographic encryption technique using orthogonal polarization. A block of plain text is encrypted with the encryption key by applying 2-step phase-shifting digital holography, and it is changed into cipher text blocks which are digital holograms. These ciphered digital holograms with the encrypted information are Fourier transform holograms and are recorded on CCDs with 256 gray levels quantized intensities. The decryption is computed by these encrypted digital holograms of cipher texts, the same encryption key and the previous cipher text. Results of computer simulations are presented to verify that the proposed method shows the feasibility in the high secure CBC encryption system.
7 CFR 650.25 - Flood-plain management.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 7 Agriculture 6 2010-01-01 2010-01-01 false Flood-plain management. 650.25 Section 650.25... Flood-plain management. Through proper planning, flood plains can be managed to reduce the threat to... encourages sound flood-plain management decisions by land users. (a) Policy—(1) General. NRCS provides...
7 CFR 650.25 - Flood-plain management.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 7 Agriculture 6 2011-01-01 2011-01-01 false Flood-plain management. 650.25 Section 650.25... Flood-plain management. Through proper planning, flood plains can be managed to reduce the threat to... encourages sound flood-plain management decisions by land users. (a) Policy—(1) General. NRCS provides...
Ronald E. Sosebee; David B. Wester
2007-01-01
The Southern High Plains of Texas (also known as the Llano Estacado) are in the southernmost subdivision of the High Plains section of the Great Plains Physiographic Province. Most of the Southern Great Plains is comprised of upland sites that were once grasslands dominated mostly by shortgrass plains that supported large herds of native buffalo. However, topographic...
Bankey, Viki; Grauch, V.J.S.; ,
2004-01-01
This report contains digital data, image files, and text files describing data formats and survey procedures for aeromagnetic data collected during a helicopter geophysical survey in northern New Mexico during October 2003. The survey covers the Town of Taos, Taos Pueblo, and surrounding communities in Taos County. Several derivative products from these data are also presented, including reduced-to-pole, horizontal gradient magnitude, and downward continued grids and images.
Bankey, Viki; Grauch, V.J.S.; ,
2004-01-01
This CD-ROM contains digital data, image files, and text files describing data formats and survey procedures for aeromagnetic data collected during a helicopter geophysical survey in southern Colorado during October 2003. The survey covers the town of Blanca and surrounding communities in Alamosa and Costilla Counties. Several derivative products from these data are also presented, including reduced-to-pole, horizontal gradient magnitude, and downward continued grids and images.
2014-12-01
format for the orientation of a body. It further recommends support- ing data be stored in a text PCK. These formats are used by the SPICE system...INTRODUCTION These file formats were developed for and are used by the SPICE system, developed by the Navigation and Ancillary Information Facility (NAIF...of NASA’s Jet Propulsion Laboratory (JPL). Most users will want to use either the SPICE libraries or CALCEPH, developed by the Institut de mécanique
Validation Results for LEWICE 2.0. [Supplement
NASA Technical Reports Server (NTRS)
Wright, William B.; Rutkowski, Adam
1999-01-01
Two CD-ROMs contain experimental ice shapes and code prediction used for validation of LEWICE 2.0 (see NASA/CR-1999-208690, CASI ID 19990021235). The data include ice shapes for both experiment and for LEWICE, all of the input and output files for the LEWICE cases, JPG files of all plots generated, an electronic copy of the text of the validation report, and a Microsoft Excel(R) spreadsheet containing all of the quantitative measurements taken. The LEWICE source code and executable are not contained on the discs.
Advanced ADA Workshop Held in Biloxi, Mississippi on 24-27 January 1989
1989-01-27
Software Engineering (Break at 2:30j 6:30-8:00 Keesler AFB Reception Officers’ Club WEDNESDAY - 25 JANIJ1APY 9:00-12:00 Bldc 1002 Generics (Break at 10...TextIO.FileType: -- NO! procedure Wrong; -- problem is FileType is limited private Object Parameters A More Useful Example generic Control -Block : in out... control the precision used Float Type Parameters An Example generic type FloatType is digits <>; function Sqrt(X : FloatType) return FloatType
Rep. Blackburn, Marsha [R-TN-7
2009-01-09
House - 07/23/2009 Motion to Discharge Committee filed by Mrs. Blackburn. Petition No: 111-5. (All Actions) Notes: On 7/23/2009, a motion was filed to discharge the Committee on Energy and Commerce from consideration of H.R.391. A discharge petition requires 218 signatures for further action. (Discharge Petition No. 111-5: text with signatures.) Tracker: This bill has the status IntroducedHere are the steps for Status of Legislation:
Conversion of the Forces Mobilization Model (FORCEMOB) from FORTRAN to C
2015-08-01
300 K !’"vale Data 18.192 K 136 K Slack 2.560 K 84 K Mapped File 412 K 412 K Sharel!ble 5.444 K 4.440 K Managed Heap - r age Table l.klusable...the C version of FORCEMOB is ready for operational use. This page is intentionally blank. v Contents 1. Introduction...without a graphical user interface (GUI): once run, FORCEMOB reads user-created input files, performs mathematical operations upon them, and outputs text
To eliminate automatic pay adjustments for Members of Congress, and for other purposes.
Rep. Latta, Robert E. [R-OH-5
2009-01-15
House - 03/23/2009 Motion to Discharge Committee filed by Mr. Latta. Petition No: 111-1. (All Actions) Notes: On 3/23/2009, a motion was filed to discharge the Committee on House Administration, and the Committee on Oversight and Government Reform from consideration of H.R.581. A discharge petition requires 218 signatures for further action. (Discharge Petition No. 111-1: text with... Tracker: This bill has the status IntroducedHere are the steps for Status of Legislation:
SciDB versus Spark: A Preliminary Comparison Based on an Earth Science Use Case
NASA Astrophysics Data System (ADS)
Clune, T.; Kuo, K. S.; Doan, K.; Oloso, A.
2015-12-01
We compare two Big Data technologies, SciDB and Spark, for performance, usability, and extensibility, when applied to a representative Earth science use case. SciDB is a new-generation parallel distributed database management system (DBMS) based on the array data model that is capable of handling multidimensional arrays efficiently but requires lengthy data ingest prior to analysis, whereas Spark is a fast and general engine for large scale data processing that can immediately process raw data files and thereby avoid the ingest process. Once data have been ingested, SciDB is very efficient in database operations such as subsetting. Spark, on the other hand, provides greater flexibility by supporting a wide variety of high-level tools including DBMS's. For the performance aspect of this preliminary comparison, we configure Spark to operate directly on text or binary data files and thereby limit the need for additional tools. Arguably, a more appropriate comparison would involve exploring other configurations of Spark which exploit supported high-level tools, but that is beyond our current resources. To make the comparison as "fair" as possible, we export the arrays produced by SciDB into text files (or converting them to binary files) for the intake by Spark and thereby avoid any additional file processing penalties. The Earth science use case selected for this comparison is the identification and tracking of snowstorms in the NASA Modern Era Retrospective-analysis for Research and Applications (MERRA) reanalysis data. The identification portion of the use case is to flag all grid cells of the MERRA high-resolution hourly data that satisfies our criteria for snowstorm, whereas the tracking portion connects flagged cells adjacent in time and space to form a snowstorm episode. We will report the results of our comparisons at this presentation.
Grechi, Daniele
2016-01-01
On March 2015, the Environmental Protection Agency of Tuscany Region (Central Italy) and the Laboratory of monitoring and environmental modelling published a Report on spatial representativeness of monitoring stations for Tuscan air quality, where they supported the decommissioning of modelling stations located in the Florentine Plain. The stations of Signa, Scandicci, and Firenze-Bassi, located in a further South area, were considered representative Believing that air quality of the Plain could be evaluated by these stations is a stretch. In this text the author show the inconsistency of the conclusion of the Report through correlation graphs comparing daily means of PM10 detected in the disposed stations and in the active ones, showing relevant differences between the reported values and the days when the limits are exceeded. The discrepancy is due to the fact that uncertainty of theoretical estimates is greater than the differences recorded by the stations considered as a reference and the areas they may represent. The area of the Plain has a population of 150,000 individuals and it is subject to a heavy environmental pression, which will change for the urban works planned for the coming years. The population's legitimate request for the analytical monitoring of air pollution could be met through the organization of participated monitoring based on the use of low-cost innovative tools.
Displaying Composite and Archived Soundings in the Advanced Weather Interactive Processing System
NASA Technical Reports Server (NTRS)
Barrett, Joe H., III; Volkmer, Matthew R.; Blottman, Peter F.; Sharp, David W.
2008-01-01
In a previous task, the Applied Meteorology Unit (AMU) developed spatial and temporal climatologies of lightning occurrence based on eight atmospheric flow regimes. The AMU created climatological, or composite, soundings of wind speed and direction, temperature, and dew point temperature at four rawinsonde observation stations at Jacksonville, Tampa, Miami, and Cape Canaveral Air Force Station, for each of the eight flow regimes. The composite soundings were delivered to the National Weather Service (NWS) Melbourne (MLB) office for display using the National version of the Skew-T Hodograph analysis and Research Program (NSHARP) software program. The NWS MLB requested the AMU make the composite soundings available for display in the Advanced Weather Interactive Processing System (AWIPS), so they could be overlaid on current observed soundings. This will allow the forecasters to compare the current state of the atmosphere with climatology. This presentation describes how the AMU converted the composite soundings from NSHARP Archive format to Network Common Data Form (NetCDF) format, so that the soundings could be displayed in AWl PS. The NetCDF is a set of data formats, programming interfaces, and software libraries used to read and write scientific data files. In AWIPS, each meteorological data type, such as soundings or surface observations, has a unique NetCDF format. Each format is described by a NetCDF template file. Although NetCDF files are in binary format, they can be converted to a text format called network Common data form Description Language (CDL). A software utility called ncgen is used to create a NetCDF file from a CDL file, while the ncdump utility is used to create a CDL file from a NetCDF file. An AWIPS receives soundings in Binary Universal Form for the Representation of Meteorological data (BUFR) format (http://dss.ucar.edu/docs/formats/bufr/), and then decodes them into NetCDF format. Only two sounding files are generated in AWIPS per day. One file contains all of the soundings received worldwide between 0000 UTC and 1200 UTC, and the other includes all soundings between 1200 UTC and 0000 UTC. In order to add the composite soundings into AWIPS, a procedure was created to configure, or localize, AWIPS. This involved modifying and creating several configuration text files. A unique fourcharacter site identifier was created for each of the 32 soundings so each could be viewed separately. The first three characters were based on the site identifier of the observed sounding, while the last character was based on the flow regime. While researching the localization process for soundings, the AMU discovered a method of archiving soundings so old soundings would not get purged automatically by AWl PS. This method could provide an alternative way of localizing AWl PS for composite soundings. In addition, this would allow forecasters to use archived soundings in AWIPS for case studies. A test sounding file in NetCDF format was written in order to verify the correct format for soundings in AWIPS. After the file was viewed successfully in AWIPS, the AMU wrote a software program in the Tool Command Language/Tool Kit (Tcl/Tk) language to convert the 32 composite soundings from NSHARP Archive to CDL format. The ncgen utility was then used to convert the CDL file to a NetCDF file. The NetCDF file could then be read and displayed in AWIPS.
Geologic Mapping of the V-36 Thetis Regio Quadrangle: 2008 Progress Report
NASA Technical Reports Server (NTRS)
Basilevsky, A. T.; Head, James W.
2008-01-01
As a result of mapping, eleven material stratigraphic units and three structural units have been identified and mapped. The material units include (from older to younger): tessera terrain material (tt), material of densely fractured plains (pdf), material of fractured and ridged plains (pfr), material of shield plains (psh), material of plains with wrinkle ridges (pwr), material of smooth plains of intermediate brightness (psi), material of radardark smooth plains (psd), material of lineated plains (pli) material of lobate plains (plo), material of craters having no radar-dark haloes (c1), and material of craters having clear dark haloes (c2). The morphologies and probably the nature of the material units in the study area are generally similar to those observed in other regions of Venus [2]. The youngest units are lobate plains (plo) which here typically look less lobate than in other areas of the planet. Close to them in age are smooth plains which are indeed smooth and represented by two varieties mentioned above. Lineated plains (pli) are densely fractured in a geometrically regular way. Plains with wrinkle ridges, being morphologically similar to those observed in other regions, here occupy unusually small areas. Shield (psh) plains here are also not abundant. Locally they show wrinkle ridging. Fractured and ridged plains (pfr), which form in other regions, the so called ridge belts, are observed as isolated areas of clusters of ridged plains surrounded by other units. Densely fractured plains (pdf) are present in relatively small areas in association with coronae and corona-like features. Tessera terrain (tt) is dissected by structures oriented in two or more directions. Structures are so densely packed that the morphology (and thus nature) of the precursor terrain is not known. Structural units include tessera transitional terrain (ttt), fracture belts (fb) and rifted terrain (rt). Tessera transitional terrain was first identified and mapped by [4] as areas of fractured and ridged plains (pfr) and densely fractured plains (pdf) deformed by transverse faults that made it formally resemble tessera terrain (tt). The obvious difference between units tt and ttt is the recognizable morphology of precursor terrain of unit ttt. Fracture belts are probably ancient rift zones [3]. Rifted terrain (rt), as in other regions of Venus, is so saturated with faults that according to the recommendation of [1, 5] it should be mapped as a structural unit.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-05-16
.../adams.html . From this page, the public can gain entry into ADAMS, which provides text and image files... Commission. Martin J. Virgilio, Deputy Executive Director for Reactor and Preparedness Programs. [FR Doc...
44 CFR 10.14 - Flood plains and wetlands.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 44 Emergency Management and Assistance 1 2010-10-01 2010-10-01 false Flood plains and wetlands. 10... Flood plains and wetlands. For any action taken by FEMA in a flood plain or wetland, the provisions of... Executive Order 11988, Flood Plain Management, and Executive Order 11990, Protection of Wetlands (44 CFR...
Using compressed images in multimedia education
NASA Astrophysics Data System (ADS)
Guy, William L.; Hefner, Lance V.
1996-04-01
The classic radiologic teaching file consists of hundreds, if not thousands, of films of various ages, housed in paper jackets with brief descriptions written on the jackets. The development of a good teaching file has been both time consuming and voluminous. Also, any radiograph to be copied was unavailable during the reproduction interval, inconveniencing other medical professionals needing to view the images at that time. These factors hinder motivation to copy films of interest. If a busy radiologist already has an adequate example of a radiological manifestation, it is unlikely that he or she will exert the effort to make a copy of another similar image even if a better example comes along. Digitized radiographs stored on CD-ROM offer marked improvement over the copied film teaching files. Our institution has several laser digitizers which are used to rapidly scan radiographs and produce high quality digital images which can then be converted into standard microcomputer (IBM, Mac, etc.) image format. These images can be stored on floppy disks, hard drives, rewritable optical disks, recordable CD-ROM disks, or removable cartridge media. Most hospital computer information systems include radiology reports in their database. We demonstrate that the reports for the images included in the users teaching file can be copied and stored on the same storage media as the images. The radiographic or sonographic image and the corresponding dictated report can then be 'linked' together. The description of the finding or findings of interest on the digitized image is thus electronically tethered to the image. This obviates the need to write much additional detail concerning the radiograph, saving time. In addition, the text on this disk can be indexed such that all files with user specified features can be instantly retrieve and combined in a single report, if desired. With the use of newer image compression techniques, hundreds of cases may be stored on a single CD-ROM depending on the quality of image required for the finding in question. This reduces the weight of a teaching file from that of a baby elephant to that of a single CD-ROM disc. Thus, with this method of teaching file preparation and storage the following advantages are realized: (1) Technically easier and less time consuming image reproduction. (2) Considerably less unwieldy and substantially more portable teaching files. (3) Novel ability to index files and then retrieve specific cases of choice based on descriptive text.
The Figure.tar.gz contains a directory for each WRF ensemble run. In these directories are *.csv files for each meteorology variable examined. These are comma delimited text files that contain statistics for each observation site. Also provided is an R script that reads these files (user would need to change directory pointers) and computes the variability of error and bias of the ensemble at each site and plots these for reproduction of figure 3.This dataset is associated with the following publication:Gilliam , R., C. Hogrefe , J. Godowitch, S. Napelenok , R. Mathur , and S.T. Rao. Impact of inherent meteorology uncertainty on air quality model predictions. JOURNAL OF GEOPHYSICAL RESEARCH-ATMOSPHERES. American Geophysical Union, Washington, DC, USA, 120(23): 12,259–12,280, (2015).
Radiology teaching file cases on the World Wide Web.
Scalzetti, E M
1997-08-01
The presentation of a radiographic teaching file on the World Wide Web can be enhanced by attending to principles of web design. Chief among these are appropriate control of page layout, minimization of the time required to download a page from the remote server, and provision for navigation within and among the web pages that constitute the site. Page layout is easily accomplished by the use of tables; column widths can be fixed to maintain an acceptable line length for text. Downloading time is minimized by rigorous editing and by optimal compression of image files; beyond this, techniques like preloading of images and specification of image width and height are also helpful. Navigation controls should be clear, consistent, and readily available.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-10-11
.... By way of background, the PULSe workstation is a front-end order entry system designed for use with... Schedule as it relates to the PULSe workstation. The text of the proposed rule change is available on the... workstations. The Exchange is also proposing some non-substantive changes to the fees schedule text to clarify...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-05-12
... amend [sic] its rules relating to the Penny Pilot Program. The text of the rule proposal is available on... proposed rule change. The text of those statements may be examined at the places specified in Item IV below... Technology Select Sector XME SPDR S&P Metals & Mining SPDR Fund. ETF. AKS AK Steel Holding Corp... KGC...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-19
... Extraordinary Market Volatility submitted to the Commission pursuant to Rule 608 of Regulation NMS. The text of... comments it received on the proposed rule change. The text of those statements may be examined at the... calculated by the Processors.\\11\\ When the National Best Bid (Offer) is below (above) the Lower (Upper) Price...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-14
... (May 31, 2012), 77 FR 33498 (June 6, 2012) (the ``Limit Up-Limit Down Release''). The text of the... proposed rule change and discussed any comments it received on the proposed rule change. The text of these... calculated by the Processors.\\13\\ When the National Best Bid (Offer) is below (above) the Lower (Upper) Price...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-18
... volatility, as described further below. The text of the proposed rule change is set forth below. Proposed new... received on the proposed rule change. The text of these statements may be examined at the places specified... Upper Price Band for each NMS Stock are calculated by the Processors.\\9\\ When the National Best Bid...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-23
... unit-of-count methodology for the NYSE Arca Trades and BBO Services. The text of the proposed rule... change. The text of those statements may be examined at the places specified in Item IV below. The... type of NYSE Arca ``Market Data'' (i.e., NYSE Arca Last Sale Information or NYSE Arca BBO Information...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-18
... further below. The text of the proposed rule change is set forth below. Proposed new language is in... the proposed rule change. The text of these statements may be examined at the places specified in Item... Stock are calculated by the Processors.\\9\\ When the National Best Bid (Offer) is below (above) the Lower...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-07-06
... Delete Obsolete Text and To Clarify and Update the Description of the Allocation of Market and Limit... Orders in the Allocation of Orders in Closing Transactions, and (4) Making Other Technical and Conforming... delete obsolete text and to clarify and update the description of the allocation of market and limit...
Brumen, Bostjan; Heričko, Marjan; Sevčnikar, Andrej; Završnik, Jernej; Hölbl, Marko
2013-12-16
Medical data are gold mines for deriving the knowledge that could change the course of a single patient's life or even the health of the entire population. A data analyst needs to have full access to relevant data, but full access may be denied by privacy and confidentiality of medical data legal regulations, especially when the data analyst is not affiliated with the data owner. Our first objective was to analyze the privacy and confidentiality issues and the associated regulations pertaining to medical data, and to identify technologies to properly address these issues. Our second objective was to develop a procedure to protect medical data in such a way that the outsourced analyst would be capable of doing analyses on protected data and the results would be comparable, if not the same, as if they had been done on the original data. Specifically, our hypothesis was there would not be a difference between the outsourced decision trees built on encrypted data and the ones built on original data. Using formal definitions, we developed an algorithm to protect medical data for outsourced analyses. The algorithm was applied to publicly available datasets (N=30) from the medical and life sciences fields. The analyses were performed on the original and the protected datasets and the results of the analyses were compared. Bootstrapped paired t tests for 2 dependent samples were used to test whether the mean differences in size, number of leaves, and the accuracy of the original and the encrypted decision trees were significantly different. The decision trees built on encrypted data were virtually the same as those built on original data. Out of 30 datasets, 100% of the trees had identical accuracy. The size of a tree and the number of leaves was different only once (1/30, 3%, P=.19). The proposed algorithm encrypts a file with plain text medical data into an encrypted file with the data protected in such a way that external data analyses are still possible. The results show that the results of analyses on original and on protected data are identical or comparably similar. The approach addresses the privacy and confidentiality issues that arise with medical data and is adherent to strict legal rules in the United States and Europe regarding the processing of the medical data.
2013-01-01
Background Medical data are gold mines for deriving the knowledge that could change the course of a single patient’s life or even the health of the entire population. A data analyst needs to have full access to relevant data, but full access may be denied by privacy and confidentiality of medical data legal regulations, especially when the data analyst is not affiliated with the data owner. Objective Our first objective was to analyze the privacy and confidentiality issues and the associated regulations pertaining to medical data, and to identify technologies to properly address these issues. Our second objective was to develop a procedure to protect medical data in such a way that the outsourced analyst would be capable of doing analyses on protected data and the results would be comparable, if not the same, as if they had been done on the original data. Specifically, our hypothesis was there would not be a difference between the outsourced decision trees built on encrypted data and the ones built on original data. Methods Using formal definitions, we developed an algorithm to protect medical data for outsourced analyses. The algorithm was applied to publicly available datasets (N=30) from the medical and life sciences fields. The analyses were performed on the original and the protected datasets and the results of the analyses were compared. Bootstrapped paired t tests for 2 dependent samples were used to test whether the mean differences in size, number of leaves, and the accuracy of the original and the encrypted decision trees were significantly different. Results The decision trees built on encrypted data were virtually the same as those built on original data. Out of 30 datasets, 100% of the trees had identical accuracy. The size of a tree and the number of leaves was different only once (1/30, 3%, P=.19). Conclusions The proposed algorithm encrypts a file with plain text medical data into an encrypted file with the data protected in such a way that external data analyses are still possible. The results show that the results of analyses on original and on protected data are identical or comparably similar. The approach addresses the privacy and confidentiality issues that arise with medical data and is adherent to strict legal rules in the United States and Europe regarding the processing of the medical data. PMID:24342053
A multilingual audiometer simulator software for training purposes.
Kompis, Martin; Steffen, Pascal; Caversaccio, Marco; Brugger, Urs; Oesch, Ivo
2012-04-01
A set of algorithms, which allows a computer to determine the answers of simulated patients during pure tone and speech audiometry, is presented. Based on these algorithms, a computer program for training in audiometry was written and found to be useful for teaching purposes. To develop a flexible audiometer simulator software as a teaching and training tool for pure tone and speech audiometry, both with and without masking. First a set of algorithms, which allows a computer to determine the answers of a simulated, hearing-impaired patient, was developed. Then, the software was implemented. Extensive use was made of simple, editable text files to define all texts in the user interface and all patient definitions. The software 'audiometer simulator' is available for free download. It can be used to train pure tone audiometry (both with and without masking), speech audiometry, measurement of the uncomfortable level, and simple simulation tests. Due to the use of text files, the user can alter or add patient definitions and all texts and labels shown on the screen. So far, English, French, German, and Portuguese user interfaces are available and the user can choose between German or French speech audiometry.
[Textual research of Liu Wan-su's works on consumptive thirst].
Yang, Shi-Zhe; Zhang, Xian-Zhe
2007-07-01
Liu Wan-su's San xiao lun (On Three Consumptions) was the earliest extant monograph dealing with the consumption thirst in traditional Chinese medicine (TCM). The other book, with the namesake of Liu Wan-su, Su wen bing ji qi yi bao ming ji (Collection for Preserving Life of Pathogenesis in Plain Questions), also included a section of consumption thirst. However, through comparison, the descriptions in both books were quite different and it seemed unlikely that it were written by the same author. Based on textural research of bibliography, it's hard to say if this is a true one. Further, comparison of the book with the texts of consumption thirst in Huang di su wen xuan ming lun fang (Elucidated Prescriptions and Expositions of Huangdi's Plain Questions), an authentic book of Liu, a consistency was found between San xiao lun and Huang di su wen xuan ming lun fang. It is very unlikely that Su wen bing ji qi yi bao ming ji was written by Liu because of its obvious different writing style.
Near East/South Asia Report No. 2745.
1983-04-29
viction were not. This was one of the reasons for the congestion and people’s rush to buy more than they needed, reflecting a negative development ...34Plain and simple: they’ve not been neglected. A factory requesting to move to Samaria from a development town recently re- ceived a negative ...Byblos Recorded Highest Growth Among Top 10; Return of Dollar Had Negative Effect on Volume of Deposits and on Some Banks"] [Text] What were the
Chenier plain genesis explained by feedbacks between waves, mud, and sand
NASA Astrophysics Data System (ADS)
Nardin, William; Fagherazzi, Sergio
2017-04-01
Cheniers are sandy ridges parallel to the coast established by high energy waves. Here we discuss ontogeny of chenier plains through dimensional analysis and numerical results from the morphodynamic model Delft3D-SWAN. Our results show that wave energy and inner-shelf slope play an important role in the formation of chenier plains. In our numerical experiments, waves affect chenier plain development in three ways: by winnowing coarse sediment from the mudflat, by eroding mud and accumulating sand over the beach during extreme wave events. We further show that different sediment characteristics and wave climates can lead to three alternative coastal landscapes: strand plains, mudflats, or the more complex chenier plains. Low inner-shelf slopes are the most favorable for strand plain and chenier plain formation, while high slopes decrease the likelihood of mudflat development and preservation.
Chenier plain development: feedbacks between waves, mud and sand
NASA Astrophysics Data System (ADS)
Nardin, W.; Fagherazzi, S.
2015-12-01
Cheniers are sandy ridges parallel to the coast established by high energy waves. Here we discuss Chenier plains ontogeny through dimensional analysis and numerical results from the morphodynamic model Delft3D-SWAN. Our results show that wave energy and shelf slope play an important role in the formation of Chenier plains. In our numerical experiments waves affect Chenier plain development in three ways: by winnowing sediment from the mudflat, by eroding mud and accumulating sand over the beach during extreme wave events. We further show that different sediment characteristics and wave climates can lead to three alternative coastal landscapes: strand plains, mudflats, or the more complex Chenier plains. Low inner-shelf slopes are the most favorable for strand plain and Chenier plain formation, while high slopes decrease the likelihood of mudflat development and preservation.
Pycellerator: an arrow-based reaction-like modelling language for biological simulations.
Shapiro, Bruce E; Mjolsness, Eric
2016-02-15
We introduce Pycellerator, a Python library for reading Cellerator arrow notation from standard text files, conversion to differential equations, generating stand-alone Python solvers, and optionally running and plotting the solutions. All of the original Cellerator arrows, which represent reactions ranging from mass action, Michales-Menten-Henri (MMH) and Gene-Regulation (GRN) to Monod-Wyman-Changeaux (MWC), user defined reactions and enzymatic expansions (KMech), were previously represented with the Mathematica extended character set. These are now typed as reaction-like commands in ASCII text files that are read by Pycellerator, which includes a Python command line interface (CLI), a Python application programming interface (API) and an iPython notebook interface. Cellerator reaction arrows are now input in text files. The arrows are parsed by Pycellerator and translated into differential equations in Python, and Python code is automatically generated to solve the system. Time courses are produced by executing the auto-generated Python code. Users have full freedom to modify the solver and utilize the complete set of standard Python tools. The new libraries are completely independent of the old Cellerator software and do not require Mathematica. All software is available (GPL) from the github repository at https://github.com/biomathman/pycellerator/releases. Details, including installation instructions and a glossary of acronyms and terms, are given in the Supplementary information. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
New generation of the multimedia search engines
NASA Astrophysics Data System (ADS)
Mijes Cruz, Mario Humberto; Soto Aldaco, Andrea; Maldonado Cano, Luis Alejandro; López Rodríguez, Mario; Rodríguez Vázqueza, Manuel Antonio; Amaya Reyes, Laura Mariel; Cano Martínez, Elizabeth; Pérez Rosas, Osvaldo Gerardo; Rodríguez Espejo, Luis; Flores Secundino, Jesús Abimelek; Rivera Martínez, José Luis; García Vázquez, Mireya Saraí; Zamudio Fuentes, Luis Miguel; Sánchez Valenzuela, Juan Carlos; Montoya Obeso, Abraham; Ramírez Acosta, Alejandro Álvaro
2016-09-01
Current search engines are based upon search methods that involve the combination of words (text-based search); which has been efficient until now. However, the Internet's growing demand indicates that there's more diversity on it with each passing day. Text-based searches are becoming limited, as most of the information on the Internet can be found in different types of content denominated multimedia content (images, audio files, video files). Indeed, what needs to be improved in current search engines is: search content, and precision; as well as an accurate display of expected search results by the user. Any search can be more precise if it uses more text parameters, but it doesn't help improve the content or speed of the search itself. One solution is to improve them through the characterization of the content for the search in multimedia files. In this article, an analysis of the new generation multimedia search engines is presented, focusing the needs according to new technologies. Multimedia content has become a central part of the flow of information in our daily life. This reflects the necessity of having multimedia search engines, as well as knowing the real tasks that it must comply. Through this analysis, it is shown that there are not many search engines that can perform content searches. The area of research of multimedia search engines of new generation is a multidisciplinary area that's in constant growth, generating tools that satisfy the different needs of new generation systems.
Challenges to Standardization: A Case Study Using Coastal and Deep-Ocean Water Level Data
NASA Astrophysics Data System (ADS)
Sweeney, A. D.; Stroker, K. J.; Mungov, G.; McLean, S. J.
2015-12-01
Sea levels recorded at coastal stations and inferred from deep-ocean pressure observations at the seafloor are submitted for archive in multiple data and metadata formats. These formats include two forms of schema-less XML and a custom binary format accompanied by metadata in a spreadsheet. The authors report on efforts to use existing standards to make this data more discoverable and more useful beyond their initial use in detecting tsunamis. An initial review of data formats for sea level data around the globe revealed heterogeneity in presentation and content. In the absence of a widely-used domain-specific format, we adopted the general model for structuring data and metadata expressed by the Network Common Data Form (netCDF). netCDF has been endorsed by the Open Geospatial Consortium and has the advantages of small size when compared to equivalent plain text representation and provides a standard way of embedding metadata in the same file. We followed the orthogonal time-series profile of the Climate and Forecast discrete sampling geometries as the convention for structuring the data and describing metadata relevant for use. We adhered to the Attribute Convention for Data Discovery for capturing metadata to support user search. Beyond making it possible to structure data and metadata in a standard way, netCDF is supported by multiple software tools in providing programmatic cataloging, access, subsetting, and transformation to other formats. We will describe our successes and failures in adhering to existing standards and provide requirements for either augmenting existing conventions or developing new ones. Some of these enhancements are specific to sea level data, while others are applicable to time-series data in general.
Ocean-Bottom Topography: The Divide between the Sohm and Hatteras Abyssal Plains.
Pratt, R M
1965-06-18
A compilation of precision echo soundings has delineated the complex topography between the Sohm and Hatteras abyssal plains off the Atlantic coast of the United States. At present the divide between the two plains is a broad, flat area about 4950 meters deep; however, the configuration of channels and depressions suggests spillage of turbidity currents from the Sohm Plain into the Hatteras Plain and a shifting of the divide toward the northeast. Hudson Canyon terminates in the divide area and has probably fed sediment into both plains.
NASA Technical Reports Server (NTRS)
Snyder, W. V.; Hanson, R. J.
1986-01-01
Text Exchange System (TES) exchanges and maintains organized textual information including source code, documentation, data, and listings. System consists of two computer programs and definition of format for information storage. Comprehensive program used to create, read, and maintain TES files. TES developed to meet three goals: First, easy and efficient exchange of programs and other textual data between similar and dissimilar computer systems via magnetic tape. Second, provide transportable management system for textual information. Third, provide common user interface, over wide variety of computing systems, for all activities associated with text exchange.
Public census data on CD-ROM at Lawrence Berkeley Laboratory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Merrill, D.W.
The Comprehensive Epidemiologic Data Resource (CEDR) and Populations at Risk to Environmental Pollution (PAREP) projects, of the Information and Computing Sciences Division (ICSD) at Lawrence Berkeley Laboratory (LBL), are using public socio-economic and geographic data files which are available to CEDR and PAREP collaborators via LBL`s computing network. At this time 70 CD-ROM diskettes (approximately 36 gigabytes) are on line via the Unix file server cedrcd. lbl. gov. Most of the files are from the US Bureau of the Census, and most pertain to the 1990 Census of Population and Housing. All the CD-ROM diskettes contain documentation in the formmore » of ASCII text files. Printed documentation for most files is available for inspection at University of California Data and Technical Assistance (UC DATA), or the UC Documents Library. Many of the CD-ROM diskettes distributed by the Census Bureau contain software for PC compatible computers, for easily accessing the data. Shared access to the data is maintained through a collaboration among the CEDR and PAREP projects at LBL, and UC DATA, and the UC Documents Library. Via the Sun Network File System (NFS), these data can be exported to Internet computers for direct access by the user`s application program(s).« less
Public census data on CD-ROM at Lawrence Berkeley Laboratory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Merrill, D.W.
The Comprehensive Epidemiologic Data Resource (CEDR) and Populations at Risk to Environmental Pollution (PAREP) projects, of the Information and Computing Sciences Division (ICSD) at Lawrence Berkeley Laboratory (LBL), are using public socio-economic and geographic data files which are available to CEDR and PAREP collaborators via LBL's computing network. At this time 70 CD-ROM diskettes (approximately 36 gigabytes) are on line via the Unix file server cedrcd. lbl. gov. Most of the files are from the US Bureau of the Census, and most pertain to the 1990 Census of Population and Housing. All the CD-ROM diskettes contain documentation in the formmore » of ASCII text files. Printed documentation for most files is available for inspection at University of California Data and Technical Assistance (UC DATA), or the UC Documents Library. Many of the CD-ROM diskettes distributed by the Census Bureau contain software for PC compatible computers, for easily accessing the data. Shared access to the data is maintained through a collaboration among the CEDR and PAREP projects at LBL, and UC DATA, and the UC Documents Library. Via the Sun Network File System (NFS), these data can be exported to Internet computers for direct access by the user's application program(s).« less
Muller, Ingrid; Rowsell, Alison; Stuart, Beth; Hayter, Victoria; Little, Paul; Ganahl, Kristin; Müller, Gabriele; Doyle, Gerardine; Chang, Peter; Lyles, Courtney R; Nutbeam, Don; Yardley, Lucy
2017-01-23
Developing accessible Web-based materials to support diabetes self-management in people with lower levels of health literacy is a continuing challenge. The objective of this international study was to develop a Web-based intervention promoting physical activity among people with type 2 diabetes to determine whether audiovisual presentation and interactivity (quizzes, planners, tailoring) could help to overcome the digital divide by making digital interventions accessible and effective for people with all levels of health literacy. This study also aimed to determine whether these materials can improve health literacy outcomes for people with lower levels of health literacy and also be effective for people with higher levels of health literacy. To assess the impact of interactivity and audiovisual features on usage, engagement, and health literacy outcomes, we designed two versions of a Web-based intervention (one interactive and one plain-text version of the same content) to promote physical activity in people with type 2 diabetes. We randomly assigned participants from the United Kingdom, Austria, Germany, Ireland, and Taiwan to either an interactive or plain-text version of the intervention in English, German, or Mandarin. Intervention usage was objectively recorded by the intervention software. Self-report measures were taken at baseline and follow-up (immediately after participants viewed the intervention) and included measures of health literacy, engagement (website satisfaction and willingness to recommend the intervention to others), and health literacy outcomes (diabetes knowledge, enablement, attitude, perceived behavioral control, and intention to undertake physical activity). In total, 1041 people took part in this study. Of the 1005 who completed health literacy information, 268 (26.67%) had intermediate or low levels of health literacy. The interactive intervention overall did not produce better outcomes than did the plain-text version. Participants in the plain-text intervention group looked at significantly more sections of the intervention (mean difference -0.47, 95% CI -0.64 to -0.30, P<.001), but this did not lead to better outcomes. Health literacy outcomes, including attitudes and intentions to engage in physical activity, significantly improved following the intervention for participants in both intervention groups. These improvements were similar across higher and lower health literacy levels and in all countries. Participants in the interactive intervention group had acquired more diabetes knowledge (mean difference 0.80, 95% CI 0.65-0.94, P<.001). Participants from both groups reported high levels of website satisfaction and would recommend the website to others. Following established practice for simple, clear design and presentation and using a person-based approach to intervention development, with in-depth iterative feedback from users, may be more important than interactivity and audiovisual presentations when developing accessible digital health interventions to improve health literacy outcomes. International Standard Randomized Controlled Trial Number (ISRCTN): 43587048; http://www.isrctn.com/ISRCTN43587048. (Archived by WebCite at http://www.webcitation.org/6nGhaP9bv). ©Ingrid Muller, Alison Rowsell, Beth Stuart, Victoria Hayter, Paul Little, Kristin Ganahl, Gabriele Müller, Gerardine Doyle, Peter Chang, Courtney R Lyles, Don Nutbeam, Lucy Yardley. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 23.01.2017.
Rea, A.H.; Becker, C.J.
1997-01-01
This compact disc contains 25 digital map data sets covering the State of Oklahoma that may be of interest to the general public, private industry, schools, and government agencies. Fourteen data sets are statewide. These data sets include: administrative boundaries; 104th U.S. Congressional district boundaries; county boundaries; latitudinal lines; longitudinal lines; geographic names; indexes of U.S. Geological Survey 1:100,000, and 1:250,000-scale topographic quadrangles; a shaded-relief image; Oklahoma State House of Representatives district boundaries; Oklahoma State Senate district boundaries; locations of U.S. Geological Survey stream gages; watershed boundaries and hydrologic cataloging unit numbers; and locations of weather stations. Eleven data sets are divided by county and are located in 77 county subdirectories. These data sets include: census block group boundaries with selected demographic data; city and major highways text; geographic names; land surface elevation contours; elevation points; an index of U.S. Geological Survey 1:24,000-scale topographic quadrangles; roads, streets and address ranges; highway text; school district boundaries; streams, river and lakes; and the public land survey system. All data sets are provided in a readily accessible format. Most data sets are provided in Digital Line Graph (DLG) format. The attributes for many of the DLG files are stored in related dBASE(R)-format files and may be joined to the data set polygon attribute or arc attribute tables using dBASE(R)-compatible software. (Any use of trade names in this publication is for descriptive purposes only and does not imply endorsement by the U.S. Government.) Point attribute tables are provided in dBASE(R) format only, and include the X and Y map coordinates of each point. Annotation (text plotted in map coordinates) are provided in AutoCAD Drawing Exchange format (DXF) files. The shaded-relief image is provided in TIFF format. All data sets except the shaded-relief image also are provided in ARC/INFO export-file format.
Hosseinifard, Seyed Javad; Mirzaei Aminiyan, Milad
One of the important purposes of hydrology is to ensure water supply in accordance with the quality criteria for agricultural, industrial, and drinking water uses. The groundwater is the main source of water supply in arid and semi-arid regions. This study was conducted to evaluate factors regulating groundwater quality in Rafsanjan plain. A total of 1040 groundwater samples randomly were collected from different areas of Rafsanjan. Then, each sample was analyzed for the major ions based on standard methods. The pH, SAR, EC, and TDS parameters and concentrations of Ca 2+ , Mg 2+ , and Na + cations, and Cl - , [Formula: see text], [Formula: see text] and [Formula: see text] anions were measured. Also boron concentration in each sample was determined. Although, maximum and minimum values of EC and TDS linked to the Anar-Beyaz area and Eastern Urban, respectively, irrigation water EC condition, however, was critical in the study areas. The pH value in Western Urban was higher than the other areas, and its value for Anar-Beyaz area was lower than the other areas, but pH value is at the optimal level in all the study areas. The results showed that hazard state with respect to Mg was critical except in Koshkoueiyeh and Anar-Beyaz areas, that these areas are marginal for irrigation use with little harm with reference to Mg. From the results, it was concluded that the status of boron concentration in study areas was critical. According to the hydrochemistry diagrams, the main groundwater type in different study areas was NaCl. Groundwater quality was not appropriate for drinking usage, and its status for agricultural practices was unsuitable in these areas.
River flood plains: Some observations on their formation
Wolman, M. Gordon; Leopold, Luna Bergere
1957-01-01
On many small rivers and most great rivers, the flood plain consists of channel and overbank deposits. The proportion of the latter is generally very small.Frequency studies indicate that the flood plains of many streams of different sizes flowing in diverse physiographic and climatic regions are subject to flooding about once a year.The uniform frequency of flooding of the flood-plain surface and the small amount of deposition observed in great floods (average 0.07 foot) support the conclusion that overbank deposition contributes only a minor part of the material constituting the flood plain. The relatively high velocities (1 to 4 fps) which can occur in overbank flows and the reduction in sediment concentration which often accompanies large floods may also help account for this. Although lateral migration of channels is important in controlling the elevation of the flood plain, rates of migration are extremely variable and alone cannot account for the uniform relation the flood-plain surface bears to the channel.Detailed studies of flood plains in Maryland and in North Carolina indicate that it is difficult to differentiate between channel and overbank deposits in a stratigraphic section alone.Because deposition on the flood plain does not continue indefinitely, the flood-plain surface can only be transformed into a terrace surface by some tectonic or climatic change which alters the regimen of the river and causes it to entrench itself below its established bed and associated flood plain. A terrace, then, is distinguished from a flood plain by the frequency with which each is overflowed.
ERIC Educational Resources Information Center
Vandermeulen, H.; DeWreede, R. E.
1983-01-01
Presents a histogram drawing program which sorts real numbers in up to 30 categories. Entered data are sorted and saved in a text file which is then used to generate the histogram. Complete Applesoft program listings are included. (JN)
George, Roy; Walsh, Laurence J
2010-04-01
To evaluate the temperature changes occurring on the apical third of root surfaces when erbium-doped yttrium aluminium garnet (Er:YAG) and erbium, chromium-doped yttrium scandium gallium garnet (Er,Cr:YSGG) laser energy was delivered with a tube etched, laterally emitting conical tip and a conventional bare design optical fiber tip. Thermal effects of root canal laser treatments on periodontal ligament cells and alveolar bone are of concern in terms of safety. A total of 64 single-rooted extracted teeth were prepared 1 mm short of the working length using rotary nickel-titanium Pro-Taper files to an apical size corresponding to a F5 Pro-Taper instrument. A thermocouple located 2 mm from the apex was used to record temperature changes arising from delivery of laser energy through laterally emitting conical tips or plain tips, using an Er:YAG or Er,Cr:YSGG laser. For the Er:YAG and Er,Cr:YSGG systems, conical fibers showed greater lateral emissions (452 + 69% and 443 + 64%) and corresponding lower forward emissions (48 + 5% and 49 + 5%) than conventional plain-fiber tips. All four combinations of laser system and fiber design elicited temperature increases less than 2.5 degrees C during lasing. The use of water irrigation attenuated completely the thermal effects of individual lasing cycles. Laterally emitting conical fiber tips can be used safely under defined conditions for intracanal irradiation without harmful thermal effects on the periodontal apparatus.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-10-22
... 1101A(b)(vi). The text of the proposed rule change is available on the Exchange's Web site at http... the proposed rule change. The text of these statements may be examined at the places specified in Item... Exchange's STO rules, is similar in practical effect to the noted OLPP subsection. In terms of the strike...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-03
... EDGA Members. The text of the proposed rule change is available on the Exchange's Internet Web site at... received on the proposed rule change. The text of these statements may be examined at the places specified... proposes to amend Rule 11.5(c)(15), the NBBO Offset Peg Order, to state that the order type will: (1) Only...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-09
...) to add a new order type, the NBBO Offset Peg Order, to the rule. The text of the proposed rule change... Change Relating to EDGX Rule 11.5 To Add a New Order Type October 2, 2012. Pursuant to Section 19(b)(1... and discussed any comments it received on the proposed rule change. The text of these statements may...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-03
... EDGX Members. The text of the proposed rule change is available on the Exchange's Internet Web site at... received on the proposed rule change. The text of these statements may be examined at the places specified... proposes to amend Rule 11.5(c)(15), the NBBO Offset Peg Order, to state that the order type will: (1) Only...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-09
...) to add a new order type, the NBBO Offset Peg Order, to the rule. The text of the proposed rule change... Change Relating to EDGA Rule 11.5 To Add a New Order Type October 2, 2012. Pursuant to Section 19(b)(1... and discussed any comments it received on the proposed rule change. The text of these statements may...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-27
... Exchange listed-securities. The text of the proposed rule change is available at the Exchange's Web site at... rule change. The text of these statements may be examined at the places specified in Item IV below. The..., the Proposed Rule Change 1. Purpose Introduction The Exchange proposes to add a new auction type to...
FEMA Database Requirements Assessment and Resource Directory Model.
1982-05-01
File (NRCM) -- WContains information on organizations that are information resources on virtually any subject. * NEW YORK TIMES ONLINE -- Full text...version of the New York Times. * Newsearch: The Daily Index (Newsearch) -- Daily indexing of the periodicals in Magazine Index, newspapers in National...a NEXIS -- Full text of all general and business news covered in a variety of news - papers, magazines and wire services. 0 Oceanic Abstracts
Federal Register 2010, 2011, 2012, 2013, 2014
2012-04-18
... proposes to amend the Exchange's Pricing Schedule at Section VII, C to update FINRA fees to mirror the text... its Pricing Schedule at Section VII, C entitled ``FINRA Fees'' to mirror the rules of the NASDAQ Stock... in the Pricing Schedule with text similar to that of NASDAQ Stock Market Rule 7003(a)(1)--(5) \\3\\ and...
User's manual for the Macintosh version of PASCO
NASA Technical Reports Server (NTRS)
Lucas, S. H.; Davis, Randall C.
1991-01-01
A user's manual for Macintosh PASCO is presented. Macintosh PASCO is an Apple Macintosh version of PASCO, an existing computer code for structural analysis and optimization of longitudinally stiffened composite panels. PASCO combines a rigorous buckling analysis program with a nonlinear mathematical optimization routine to minimize panel mass. Macintosh PASCO accepts the same input as mainframe versions of PASCO. As output, Macintosh PASCO produces a text file and mode shape plots in the form of Apple Macintosh PICT files. Only the user interface for Macintosh is discussed here.
Rep. Polis, Jared [D-CO-2
2014-07-22
House - 09/17/2014 Motion to Discharge Committee filed by Mr. Polis. Petition No: 113-11. (All Actions) Notes: On 9/17/2014, a motion was filed to discharge the Committee on Rules from the consideration of H.Res.678 a resolution providing for consideration of S.815. A discharge petition requires 218 signatures for further action. (Discharge Petition No. 113-11: text with signatures.) Tracker: This bill has the status IntroducedHere are the steps for Status of Legislation:
DOE Office of Scientific and Technical Information (OSTI.GOV)
North, Michael J.
SchemaOnRead provides tools for implementing schema-on-read including a single function call (e.g., schemaOnRead("filename")) that reads text (TXT), comma separated value (CSV), raster image (BMP, PNG, GIF, TIFF, and JPG), R data (RDS), HDF5, NetCDF, spreadsheet (XLS, XLSX, ODS, and DIF), Weka Attribute-Relation File Format (ARFF), Epi Info (REC), Pajek network (PAJ), R network (NET), Hypertext Markup Language (HTML), SPSS (SAV), Systat (SYS), and Stata (DTA) files. It also recursively reads folders (e.g., schemaOnRead("folder")), returning a nested list of the contained elements.
BamTools: a C++ API and toolkit for analyzing and managing BAM files.
Barnett, Derek W; Garrison, Erik K; Quinlan, Aaron R; Strömberg, Michael P; Marth, Gabor T
2011-06-15
Analysis of genomic sequencing data requires efficient, easy-to-use access to alignment results and flexible data management tools (e.g. filtering, merging, sorting, etc.). However, the enormous amount of data produced by current sequencing technologies is typically stored in compressed, binary formats that are not easily handled by the text-based parsers commonly used in bioinformatics research. We introduce a software suite for programmers and end users that facilitates research analysis and data management using BAM files. BamTools provides both the first C++ API publicly available for BAM file support as well as a command-line toolkit. BamTools was written in C++, and is supported on Linux, Mac OSX and MS Windows. Source code and documentation are freely available at http://github.org/pezmaster31/bamtools.
Ryder, Robert T.; Harris, Anita G.; Repetski, John E.; Crangle, Robert D.; Ruppert, Leslie F.; Ryder, Robert T.
2014-01-01
This chapter is a re-release of U.S. Geological Survey Bulletin 1839-K, of the same title, by Ryder and others (1992; online version 2.0 revised and digitized by Robert D. Crangle, Jr., 2003). It consists of one file of the report text as it appeared in USGS Bulletin 1839-K and a second file containing the cross section, figures 1 and 2, and tables 1 and 2 on one oversized sheet; the second file was digitized in 2003 as version 2.0 and also includes the gamma-ray well log traces.
Incorporating uncertainty in RADTRAN 6.0 input files.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dennis, Matthew L.; Weiner, Ruth F.; Heames, Terence John
Uncertainty may be introduced into RADTRAN analyses by distributing input parameters. The MELCOR Uncertainty Engine (Gauntt and Erickson, 2004) has been adapted for use in RADTRAN to determine the parameter shape and minimum and maximum of the distribution, to sample on the distribution, and to create an appropriate RADTRAN batch file. Coupling input parameters is not possible in this initial application. It is recommended that the analyst be very familiar with RADTRAN and able to edit or create a RADTRAN input file using a text editor before implementing the RADTRAN Uncertainty Analysis Module. Installation of the MELCOR Uncertainty Engine ismore » required for incorporation of uncertainty into RADTRAN. Gauntt and Erickson (2004) provides installation instructions as well as a description and user guide for the uncertainty engine.« less
Plain Language to Communicate Physical Activity Information: A Website Content Analysis.
Paige, Samantha R; Black, David R; Mattson, Marifran; Coster, Daniel C; Stellefson, Michael
2018-04-01
Plain language techniques are health literacy universal precautions intended to enhance health care system navigation and health outcomes. Physical activity (PA) is a popular topic on the Internet, yet it is unknown if information is communicated in plain language. This study examined how plain language techniques are included in PA websites, and if the use of plain language techniques varies according to search procedures (keyword, search engine) and website host source (government, commercial, educational/organizational). Three keywords ("physical activity," "fitness," and "exercise") were independently entered into three search engines (Google, Bing, and Yahoo) to locate a nonprobability sample of websites ( N = 61). Fourteen plain language techniques were coded within each website to examine content formatting, clarity and conciseness, and multimedia use. Approximately half ( M = 6.59; SD = 1.68) of the plain language techniques were included in each website. Keyword physical activity resulted in websites with fewer clear and concise plain language techniques ( p < .05), whereas fitness resulted in websites with more clear and concise techniques ( p < .01). Plain language techniques did not vary by search engine or the website host source. Accessing PA information that is easy to understand and behaviorally oriented may remain a challenge for users. Transdisciplinary collaborations are needed to optimize plain language techniques while communicating online PA information.
Cryptanalysis of "an improvement over an image encryption method based on total shuffling"
NASA Astrophysics Data System (ADS)
Akhavan, A.; Samsudin, A.; Akhshani, A.
2015-09-01
In the past two decades, several image encryption algorithms based on chaotic systems had been proposed. Many of the proposed algorithms are meant to improve other chaos based and conventional cryptographic algorithms. Whereas, many of the proposed improvement methods suffer from serious security problems. In this paper, the security of the recently proposed improvement method for a chaos-based image encryption algorithm is analyzed. The results indicate the weakness of the analyzed algorithm against chosen plain-text.
1989-06-01
plains harvest mice, western harvest mice, northern grasshopper mice, prairie voles, meadow voles, Ord’s kangaroo rats, hispid pocket mice, and...both metabolically and from a manufacturing perspective. DDT and DDE are two of the more persistent OCPs (Edwards, 1972). This may be due in part to...DBPC, although toxic, does not bioaccumulate significantly, while isodrin, an analog of endrin, is converted to endrin by metabolic processes. Two
Kaltschmidt, Jens; Schmitt, Simon P W; Pruszydlo, Markus G; Haefeli, Walter E
2008-01-01
Electronic mailing systems (e-mail) are an important means to disseminate information within electronic networks. However, in large business communities including the hectic environment of hospitals it may be difficult to induce account holders to read the e-mail. In two mailings disseminated in a large university hospital we evaluated the impact of e-mail layout (three e-mail text versions, two e-mails with graphics) on the willingness of its approximately 6500 recipients to seek additional electronic information and open an integrated link. Overall access rates after 90 days were 21.1 and 23.5% with more than 70% of the respondents opening the link within 3 days. Differences between different layouts were large and artwork text, HTML text, animated GIF, and static image prompted 1.2, 1.7, 1.8, and 2.3 times more often access than the courier plain text message (p
Kaltschmidt, Jens; Schmitt, Simon P.W.; Pruszydlo, Markus G.; Haefeli, Walter E.
2008-01-01
Electronic mailing systems (e-mail) are an important means to disseminate information within electronic networks. However, in large business communities including the hectic environment of hospitals it may be difficult to induce account holders to read the e-mail. In two mailings disseminated in a large university hospital we evaluated the impact of e-mail layout (three e-mail text versions, two e-mails with graphics) on the willingness of its ∼6500 recipients to seek additional electronic information and open an integrated link. Overall access rates after 90 days were 21.1 and 23.5% with more than 70% of the respondents opening the link within 3 days. Differences between different layouts were large and artwork text, HTML text, animated GIF, and static image prompted 1.2, 1.7, 1.8, and 2.3 times more often access than the courier plain text message (p ≤ 0.001). This study revealed that layout is a major determinant of the success of an information campaign. PMID:18096910
Momota, Ryusuke; Ohtsuka, Aiji
2018-01-01
Anatomy is the science and art of understanding the structure of the body and its components in relation to the functions of the whole-body system. Medicine is based on a deep understanding of anatomy, but quite a few introductory-level learners are overwhelmed by the sheer amount of anatomical terminology that must be understood, so they regard anatomy as a dull and dense subject. To help them learn anatomical terms in a more contextual way, we started a new open-source project, the Network of Anatomical Texts (NAnaTex), which visualizes relationships of body components by integrating text-based anatomical information using Cytoscape, a network visualization software platform. Here, we present a network of bones and muscles produced from literature descriptions. As this network is primarily text-based and does not require any programming knowledge, it is easy to implement new functions or provide extra information by making changes to the original text files. To facilitate collaborations, we deposited the source code files for the network into the GitHub repository ( https://github.com/ryusukemomota/nanatex ) so that anybody can participate in the evolution of the network and use it for their own non-profit purposes. This project should help not only introductory-level learners but also professional medical practitioners, who could use it as a quick reference.
Public-domain-software solution to data-access problems for numerical modelers
Jenter, Harry; Signell, Richard
1992-01-01
Unidata's network Common Data Form, netCDF, provides users with an efficient set of software for scientific-data-storage, retrieval, and manipulation. The netCDF file format is machine-independent, direct-access, self-describing, and in the public domain, thereby alleviating many problems associated with accessing output from large hydrodynamic models. NetCDF has programming interfaces in both the Fortran and C computer language with an interface to C++ planned for release in the future. NetCDF also has an abstract data type that relieves users from understanding details of the binary file structure; data are written and retrieved by an intuitive, user-supplied name rather than by file position. Users are aided further by Unidata's inclusion of the Common Data Language, CDL, a printable text-equivalent of the contents of a netCDF file. Unidata provides numerous operators and utilities for processing netCDF files. In addition, a number of public-domain and proprietary netCDF utilities from other sources are available at this time or will be available later this year. The U.S. Geological Survey has produced and is producing a number of public-domain netCDF utilities.
External-Compression Supersonic Inlet Design Code
NASA Technical Reports Server (NTRS)
Slater, John W.
2011-01-01
A computer code named SUPIN has been developed to perform aerodynamic design and analysis of external-compression, supersonic inlets. The baseline set of inlets include axisymmetric pitot, two-dimensional single-duct, axisymmetric outward-turning, and two-dimensional bifurcated-duct inlets. The aerodynamic methods are based on low-fidelity analytical and numerical procedures. The geometric methods are based on planar geometry elements. SUPIN has three modes of operation: 1) generate the inlet geometry from a explicit set of geometry information, 2) size and design the inlet geometry and analyze the aerodynamic performance, and 3) compute the aerodynamic performance of a specified inlet geometry. The aerodynamic performance quantities includes inlet flow rates, total pressure recovery, and drag. The geometry output from SUPIN includes inlet dimensions, cross-sectional areas, coordinates of planar profiles, and surface grids suitable for input to grid generators for analysis by computational fluid dynamics (CFD) methods. The input data file for SUPIN and the output file from SUPIN are text (ASCII) files. The surface grid files are output as formatted Plot3D or stereolithography (STL) files. SUPIN executes in batch mode and is available as a Microsoft Windows executable and Fortran95 source code with a makefile for Linux.
12 CFR 611.1217 - Plain language requirements.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 12 Banks and Banking 6 2011-01-01 2011-01-01 false Plain language requirements. 611.1217 Section 611.1217 Banks and Banking FARM CREDIT ADMINISTRATION FARM CREDIT SYSTEM ORGANIZATION Termination of System Institution Status § 611.1217 Plain language requirements. (a) Plain language presentation. All...
12 CFR 611.1217 - Plain language requirements.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 12 Banks and Banking 6 2010-01-01 2010-01-01 false Plain language requirements. 611.1217 Section 611.1217 Banks and Banking FARM CREDIT ADMINISTRATION FARM CREDIT SYSTEM ORGANIZATION Termination of System Institution Status § 611.1217 Plain language requirements. (a) Plain language presentation. All...
49 CFR 229.64 - Plain bearings.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 49 Transportation 4 2010-10-01 2010-10-01 false Plain bearings. 229.64 Section 229.64 Transportation Other Regulations Relating to Transportation (Continued) FEDERAL RAILROAD ADMINISTRATION....64 Plain bearings. A plain bearing box shall contain visible free oil and may not be cracked to the...
78 FR 69124 - Trinity Adaptive Management Working Group; Public Meeting and Teleconference
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-18
... Recommendation, Bylaw discussion, 2014 Flow Alternatives, Status of Klamath fall flow release, Mining issues...Point, or rich text file). Registered speakers who wish to expand on their oral statements, or those who...
NASA TLX: software for assessing subjective mental workload.
Cao, Alex; Chintamani, Keshav K; Pandya, Abhilash K; Ellis, R Darin
2009-02-01
The NASA Task Load Index (TLX) is a popular technique for measuring subjective mental workload. It relies on a multidimensional construct to derive an overall workload score based on a weighted average of ratings on six subscales: mental demand, physical demand, temporal demand, performance, effort, and frustration level. A program for implementing a computerized version of the NASA TLX is described. The software version assists in simplifying collection, postprocessing, and storage of raw data. The program collects raw data from the subject and calculates the weighted (or unweighted) workload score, which is output to a text file. The program can also be tailored to a specific experiment using a simple input text file, if desired. The program was designed in Visual Studio 2005 and is capable of running on a Pocket PC with Windows CE or on a PC with Windows 2000 or higher. The NASA TLX program is available for free download.
Open access for the non-English-speaking world: overcoming the language barrier
Fung, Isaac CH
2008-01-01
This editorial highlights the problem of language barrier in scientific communication in spite of the recent success of Open Access Movement. Four options for English-language journals to overcome the language barrier are suggested: 1) abstracts in alternative languages provided by authors, 2) Wiki open translation, 3) international board of translator-editors, and 4) alternative language version of the journal. The Emerging Themes in Epidemiology announces that with immediate effect, it will accept translations of abstracts or full texts by authors as Additional files. Editorial note: In an effort towards overcoming the language barrier in scientific publication, ETE will accept translations of abstracts or the full text of published articles. Each translation should be submitted separately as an Additional File in PDF format. ETE will only peer review English-language versions. Therefore, translations will not be scrutinized in the review-process and the responsibility for accurate translation rests with the authors. PMID:18173854
Bartos, Timothy T.; Hallberg, Laura L.
2011-01-01
The High Plains aquifer system, commonly called the High Plains aquifer in many publications, is a nationally important water resource that underlies a 111-million-acre area (173,000 square miles) in parts of eight States including Wyoming. Through irrigation of crops with groundwater from the High Plains aquifer system, the area that overlies the aquifer system has become one of the major agricultural regions in the world. In addition, the aquifer system also serves as the primary source of drinking water for most residents of the region. The High Plains aquifer system is one of the largest aquifers or aquifer systems in the world. The High Plains aquifer system underlies an area of 8,190 square miles in southeastern Wyoming. Including Laramie County, the High Plains aquifer system is present in parts of five counties in southeastern Wyoming. The High Plains aquifer system underlies 8 percent of Wyoming, and 5 percent of the aquifer system is located within the State. Based on withdrawals for irrigation, public supply, and industrial use in 2000, the High Plains aquifer system is the most utilized source of groundwater in Wyoming. With the exception of the Laramie Mountains in western Laramie County, the High Plains aquifer system is present throughout Laramie County. In Laramie County, the High Plains aquifer system is the predominant groundwater resource for agricultural (irrigation), municipal, industrial, and domestic uses. Withdrawal of groundwater for irrigation (primarily in the eastern part of the county) is the largest use of water from the High Plains aquifer system in Laramie County and southeastern Wyoming. Continued interest in groundwater levels in the High Plains aquifer system in Laramie County prompted a study by the U.S. Geological Survey in cooperation with the Wyoming State Engineer's Office to update the potentiometric-surface map of the aquifer system in Laramie County. Groundwater levels were measured in wells completed in the High Plains aquifer system from March to June 2009. The groundwater levels were used to construct a map of the potentiometric surface of the High Plains aquifer system. In addition, depth to water and estimated saturated-thickness maps of the aquifer system were constructed using the potentiometric-surface map.
Aboulbanine, Zakaria; El Khayati, Naïma
2018-04-13
The use of phase space in medical linear accelerator Monte Carlo (MC) simulations significantly improves the execution time and leads to results comparable to those obtained from full calculations. The classical representation of phase space stores directly the information of millions of particles, producing bulky files. This paper presents a virtual source model (VSM) based on a reconstruction algorithm, taking as input a compressed file of roughly 800 kb derived from phase space data freely available in the International Atomic Energy Agency (IAEA) database. This VSM includes two main components; primary and scattered particle sources, with a specific reconstruction method developed for each. Energy spectra and other relevant variables were extracted from IAEA phase space and stored in the input description data file for both sources. The VSM was validated for three photon beams: Elekta Precise 6 MV/10 MV and a Varian TrueBeam 6 MV. Extensive calculations in water and comparisons between dose distributions of the VSM and IAEA phase space were performed to estimate the VSM precision. The Geant4 MC toolkit in multi-threaded mode (Geant4-[mt]) was used for fast dose calculations and optimized memory use. Four field configurations were chosen for dose calculation validation to test field size and symmetry effects, [Formula: see text] [Formula: see text], [Formula: see text] [Formula: see text], and [Formula: see text] [Formula: see text] for squared fields, and [Formula: see text] [Formula: see text] for an asymmetric rectangular field. Good agreement in terms of [Formula: see text] formalism, for 3%/3 mm and 2%/3 mm criteria, for each evaluated radiation field and photon beam was obtained within a computation time of 60 h on a single WorkStation for a 3 mm voxel matrix. Analyzing the VSM's precision in high dose gradient regions, using the distance to agreement concept (DTA), showed also satisfactory results. In all investigated cases, the mean DTA was less than 1 mm in build-up and penumbra regions. In regards to calculation efficiency, the event processing speed is six times faster using Geant4-[mt] compared to sequential Geant4, when running the same simulation code for both. The developed VSM for 6 MV/10 MV beams widely used, is a general concept easy to adapt in order to reconstruct comparable beam qualities for various linac configurations, facilitating its integration for MC treatment planning purposes.
Development of EnergyPlus Utility to Batch Simulate Building Energy Performance on a National Scale
DOE Office of Scientific and Technical Information (OSTI.GOV)
Valencia, Jayson F.; Dirks, James A.
2008-08-29
EnergyPlus is a simulation program that requires a large number of details to fully define and model a building. Hundreds or even thousands of lines in a text file are needed to run the EnergyPlus simulation depending on the size of the building. To manually create these files is a time consuming process that would not be practical when trying to create input files for thousands of buildings needed to simulate national building energy performance. To streamline the process needed to create the input files for EnergyPlus, two methods were created to work in conjunction with the National Renewable Energymore » Laboratory (NREL) Preprocessor; this reduced the hundreds of inputs needed to define a building in EnergyPlus to a small set of high-level parameters. The first method uses Java routines to perform all of the preprocessing on a Windows machine while the second method carries out all of the preprocessing on the Linux cluster by using an in-house built utility called Generalized Parametrics (GPARM). A comma delimited (CSV) input file is created to define the high-level parameters for any number of buildings. Each method then takes this CSV file and uses the data entered for each parameter to populate an extensible markup language (XML) file used by the NREL Preprocessor to automatically prepare EnergyPlus input data files (idf) using automatic building routines and macro templates. Using a Linux utility called “make”, the idf files can then be automatically run through the Linux cluster and the desired data from each building can be aggregated into one table to be analyzed. Creating a large number of EnergyPlus input files results in the ability to batch simulate building energy performance and scale the result to national energy consumption estimates.« less
27 CFR 9.207 - Outer Coastal Plain.
Code of Federal Regulations, 2010 CFR
2010-04-01
... Outer Coastal Plain. (a) Name. The name of the viticultural area described in this section is “Outer...,000 scale. (c) Boundary. The Outer Coastal Plain viticultural area includes all of Cumberland, Cape... Counties in the State of New Jersey. The boundary of the Outer Coastal Plain viticultural area is as...
U.S. Army Research Laboratory (ARL) Corporate Dari Document Transcription and Translation Guidelines
2012-10-01
text file format. 15. SUBJECT TERMS Transcription, Translation, guidelines, ground truth, Optical character recognition , OCR, Machine Translation, MT...foreign language into a target language in order to train, test, and evaluate optical character recognition (OCR) and machine translation (MT) embedded...graphic element and should not be transcribed. Elements that are not part of the primary text such as handwritten annotations or stamps should not be
The IATH ELAN Text-Sync Tool: A Simple System for Mobilizing ELAN Transcripts On- or Off-Line
ERIC Educational Resources Information Center
Dobrin, Lise M.; Ross, Douglas
2017-01-01
In this article we present the IATH ELAN Text-Sync Tool (ETST; see http://community.village.virginia.edu/etst), a series of scripts and workflow for playing ELAN files and associated audiovisual media in a web browser either on- or off-line. ELAN has become an indispensable part of documentary linguists' toolkit, but it is less than ideal for…
JOVIAL/Ada Microprocessor Study.
1982-04-01
Study Final Technical Report interesting feature of the nodes is that they provide multiple virtual terminals, so it is possible to monitor several...Terminal Interface Tasking Except ion Handling A more elaborate system could allow such features as spooling, background jobs or multiple users. To a large...Another editor feature is the buffer. Buffers may hold small amounts of text or entire text objects. They allow multiple files to be edited simultaneously
Tool to assess contents of ARM surface meteorology network netCDF files
DOE Office of Scientific and Technical Information (OSTI.GOV)
Staudt, A.; Kwan, T.; Tichler, J.
The Atmospheric Radiation Measurement (ARM) Program, supported by the US Department of Energy, is a major program of atmospheric measurement and modeling designed to improve the understanding of processes and properties that affect atmospheric radiation, with a particular focus on the influence of clouds and the role of cloud radiative feedback in the climate system. The ARM Program will use three highly instrumented primary measurement sites. Deployment of instrumentation at the first site, located in the Southern Great Plains of the United States, began in May of 1992. The first phase of deployment at the second site in the Tropicalmore » Western Pacific is scheduled for late in 1995. The third site will be in the North Slope of Alaska and adjacent Arctic Ocean. To meet the scientific objectives of ARM, observations from the ARM sites are combined with data from other sources; these are called external data. Among these external data sets are surface meteorological observations from the Oklahoma Mesonet, a Kansas automated weather network, the Wind Profiler Demonstration Network (WPDN), and the National Weather Service (NWS) surface stations. Before combining these data with the Surface Meteorological Observations Station (SMOS) ARM data, it was necessary to assess the contents and quality of both the ARM and the external data sets. Since these data sets had previously been converted to netCDF format for use by the ARM Science Team, a tool was written to assess the contents of the netCDF files.« less
2008-03-01
by plain fatigue and the process kept alternating or finishing all fretting fatigue cycles first followed by plain fatigue...fatigue and the process kept alternating or finishing all fretting fatigue cycles first followed by plain fatigue. 127 6.2.2. Phase Difference...component’s life. Figure 1.2 illustrates the process of combination of fretting fatigue and plain fatigue, by using three parts. The first part of this figure
Pedrinha, Victor Feliz; Brandão, Juliana Melo da Silva; Pessoa, Oscar Faciola; Rodrigues, Patrícia de Almeida
2018-01-01
Advances in endodontics have enabled the evolution of file manufacturing processes, improving performance beyond that of conventional files. In the present study, systems manufactured using state of the art methods and possessing special properties related to NiTi alloys ( i.e ., CM-Wire, M-Wire and R-Phase) were selected. The aim of this review was to provide a detailed analysis of the literature about the relationship between recently introduced NiTi files with different movement kinematics and shaping ability, apical extrusion of debris and dentin defects in root canal preparations. From March 2016 to January 2017, electronic searches were conducted in the PubMed and SCOPUS databases for articles published since January 2010. In vitro studies performed on extracted human teeth and published in English were considered for this review. Based on the inclusion criteria, 71 papers were selected for the analysis of full-text copies. Specific analysis was performed on 45 articles describing the effects of reciprocating, continuous and adaptive movements on the WaveOne Gold, Reciproc, HyFlex CM and Twisted File Adaptive systems. A wide range of testing conditions and methodologies have been used to compare the systems. Due the controversies among the results, the characteristics of the files used, such as their design and alloys, appear to be inconsistent to determine the best approach.
NASA Astrophysics Data System (ADS)
Al-Mishwat, Ali T.
2016-05-01
PHASS99 is a FORTRAN program designed to retrieve and decode radiometric and other physical age information of igneous rocks contained in the international database IGBADAT (Igneous Base Data File). In the database, ages are stored in a proprietary format using mnemonic representations. The program can handle up to 99 ages in an igneous rock specimen and caters to forty radiometric age systems. The radiometric age alphanumeric strings assigned to each specimen description in the database consist of four components: the numeric age and its exponential modifier, a four-character mnemonic method identification, a two-character mnemonic name of analysed material, and the reference number in the rock group bibliography vector. For each specimen, the program searches for radiometric age strings, extracts them, parses them, decodes the different age components, and converts them to high-level English equivalents. IGBADAT and similarly-structured files are used for input. The output includes three files: a flat raw ASCII text file containing retrieved radiometric age information, a generic spreadsheet-compatible file for data import to spreadsheets, and an error file. PHASS99 builds on the old program TSTPHA (Test Physical Age) decoder program and expands greatly its capabilities. PHASS99 is simple, user friendly, fast, efficient, and does not require users to have knowledge of programing.
ZED- A LINE EDITOR FOR THE DEC VAX
NASA Technical Reports Server (NTRS)
Scott, P. J.
1994-01-01
The ZED editor for the DEC VAX is a simple, yet powerful line editor for text, program source code, and non-binary data. Line editors can be superior to screen editors in some cases, such as executing complex multiple or conditional commands, or editing via slow modem lines. ZED excels in the area of text processing by using procedure files. For example, such procedures can reformat a file of addresses or remove all comment lines from a FORTRAN program. In addition to command files, ZED also features versatile search qualifiers, global changes, conditionals, on-line help, hexadecimal mode, space compression, looping, logical combinations of search strings, journaling, visible control characters, and automatic detabbing. The ZED editor was originally developed at Cambridge University in London and has been continuously enhanced since 1976. Users of the Cambridge implementation have devised such elaborate ZED procedures as chess games, calculators, and programs for evaluating Pi. This implementation of ZED strives to maintain the characteristics of the Cambridge editor. A complete ZED manual is included on the tape. ZED is written entirely in C for either batch or interactive execution on the DEC VAX under VMS 4.X and requires 80,896 bytes of memory. This program was released in 1988 and updated in 1989.
Kimbrow, Dustin R.
2014-01-01
Topographic survey data of areas on Dauphin Island on the Alabama coast were collected using a truck-mounted mobile terrestrial light detection and ranging system. This system is composed of a high frequency laser scanner in conjunction with an inertial measurement unit and a position and orientation computer to produce highly accurate topographic datasets. A global positioning system base station was set up on a nearby benchmark and logged vertical and horizontal position information during the survey for post-processing. Survey control points were also collected throughout the study area to determine residual errors. Data were collected 5 days after Hurricane Isaac made landfall in early September 2012 to document sediment deposits prior to clean-up efforts. Three data files in ASCII text format with the extension .xyz are included in this report, and each file is named according to both the acquisition date and the relative geographic location on Dauphin Island (for example, 20120903_Central.xyz). Metadata are also included for each of the files in both Extensible Markup Language with the extension .xml and ASCII text formats. These topographic data can be used to analyze the effects of storm surge on barrier island environments and also serve as a baseline dataset for future change detection analyses.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-08-27
... considered, but were eliminated from detailed analysis include: conventional mining (whether by open pit or... Agencywide Documents and Management System (ADAMS), which provides text and image files of the NRC's public...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-06-12
... government's requirement for accessibility (508 PDF or HTML file). The awardee must provide descriptive text... schedule. Among the criteria used to evaluate the applications are indication of a clear understanding of...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-16
... files). The awardee must provide descriptive text interpreting all graphics, photos, graphs, and/or... implement a project of this size and scope. Review Considerations: Among the criteria used to evaluate the...
24 CFR 1715.50 - Advertising disclaimers; subdivisions registered and effective with HUD.
Code of Federal Regulations, 2011 CFR
2011-04-01
... disclaimer statement shall be displayed below the text of all printed material and literature used in... file with the Secretary. If the material or literature consists of more than one page, it shall appear...
24 CFR 1715.50 - Advertising disclaimers; subdivisions registered and effective with HUD.
Code of Federal Regulations, 2013 CFR
2013-04-01
... disclaimer statement shall be displayed below the text of all printed material and literature used in... file with the Secretary. If the material or literature consists of more than one page, it shall appear...