Faibish, Sorin; Bent, John M; Tzelnic, Percy; Grider, Gary; Torres, Aaron
2015-02-03
Techniques are provided for storing files in a parallel computing system using sub-files with semantically meaningful boundaries. A method is provided for storing at least one file generated by a distributed application in a parallel computing system. The file comprises one or more of a complete file and a plurality of sub-files. The method comprises the steps of obtaining a user specification of semantic information related to the file; providing the semantic information as a data structure description to a data formatting library write function; and storing the semantic information related to the file with one or more of the sub-files in one or more storage nodes of the parallel computing system. The semantic information provides a description of data in the file. The sub-files can be replicated based on semantically meaningful boundaries.
Tuning HDF5 subfiling performance on parallel file systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Byna, Suren; Chaarawi, Mohamad; Koziol, Quincey
Subfiling is a technique used on parallel file systems to reduce locking and contention issues when multiple compute nodes interact with the same storage target node. Subfiling provides a compromise between the single shared file approach that instigates the lock contention problems on parallel file systems and having one file per process, which results in generating a massive and unmanageable number of files. In this paper, we evaluate and tune the performance of recently implemented subfiling feature in HDF5. In specific, we explain the implementation strategy of subfiling feature in HDF5, provide examples of using the feature, and evaluate andmore » tune parallel I/O performance of this feature with parallel file systems of the Cray XC40 system at NERSC (Cori) that include a burst buffer storage and a Lustre disk-based storage. We also evaluate I/O performance on the Cray XC30 system, Edison, at NERSC. Our results show performance benefits of 1.2X to 6X performance advantage with subfiling compared to writing a single shared HDF5 file. We present our exploration of configurations, such as the number of subfiles and the number of Lustre storage targets to storing files, as optimization parameters to obtain superior I/O performance. Based on this exploration, we discuss recommendations for achieving good I/O performance as well as limitations with using the subfiling feature.« less
PASCAL Data Base File Description and Indexing Rules in Chemistry, Biology and Medicine.
ERIC Educational Resources Information Center
Gaillardin, R.; And Others
This report on the multidisciplinary PASCAL database describes the files and the indexing rules for chemistry, biology, and medicine. PASCAL deals with all aspects of chemistry within two subfiles whose combined yearly growth is about 100,000 references. The Biopascal file, organized in the two subfiles of Plant Science and Biology and Medicine,…
Faibish, Sorin; Bent, John M.; Tzelnic, Percy; Grider, Gary; Torres, Aaron
2015-10-20
Techniques are provided for storing files in a parallel computing system using different resolutions. A method is provided for storing at least one file generated by a distributed application in a parallel computing system. The file comprises one or more of a complete file and a sub-file. The method comprises the steps of obtaining semantic information related to the file; generating a plurality of replicas of the file with different resolutions based on the semantic information; and storing the file and the plurality of replicas of the file in one or more storage nodes of the parallel computing system. The different resolutions comprise, for example, a variable number of bits and/or a different sub-set of data elements from the file. A plurality of the sub-files can be merged to reproduce the file.
ERIC Educational Resources Information Center
Center for Education Statistics (ED/OERI), Washington, DC.
The Financial Statistics machine-readable data file (MRDF) is a subfile of the larger Higher Education General Information Survey (HEGIS). It contains basic financial statistics for over 3,000 institutions of higher education in the United States and its territories. The data are arranged sequentially by institution, with institutional…
Reducing I/O variability using dynamic I/O path characterization in petascale storage systems
Son, Seung Woo; Sehrish, Saba; Liao, Wei-keng; ...
2016-11-01
In petascale systems with a million CPU cores, scalable and consistent I/O performance is becoming increasingly difficult to sustain mainly because of I/O variability. Furthermore, the I/O variability is caused by concurrently running processes/jobs competing for I/O or a RAID rebuild when a disk drive fails. We present a mechanism that stripes across a selected subset of I/O nodes with the lightest workload at runtime to achieve the highest I/O bandwidth available in the system. In this paper, we propose a probing mechanism to enable application-level dynamic file striping to mitigate I/O variability. We also implement the proposed mechanism inmore » the high-level I/O library that enables memory-to-file data layout transformation and allows transparent file partitioning using subfiling. Subfiling is a technique that partitions data into a set of files of smaller size and manages file access to them, making data to be treated as a single, normal file to users. Here, we demonstrate that our bandwidth probing mechanism can successfully identify temporally slower I/O nodes without noticeable runtime overhead. Experimental results on NERSC’s systems also show that our approach isolates I/O variability effectively on shared systems and improves overall collective I/O performance with less variation.« less
Storing files in a parallel computing system using list-based index to identify replica files
DOE Office of Scientific and Technical Information (OSTI.GOV)
Faibish, Sorin; Bent, John M.; Tzelnic, Percy
Improved techniques are provided for storing files in a parallel computing system using a list-based index to identify file replicas. A file and at least one replica of the file are stored in one or more storage nodes of the parallel computing system. An index for the file comprises at least one list comprising a pointer to a storage location of the file and a storage location of the at least one replica of the file. The file comprises one or more of a complete file and one or more sub-files. The index may also comprise a checksum value formore » one or more of the file and the replica(s) of the file. The checksum value can be evaluated to validate the file and/or the file replica(s). A query can be processed using the list.« less
Storing files in a parallel computing system based on user-specified parser function
Faibish, Sorin; Bent, John M; Tzelnic, Percy; Grider, Gary; Manzanares, Adam; Torres, Aaron
2014-10-21
Techniques are provided for storing files in a parallel computing system based on a user-specified parser function. A plurality of files generated by a distributed application in a parallel computing system are stored by obtaining a parser from the distributed application for processing the plurality of files prior to storage; and storing one or more of the plurality of files in one or more storage nodes of the parallel computing system based on the processing by the parser. The plurality of files comprise one or more of a plurality of complete files and a plurality of sub-files. The parser can optionally store only those files that satisfy one or more semantic requirements of the parser. The parser can also extract metadata from one or more of the files and the extracted metadata can be stored with one or more of the plurality of files and used for searching for files.
Storing files in a parallel computing system based on user or application specification
DOE Office of Scientific and Technical Information (OSTI.GOV)
Faibish, Sorin; Bent, John M.; Nick, Jeffrey M.
2016-03-29
Techniques are provided for storing files in a parallel computing system based on a user-specification. A plurality of files generated by a distributed application in a parallel computing system are stored by obtaining a specification from the distributed application indicating how the plurality of files should be stored; and storing one or more of the plurality of files in one or more storage nodes of a multi-tier storage system based on the specification. The plurality of files comprise a plurality of complete files and/or a plurality of sub-files. The specification can optionally be processed by a daemon executing on onemore » or more nodes in a multi-tier storage system. The specification indicates how the plurality of files should be stored, for example, identifying one or more storage nodes where the plurality of files should be stored.« less
Parallel file system with metadata distributed across partitioned key-value store c
Bent, John M.; Faibish, Sorin; Grider, Gary; Torres, Aaron
2017-09-19
Improved techniques are provided for storing metadata associated with a plurality of sub-files associated with a single shared file in a parallel file system. The shared file is generated by a plurality of applications executing on a plurality of compute nodes. A compute node implements a Parallel Log Structured File System (PLFS) library to store at least one portion of the shared file generated by an application executing on the compute node and metadata for the at least one portion of the shared file on one or more object storage servers. The compute node is also configured to implement a partitioned data store for storing a partition of the metadata for the shared file, wherein the partitioned data store communicates with partitioned data stores on other compute nodes using a message passing interface. The partitioned data store can be implemented, for example, using Multidimensional Data Hashing Indexing Middleware (MDHIM).
Using Purpose-Built Functions and Block Hashes to Enable Small Block and Sub-file Forensics
2010-01-01
JPEGs. We tested precarve using the nps-2009-canon2-gen6 (Garfinkel et al., 2009) disk image. The disk image was created with a 32 MB SD card and a...analysis of n-grams in the fragment. Fig. 1 e Usage of a 160 GB iPod reported by iTunes 8.2.1 (6) (top), as reported by the file system (bottom center), and...as computing with random sampling (bottom right). Note that iTunes usage actually in GiB, even though the program displays the “GB” label. Fig. 2 e
Artificial neural networks for modeling ammonia emissions released from sewage sludge composting
NASA Astrophysics Data System (ADS)
Boniecki, P.; Dach, J.; Pilarski, K.; Piekarska-Boniecka, H.
2012-09-01
The project was designed to develop, test and validate an original Neural Model describing ammonia emissions generated in composting sewage sludge. The composting mix was to include the addition of such selected structural ingredients as cereal straw, sawdust and tree bark. All created neural models contain 7 input variables (chemical and physical parameters of composting) and 1 output (ammonia emission). The α data file was subdivided into three subfiles: the learning file (ZU) containing 330 cases, the validation file (ZW) containing 110 cases and the test file (ZT) containing 110 cases. The standard deviation ratios (for all 4 created networks) ranged from 0.193 to 0.218. For all of the selected models, the correlation coefficient reached the high values of 0.972-0.981. The results show that he predictive neural model describing ammonia emissions from composted sewage sludge is well suited for assessing such emissions. The sensitivity analysis of the model for the input of variables of the process in question has shown that the key parameters describing ammonia emissions released in composting sewage sludge are pH and the carbon to nitrogen ratio (C:N).
NASA Astrophysics Data System (ADS)
Baru, Chaitan; Nandigam, Viswanath; Krishnan, Sriram
2010-05-01
Increasingly, the geoscience user community expects modern IT capabilities to be available in service of their research and education activities, including the ability to easily access and process large remote sensing datasets via online portals such as GEON (www.geongrid.org) and OpenTopography (opentopography.org). However, serving such datasets via online data portals presents a number of challenges. In this talk, we will evaluate the pros and cons of alternative storage strategies for management and processing of such datasets using binary large object implementations (BLOBs) in database systems versus implementation in Hadoop files using the Hadoop Distributed File System (HDFS). The storage and I/O requirements for providing online access to large datasets dictate the need for declustering data across multiple disks, for capacity as well as bandwidth and response time performance. This requires partitioning larger files into a set of smaller files, and is accompanied by the concomitant requirement for managing large numbers of file. Storing these sub-files as blobs in a shared-nothing database implemented across a cluster provides the advantage that all the distributed storage management is done by the DBMS. Furthermore, subsetting and processing routines can be implemented as user-defined functions (UDFs) on these blobs and would run in parallel across the set of nodes in the cluster. On the other hand, there are both storage overheads and constraints, and software licensing dependencies created by such an implementation. Another approach is to store the files in an external filesystem with pointers to them from within database tables. The filesystem may be a regular UNIX filesystem, a parallel filesystem, or HDFS. In the HDFS case, HDFS would provide the file management capability, while the subsetting and processing routines would be implemented as Hadoop programs using the MapReduce model. Hadoop and its related software libraries are freely available. Another consideration is the strategy used for partitioning large data collections, and large datasets within collections, using round-robin vs hash partitioning vs range partitioning methods. Each has different characteristics in terms of spatial locality of data and resultant degree of declustering of the computations on the data. Furthermore, we have observed that, in practice, there can be large variations in the frequency of access to different parts of a large data collection and/or dataset, thereby creating "hotspots" in the data. We will evaluate the ability of different approaches for dealing effectively with such hotspots and alternative strategies for dealing with hotspots.
Belief in an Afterlife: A National Survey.
ERIC Educational Resources Information Center
Klenow, Daniel J.; Bolin, Robert C.
1990-01-01
Examined factors affecting belief in afterlife. Data from 1978 subfile on National Opinion Research Center's General Social Survey showed that, controlling on frequency of church attendance and religious intensity, Protestants had highest incidence of belief in life after death, followed by Catholics, and then by Jews. Race, religion, and church…
SENSIT.FOR: A program for sensitometric reduction
NASA Astrophysics Data System (ADS)
Maury, A.; Marchal, J.
1984-09-01
A FORTRAN program for sensitometric evaluation of processes involved in hypering astronomical plates was written. It contains subroutines for full or quick description of the operation being done; choice of type of sensitogram; creation of 16 subfiles in the scan; density filtering; correction for area; specular PDS to diffuse ISO density calibration; and fog correction.
An Attempt to Compare EMCS with TOXLINE.
ERIC Educational Resources Information Center
Lorent, Jean-Pierre; Schirner, Hedi
1978-01-01
Sixteen queries, performed on the two systems, resulted in 247 unique relevant references from EMCS and 559 from TOXLINE. The overlap varied considerably, from 6 to 50 percent. It is remarkable that TOXLINE, with nine subfiles specially compiled for toxicology, can be supplemented by EMCS, which in the present study delivered 31 percent of all the…
1979-09-01
CAN E GALLEO 3y: GCOSF G(,jPSF GCUNPFk’ E NT py_ SC.A5S4 A 1’ _ _S FG MEN T ( O OESC I £NITE, TE. - UNPACK SUBFILE OR OAE LIST LENr ,TH1 75...SEGMENTS (PR2) DESCS SET LINE MODE IN DRAFTING LENr ,THI ------- 5 --lANGUAGES FTN . ...... - HAS NO SECONDARY ENTRY POINTS CAN BE CALLED BYI PRTADLE B DSTR
MAIL LOG, program theory, volume 1. [Scout project automatic data system
NASA Technical Reports Server (NTRS)
Harris, D. K.
1979-01-01
The program theory used to obtain the software package, MAIL LOG, developed for the Scout Project Automatic Data System, SPADS, is described. The program is written in FORTRAN for the PRIME 300 computer system. The MAIL LOG data base consists of three main subfiles: (1) incoming and outgoing mail correspondence; (2) design information releases and reports; and (3) drawings and engineering orders. All subroutine descriptions, flowcharts, and MAIL LOG outputs are given and the data base design is described.
The Use of Binary Search Trees in External Distribution Sorting.
ERIC Educational Resources Information Center
Cooper, David; Lynch, Michael F.
1984-01-01
Suggests new method of external distribution called tree partitioning that involves use of binary tree to split incoming file into successively smaller partitions for internal sorting. Number of disc accesses during a tree-partitioning sort were calculated in simulation using files extracted from British National Bibliography catalog files. (19…
Privacy Impact Assessment for the External Compliance Program Discrimination Complaint Files
The External Compliance Program Discrimination Complaint Files System collects information on administrative complaints. Learn how this data will be collected in the system, how it will be used, access to the data, and the purpose of data collection.
1985-02-01
li’Lii El. IE F INE ,UT 1 = K MM. * GET, NAST484/UN=SYSTEM. E(EGIN, ,NAST464. PFL, 160000, RED’UCE(-). LINKI , L~DDEDDD Figure A-I1 Typical Control-Card...initiated via Che LINKI statement, in which the second term is the input data file. The permanent file name KMDM, shown in conjunction with local file
mrtailor: a tool for PDB-file preparation for the generation of external restraints.
Gruene, Tim
2013-09-01
Model building starting from, for example, a molecular-replacement solution with low sequence similarity introduces model bias, which can be difficult to detect, especially at low resolution. The program mrtailor removes low-similarity regions from a template PDB file according to sequence similarity between the target sequence and the template sequence and maps the target sequence onto the PDB file. The modified PDB file can be used to generate external restraints for low-resolution refinement with reduced model bias and can be used as a starting point for model building and refinement. The program can call ProSMART [Nicholls et al. (2012), Acta Cryst. D68, 404-417] directly in order to create external restraints suitable for REFMAC5 [Murshudov et al. (2011), Acta Cryst. D67, 355-367]. Both a command-line version and a GUI exist.
The Design and Usage of the New Data Management Features in NASTRAN
NASA Technical Reports Server (NTRS)
Pamidi, P. R.; Brown, W. K.
1984-01-01
Two new data management features are installed in the April 1984 release of NASTRAN. These two features are the Rigid Format Data Base and the READFILE capability. The Rigid Format Data Base is stored on external files in card image format and can be easily maintained and expanded by the use of standard text editors. This data base provides the user and the NASTRAN maintenance contractor with an easy means for making changes to a Rigid Format or for generating new Rigid Formats without unnecessary compilations and link editing of NASTRAN. Each Rigid Format entry in the data base contains the Direct Matrix Abstraction Program (DMAP), along with the associated restart, DMAP sequence subset and substructure control flags. The READFILE capability allows an user to reference an external secondary file from the NASTRAN primary input file and to read data from this secondary file. There is no limit to the number of external secondary files that may be referenced and read.
Checkpoint-Restart in User Space
DOE Office of Scientific and Technical Information (OSTI.GOV)
CRUISE implements a user-space file system that stores data in main memory and transparently spills over to other storage, like local flash memory or the parallel file system, as needed. CRUISE also exposes file contents fo remote direct memory access, allowing external tools to copy files to the parallel file system in the background with reduced CPU interruption.
. These tables may be defined within a separate ASCII text file (see Description and Format of BUFR Tables time, the BUFR tables are usually read from an external ASCII text file (although it is also possible reports. Click here to view the ASCII text file (called /nwprod/fix/bufrtab.002 on the NCEP CCS machines
Unbinding Transition of Probes in Single-File Systems
NASA Astrophysics Data System (ADS)
Bénichou, Olivier; Démery, Vincent; Poncet, Alexis
2018-02-01
Single-file transport, arising in quasi-one-dimensional geometries where particles cannot pass each other, is characterized by the anomalous dynamics of a probe, notably its response to an external force. In these systems, the motion of several probes submitted to different external forces, although relevant to mixtures of charged and neutral or active and passive objects, remains unexplored. Here, we determine how several probes respond to external forces. We rely on a hydrodynamic description of the symmetric exclusion process to obtain exact analytical results at long times. We show that the probes can either move as a whole, or separate into two groups moving away from each other. In between the two regimes, they separate with a different dynamical exponent, as t1 /4. This unbinding transition also occurs in several continuous single-file systems and is expected to be observable.
Özkocak, I; Taşkan, M M; Gökt Rk, H; Aytac, F; Karaarslan, E Şirin
2015-01-01
The aim of this study is to evaluate increases in temperature on the external root surface during endodontic treatment with different rotary systems. Fifty human mandibular incisors with a single root canal were selected. All root canals were instrumented using a size 20 Hedstrom file, and the canals were irrigated with 5% sodium hypochlorite solution. The samples were randomly divided into the following three groups of 15 teeth: Group 1: The OneShape Endodontic File no.: 25; Group 2: The Reciproc Endodontic File no.: 25; Group 3: The WaveOne Endodontic File no.: 25. During the preparation, the temperature changes were measured in the middle third of the roots using a noncontact infrared thermometer. The temperature data were transferred from the thermometer to the computer and were observed graphically. Statistical analysis was performed using the Kruskal-Wallis analysis of variance at a significance level of 0.05. The increases in temperature caused by the OneShape file system were lower than those of the other files (P < 0.05). The WaveOne file showed the highest temperature increases. However, there were no significant differences between the Reciproc and WaveOne files. The single file rotary systems used in this study may be recommended for clinical use.
Investigation of At-Risk Patent Filings
ERIC Educational Resources Information Center
Livne, O.
2003-01-01
The author presents an investigation of patent-application filings made without external financial support, or "at-risk", based on inventions disclosed to the University of California from fiscal years 1991 to 2000. The success of the at-risk patent applications filed on these invention disclosures is examined from the perspective of…
Propagation of Data Dependency through Distributed Cooperating Processes
1988-09-01
12 The External Data Dependency Analyzer ( EDDA ) .................................................. 12 The new EPL...47 EDDA Patch Files for the Dining Philosophers Example [Figure 23] ................... 49 L im itations...dependencies is evident. The External Data Dependency Analyzer ( EDDA ) The EDDA derives external data dependencies by performing two levels of analysis
The DREO Elint Browser Utility (DEBU) reference manual
NASA Astrophysics Data System (ADS)
Ford, Barbara; Jones, David
1992-04-01
An electronic intelligent database browsing tool called DEBU has been developed that allows databases such as ELP, Kilting, EWIR, and AFEWC to be reviewed and analyzed from a user-friendly environment on a personal computer. DEBU's basic function is to allow users to examine the contents of user-selected subfiles of user-selected emitters of user-selected databases. DEBU augments this functionality with support for selecting (filtering) and combining subsets of emitters by user-selected attributes such as name, parameter type, or parameter value. DEBU provides facilities for examining histograms and x-y plots of selected parameters, for doing ambiguity analysis and mode level analysis, and for generating and printing a variety of reports. A manual is provided for users of DEBU, including descriptions and illustrations of menus and windows.
Mail LOG: Program operating instructions
NASA Technical Reports Server (NTRS)
Harris, D. K.
1979-01-01
The operating instructions for the software package, MAIL LOG, developed for the Scout Project Automatic Data System, SPADS, are provided. The program is written in FORTRAN for the PRIME 300 computer system. The MAIL LOG program has the following four modes of operation: (1) INPUT - putting new records into the data base (2) REVISE - changing or modifying existing records in the data base (3) SEARCH - finding special records existing in the data base (4) ARCHIVE - store or put away existing records in the data base. The output includes special printouts of records in the data base and results from the INPUT and SEARCH modes. The MAIL LOG data base consists of three main subfiles: Incoming and outgoing mail correspondence; Design Information Releases and Releases and Reports; and Drawings and Engineering orders.
2009-12-01
other services for early UNIX systems at Bell labs. In many UNIX based systems, the field added to ‘etc/ passwd ’ file to carry GCOS ID information was...charset, and external. struct options_main { /* Option flags */ opt_flags flags; /* Password files */ struct list_main * passwd ; /* Password file...object PASSWD . It is part of several other data structures. struct PASSWD { int id; char *login; char *passwd_hash; int UID
NASA Technical Reports Server (NTRS)
1993-01-01
All the options in the NASA VEGetation Workbench (VEG) make use of a database of historical cover types. This database contains results from experiments by scientists on a wide variety of different cover types. The learning system uses the database to provide positive and negative training examples of classes that enable it to learn distinguishing features between classes of vegetation. All the other VEG options use the database to estimate the error bounds involved in the results obtained when various analysis techniques are applied to the sample of cover type data that is being studied. In the previous version of VEG, the historical cover type database was stored as part of the VEG knowledge base. This database was removed from the knowledge base. It is now stored as a series of flat files that are external to VEG. An interface between VEG and these files was provided. The interface allows the user to select which files of historical data to use. The files are then read, and the data are stored in Knowledge Engineering Environment (KEE) units using the same organization of units as in the previous version of VEG. The interface also allows the user to delete some or all of the historical database units from VEG and load new historical data from a file. This report summarizes the use of the historical cover type database in VEG. It then describes the new interface to the files containing the historical data. It describes minor changes that were made to VEG to enable the externally stored database to be used. Test runs to test the operation of the new interface and also to test the operation of VEG using historical data loaded from external files are described. Task F was completed. A Sun cartridge tape containing the KEE and Common Lisp code for the new interface and the modified version of the VEG knowledge base was delivered to the NASA GSFC technical representative.
External-Compression Supersonic Inlet Design Code
NASA Technical Reports Server (NTRS)
Slater, John W.
2011-01-01
A computer code named SUPIN has been developed to perform aerodynamic design and analysis of external-compression, supersonic inlets. The baseline set of inlets include axisymmetric pitot, two-dimensional single-duct, axisymmetric outward-turning, and two-dimensional bifurcated-duct inlets. The aerodynamic methods are based on low-fidelity analytical and numerical procedures. The geometric methods are based on planar geometry elements. SUPIN has three modes of operation: 1) generate the inlet geometry from a explicit set of geometry information, 2) size and design the inlet geometry and analyze the aerodynamic performance, and 3) compute the aerodynamic performance of a specified inlet geometry. The aerodynamic performance quantities includes inlet flow rates, total pressure recovery, and drag. The geometry output from SUPIN includes inlet dimensions, cross-sectional areas, coordinates of planar profiles, and surface grids suitable for input to grid generators for analysis by computational fluid dynamics (CFD) methods. The input data file for SUPIN and the output file from SUPIN are text (ASCII) files. The surface grid files are output as formatted Plot3D or stereolithography (STL) files. SUPIN executes in batch mode and is available as a Microsoft Windows executable and Fortran95 source code with a makefile for Linux.
Research on the Perforating Algorithm Based on STL Files
NASA Astrophysics Data System (ADS)
Yuchuan, Han; Xianfeng, Zhu; Yunrui, Bai; Zhiwen, Wu
2018-04-01
In the process of making medical personalized external fixation brace, the 3D data file should be perforated to increase the air permeability and reduce the weight. In this paper, a perforating algorithm for 3D STL file is proposed, which can perforate holes, hollow characters and engrave decorative patterns on STL files. The perforating process is composed of three steps. Firstly, make the imaginary space surface intersect with the STL model, and reconstruct triangles at the intersection. Secondly, delete the triangular facets inside the space surface and make a hole on the STL model. Thirdly, triangulate the inner surface of the hole, and thus realize the perforating. Choose the simple space equations such as cylindrical and rectangular prism equations as perforating equations can perforate round holes and rectangular holes. Through the combination of different holes, lettering, perforating decorative patterns and other perforated results can be accomplished. At last, an external fixation brace and an individual pen container were perforated holes using the algorithm, and the expected results were reached, which proved the algorithm is feasible.
75 FR 166 - Postal Product Price Changes
Federal Register 2010, 2011, 2012, 2013, 2014
2010-01-04
... Notice includes three attachments: (1) A redacted version of the letter to the customer with the amended... of contracts are based on objective, external factors and out of the Postal Service's discretion.\\2\\ Such objective, external factors are, in the case of the Global Direct contract filed in Docket No...
75 FR 166 - Postal Product Price Changes
Federal Register 2010, 2011, 2012, 2013, 2014
2010-01-04
... Notice includes three attachments: (1) A redacted version of the letter to the customer with the amended... contracts are based on objective, external factors and out of the Postal Service's discretion.\\2\\ Such objective, external factors are, in the case of the Global Direct contract filed in Docket No. CP2009-29...
ELAS - SCIENCE & TECHNOLOGY LABORATORY APPLICATIONS SOFTWARE (SILICON GRAPHICS VERSION)
NASA Technical Reports Server (NTRS)
Walters, D.
1994-01-01
The Science and Technology Laboratory Applications Software (ELAS) was originally designed to analyze and process digital imagery data, specifically remotely-sensed scanner data. This capability includes the processing of Landsat multispectral data; aircraft-acquired scanner data; digitized topographic data; and numerous other ancillary data, such as soil types and rainfall information, that can be stored in digitized form. ELAS has the subsequent capability to geographically reference this data to dozens of standard, as well as user created projections. As an integrated image processing system, ELAS offers the user of remotely-sensed data a wide range of capabilities in the areas of land cover analysis and general purpose image analysis. ELAS is designed for flexible use and operation and includes its own FORTRAN operating subsystem and an expandable set of FORTRAN application modules. Because all of ELAS resides in one "logical" FORTRAN program, data inputs and outputs, directives, and module switching are convenient for the user. There are over 230 modules presently available to aid the user in performing a wide range of land cover analyses and manipulation. The file management modules enable the user to allocate, define, access, and specify usage for all types of files (ELAS files, subfiles, external files etc.). Various other modules convert specific types of satellite, aircraft, and vector-polygon data into files that can be used by other ELAS modules. The user also has many module options which aid in displaying image data, such as magnification/reduction of the display; true color display; and several memory functions. Additional modules allow for the building and manipulation of polygonal areas of the image data. Finally, there are modules which allow the user to select and classify the image data. An important feature of the ELAS subsystem is that its structure allows new applications modules to be easily integrated in the future. ELAS has as a standard the flexibility to process data elements exceeding 8 bits in length, including floating point (noninteger) elements and 16 or 32 bit integers. Thus it is able to analyze and process "non-standard" nonimage data. The VAX (ERL-10017) and Concurrent (ERL-10013) versions of ELAS 9.0 are written in FORTRAN and ASSEMBLER for DEC VAX series computers running VMS and Concurrent computers running MTM. The Sun (SSC-00019), Masscomp (SSC-00020), and Silicon Graphics (SSC-00021) versions of ELAS 9.0 are written in FORTRAN 77 and C-LANGUAGE for Sun4 series computers running SunOS, Masscomp computers running UNIX, and Silicon Graphics IRIS computers running IRIX. The Concurrent version requires at least 15 bit addressing and a direct memory access channel. The VAX and Concurrent versions of ELAS both require floating-point hardware, at least 1Mb of RAM, and approximately 70Mb of disk space. Both versions also require a COMTAL display device in order to display images. For the Sun, Masscomp, and Silicon Graphics versions of ELAS, the disk storage required is approximately 115Mb, and a minimum of 8Mb of RAM is required for execution. The Sun version of ELAS requires either the X-Window System Version 11 Revision 4 or Sun OpenWindows Version 2. The Masscomp version requires a GA1000 display device and the associated "gp" library. The Silicon Graphics version requires Silicon Graphics' GL library. ELAS display functions will not work with a monochrome monitor. The standard distribution medium for the VAX version (ERL10017) is a set of two 9-track 1600 BPI magnetic tapes in DEC VAX BACKUP format. This version is also available on a TK50 tape cartridge in DEC VAX BACKUP format. The standard distribution medium for the Concurrent version (ERL-10013) is a set of two 9-track 1600 BPI magnetic tapes in Concurrent BACKUP format. The standard distribution medium for the Sun version (SSC-00019) is a .25 inch streaming magnetic tape cartridge in UNIX tar format. The standard distribution medium for the Masscomp version, (SSC-00020) is a .25 inch streaming magnetic tape cartridge in UNIX tar format. The standard distribution medium for the Silicon Graphics version (SSC-00021) is a .25 inch streaming magnetic IRIS tape cartridge in UNIX tar format. Version 9.0 was released in 1991. Sun4, SunOS, and Open Windows are trademarks of Sun Microsystems, Inc. MIT X Window System is licensed by Massachusetts Institute of Technology.
ELAS - SCIENCE & TECHNOLOGY LABORATORY APPLICATIONS SOFTWARE (CONCURRENT VERSION)
NASA Technical Reports Server (NTRS)
Pearson, R. W.
1994-01-01
The Science and Technology Laboratory Applications Software (ELAS) was originally designed to analyze and process digital imagery data, specifically remotely-sensed scanner data. This capability includes the processing of Landsat multispectral data; aircraft-acquired scanner data; digitized topographic data; and numerous other ancillary data, such as soil types and rainfall information, that can be stored in digitized form. ELAS has the subsequent capability to geographically reference this data to dozens of standard, as well as user created projections. As an integrated image processing system, ELAS offers the user of remotely-sensed data a wide range of capabilities in the areas of land cover analysis and general purpose image analysis. ELAS is designed for flexible use and operation and includes its own FORTRAN operating subsystem and an expandable set of FORTRAN application modules. Because all of ELAS resides in one "logical" FORTRAN program, data inputs and outputs, directives, and module switching are convenient for the user. There are over 230 modules presently available to aid the user in performing a wide range of land cover analyses and manipulation. The file management modules enable the user to allocate, define, access, and specify usage for all types of files (ELAS files, subfiles, external files etc.). Various other modules convert specific types of satellite, aircraft, and vector-polygon data into files that can be used by other ELAS modules. The user also has many module options which aid in displaying image data, such as magnification/reduction of the display; true color display; and several memory functions. Additional modules allow for the building and manipulation of polygonal areas of the image data. Finally, there are modules which allow the user to select and classify the image data. An important feature of the ELAS subsystem is that its structure allows new applications modules to be easily integrated in the future. ELAS has as a standard the flexibility to process data elements exceeding 8 bits in length, including floating point (noninteger) elements and 16 or 32 bit integers. Thus it is able to analyze and process "non-standard" nonimage data. The VAX (ERL-10017) and Concurrent (ERL-10013) versions of ELAS 9.0 are written in FORTRAN and ASSEMBLER for DEC VAX series computers running VMS and Concurrent computers running MTM. The Sun (SSC-00019), Masscomp (SSC-00020), and Silicon Graphics (SSC-00021) versions of ELAS 9.0 are written in FORTRAN 77 and C-LANGUAGE for Sun4 series computers running SunOS, Masscomp computers running UNIX, and Silicon Graphics IRIS computers running IRIX. The Concurrent version requires at least 15 bit addressing and a direct memory access channel. The VAX and Concurrent versions of ELAS both require floating-point hardware, at least 1Mb of RAM, and approximately 70Mb of disk space. Both versions also require a COMTAL display device in order to display images. For the Sun, Masscomp, and Silicon Graphics versions of ELAS, the disk storage required is approximately 115Mb, and a minimum of 8Mb of RAM is required for execution. The Sun version of ELAS requires either the X-Window System Version 11 Revision 4 or Sun OpenWindows Version 2. The Masscomp version requires a GA1000 display device and the associated "gp" library. The Silicon Graphics version requires Silicon Graphics' GL library. ELAS display functions will not work with a monochrome monitor. The standard distribution medium for the VAX version (ERL10017) is a set of two 9-track 1600 BPI magnetic tapes in DEC VAX BACKUP format. This version is also available on a TK50 tape cartridge in DEC VAX BACKUP format. The standard distribution medium for the Concurrent version (ERL-10013) is a set of two 9-track 1600 BPI magnetic tapes in Concurrent BACKUP format. The standard distribution medium for the Sun version (SSC-00019) is a .25 inch streaming magnetic tape cartridge in UNIX tar format. The standard distribution medium for the Masscomp version, (SSC-00020) is a .25 inch streaming magnetic tape cartridge in UNIX tar format. The standard distribution medium for the Silicon Graphics version (SSC-00021) is a .25 inch streaming magnetic IRIS tape cartridge in UNIX tar format. Version 9.0 was released in 1991. Sun4, SunOS, and Open Windows are trademarks of Sun Microsystems, Inc. MIT X Window System is licensed by Massachusetts Institute of Technology.
ELAS - SCIENCE & TECHNOLOGY LABORATORY APPLICATIONS SOFTWARE (SUN VERSION)
NASA Technical Reports Server (NTRS)
Walters, D.
1994-01-01
The Science and Technology Laboratory Applications Software (ELAS) was originally designed to analyze and process digital imagery data, specifically remotely-sensed scanner data. This capability includes the processing of Landsat multispectral data; aircraft-acquired scanner data; digitized topographic data; and numerous other ancillary data, such as soil types and rainfall information, that can be stored in digitized form. ELAS has the subsequent capability to geographically reference this data to dozens of standard, as well as user created projections. As an integrated image processing system, ELAS offers the user of remotely-sensed data a wide range of capabilities in the areas of land cover analysis and general purpose image analysis. ELAS is designed for flexible use and operation and includes its own FORTRAN operating subsystem and an expandable set of FORTRAN application modules. Because all of ELAS resides in one "logical" FORTRAN program, data inputs and outputs, directives, and module switching are convenient for the user. There are over 230 modules presently available to aid the user in performing a wide range of land cover analyses and manipulation. The file management modules enable the user to allocate, define, access, and specify usage for all types of files (ELAS files, subfiles, external files etc.). Various other modules convert specific types of satellite, aircraft, and vector-polygon data into files that can be used by other ELAS modules. The user also has many module options which aid in displaying image data, such as magnification/reduction of the display; true color display; and several memory functions. Additional modules allow for the building and manipulation of polygonal areas of the image data. Finally, there are modules which allow the user to select and classify the image data. An important feature of the ELAS subsystem is that its structure allows new applications modules to be easily integrated in the future. ELAS has as a standard the flexibility to process data elements exceeding 8 bits in length, including floating point (noninteger) elements and 16 or 32 bit integers. Thus it is able to analyze and process "non-standard" nonimage data. The VAX (ERL-10017) and Concurrent (ERL-10013) versions of ELAS 9.0 are written in FORTRAN and ASSEMBLER for DEC VAX series computers running VMS and Concurrent computers running MTM. The Sun (SSC-00019), Masscomp (SSC-00020), and Silicon Graphics (SSC-00021) versions of ELAS 9.0 are written in FORTRAN 77 and C-LANGUAGE for Sun4 series computers running SunOS, Masscomp computers running UNIX, and Silicon Graphics IRIS computers running IRIX. The Concurrent version requires at least 15 bit addressing and a direct memory access channel. The VAX and Concurrent versions of ELAS both require floating-point hardware, at least 1Mb of RAM, and approximately 70Mb of disk space. Both versions also require a COMTAL display device in order to display images. For the Sun, Masscomp, and Silicon Graphics versions of ELAS, the disk storage required is approximately 115Mb, and a minimum of 8Mb of RAM is required for execution. The Sun version of ELAS requires either the X-Window System Version 11 Revision 4 or Sun OpenWindows Version 2. The Masscomp version requires a GA1000 display device and the associated "gp" library. The Silicon Graphics version requires Silicon Graphics' GL library. ELAS display functions will not work with a monochrome monitor. The standard distribution medium for the VAX version (ERL10017) is a set of two 9-track 1600 BPI magnetic tapes in DEC VAX BACKUP format. This version is also available on a TK50 tape cartridge in DEC VAX BACKUP format. The standard distribution medium for the Concurrent version (ERL-10013) is a set of two 9-track 1600 BPI magnetic tapes in Concurrent BACKUP format. The standard distribution medium for the Sun version (SSC-00019) is a .25 inch streaming magnetic tape cartridge in UNIX tar format. The standard distribution medium for the Masscomp version, (SSC-00020) is a .25 inch streaming magnetic tape cartridge in UNIX tar format. The standard distribution medium for the Silicon Graphics version (SSC-00021) is a .25 inch streaming magnetic IRIS tape cartridge in UNIX tar format. Version 9.0 was released in 1991. Sun4, SunOS, and Open Windows are trademarks of Sun Microsystems, Inc. MIT X Window System is licensed by Massachusetts Institute of Technology.
ELAS - SCIENCE & TECHNOLOGY LABORATORY APPLICATIONS SOFTWARE (MASSCOMP VERSION)
NASA Technical Reports Server (NTRS)
Walters, D.
1994-01-01
The Science and Technology Laboratory Applications Software (ELAS) was originally designed to analyze and process digital imagery data, specifically remotely-sensed scanner data. This capability includes the processing of Landsat multispectral data; aircraft-acquired scanner data; digitized topographic data; and numerous other ancillary data, such as soil types and rainfall information, that can be stored in digitized form. ELAS has the subsequent capability to geographically reference this data to dozens of standard, as well as user created projections. As an integrated image processing system, ELAS offers the user of remotely-sensed data a wide range of capabilities in the areas of land cover analysis and general purpose image analysis. ELAS is designed for flexible use and operation and includes its own FORTRAN operating subsystem and an expandable set of FORTRAN application modules. Because all of ELAS resides in one "logical" FORTRAN program, data inputs and outputs, directives, and module switching are convenient for the user. There are over 230 modules presently available to aid the user in performing a wide range of land cover analyses and manipulation. The file management modules enable the user to allocate, define, access, and specify usage for all types of files (ELAS files, subfiles, external files etc.). Various other modules convert specific types of satellite, aircraft, and vector-polygon data into files that can be used by other ELAS modules. The user also has many module options which aid in displaying image data, such as magnification/reduction of the display; true color display; and several memory functions. Additional modules allow for the building and manipulation of polygonal areas of the image data. Finally, there are modules which allow the user to select and classify the image data. An important feature of the ELAS subsystem is that its structure allows new applications modules to be easily integrated in the future. ELAS has as a standard the flexibility to process data elements exceeding 8 bits in length, including floating point (noninteger) elements and 16 or 32 bit integers. Thus it is able to analyze and process "non-standard" nonimage data. The VAX (ERL-10017) and Concurrent (ERL-10013) versions of ELAS 9.0 are written in FORTRAN and ASSEMBLER for DEC VAX series computers running VMS and Concurrent computers running MTM. The Sun (SSC-00019), Masscomp (SSC-00020), and Silicon Graphics (SSC-00021) versions of ELAS 9.0 are written in FORTRAN 77 and C-LANGUAGE for Sun4 series computers running SunOS, Masscomp computers running UNIX, and Silicon Graphics IRIS computers running IRIX. The Concurrent version requires at least 15 bit addressing and a direct memory access channel. The VAX and Concurrent versions of ELAS both require floating-point hardware, at least 1Mb of RAM, and approximately 70Mb of disk space. Both versions also require a COMTAL display device in order to display images. For the Sun, Masscomp, and Silicon Graphics versions of ELAS, the disk storage required is approximately 115Mb, and a minimum of 8Mb of RAM is required for execution. The Sun version of ELAS requires either the X-Window System Version 11 Revision 4 or Sun OpenWindows Version 2. The Masscomp version requires a GA1000 display device and the associated "gp" library. The Silicon Graphics version requires Silicon Graphics' GL library. ELAS display functions will not work with a monochrome monitor. The standard distribution medium for the VAX version (ERL10017) is a set of two 9-track 1600 BPI magnetic tapes in DEC VAX BACKUP format. This version is also available on a TK50 tape cartridge in DEC VAX BACKUP format. The standard distribution medium for the Concurrent version (ERL-10013) is a set of two 9-track 1600 BPI magnetic tapes in Concurrent BACKUP format. The standard distribution medium for the Sun version (SSC-00019) is a .25 inch streaming magnetic tape cartridge in UNIX tar format. The standard distribution medium for the Masscomp version, (SSC-00020) is a .25 inch streaming magnetic tape cartridge in UNIX tar format. The standard distribution medium for the Silicon Graphics version (SSC-00021) is a .25 inch streaming magnetic IRIS tape cartridge in UNIX tar format. Version 9.0 was released in 1991. Sun4, SunOS, and Open Windows are trademarks of Sun Microsystems, Inc. MIT X Window System is licensed by Massachusetts Institute of Technology.
ELAS - SCIENCE & TECHNOLOGY LABORATORY APPLICATIONS SOFTWARE (DEC VAX VERSION)
NASA Technical Reports Server (NTRS)
Junkin, B. G.
1994-01-01
The Science and Technology Laboratory Applications Software (ELAS) was originally designed to analyze and process digital imagery data, specifically remotely-sensed scanner data. This capability includes the processing of Landsat multispectral data; aircraft-acquired scanner data; digitized topographic data; and numerous other ancillary data, such as soil types and rainfall information, that can be stored in digitized form. ELAS has the subsequent capability to geographically reference this data to dozens of standard, as well as user created projections. As an integrated image processing system, ELAS offers the user of remotely-sensed data a wide range of capabilities in the areas of land cover analysis and general purpose image analysis. ELAS is designed for flexible use and operation and includes its own FORTRAN operating subsystem and an expandable set of FORTRAN application modules. Because all of ELAS resides in one "logical" FORTRAN program, data inputs and outputs, directives, and module switching are convenient for the user. There are over 230 modules presently available to aid the user in performing a wide range of land cover analyses and manipulation. The file management modules enable the user to allocate, define, access, and specify usage for all types of files (ELAS files, subfiles, external files etc.). Various other modules convert specific types of satellite, aircraft, and vector-polygon data into files that can be used by other ELAS modules. The user also has many module options which aid in displaying image data, such as magnification/reduction of the display; true color display; and several memory functions. Additional modules allow for the building and manipulation of polygonal areas of the image data. Finally, there are modules which allow the user to select and classify the image data. An important feature of the ELAS subsystem is that its structure allows new applications modules to be easily integrated in the future. ELAS has as a standard the flexibility to process data elements exceeding 8 bits in length, including floating point (noninteger) elements and 16 or 32 bit integers. Thus it is able to analyze and process "non-standard" nonimage data. The VAX (ERL-10017) and Concurrent (ERL-10013) versions of ELAS 9.0 are written in FORTRAN and ASSEMBLER for DEC VAX series computers running VMS and Concurrent computers running MTM. The Sun (SSC-00019), Masscomp (SSC-00020), and Silicon Graphics (SSC-00021) versions of ELAS 9.0 are written in FORTRAN 77 and C-LANGUAGE for Sun4 series computers running SunOS, Masscomp computers running UNIX, and Silicon Graphics IRIS computers running IRIX. The Concurrent version requires at least 15 bit addressing and a direct memory access channel. The VAX and Concurrent versions of ELAS both require floating-point hardware, at least 1Mb of RAM, and approximately 70Mb of disk space. Both versions also require a COMTAL display device in order to display images. For the Sun, Masscomp, and Silicon Graphics versions of ELAS, the disk storage required is approximately 115Mb, and a minimum of 8Mb of RAM is required for execution. The Sun version of ELAS requires either the X-Window System Version 11 Revision 4 or Sun OpenWindows Version 2. The Masscomp version requires a GA1000 display device and the associated "gp" library. The Silicon Graphics version requires Silicon Graphics' GL library. ELAS display functions will not work with a monochrome monitor. The standard distribution medium for the VAX version (ERL10017) is a set of two 9-track 1600 BPI magnetic tapes in DEC VAX BACKUP format. This version is also available on a TK50 tape cartridge in DEC VAX BACKUP format. The standard distribution medium for the Concurrent version (ERL-10013) is a set of two 9-track 1600 BPI magnetic tapes in Concurrent BACKUP format. The standard distribution medium for the Sun version (SSC-00019) is a .25 inch streaming magnetic tape cartridge in UNIX tar format. The standard distribution medium for the Masscomp version, (SSC-00020) is a .25 inch streaming magnetic tape cartridge in UNIX tar format. The standard distribution medium for the Silicon Graphics version (SSC-00021) is a .25 inch streaming magnetic IRIS tape cartridge in UNIX tar format. Version 9.0 was released in 1991. Sun4, SunOS, and Open Windows are trademarks of Sun Microsystems, Inc. MIT X Window System is licensed by Massachusetts Institute of Technology.
The NCAR Research Data Archive's Hybrid Approach for Data Discovery and Access
NASA Astrophysics Data System (ADS)
Schuster, D.; Worley, S. J.
2013-12-01
The NCAR Research Data Archive (RDA http://rda.ucar.edu) maintains a variety of data discovery and access capabilities for it's 600+ dataset collections to support the varying needs of a diverse user community. In-house developed and standards-based community tools offer services to more than 10,000 users annually. By number of users the largest group is external and access the RDA through web based protocols; the internal NCAR HPC users are fewer in number, but typically access more data volume. This paper will detail the data discovery and access services maintained by the RDA to support both user groups, and show metrics that illustrate how the community is using the services. The distributed search capability enabled by standards-based community tools, such as Geoportal and an OAI-PMH access point that serves multiple metadata standards, provide pathways for external users to initially discover RDA holdings. From here, in-house developed web interfaces leverage primary discovery level metadata databases that support keyword and faceted searches. Internal NCAR HPC users, or those familiar with the RDA, may go directly to the dataset collection of interest and refine their search based on rich file collection metadata. Multiple levels of metadata have proven to be invaluable for discovery within terabyte-sized archives composed of many atmospheric or oceanic levels, hundreds of parameters, and often numerous grid and time resolutions. Once users find the data they want, their access needs may vary as well. A THREDDS data server running on targeted dataset collections enables remote file access through OPENDAP and other web based protocols primarily for external users. In-house developed tools give all users the capability to submit data subset extraction and format conversion requests through scalable, HPC based delayed mode batch processing. Users can monitor their RDA-based data processing progress and receive instructions on how to access the data when it is ready. External users are provided with RDA server generated scripts to download the resulting request output. Similarly they can download native dataset collection files or partial files using Wget or cURL based scripts supplied by the RDA server. Internal users can access the resulting request output or native dataset collection files directly from centralized file systems.
VizieR Online Data Catalog: Habitable zones around main-sequence stars (Kopparapu+, 2014)
NASA Astrophysics Data System (ADS)
Kopparapu, R. K.; Ramirez, R. M.; Schottelkotte, J.; Kasting, J. F.; Domagal-Goldman, S.; Eymet, V.
2017-08-01
Language: Fortran 90 Code tested under the following compilers/operating systems: ifort/CentOS linux Description of input data: No input necessary. Description of output data: Output files: HZs.dat, HZ_coefficients.dat System requirements: No major system requirement. Fortran compiler necessary. Calls to external routines: None. Additional comments: None (1 data file).
Research by External Agencies or Individuals in AISD.
ERIC Educational Resources Information Center
Austin Independent School District, TX.
Abstracts of 34 research projects conducted in the Austin (Texas) Independent School District (AISD) are presented. A roster summarizing the projects by external researchers is also included. The roster shows, for each project, the project number, title, director, sponsor, schools where research was conducted, and whether a full report is on file.…
EPA is releasing a draft report "Next Generation Risk Assessment: I...
Standardizing Documentation of FITS Headers
NASA Astrophysics Data System (ADS)
Hourcle, Joseph
2014-06-01
Although the FITS file format[1] can be self-documenting, human intervention is often needed to read the headers to write the necessary transformations to make a given instrument team's data compatible with our preferred analysis package. External documentation may be needed to determine what the values are of coded values or unfamiliar acronyms.Different communities have interpreted keywords slightly differently. This has resulted in ambiguous fields such as DATE-OBS, which could be either the start or mid-point of an observation.[2]Conventions for placing units and additional information within the comments of a FITS card exist, but they require re-writing the FITS file. This operation can be quite costly for large archives, and should not be taken lightly when dealing with issues of digital preservation.We present what we believe is needed for a machine-actionable external file describing a given collection of FITS files. We seek comments from data producers, archives, and those writing software to help develop a single, useful, implementable standard.References:[1] Pence, et.al. 2010, http://dx.doi.org/10.1051/0004-6361/201015362[2] Rots, et.al, (in preparation), http://hea-www.cfa.harvard.edu arots/TimeWCS/
Restrepo-Pérez, Laura; Soler, Lluís; Martínez-Cisneros, Cynthia S.; Schmidt, Oliver G.
2014-01-01
We demonstrate that catalytic micromotors can be trapped in microfluidic chips containing chevron and heart-shaped structures. Despite the challenge presented by the reduced size of the traps, microfluidic chips with different trapping geometries can be fabricated via replica moulding. We prove that these microfluidic chips can capture micromotors without the need for any external mechanism to control their motion. PMID:24643940
Description of the process used to create 1992 Hanford Morality Study database
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gilbert, E.S.; Buchanan, J.A.; Holter, N.A.
1992-12-01
An updated and expanded database for the Hanford Mortality Study has been developed by PNL`s Epidemiology and Biometry Department. The purpose of this report is to document this process. The primary sources of data were the Occupational Health History (OHH) files maintained by the Hanford Environmental Health Foundation (HEHF) and including demographic data and job histories; the Hanford Mortality (HMO) files also maintained by HEHF and including information of deaths of Hanford workers; the Occupational Radiation Exposure (ORE) files maintained by PNL`s Health Physics Department and containing data on external dosimetry; and a file of workers with confirmed internal depositionsmore » of radionuclides also maintained by PNL`s Health Physics Department. This report describes each of these files in detail, and also describes the many edits that were performed to address the consistency and accuracy of data within and between these files.« less
Description of the process used to create 1992 Hanford Morality Study database
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gilbert, E. S.; Buchanan, J. A.; Holter, N. A.
1992-12-01
An updated and expanded database for the Hanford Mortality Study has been developed by PNL's Epidemiology and Biometry Department. The purpose of this report is to document this process. The primary sources of data were the Occupational Health History (OHH) files maintained by the Hanford Environmental Health Foundation (HEHF) and including demographic data and job histories; the Hanford Mortality (HMO) files also maintained by HEHF and including information of deaths of Hanford workers; the Occupational Radiation Exposure (ORE) files maintained by PNL's Health Physics Department and containing data on external dosimetry; and a file of workers with confirmed internal depositionsmore » of radionuclides also maintained by PNL's Health Physics Department. This report describes each of these files in detail, and also describes the many edits that were performed to address the consistency and accuracy of data within and between these files.« less
Risk Assessment Update: Russian Segment
NASA Technical Reports Server (NTRS)
Christiansen, Eric; Lear, Dana; Hyde, James; Bjorkman, Michael; Hoffman, Kevin
2012-01-01
BUMPER-II version 1.95j source code was provided to RSC-E- and Khrunichev at January 2012 MMOD TIM in Moscow. MEMCxP and ORDEM 3.0 environments implemented as external data files. NASA provided a sample ORDEM 3.0 g."key" & "daf" environment file set for demonstration and benchmarking BUMPER -II v1.95j installation at the Jan-12 TIM. ORDEM 3.0 has been completed and is currently in beta testing. NASA will provide a preliminary set of ORDEM 3.0 ".key" & ".daf" environment files for the years 2012 through 2028. Bumper output files produced using the new ORDEM 3.0 data files are intended for internal use only, not for requirements verification. Output files will contain these words ORDEM FILE DESCRIPTION = PRELIMINARY VERSION: not for production. The projectile density term in many BUMPER-II ballistic limit equations will need to be updated. Cube demo scripts and output files delivered at the Jan-12 TIM have been updated for the new ORDEM 3.0 data files. Risk assessment results based on ORDEM 3.0 and MEM will be presented for the Russian Segment (RS) of ISS.
SIDS-toADF File Mapping Manual
NASA Technical Reports Server (NTRS)
McCarthy, Douglas; Smith, Matthew; Poirier, Diane; Smith, Charles A. (Technical Monitor)
2002-01-01
The "CFD General Notation System" (CGNS) consists of a collection of conventions, and conforming software, for the storage and retrieval of Computational Fluid Dynamics (CFD) data. It facilitates the exchange of data between sites and applications, and helps stabilize the archiving of aerodynamic data. This effort was initiated in order to streamline the procedures in exchanging data and software between NASA and its customers, but the goal is to develop CGNS into a National Standard for the exchange of aerodynamic data. The CGNS development team is comprised of members from Boeing Commercial Airplane Group, NASA-Ames, NASA-Langley, NASA-Lewis, McDonnell-Douglas Corporation (now Boeing-St. Louis), Air Force-Wright Lab., and ICEM-CFD Engineering. The elements of CGNS address all activities associated with the storage of data on external media and its movement to and from application programs. These elements include: 1) The Advanced Data Format (ADF) Database manager, consisting of both a file format specification and its I/O software, which handles the actual reading and writing of data from and to external storage media; 2) The Standard Interface Data Structures (SIDS), which specify the intellectual content of CFD data and the conventions governing naming and terminology; 3) The SIDS-to-ADF File Mapping conventions, which specify the exact location where the CFD data defined by the SIDS is to be stored within the ADF file(s); and 4) The CGNS Mid-level Library, which provides CFD-knowledgeable routines suitable for direct installation into application codes. The SIDS-toADF File Mapping Manual specifies the exact manner in which, under CGNS conventions, CFD data structures (the SIDS) are to be stored in (i.e., mapped onto) the file structure provided by the database manager (ADF). The result is a conforming CGNS database. Adherence to the mapping conventions guarantees uniform meaning and location of CFD data within ADF files, and thereby allows the construction of universal software to read and write the data.
JOVIAL (J73) to Ada Translator.
1982-06-01
editors, file managers , and other APSE , the Translator will Provide significant (though not total) Ltion of the conversion of J73 Proorams for use...vlobal knowlede only of compool declarationsi externals are not resolved until the compiled modules are linked. Creatinv a vlobal data base durin...translation (as shown in Figure 2-1) will require the Job control, file management , and text editing capabilities which are provided by a typical
Outline of Toshiba Business Information Center
NASA Astrophysics Data System (ADS)
Nagata, Yoshihiro
Toshiba Business Information Center gathers and stores inhouse and external business information used in common within the Toshiba Corp., and provides companywide circulation, reference and other services. The Center established centralized information management system by employing decentralized computers, electronic file apparatus (30cm laser disc) and other office automation equipments. Online retrieval through LAN is available to search the stored documents and increasing copying requests are processed by electronic file. This paper describes the purpose of establishment of the Center, the facilities, management scheme, systematization of the files and the present situation and plan of each information service.
An in vitro comparison of root canal transportation by reciproc file with and without glide path.
Nazarimoghadam, Kiumars; Daryaeian, Mohammad; Ramazani, Nahid
2014-09-01
The aim of ideal canal preparation is to prevent iatrogenic aberrations such as transportation. The aim of this study was to evaluate the root canal transportation by Reciproc file with and without glide path. Thirty acrylic-resin blocks with a curvature of 60° and size#10 (2% taper) were assigned into two groups (n= 15). In group 1, the glide path was performed using stainless steel k-files size#10 and 15 at working length In group 2, canals were prepared with Reciproc file system at working length. By using digital imaging software (AutoCAD 2008), the pre-instrumentation and post-instrumentation digital images were superimposed over, taking the landmarks as reference points. Then the radius of the internal and external curve of the specimens was calculated at three α, β and γ points (1mm to apex as α, 3mm to apex as β, and 5mm to apex as γ). The data were statically analyzed using the independent T-test and Mann-Whitney U test by SPSS version 16. Glide path was found significant for only external curve in the apical third of the canal; that is, 5mm to apex (P=0.005). But in the other third, canal modification was not significant (P> 0.008). Canal transportation in the apical third of the canal seems to be significantly reduced when glide path is performed using reciprocating files.
An In Vitro Comparison of Root Canal Transportation by Reciproc File With and Without Glide Path
Nazarimoghadam, Kiumars; Daryaeian, Mohammad; Ramazani, Nahid
2014-01-01
Objective: The aim of ideal canal preparation is to prevent iatrogenic aberrations such as transportation. The aim of this study was to evaluate the root canal transportation by Reciproc file with and without glide path. Materials and Methods: Thirty acrylic-resin blocks with a curvature of 60° and size#10 (2% taper) were assigned into two groups (n= 15). In group 1, the glide path was performed using stainless steel k-files size#10 and 15 at working length In group 2, canals were prepared with Reciproc file system at working length. By using digital imaging software (AutoCAD 2008), the pre-instrumentation and post-instrumentation digital images were superimposed over, taking the landmarks as reference points. Then the radius of the internal and external curve of the specimens was calculated at three α, β and γ points (1mm to apex as α, 3mm to apex as β, and 5mm to apex as γ). The data were statically analyzed using the independent T-test and Mann-Whitney U test by SPSS version 16. Results: Glide path was found significant for only external curve in the apical third of the canal; that is, 5mm to apex (P=0.005). But in the other third, canal modification was not significant (P> 0.008). Conclusion: Canal transportation in the apical third of the canal seems to be significantly reduced when glide path is performed using reciprocating files. PMID:25628682
External Forces Affecting Higher Education. NACUBO Professional File. Vol. 7, No. 5.
ERIC Educational Resources Information Center
Bailey, Stephen K.
Out of the many external forces that influence college campuses, there are four that have had (or are likely to have) a major impact on the fortunes of higher education. The ways in which college and university officials and friends react to these forces can make an enormous difference to the future of higher education. The forces are: (1) Federal…
How to File a Complaint of Discrimination brochure
The U. S. Environmental Protection Agency’s Office of Civil Rights (OCR) External Compliance and Complaints (ECC) Program is responsible for enforcing several civil rights laws which, together, prohibit discrimination.
NASA Astrophysics Data System (ADS)
Flomenbom, Ophir; Castañeda-Priego, Ramón; Peeters, François
2014-11-01
In this document, we present the Special Issue's projects; these include reviews and articles about mathematical solutions and formulations of single-file dynamics (SFD), yet also its computational modeling, experimental evidence, and value in explaining real life occurrences. In particular, we introduce projects focusing on electron dynamics on liquid helium in channels with changing width, on the zig-zag configuration in files with longitudinal movement, on expanding files, on both heterogeneous and slow files, on files with external forces, and on the importance of the interaction potential shape on the particle dynamics along the file. Applications of SFD are of intrinsic value in life sciences, biophysics, physics, and materials science, since they can explain a large diversity of many-body systems, e.g., biological channels, biological motors, membranes, crowding, electron motion in proteins, etc. These systems are explained in all the projects that participate in this topical issue. This Special Issue can therefore intrigue, inspire and advance scientifically young people, yet also those scientists that actively work in this field.
2015-01-29
The Food and Drug Administration (FDA or the Agency) is issuing a final order to require the filing of premarket approval applications (PMA) for automated external defibrillator (AED) systems, which consist of an AED and those AED accessories necessary for the AED to detect and interpret an electrocardiogram and deliver an electrical shock (e.g., pad electrodes, batteries, adapters, and hardware keys for pediatric use).
Active Brownian particles escaping a channel in single file.
Locatelli, Emanuele; Baldovin, Fulvio; Orlandini, Enzo; Pierno, Matteo
2015-02-01
Active particles may happen to be confined in channels so narrow that they cannot overtake each other (single-file conditions). This interesting situation reveals nontrivial physical features as a consequence of the strong interparticle correlations developed in collective rearrangements. We consider a minimal two-dimensional model for active Brownian particles with the aim of studying the modifications introduced by activity with respect to the classical (passive) single-file picture. Depending on whether their motion is dominated by translational or rotational diffusion, we find that active Brownian particles in single file may arrange into clusters that are continuously merging and splitting (active clusters) or merely reproduce passive-motion paradigms, respectively. We show that activity conveys to self-propelled particles a strategic advantage for trespassing narrow channels against external biases (e.g., the gravitational field).
Active Brownian particles escaping a channel in single file
NASA Astrophysics Data System (ADS)
Locatelli, Emanuele; Baldovin, Fulvio; Orlandini, Enzo; Pierno, Matteo
2015-02-01
Active particles may happen to be confined in channels so narrow that they cannot overtake each other (single-file conditions). This interesting situation reveals nontrivial physical features as a consequence of the strong interparticle correlations developed in collective rearrangements. We consider a minimal two-dimensional model for active Brownian particles with the aim of studying the modifications introduced by activity with respect to the classical (passive) single-file picture. Depending on whether their motion is dominated by translational or rotational diffusion, we find that active Brownian particles in single file may arrange into clusters that are continuously merging and splitting (active clusters) or merely reproduce passive-motion paradigms, respectively. We show that activity conveys to self-propelled particles a strategic advantage for trespassing narrow channels against external biases (e.g., the gravitational field).
Securing the AliEn File Catalogue - Enforcing authorization with accountable file operations
NASA Astrophysics Data System (ADS)
Schreiner, Steffen; Bagnasco, Stefano; Sankar Banerjee, Subho; Betev, Latchezar; Carminati, Federico; Vladimirovna Datskova, Olga; Furano, Fabrizio; Grigoras, Alina; Grigoras, Costin; Mendez Lorenzo, Patricia; Peters, Andreas Joachim; Saiz, Pablo; Zhu, Jianlin
2011-12-01
The AliEn Grid Services, as operated by the ALICE Collaboration in its global physics analysis grid framework, is based on a central File Catalogue together with a distributed set of storage systems and the possibility to register links to external data resources. This paper describes several identified vulnerabilities in the AliEn File Catalogue access protocol regarding fraud and unauthorized file alteration and presents a more secure and revised design: a new mechanism, called LFN Booking Table, is introduced in order to keep track of access authorization in the transient state of files entering or leaving the File Catalogue. Due to a simplification of the original Access Envelope mechanism for xrootd-protocol-based storage systems, fundamental computational improvements of the mechanism were achieved as well as an up to 50% reduction of the credential's size. By extending the access protocol with signed status messages from the underlying storage system, the File Catalogue receives trusted information about a file's size and checksum and the protocol is no longer dependent on client trust. Altogether, the revised design complies with atomic and consistent transactions and allows for accountable, authentic, and traceable file operations. This paper describes these changes as part and beyond the development of AliEn version 2.19.
Pereira, Andre; Atri, Mostafa; Rogalla, Patrik; Huynh, Thien; O'Malley, Martin E
2015-11-01
The value of a teaching case repository in radiology training programs is immense. The allocation of resources for putting one together is a complex issue, given the factors that have to be coordinated: hardware, software, infrastructure, administration, and ethics. Costs may be significant and cost-effective solutions are desirable. We chose Medical Imaging Resource Center (MIRC) to build our teaching file. It is offered by RSNA for free. For the hardware, we chose the Raspberry Pi, developed by the Raspberry Foundation: a small control board developed as a low cost computer for schools also used in alternative projects such as robotics and environmental data collection. Its performance and reliability as a file server were unknown to us. For the operational system, we chose Raspbian, a variant of Debian Linux, along with Apache (web server), MySql (database server) and PHP, which enhance the functionality of the server. A USB hub and an external hard drive completed the setup. Installation of software was smooth. The Raspberry Pi was able to handle very well the task of hosting the teaching file repository for our division. Uptime was logged at 100 %, and loading times were similar to other MIRC sites available online. We setup two servers (one for backup), each costing just below $200.00 including external storage and USB hub. It is feasible to run RSNA's MIRC off a low-cost control board (Raspberry Pi). Performance and reliability are comparable to full-size servers for the intended purpose of hosting a teaching file within an intranet environment.
RCHILD - an R-package for flexible use of the landscape evolution model CHILD
NASA Astrophysics Data System (ADS)
Dietze, Michael
2014-05-01
Landscape evolution models provide powerful approaches to numerically assess earth surface processes, to quantify rates of landscape change, infer sediment transfer rates, estimate sediment budgets, investigate the consequences of changes in external drivers on a geomorphic system, to provide spatio-temporal interpolations between known landscape states or to test conceptual hypotheses. CHILD (Channel-Hillslope Integrated Landscape Development Model) is one of the most-used models of landscape change in the context of at least tectonic and geomorphologic process interactions. Running CHILD from command line and working with the model output can be a rather awkward task (static model control via text input file, only numeric output in text files). The package RCHILD is a collection of functions for the free statistical software R that help using CHILD in a flexible, dynamic and user-friendly way. The comprised functions allow creating maps, real-time scenes, animations and further thematic plots from model output. The model input files can be modified dynamically and, hence, (feedback-related) changes in external factors can be implemented iteratively. Output files can be written to common formats that can be readily imported to standard GIS software. This contribution presents the basic functionality of the model CHILD as visualised and modified by the package. A rough overview of the available functions is given. Application examples help to illustrate the great potential of numeric modelling of geomorphologic processes.
NASA Technical Reports Server (NTRS)
Godfrey, Gary S.
2003-01-01
This project illustrates an animation of the orbiter mate to the external tank, an animation of the OMS POD installation to the orbiter, and a simulation of the landing gear mechanism at the Kennedy Space Center. A detailed storyboard was created to reflect each animation or simulation. Solid models were collected and translated into Pro/Engineer's prt and asm formats. These solid models included computer files of the: orbiter, external tank, solid rocket booster, mobile launch platform, transporter, vehicle assembly building, OMS POD fixture, and landing gear. A depository of the above solid models was established. These solid models were translated into several formats. This depository contained the following files: stl for sterolithography, stp for neutral file work, shrinkwrap for compression, tiff for photoshop work, jpeg for Internet use, and prt and asm for Pro/Engineer use. Solid models were created of the material handling sling, bay 3 platforms, and orbiter contact points. Animations were developed using mechanisms to reflect each storyboard. Every effort was made to build all models technically correct for engineering use. The result was an animated routine that could be used by NASA for training material handlers and uncovering engineering safety issues.
Griss, Johannes; Reisinger, Florian; Hermjakob, Henning; Vizcaíno, Juan Antonio
2012-03-01
We here present the jmzReader library: a collection of Java application programming interfaces (APIs) to parse the most commonly used peak list and XML-based mass spectrometry (MS) data formats: DTA, MS2, MGF, PKL, mzXML, mzData, and mzML (based on the already existing API jmzML). The library is optimized to be used in conjunction with mzIdentML, the recently released standard data format for reporting protein and peptide identifications, developed by the HUPO proteomics standards initiative (PSI). mzIdentML files do not contain spectra data but contain references to different kinds of external MS data files. As a key functionality, all parsers implement a common interface that supports the various methods used by mzIdentML to reference external spectra. Thus, when developing software for mzIdentML, programmers no longer have to support multiple MS data file formats but only this one interface. The library (which includes a viewer) is open source and, together with detailed documentation, can be downloaded from http://code.google.com/p/jmzreader/. © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
The Information Quality Act: OMB’s Guidance and Initial Implementation
2004-08-19
Fiscal Year 2001. CRS-3 3 The Chamber of Commerce describes itself on its website as the world’s largest not-for- profit business federation. See [http...resuscitation and the use of automated external defibrillators. OSHA agreed to do so. In another case, the Chamber of Commerce requested that EPA revise the...decision — the Department of Justice (DOJ) filed a brief recommending the dismissal of a lawsuit filed under the IQA by the Chamber of Commerce and the Salt
The TORSED method for construction of TORT boundary sources from external DORT flux files
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rhoades, W.A.
1993-08-01
The TORSED method provides a means of coupling cylindrical two-dimensional DORT fluxes or fluences to a three-dimensional TORT calculation in Cartesian geometry through construction of external boundary sources for TORT. This can be important for several reasons. The two-dimensional environment may be too large for TORT simulation. The two-dimensional environment may be truly cylindrical in nature, and thus, better treated in that geometry. It may be desired to use a single environment calculation to study numerous local perturbations. In Section I the TORSED code is described in detail and the diverse demonstration problems that accompany the code distribution are discussed.more » In Section II, an updated discussion of the VISA code is given. VISA is required to preprocess the DORT files for use in TORSED. In Section III, the references are listed.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hay, Tristan R.; Rishel, Jeremy P.
2013-09-30
The Air Pollutant Graphical Environmental Monitoring System (APGEMS) is used by the Hanford Emergency Operation Center (EOC) to provide refined plume modeling of releases involving radionuclides. The dose conversion factors (DCFs) used by APGEMS to convert air concentration to dose are stored in a file called HUDUFACT.dat; the DCFs are based primarily on ICRP 30 compiled in the late 1980’s. This report updates the DCFs using more recent values reported in the Environmental Protection Agencies (EPAs) Federal Guidance Report (FGR) 12 and 13. FGR 12 provides external exposure (air submersion) DCFs for radionuclides in air; FGR 13 provides DCFs formore » radionuclides from inhalation. DCFs were updated for only those radionuclides listed in the original HUDUFACT.dat file. Since FGR 13 provides inhalation dose conversion factors as a function of age, revised DCF files were created for APGEMS for each age group. The “adult” DCF file is the most relevant to compare to the original DCF file being used in APGEMS; these DCF values are compared in this report.« less
Madanecki, Piotr; Bałut, Magdalena; Buckley, Patrick G; Ochocka, J Renata; Bartoszewski, Rafał; Crossman, David K; Messiaen, Ludwine M; Piotrowski, Arkadiusz
2018-01-01
High-throughput technologies generate considerable amount of data which often requires bioinformatic expertise to analyze. Here we present High-Throughput Tabular Data Processor (HTDP), a platform independent Java program. HTDP works on any character-delimited column data (e.g. BED, GFF, GTF, PSL, WIG, VCF) from multiple text files and supports merging, filtering and converting of data that is produced in the course of high-throughput experiments. HTDP can also utilize itemized sets of conditions from external files for complex or repetitive filtering/merging tasks. The program is intended to aid global, real-time processing of large data sets using a graphical user interface (GUI). Therefore, no prior expertise in programming, regular expression, or command line usage is required of the user. Additionally, no a priori assumptions are imposed on the internal file composition. We demonstrate the flexibility and potential of HTDP in real-life research tasks including microarray and massively parallel sequencing, i.e. identification of disease predisposing variants in the next generation sequencing data as well as comprehensive concurrent analysis of microarray and sequencing results. We also show the utility of HTDP in technical tasks including data merge, reduction and filtering with external criteria files. HTDP was developed to address functionality that is missing or rudimentary in other GUI software for processing character-delimited column data from high-throughput technologies. Flexibility, in terms of input file handling, provides long term potential functionality in high-throughput analysis pipelines, as the program is not limited by the currently existing applications and data formats. HTDP is available as the Open Source software (https://github.com/pmadanecki/htdp).
Bałut, Magdalena; Buckley, Patrick G.; Ochocka, J. Renata; Bartoszewski, Rafał; Crossman, David K.; Messiaen, Ludwine M.; Piotrowski, Arkadiusz
2018-01-01
High-throughput technologies generate considerable amount of data which often requires bioinformatic expertise to analyze. Here we present High-Throughput Tabular Data Processor (HTDP), a platform independent Java program. HTDP works on any character-delimited column data (e.g. BED, GFF, GTF, PSL, WIG, VCF) from multiple text files and supports merging, filtering and converting of data that is produced in the course of high-throughput experiments. HTDP can also utilize itemized sets of conditions from external files for complex or repetitive filtering/merging tasks. The program is intended to aid global, real-time processing of large data sets using a graphical user interface (GUI). Therefore, no prior expertise in programming, regular expression, or command line usage is required of the user. Additionally, no a priori assumptions are imposed on the internal file composition. We demonstrate the flexibility and potential of HTDP in real-life research tasks including microarray and massively parallel sequencing, i.e. identification of disease predisposing variants in the next generation sequencing data as well as comprehensive concurrent analysis of microarray and sequencing results. We also show the utility of HTDP in technical tasks including data merge, reduction and filtering with external criteria files. HTDP was developed to address functionality that is missing or rudimentary in other GUI software for processing character-delimited column data from high-throughput technologies. Flexibility, in terms of input file handling, provides long term potential functionality in high-throughput analysis pipelines, as the program is not limited by the currently existing applications and data formats. HTDP is available as the Open Source software (https://github.com/pmadanecki/htdp). PMID:29432475
Ground Processing of Data From the Mars Exploration Rovers
NASA Technical Reports Server (NTRS)
Wright, Jesse; Sturdevant, Kathryn; Noble, David
2006-01-01
A computer program implements the Earth side of the protocol that governs the transfer of data files generated by the Mars Exploration Rovers. It also provides tools for viewing data in these files and integrating data-product files into automated and manual processes. It reconstitutes files from telemetry data packets. Even if only one packet is received, metadata provide enough information to enable this program to identify and use partial data products. This software can generate commands to acknowledge received files and retransmit missed parts of files, or it can feed a manual process to make decisions about retransmission. The software uses an Extensible Markup Language (XML) data dictionary to provide a generic capability for displaying files of basic types, and uses external "plug-in" application programs to provide more sophisticated displays. This program makes data products available with very low latency, and can trigger automated actions when complete or partial products are received. The software is easy to install and use. The only system requirement for installing the software is a Java J2SE 1.4 platform. Several instances of the software can be executed simultaneously on the same machine.
Hu, Ding; Xie, Shuqun; Yu, Donglan; Zheng, Zhensheng; Wang, Kuijian
2010-04-01
The development of external counterpulsation (ECP) local area network system and extensible markup language (XML)-based remote ECP medical information system conformable to digital imaging and communications in medicine (DICOM) standard has been improving the digital interchangeablity and sharability of ECP data. However, the therapy process of ECP is a continuous and longtime supervision which builds a mass of waveform data. In order to reduce the storage space and improve the transmission efficiency, the waveform data with the normative format of ECP data files have to be compressed. In this article, we introduced the compression arithmetic of template matching and improved quick fitting of linear approximation distance thresholding (LADT) in combimation with the characters of enhanced external counterpulsation (EECP) waveform signal. The DICOM standard is used as the storage and transmission standard to make our system compatible with hospital information system. According to the rules of transfer syntaxes, we defined private transfer syntax for one-dimensional compressed waveform data and stored EECP data into a DICOM file. Testing result indicates that the compressed and normative data can be correctly transmitted and displayed between EECP workstations in our EECP laboratory.
Accessibility - This is a link to an external javascript file that searchs for a flash browser plug-in and what version of the browser is being used. The javascript is functional ToxMystery uses Adobe Flash Player. If you cannot install Flash Player, please use ...
NASA Technical Reports Server (NTRS)
2008-01-01
The Aquarius Radiometer, a subsystem of the Aquarius Instrument required a data acquisition ground system to support calibration and radiometer performance assessment. To support calibration and compose performance assessments, we developed an automated system which uploaded raw data to a ftp server and saved raw and processed data to a database. This paper details the overall functionalities of the Aquarius Instrument Science Data System (ISDS) and the individual electrical ground support equipment (EGSE) which produced data files that were infused into the ISDS. Real time EGSEs include an ICDS Simulator, Calibration GSE, Labview controlled power supply, and a chamber data acquisition system. ICDS Simulator serves as a test conductor primary workstation, collecting radiometer housekeeping (HK) and science data and passing commands and HK telemetry collection request to the radiometer. Calibration GSE (Radiometer Active Test Source) provides source choice from multiple targets for the radiometer external calibration. Power Supply GSE, controlled by labview, provides real time voltage and current monitoring of the radiometer. And finally the chamber data acquisition system produces data reflecting chamber vacuum pressure, thermistor temperatures, AVG and watts. Each GSE system produce text based data files every two to six minutes and automatically copies the data files to the Central Archiver PC. The Archiver PC stores the data files, schedules automated uploads of these files to an external FTP server, and accepts request to copy all data files to the ISDS for offline data processing and analysis. Aquarius Radiometer ISDS contains PHP and MATLab programs to parse, process and save all data to a MySQL database. Analysis tools (MATLab programs) in the ISDS system are capable of displaying radiometer science, telemetry and auxiliary data in near real time as well as performing data analysis and producing automated performance assessment reports of the Aquarius Radiometer.
Survey on Security Issues in File Management in Cloud Computing Environment
NASA Astrophysics Data System (ADS)
Gupta, Udit
2015-06-01
Cloud computing has pervaded through every aspect of Information technology in past decade. It has become easier to process plethora of data, generated by various devices in real time, with the advent of cloud networks. The privacy of users data is maintained by data centers around the world and hence it has become feasible to operate on that data from lightweight portable devices. But with ease of processing comes the security aspect of the data. One such security aspect is secure file transfer either internally within cloud or externally from one cloud network to another. File management is central to cloud computing and it is paramount to address the security concerns which arise out of it. This survey paper aims to elucidate the various protocols which can be used for secure file transfer and analyze the ramifications of using each protocol.
Ioannidis, Vassilios; van Nimwegen, Erik; Stockinger, Heinz
2016-01-01
ISMARA ( ismara.unibas.ch) automatically infers the key regulators and regulatory interactions from high-throughput gene expression or chromatin state data. However, given the large sizes of current next generation sequencing (NGS) datasets, data uploading times are a major bottleneck. Additionally, for proprietary data, users may be uncomfortable with uploading entire raw datasets to an external server. Both these problems could be alleviated by providing a means by which users could pre-process their raw data locally, transferring only a small summary file to the ISMARA server. We developed a stand-alone client application that pre-processes large input files (RNA-seq or ChIP-seq data) on the user's computer for performing ISMARA analysis in a completely automated manner, including uploading of small processed summary files to the ISMARA server. This reduces file sizes by up to a factor of 1000, and upload times from many hours to mere seconds. The client application is available from ismara.unibas.ch/ISMARA/client. PMID:28232860
Software Security Knowledge: CWE. Knowing What Could Make Software Vulnerable to Attack
2011-05-01
shall be subject to a penalty for failing to comply with a collection of information if it does not display a currently valid OMB control number. 1...Buffer • CWE-642: External Control of Critical State Data • CWE-73: External Control of File Name or Path • CWE-426: Untrusted Search Path • CWE...94: Failure to Control Generation of Code (aka ’Code Injection’) • CWE-494: Download of Code Without Integrity Check • CWE-404: Improper Resource
Trick Simulation Environment 07
NASA Technical Reports Server (NTRS)
Lin, Alexander S.; Penn, John M.
2012-01-01
The Trick Simulation Environment is a generic simulation toolkit used for constructing and running simulations. This release includes a Monte Carlo analysis simulation framework and a data analysis package. It produces all auto documentation in XML. Also, the software is capable of inserting a malfunction at any point during the simulation. Trick 07 adds variable server output options and error messaging and is capable of using and manipulating wide characters for international support. Wide character strings are available as a fundamental type for variables processed by Trick. A Trick Monte Carlo simulation uses a statistically generated, or predetermined, set of inputs to iteratively drive the simulation. Also, there is a framework in place for optimization and solution finding where developers may iteratively modify the inputs per run based on some analysis of the outputs. The data analysis package is capable of reading data from external simulation packages such as MATLAB and Octave, as well as the common comma-separated values (CSV) format used by Excel, without the use of external converters. The file formats for MATLAB and Octave were obtained from their documentation sets, and Trick maintains generic file readers for each format. XML tags store the fields in the Trick header comments. For header files, XML tags for structures and enumerations, and the members within are stored in the auto documentation. For source code files, XML tags for each function and the calling arguments are stored in the auto documentation. When a simulation is built, a top level XML file, which includes all of the header and source code XML auto documentation files, is created in the simulation directory. Trick 07 provides an XML to TeX converter. The converter reads in header and source code XML documentation files and converts the data to TeX labels and tables suitable for inclusion in TeX documents. A malfunction insertion capability allows users to override the value of any simulation variable, or call a malfunction job, at any time during the simulation. Users may specify conditions, use the return value of a malfunction trigger job, or manually activate a malfunction. The malfunction action may consist of executing a block of input file statements in an action block, setting simulation variable values, call a malfunction job, or turn on/off simulation jobs.
Beam Propagator for Weather Radars, Modules 1 and 2
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ortega, Edwin Campos
2013-10-08
This program simulates the beam propagation of weather radar pulses under particular and realistic atmospheric conditions (without using the assumption of standard refraction conditions). It consists of two modules: radiosondings_refract_index_many.pro (MAIN MODULE) beam_propagation_function.pro(EXTERNAL FUNCTION) FOR THE MAIN MODULE, THE CODE DOES OUTPUT--INTO A FILE--THE BEAM HEIGHT AS A FUNCTION OF RANGE. THE RADIOSONDE INPUT FILES SHOULD BE ALREADY AVAILABLE BY THE USER. FOR EXAMPLE, RADIOSONDE OBSERVATION FILES CAN BE OBTAINED AT: RADIOSONDE OBSERVATIONS DOWNLOADED AT "http://weather.uwyo.edu/upperair/soounding.html" OR "http://jervis.pyr.ec.gc.ca" THE EXTERNAL FUNCTION DOES THE ACTUAL COMPUTATION OF BEAM PROPAGATION. IT INCLUDES CONDITIONS OF ANOMALOUS PROPAGATION AND NEGATIVE ELEVATION ANGLES. THE EQUATIONSmore » USED HERE WERE DERIVED BY EDWIN CAMPOS, BASED ON THE SNELL-DESCARTES LAW OF REFRACTION, CONSIDERING THE EARTH CURVATURE. THE PROGRAM REQUIRES A COMPILER FOR THE INTERACTIVE DATA LANGUAGE (IDL). DESCRIPTION AND VALIDATION DETAILS HAVE BEEN PUBLISHED IN THE PEER-REVIEWED SCIENTIFIC LITERATURE, AS FOLLOWS: Campos E. 2012. Estimating weather radar coverage over complex terrain, pp.26-32, peer reviewed, in Weather Radar and Hydrology, edited by Moore RJ, Cole SJ and Illingworth AJ. International Association of Hydrological Sciences (IAHS) Press, IAHS Publ. 351. ISBN 978-1-907161-26-1.« less
Modern Techniques for Searching the Chemical Literature.
ERIC Educational Resources Information Center
Holm, Bart E.
The chemists' information needs are for current awareness, selective dissemination, and retrospective search services, of research, development, engineering, production, and marketing information located internally or externally, and contained in journals, patents, theses, reports, data files, information services, and from people. This paper is…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-07-06
... the feasibility of the Douglas County Wave and Tidal Energy Power Project, in the Pacific Ocean, off... gigawatt-hours (GWh) to 10.2 GWh. The OWCSS is operated by external wave action, which causes water to...
MAIL LOG, program theory, volume 2
NASA Technical Reports Server (NTRS)
Harris, D. K.
1979-01-01
Information relevant to the MAIL LOG program theory is documented. The L-files for mail correspondence, design information release/report, and the drawing/engineering order are given. In addition, sources for miscellaneous external routines and special support routines are documented along with a glossary of terms.
Advanced Data Format (ADF) Software Library and Users Guide
NASA Technical Reports Server (NTRS)
Smith, Matthew; Smith, Charles A. (Technical Monitor)
1998-01-01
The "CFD General Notation System" (CGNS) consists of a collection of conventions, and conforming software, for the storage and retrieval of Computational Fluid Dynamics (CFD) data. It facilitates the exchange of data between sites and applications, and helps stabilize the archiving of aerodynamic data. This effort was initiated in order to streamline the procedures in exchanging data and software between NASA and its customers, but the goal is to develop CGNS into a National Standard for the exchange of aerodynamic data. The CGNS development team is comprised of members from Boeing Commercial. Airplane Group, NASA-Ames, NASA-Langley, NASA-Lewis, McDonnell-Douglas Corporation (now Boeing-St. Louis), Air Force-Wright Lab., and ICEM-CFD Engineering. The elements of CGNS address all activities associated with the storage of data on external media and its movement to and from application programs. These elements include: 1) The Advanced Data Format (ADF) Database manager, consisting of both a file format specification and its 1/0 software, which handles the actual reading and writing of data from and to external storage media; 2) The Standard Interface Data Structures (SIDS), which specify the intellectual content of CFD data and the conventions governing naming and terminology; 3) The SIDS-to-ADF File Mapping conventions, which specify the exact location where the CFD data defined by the SIDS is to be stored within the ADF file(s); and 4) The CGNS Mid-level Library, which provides CFD-knowledgeable routines suitable for direct installation into application codes. The ADF is a generic database manager with minimal intrinsic capability. It was written for the purpose of storing large numerical datasets in an efficient, platform independent manner. To be effective, it must be used in conjunction with external agreements on how the data will be organized within the ADF database such defined by the SIDS. There are currently 34 user callable functions that comprise the ADF Core library and are described in the Users Guide. The library is written in C, but each function has a FORTRAN counterpart.
Digital impression-taking: Fundamentals and benefits in orthodontics.
Lecocq, Guillaume
2016-06-01
The digital era has burst into our offices in a big way. 3D camera technology has improved, enabling us to record our impressions and the occlusion in a digital format file. This file can then be used to make set-ups and manufacture orthodontic devices. Like any new technology, it needs to be studied and understood in order to grasp it fully and master the information and digital flow which can be generated between one's office and any external party involved in treatment, such as laboratories or other colleagues. Copyright © 2016 CEO. Published by Elsevier Masson SAS. All rights reserved.
Cánovas, Rodrigo; Moffat, Alistair; Turpin, Andrew
2016-12-15
Next generation sequencing machines produce vast amounts of genomic data. For the data to be useful, it is essential that it can be stored and manipulated efficiently. This work responds to the combined challenge of compressing genomic data, while providing fast access to regions of interest, without necessitating decompression of whole files. We describe CSAM (Compressed SAM format), a compression approach offering lossless and lossy compression for SAM files. The structures and techniques proposed are suitable for representing SAM files, as well as supporting fast access to the compressed information. They generate more compact lossless representations than BAM, which is currently the preferred lossless compressed SAM-equivalent format; and are self-contained, that is, they do not depend on any external resources to compress or decompress SAM files. An implementation is available at https://github.com/rcanovas/libCSAM CONTACT: canovas-ba@lirmm.frSupplementary Information: Supplementary data is available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Perl-speaks-NONMEM (PsN)--a Perl module for NONMEM related programming.
Lindbom, Lars; Ribbing, Jakob; Jonsson, E Niclas
2004-08-01
The NONMEM program is the most widely used nonlinear regression software in population pharmacokinetic/pharmacodynamic (PK/PD) analyses. In this article we describe a programming library, Perl-speaks-NONMEM (PsN), intended for programmers that aim at using the computational capability of NONMEM in external applications. The library is object oriented and written in the programming language Perl. The classes of the library are built around NONMEM's data, model and output files. The specification of the NONMEM model is easily set or changed through the model and data file classes while the output from a model fit is accessed through the output file class. The classes have methods that help the programmer perform common repetitive tasks, e.g. summarising the output from a NONMEM run, setting the initial estimates of a model based on a previous run or truncating values over a certain threshold in the data file. PsN creates a basis for the development of high-level software using NONMEM as the regression tool.
Bell, Andrew S; Bradley, Joseph; Everett, Jeremy R; Knight, Michelle; Loesel, Jens; Mathias, John; McLoughlin, David; Mills, James; Sharp, Robert E; Williams, Christine; Wood, Terence P
2013-05-01
The screening files of many large companies, including Pfizer, have grown considerably due to internal chemistry efforts, company mergers and acquisitions, external contracted synthesis, or compound purchase schemes. In order to screen the targets of interest in a cost-effective fashion, we devised an easy-to-assemble, plate-based diversity subset (PBDS) that represents almost the entire computed chemical space of the screening file whilst comprising only a fraction of the plates in the collection. In order to create this file, we developed new design principles for the quality assessment of screening plates: the Rule of 40 (Ro40) and a plate selection process that insured excellent coverage of both library chemistry and legacy chemistry space. This paper describes the rationale, design, construction, and performance of the PBDS, that has evolved into the standard paradigm for singleton (one compound per well) high-throughput screening in Pfizer since its introduction in 2006.
78 FR 36549 - Sunshine Act; Notice of Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2013-06-18
..., Office of External Affairs, (202) 942-1640. Dated: June 13, 2013. James B. Petri, Secretary, Federal Retirement Thrift Investment Board. [FR Doc. 2013-14524 Filed 6-14-13; 11:15 am] BILLING CODE 6760-01-P ... of Financial Management Report 5. FY 2013-2017 Strategic Plan Update [[Page 36550
NetCDF4/HDF5 and Linked Data in the Real World - Enriching Geoscientific Metadata without Bloat
NASA Astrophysics Data System (ADS)
Ip, Alex; Car, Nicholas; Druken, Kelsey; Poudjom-Djomani, Yvette; Butcher, Stirling; Evans, Ben; Wyborn, Lesley
2017-04-01
NetCDF4 has become the dominant generic format for many forms of geoscientific data, leveraging (and constraining) the versatile HDF5 container format, while providing metadata conventions for interoperability. However, the encapsulation of detailed metadata within each file can lead to metadata "bloat", and difficulty in maintaining consistency where metadata is replicated to multiple locations. Complex conceptual relationships are also difficult to represent in simple key-value netCDF metadata. Linked Data provides a practical mechanism to address these issues by associating the netCDF files and their internal variables with complex metadata stored in Semantic Web vocabularies and ontologies, while complying with and complementing existing metadata conventions. One of the stated objectives of the netCDF4/HDF5 formats is that they should be self-describing: containing metadata sufficient for cataloguing and using the data. However, this objective can be regarded as only partially-met where details of conventions and definitions are maintained externally to the data files. For example, one of the most widely used netCDF community standards, the Climate and Forecasting (CF) Metadata Convention, maintains standard vocabularies for a broad range of disciplines across the geosciences, but this metadata is currently neither readily discoverable nor machine-readable. We have previously implemented useful Linked Data and netCDF tooling (ncskos) that associates netCDF files, and individual variables within those files, with concepts in vocabularies formulated using the Simple Knowledge Organization System (SKOS) ontology. NetCDF files contain Uniform Resource Identifier (URI) links to terms represented as SKOS Concepts, rather than plain-text representations of those terms, so we can use simple, standardised web queries to collect and use rich metadata for the terms from any Linked Data-presented SKOS vocabulary. Geoscience Australia (GA) manages a large volume of diverse geoscientific data, much of which is being translated from proprietary formats to netCDF at NCI Australia. This data is made available through the NCI National Environmental Research Data Interoperability Platform (NERDIP) for programmatic access and interdisciplinary analysis. The netCDF files contain both scientific data variables (e.g. gravity, magnetic or radiometric values), but also domain-specific operational values (e.g. specific instrument parameters) best described fully in formal vocabularies. Our ncskos codebase provides access to multiple stores of detailed external metadata in a standardised fashion. Geophysical datasets are generated from a "survey" event, and GA maintains corporate databases of all surveys and their associated metadata. It is impractical to replicate the full source survey metadata into each netCDF dataset so, instead, we link the netCDF files to survey metadata using public Linked Data URIs. These URIs link to Survey class objects which we model as a subclass of Activity objects as defined by the PROV Ontology, and we provide URI resolution for them via a custom Linked Data API which draws current survey metadata from GA's in-house databases. We have demonstrated that Linked Data is a practical way to associate netCDF data with detailed, external metadata. This allows us to ensure that catalogued metadata is kept consistent with metadata points-of-truth, and we can infer complex conceptual relationships not possible with netCDF key-value attributes alone.
Father involvement in child welfare: Associations with changes in externalizing behavior.
Leon, Scott C; Jhe Bai, Grace; Fuller, Anne K
2016-05-01
Nonresident fathers can have a significant impact on children's behavioral outcomes. Unfortunately, the impact of nonresident father involvement on the behavioral outcomes of children with child welfare involvement has received scant attention in the literature, a limitation the current study sought to address. A sample of 333 children in state custody in Illinois between the ages of six and 13 participated and were assessed using the externalizing behavior scale of the Child and Adolescent Needs and Strengths (CANS) at regular intervals throughout their time in care. Father involvement was measured through a review of case files and interviews with child welfare workers. Growth trajectories were fit to children's externalizing behavior across time and were predicted using Time 1 characteristics. Father involvement, total non-father relative involvement, and gender (girls) was associated with lower baseline externalizing behavior and the African American children in the sample experienced higher baseline externalizing behavior. However, only Time 1 father involvement predicted slope trajectories after controlling for Time 1 externalizing behavior; more father involvement was associated with lower externalizing behavior trajectories. These results suggest that even in the unique and stressful context of child welfare, father involvement can be protective regarding children's externalizing behaviors. Copyright © 2016 Elsevier Ltd. All rights reserved.
Self-contained exothermic applicator and process
Koehmstedt, Paul L.
1984-01-01
An adhesive resin application system which requires no external heating apparatus, and which is operative in the absence of a reactive atmosphere, is disclosed. The system provides its own heat by employing an adhesive material containing reactants which react exothermally when electrically ignited. After ignition of the reactants, sufficient heat energy is liberated by the exothermic reaction either to plasticize a thermoplastic resin or to cure a thermosetting resin and therby bond together two closely spaced objects. This application is a continuation-in-part of application Ser. No. 489,006, filed Apr. 27, 1983, which is a continuation-in-part of application, Ser. No. 929,120, filed July 28, 1978, both now abandoned.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dolan, Daniel H.; Ao, Tommy
The Sandia Data Archive (SDA) format is a specific implementation of the HDF5 (Hierarchal Data Format version 5) standard. The format was developed for storing data in a universally accessible manner. SDA files may contain one or more data records, each associated with a distinct text label. Primitive records provide basic data storage, while compound records support more elaborate grouping. External records allow text/binary files to be carried inside an archive and later recovered. This report documents version 1.0 of the SDA standard. The information provided here is sufficient for reading from and writing to an archive. Although the formatmore » was original designed for use in MATLAB, broader use is encouraged.« less
Sleep Better at Night...Back Up Your Data.
ERIC Educational Resources Information Center
Smith, Russell
1996-01-01
Discusses the need to back up computer files, and describes the technological evolution of back-up methods. Reviews tape drive and external hard drive back-up products offered by computer companies and presents back-up strategies to use with all back-up methods. A sidebar lists information on the reviewed products. (JMV)
76 FR 75922 - Submission for OMB Review; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2011-12-05
... portfolios holding equity securities will bear an external cost burden of $1,000 per portfolio to prepare and... a small business investment company registered on Form N-5 (``Funds''), to file Form N- PX not later... comprised of 6,200 portfolios holding equity securities and 3,800 portfolios holding no equity securities...
Input Files and Procedures for Analysis of SMA Hybrid Composite Beams in MSC.Nastran and ABAQUS
NASA Technical Reports Server (NTRS)
Turner, Travis L.; Patel, Hemant D.
2005-01-01
A thermoelastic constitutive model for shape memory alloys (SMAs) and SMA hybrid composites (SMAHCs) was recently implemented in the commercial codes MSC.Nastran and ABAQUS. The model is implemented and supported within the core of the commercial codes, so no user subroutines or external calculations are necessary. The model and resulting structural analysis has been previously demonstrated and experimentally verified for thermoelastic, vibration and acoustic, and structural shape control applications. The commercial implementations are described in related documents cited in the references, where various results are also shown that validate the commercial implementations relative to a research code. This paper is a companion to those documents in that it provides additional detail on the actual input files and solution procedures and serves as a repository for ASCII text versions of the input files necessary for duplication of the available results.
External Dependencies-Driven Architecture Discovery and Analysis of Implemented Systems
NASA Technical Reports Server (NTRS)
Ganesan, Dharmalingam; Lindvall, Mikael; Ron, Monica
2014-01-01
A method for architecture discovery and analysis of implemented systems (AIS) is disclosed. The premise of the method is that architecture decisions are inspired and influenced by the external entities that the software system makes use of. Examples of such external entities are COTS components, frameworks, and ultimately even the programming language itself and its libraries. Traces of these architecture decisions can thus be found in the implemented software and is manifested in the way software systems use such external entities. While this fact is often ignored in contemporary reverse engineering methods, the AIS method actively leverages and makes use of the dependencies to external entities as a starting point for the architecture discovery. The AIS method is demonstrated using the NASA's Space Network Access System (SNAS). The results show that, with abundant evidence, the method offers reusable and repeatable guidelines for discovering the architecture and locating potential risks (e.g. low testability, decreased performance) that are hidden deep in the implementation. The analysis is conducted by using external dependencies to identify, classify and review a minimal set of key source code files. Given the benefits of analyzing external dependencies as a way to discover architectures, it is argued that external dependencies deserve to be treated as first-class citizens during reverse engineering. The current structure of a knowledge base of external entities and analysis questions with strategies for getting answers is also discussed.
Database Migration for Command and Control
2002-11-01
Sql - proprietary JDP Private Area Air defense data Defended asset list Oracle 7.3.2 - Automated process (OLTP...TADIL warnings Oracle 7.3.2 Flat File - Discrete transaction with data upds - NRT response required Pull mission data Std SQL ...level execution data Oracle 7.3 User update External interfaces Auto/manual backup Messaging Proprietary replication (internally) SQL Web server
ERIC Educational Resources Information Center
Billups, Felice D.
2012-01-01
Institutional researchers (IRs) are often asked to conduct focus groups as an efficient way to address an institutional concern or problem. Typically, IR professionals depend on external consultants and specialists to conduct these group interviews for them; however, due to recent resource constraints (staffing, budgets), they are increasingly…
ERIC Educational Resources Information Center
Roessler, Richard T.; Neath, Jeanne; McMahon, Brian T.; Rumrill, Phillip D.
2007-01-01
Single-predictor and stepwise multinomial logistic regression analyses and an external validation were completed on 3,082 allegations of employment discrimination by adults with multiple sclerosis. Women filed two thirds of the allegations, and individuals between 31 and 50 made the vast majority of discrimination charges (73%). Allegations…
NASA Astrophysics Data System (ADS)
Camilo, Ana E. F.; Grégio, André; Santos, Rafael D. C.
2016-05-01
Malware detection may be accomplished through the analysis of their infection behavior. To do so, dynamic analysis systems run malware samples and extract their operating system activities and network traffic. This traffic may represent malware accessing external systems, either to steal sensitive data from victims or to fetch other malicious artifacts (configuration files, additional modules, commands). In this work, we propose the use of visualization as a tool to identify compromised systems based on correlating malware communications in the form of graphs and finding isomorphisms between them. We produced graphs from over 6 thousand distinct network traffic files captured during malware execution and analyzed the existing relationships among malware samples and IP addresses.
Transferable Output ASCII Data (TOAD) editor version 1.0 user's guide
NASA Technical Reports Server (NTRS)
Bingel, Bradford D.; Shea, Anne L.; Hofler, Alicia S.
1991-01-01
The Transferable Output ASCII Data (TOAD) editor is an interactive software tool for manipulating the contents of TOAD files. The TOAD editor is specifically designed to work with tabular data. Selected subsets of data may be displayed to the user's screen, sorted, exchanged, duplicated, removed, replaced, inserted, or transferred to and from external files. It also offers a number of useful features including on-line help, macros, a command history, an 'undo' option, variables, and a full compliment of mathematical functions and conversion factors. Written in ANSI FORTRAN 77 and completely self-contained, the TOAD editor is very portable and has already been installed on SUN, SGI/IRIS, and CONVEX hosts.
Tool to assess contents of ARM surface meteorology network netCDF files
DOE Office of Scientific and Technical Information (OSTI.GOV)
Staudt, A.; Kwan, T.; Tichler, J.
The Atmospheric Radiation Measurement (ARM) Program, supported by the US Department of Energy, is a major program of atmospheric measurement and modeling designed to improve the understanding of processes and properties that affect atmospheric radiation, with a particular focus on the influence of clouds and the role of cloud radiative feedback in the climate system. The ARM Program will use three highly instrumented primary measurement sites. Deployment of instrumentation at the first site, located in the Southern Great Plains of the United States, began in May of 1992. The first phase of deployment at the second site in the Tropicalmore » Western Pacific is scheduled for late in 1995. The third site will be in the North Slope of Alaska and adjacent Arctic Ocean. To meet the scientific objectives of ARM, observations from the ARM sites are combined with data from other sources; these are called external data. Among these external data sets are surface meteorological observations from the Oklahoma Mesonet, a Kansas automated weather network, the Wind Profiler Demonstration Network (WPDN), and the National Weather Service (NWS) surface stations. Before combining these data with the Surface Meteorological Observations Station (SMOS) ARM data, it was necessary to assess the contents and quality of both the ARM and the external data sets. Since these data sets had previously been converted to netCDF format for use by the ARM Science Team, a tool was written to assess the contents of the netCDF files.« less
Catching errors with patient-specific pretreatment machine log file analysis.
Rangaraj, Dharanipathy; Zhu, Mingyao; Yang, Deshan; Palaniswaamy, Geethpriya; Yaddanapudi, Sridhar; Wooten, Omar H; Brame, Scott; Mutic, Sasa
2013-01-01
A robust, efficient, and reliable quality assurance (QA) process is highly desired for modern external beam radiation therapy treatments. Here, we report the results of a semiautomatic, pretreatment, patient-specific QA process based on dynamic machine log file analysis clinically implemented for intensity modulated radiation therapy (IMRT) treatments delivered by high energy linear accelerators (Varian 2100/2300 EX, Trilogy, iX-D, Varian Medical Systems Inc, Palo Alto, CA). The multileaf collimator machine (MLC) log files are called Dynalog by Varian. Using an in-house developed computer program called "Dynalog QA," we automatically compare the beam delivery parameters in the log files that are generated during pretreatment point dose verification measurements, with the treatment plan to determine any discrepancies in IMRT deliveries. Fluence maps are constructed and compared between the delivered and planned beams. Since clinical introduction in June 2009, 912 machine log file analyses QA were performed by the end of 2010. Among these, 14 errors causing dosimetric deviation were detected and required further investigation and intervention. These errors were the result of human operating mistakes, flawed treatment planning, and data modification during plan file transfer. Minor errors were also reported in 174 other log file analyses, some of which stemmed from false positives and unreliable results; the origins of these are discussed herein. It has been demonstrated that the machine log file analysis is a robust, efficient, and reliable QA process capable of detecting errors originating from human mistakes, flawed planning, and data transfer problems. The possibility of detecting these errors is low using point and planar dosimetric measurements. Copyright © 2013 American Society for Radiation Oncology. Published by Elsevier Inc. All rights reserved.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-04-16
... the firm controls access to market data and not for external interrogation devices or internal interrogation devices for which a vendor (and not the firm) controls access to market data. This program better... that its employees use and in respect of which the firm controls access to market data. The...
Optical mass memory system (AMM-13). AMM/DBMS interface control document
NASA Technical Reports Server (NTRS)
Bailey, G. A.
1980-01-01
The baseline for external interfaces of a 10 to the 13th power bit, optical archival mass memory system (AMM-13) is established. The types of interfaces addressed include data transfer; AMM-13, Data Base Management System, NASA End-to-End Data System computer interconnect; data/control input and output interfaces; test input data source; file management; and facilities interface.
Cost Studies in Higher Education. The AIR Professional File, No. 7, Fall 1980.
ERIC Educational Resources Information Center
Hample, Stephen R.
A guide to cost studies in higher education is presented, with emphasis directed to the response of a four-year public institution to an externally mandated cost study. Cost studies are usually requested to guide budget allocations, either from an internal campus need or from outside pressures. Although much effort has been expended by various…
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-30
... Change Relating to Fees for EdgeBook Attributed\\SM\\ January 23, 2013. Pursuant to Section 19(b)(1) of the... external distribution of EdgeBook Attributed\\SM\\, the Exchange's attributed book feed, and (ii) offer a new... SR-EDGX-2011-18,\\4\\ the Exchange made available the EDGX Book Feed (``EdgeBook Depth X\\SM\\'') to...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-30
... Change Relating to Fees for EdgeBook Attributed\\SM\\ January 23, 2012. Pursuant to Section 19(b)(1) of the... external distribution of EdgeBook Attributed\\SM\\, the Exchange's attributed book feed, and (ii) offer a new...-2011-19,\\4\\ the Exchange made available the EDGA Book Feed (``EdgeBook Depth A\\SM\\'') to Members and...
CGNS Mid-Level Software Library and Users Guide
NASA Technical Reports Server (NTRS)
Poirier, Diane; Smith, Charles A. (Technical Monitor)
1998-01-01
The "CFD General Notation System" (CGNS) consists of a collection of conventions, and conforming software, for the storage and retrieval of Computational Fluid Dynamics (CFD) data. It facilitates the exchange of data between sites and applications, and helps stabilize the archiving of aerodynamic data. This effort was initiated in order to streamline the procedures in exchanging data and software between NASA and its customers, but the goal is to develop CGNS into a National Standard for the exchange of aerodynamic data. The CGNS development team is comprised of members from Boeing Commercial Airplane Group, NASA-Ames, NASA-Langley, NASA-Lewis, McDonnell-Douglas Corporation (now Boeing-St. Louis), Air Force-Wright Lab., and ICEM-CFD Engineering. The elements of CGNS address all activities associated with the storage of data on external media and its movement to and from application programs. These elements include: - The Advanced Data Format (ADF) Database manager, consisting of both a file format specification and its I/O software, which handles the actual reading and writing of data from and to external storage media; - The Standard Interface Data Structures (SIDS), which specify the intellectual content of CFD data and the conventions governing naming and terminology; - The SIDS-to-ADF File Mapping conventions, which specify the exact location where the CFD data defined by the SIDS is to be stored within the ADF file(s); and - The CGNS Mid-level Library, which provides CFD-knowledgeable routines suitable for direct installation into application codes. The CGNS Mid-level Library was designed to ease the implementation of CGNS by providing developers with a collection of handy I/O functions. Since knowledge of the ADF core is not required to use this library, it will greatly facilitate the task of interfacing with CGNS. There are currently 48 user callable functions that comprise the Mid-level library and are described in the Users Guide. The library is written in C, but each function has a FORTRAN counterpart.
Poster - Thur Eve - 54: A software solution for ongoing DVH quality assurance in radiation therapy.
Annis, S-L; Zeng, G; Wu, X; Macpherson, M
2012-07-01
A program has been developed in MATLAB for use in quality assurance of treatment planning of radiation therapy. It analyzes patient DVH files and compiles dose volume data for review, trending, comparison and analysis. Patient DVH files are exported from the Eclipse treatment planning system and saved according to treatment sites and date. Currently analysis is available for 4 treatment sites; Prostate, Prostate Bed, Lung, and Upper GI, with two functions for data report and analysis: patient-specific and organ-specific. The patient-specific function loads one patient DVH file and reports the user-specified dose volume data of organs and targets. These data can be compiled to an external file for a third party analysis. The organ-specific function extracts a requested dose volume of an organ from the DVH files of a patient group and reports the statistics over this population. A graphical user interface is utilized to select clinical sites, function and structures, and input user's requests. We have implemented this program in planning quality assurance at our center. The program has tracked the dosimetric improvement in GU sites after VMAT was implemented clinically. It has generated dose volume statistics for different groups of patients associated with technique or time range. This program allows reporting and statistical analysis of DVH files. It is an efficient tool for the planning quality control in radiation therapy. © 2012 American Association of Physicists in Medicine.
KungFQ: a simple and powerful approach to compress fastq files.
Grassi, Elena; Di Gregorio, Federico; Molineris, Ivan
2012-01-01
Nowadays storing data derived from deep sequencing experiments has become pivotal and standard compression algorithms do not exploit in a satisfying manner their structure. A number of reference-based compression algorithms have been developed but they are less adequate when approaching new species without fully sequenced genomes or nongenomic data. We developed a tool that takes advantages of fastq characteristics and encodes them in a binary format optimized in order to be further compressed with standard tools (such as gzip or lzma). The algorithm is straightforward and does not need any external reference file, it scans the fastq only once and has a constant memory requirement. Moreover, we added the possibility to perform lossy compression, losing some of the original information (IDs and/or qualities) but resulting in smaller files; it is also possible to define a quality cutoff under which corresponding base calls are converted to N. We achieve 2.82 to 7.77 compression ratios on various fastq files without losing information and 5.37 to 8.77 losing IDs, which are often not used in common analysis pipelines. In this paper, we compare the algorithm performance with known tools, usually obtaining higher compression levels.
X-Antenna: A graphical interface for antenna analysis codes
NASA Technical Reports Server (NTRS)
Goldstein, B. L.; Newman, E. H.; Shamansky, H. T.
1995-01-01
This report serves as the user's manual for the X-Antenna code. X-Antenna is intended to simplify the analysis of antennas by giving the user graphical interfaces in which to enter all relevant antenna and analysis code data. Essentially, X-Antenna creates a Motif interface to the user's antenna analysis codes. A command-file allows new antennas and codes to be added to the application. The menu system and graphical interface screens are created dynamically to conform to the data in the command-file. Antenna data can be saved and retrieved from disk. X-Antenna checks all antenna and code values to ensure they are of the correct type, writes an output file, and runs the appropriate antenna analysis code. Volumetric pattern data may be viewed in 3D space with an external viewer run directly from the application. Currently, X-Antenna includes analysis codes for thin wire antennas (dipoles, loops, and helices), rectangular microstrip antennas, and thin slot antennas.
This database processes approximately 3,000 Notice of Arrival (NOA) reporting forms from importers and exporters of pesticide products. This is an electronic version of the EPA Form 3540-1. The external user fills out the NOA and submits it electronically. The form is then processed by the Pesticides section and either approved or disapproved. The system then generates an Adobe PDF version of the EPA Form 3540-1 with signature or disapproval and emailed to the external user. The e-filing system eliminates the need for the Region to invest in paper, copying, storage and mailing expenses, while at the same time allowing the regulated community to conduct its business with us in a more expeditious manner.
Lau, Lee Min; Banning, Pam D; Monson, Kent; Knight, Elva; Wilson, Pat S; Shakib, Shaun C
2005-01-01
The Department of Defense (DoD) has used a common application, Composite Health Care System (CHCS), throughout all DoD facilities. However, the master files used to encode patient data in CHCS are not identical across DoD facilities. The encoded data is thus not interoperable from one DoD facility to another. To enable data interoperability in the next-generation system, CHCS II, and for the DoD to exchange laboratory results with external organizations such as the Veterans Administration (VA), the disparate master file codes for laboratory results are mapped to Logical Observation Identifier Names and Codes (LOINC) wherever possible. This paper presents some findings from our experience mapping DoD laboratory results to LOINC.
ArrayBridge: Interweaving declarative array processing with high-performance computing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xing, Haoyuan; Floratos, Sofoklis; Blanas, Spyros
Scientists are increasingly turning to datacenter-scale computers to produce and analyze massive arrays. Despite decades of database research that extols the virtues of declarative query processing, scientists still write, debug and parallelize imperative HPC kernels even for the most mundane queries. This impedance mismatch has been partly attributed to the cumbersome data loading process; in response, the database community has proposed in situ mechanisms to access data in scientific file formats. Scientists, however, desire more than a passive access method that reads arrays from files. This paper describes ArrayBridge, a bi-directional array view mechanism for scientific file formats, that aimsmore » to make declarative array manipulations interoperable with imperative file-centric analyses. Our prototype implementation of ArrayBridge uses HDF5 as the underlying array storage library and seamlessly integrates into the SciDB open-source array database system. In addition to fast querying over external array objects, ArrayBridge produces arrays in the HDF5 file format just as easily as it can read from it. ArrayBridge also supports time travel queries from imperative kernels through the unmodified HDF5 API, and automatically deduplicates between array versions for space efficiency. Our extensive performance evaluation in NERSC, a large-scale scientific computing facility, shows that ArrayBridge exhibits statistically indistinguishable performance and I/O scalability to the native SciDB storage engine.« less
Leroy, Agnès Marie Françoise; Bahia, Maria Guiomar de Azevedo; Ehrlacher, Alain; Buono, Vicente Tadeu Lopes
2012-08-01
To build a mathematical model describing the mechanical behavior of NiTi rotary files while they are rotating in a root canal. The file was seen as a beam undergoing large transformations. The instrument was assumed to be rotating steadily in the root canal, and the geometry of the canal was considered as a known parameter of the problem. The formulae of large transformations mechanics then allowed the calculation of the Green-Lagrange strain field in the file. The non-linear mechanical behavior of NiTi was modeled as a continuous piecewise linear function, assuming that the material did not reach plastic deformation. Criteria locating the changes of behavior of NiTi were established and the tension field in the file, and the external efforts applied on it were calculated. The unknown variable of torsion was deduced from the equilibrium equation system using a Coulomb contact law which solved the problem on a cycle of rotation. In order to verify that the model described well reality, three-point bending experiments were managed on superelastic NiTi wires, whose results were compared to the theoretical ones. It appeared that the model gave a good mentoring of the empirical results in the range of bending angles that interested us. Knowing the geometry of the root canal, one is now able to write the equations of the strain and stress fields in the endodontic instrument, and to quantify the impact of each macroscopic parameter of the problem on its response. This should be useful to predict failure of the files under rotating bending fatigue, and to optimize the geometry of the files. Copyright © 2012 Elsevier B.V. All rights reserved.
ERIC Educational Resources Information Center
D'Allegro, Mary-Lou; Paff, Lolita A.
2010-01-01
Most economic impact studies are prepared by external consultants at significant cost to an individual college, a higher education state system, or a set of institutions with similar Carnegie Classifications. This case study provides a detailed framework that academic institutions may use to derive economic impact estimates without hiring external…
HepML, an XML-based format for describing simulated data in high energy physics
NASA Astrophysics Data System (ADS)
Belov, S.; Dudko, L.; Kekelidze, D.; Sherstnev, A.
2010-10-01
In this paper we describe a HepML format and a corresponding C++ library developed for keeping complete description of parton level events in a unified and flexible form. HepML tags contain enough information to understand what kind of physics the simulated events describe and how the events have been prepared. A HepML block can be included into event files in the LHEF format. The structure of the HepML block is described by means of several XML Schemas. The Schemas define necessary information for the HepML block and how this information should be located within the block. The library libhepml is a C++ library intended for parsing and serialization of HepML tags, and representing the HepML block in computer memory. The library is an API for external software. For example, Matrix Element Monte Carlo event generators can use the library for preparing and writing a header of an LHEF file in the form of HepML tags. In turn, Showering and Hadronization event generators can parse the HepML header and get the information in the form of C++ classes. libhepml can be used in C++, C, and Fortran programs. All necessary parts of HepML have been prepared and we present the project to the HEP community. Program summaryProgram title: libhepml Catalogue identifier: AEGL_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEGL_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU GPLv3 No. of lines in distributed program, including test data, etc.: 138 866 No. of bytes in distributed program, including test data, etc.: 613 122 Distribution format: tar.gz Programming language: C++, C Computer: PCs and workstations Operating system: Scientific Linux CERN 4/5, Ubuntu 9.10 RAM: 1 073 741 824 bytes (1 Gb) Classification: 6.2, 11.1, 11.2 External routines: Xerces XML library ( http://xerces.apache.org/xerces-c/), Expat XML Parser ( http://expat.sourceforge.net/) Nature of problem: Monte Carlo simulation in high energy physics is divided into several stages. Various programs exist for these stages. In this article we are interested in interfacing different Monte Carlo event generators via data files, in particular, Matrix Element (ME) generators and Showering and Hadronization (SH) generators. There is a widely accepted format for data files for such interfaces - Les Houches Event Format (LHEF). Although information kept in an LHEF file is enough for proper working of SH generators, it is insufficient for understanding how events in the LHEF file have been prepared and which physical model has been applied. In this paper we propose an extension of the format for keeping additional information available in generators. We propose to add a new information block, marked up with XML tags, to the LHEF file. This block describes events in the file in more detail. In particular, it stores information about a physical model, kinematical cuts, generator, etc. This helps to make LHEF files self-documented. Certainly, HepML can be applied in more general context, not in LHEF files only. Solution method: In order to overcome drawbacks of the original LHEF accord we propose to add a new information block of HepML tags. HepML is an XML-based markup language. We designed several XML Schemas for all tags in the language. Any HepML document should follow rules of the Schemas. The language is equipped with a library for operation with HepML tags and documents. This C++ library, called libhepml, consists of classes for HepML objects, which represent a HepML document in computer memory, parsing classes, serializating classes, and some auxiliary classes. Restrictions: The software is adapted for solving problems, described in the article. There are no additional restrictions. Running time: Tests have been done on a computer with Intel(R) Core(TM)2 Solo, 1.4 GHz. Parsing of a HepML file: 6 ms (size of the HepML files is 12.5 Kb) Writing of a HepML block to file: 14 ms (file size 12.5 Kb) Merging of two HepML blocks and writing to file: 18 ms (file size - 25.0 Kb).
NASA Astrophysics Data System (ADS)
Rose, K.; Rowan, C.; Rager, D.; Dehlin, M.; Baker, D. V.; McIntyre, D.
2015-12-01
Multi-organizational research teams working jointly on projects often encounter problems with discovery, access to relevant existing resources, and data sharing due to large file sizes, inappropriate file formats, or other inefficient options that make collaboration difficult. The Energy Data eXchange (EDX) from Department of Energy's (DOE) National Energy Technology Laboratory (NETL) is an evolving online research environment designed to overcome these challenges in support of DOE's fossil energy goals while offering improved access to data driven products of fossil energy R&D such as datasets, tools, and web applications. In 2011, development of NETL's Energy Data eXchange (EDX) was initiated and offers i) a means for better preserving of NETL's research and development products for future access and re-use, ii) efficient, discoverable access to authoritative, relevant, external resources, and iii) an improved approach and tools to support secure, private collaboration and coordination between multi-organizational teams to meet DOE mission and goals. EDX presently supports fossil energy and SubTER Crosscut research activities, with an ever-growing user base. EDX is built on a heavily customized instance of the open source platform, Comprehensive Knowledge Archive Network (CKAN). EDX connects users to externally relevant data and tools through connecting to external data repositories built on different platforms and other CKAN platforms (e.g. Data.gov). EDX does not download and repost data or tools that already have an online presence. This leads to redundancy and even error. If a relevant resource already has an online instance, is hosted by another online entity, EDX will point users to that external host either using web services, inventorying URLs and other methods. EDX offers users the ability to leverage private-secure capabilities custom built into the system. The team is presently working on version 3 of EDX which will incorporate big data analytical capabilities amongst other advanced features.
Innovation in the imaging perianal fistula: a step towards personalised medicine
Sahnan, Kapil; Adegbola, Samuel O.; Tozer, Philip J.; Patel, Uday; Ilangovan, Rajpandian; Warusavitarne, Janindra; Faiz, Omar D.; Hart, Ailsa L.; Phillips, Robin K. S.; Lung, Phillip F. C.
2018-01-01
Background: Perianal fistula is a topic both hard to understand and to teach. The key to understanding the treatment options and the likely success is deciphering the exact morphology of the tract(s) and the amount of sphincter involved. Our aim was to explore alternative platforms better to understand complex perianal fistulas through three-dimensional (3D) imaging and reconstruction. Methods: Digital imaging and communications in medicine images of spectral attenuated inversion recovery magnetic resonance imaging (MRI) sequences were imported onto validated open-source segmentation software. A specialist consultant gastrointestinal radiologist performed segmentation of the fistula, internal and external sphincter. Segmented files were exported as stereolithography files. Cura (Ultimaker Cura 3.0.4) was used to prepare the files for printing on an Ultimaker 3 Extended 3D printer. Animations were created in collaboration with Touch Surgery™. Results: Three examples of 3D printed models demonstrating complex perianal fistula were created. The anatomical components are displayed in different colours: red: fistula tract; green: external anal sphincter and levator plate; blue: internal anal sphincter and rectum. One of the models was created to be split in half, to display the internal opening and allow complexity in the intersphincteric space to better evaluated. An animation of MRI fistulography of a trans-sphincteric fistula tract with a cephalad extension in the intersphincteric space was also created. Conclusion: MRI is the reference standard for assessment of perianal fistula, defining anatomy and guiding surgery. However, communication of findings between radiologist and surgeon remains challenging. Feasibility of 3D reconstructions of complex perianal fistula is realized, with the potential to improve surgical planning, communication with patients, and augment training. PMID:29854001
ATLAS software configuration and build tool optimisation
NASA Astrophysics Data System (ADS)
Rybkin, Grigory; Atlas Collaboration
2014-06-01
ATLAS software code base is over 6 million lines organised in about 2000 packages. It makes use of some 100 external software packages, is developed by more than 400 developers and used by more than 2500 physicists from over 200 universities and laboratories in 6 continents. To meet the challenge of configuration and building of this software, the Configuration Management Tool (CMT) is used. CMT expects each package to describe its build targets, build and environment setup parameters, dependencies on other packages in a text file called requirements, and each project (group of packages) to describe its policies and dependencies on other projects in a text project file. Based on the effective set of configuration parameters read from the requirements files of dependent packages and project files, CMT commands build the packages, generate the environment for their use, or query the packages. The main focus was on build time performance that was optimised within several approaches: reduction of the number of reads of requirements files that are now read once per package by a CMT build command that generates cached requirements files for subsequent CMT build commands; introduction of more fine-grained build parallelism at package task level, i.e., dependent applications and libraries are compiled in parallel; code optimisation of CMT commands used for build; introduction of package level build parallelism, i. e., parallelise the build of independent packages. By default, CMT launches NUMBER-OF-PROCESSORS build commands in parallel. The other focus was on CMT commands optimisation in general that made them approximately 2 times faster. CMT can generate a cached requirements file for the environment setup command, which is especially useful for deployment on distributed file systems like AFS or CERN VMFS. The use of parallelism, caching and code optimisation significantly-by several times-reduced software build time, environment setup time, increased the efficiency of multi-core computing resources utilisation, and considerably improved software developer and user experience.
Dias, Pedro G B Souza; Mello, Francisco De Assis Ganeo De; Vieira, Lelisberto Baldo
2016-06-09
A new genus and species of Luzarinae cricket (Grylloidea, Phalangopsidae) is described from the Chapada Diamantina National Park, Bahia State, northeast Brazil. Sishiniheia diamantina, n. gen. n. sp. is described based in characters of external morphology and male genitalia and is characterized by the reduced FWs, absence of stridulatory file, thick longitudinal venation and the thin, pointed and curved pseudepiphallic arms.
Development of a Taxonomy of Sociocultural Factors that Influence Decision Making
2015-02-01
from external sources, such as from receiving money or making good grades, or being promoted. According to the Self - Determination Theory (SDT...Press, 1906. Also available at http://www.gutenberg.org/files/29904/29904-h/29904-h.htm. Deci EL, Ryan , RM. Intrinsic motivation and self ...Confidence) According to Bandura (1995), an individual’s self -efficacy determines if a coping behavior will be initiated. For Bandura (1995), self
NASA Technical Reports Server (NTRS)
Mcmillan, J. D.
1976-01-01
A description of the input and output files and the data control cards for the altimeter residual computation (ARC) computer program is given. The program acts as the final altimeter preprocessor before the data is reformatted for external users. It calculates all parameters necessary for the computation of the altimeter observation residuals and the sea surface height. Mathematical models used for calculating tropospheric refraction, geoid height, tide height, ephemeris, and orbit geometry are described.
Substance Identification Information from EPA's Substance Registry
The Substance Registry Services (SRS) is the authoritative resource for basic information about substances of interest to the U.S. EPA and its state and tribal partners. Substances, particularly chemicals, can have many valid synonyms. For example, toluene, methyl benzene, and phenyl methane, are commonly used names for the same chemical. EPA programs collect environmental data for this chemical using each of these names, plus others. This diversity leads to problems when a user is looking for programmatic data for toluene but is unaware that the data is stored under the synonym methyl benzene. For each substance, the SRS identifies the statutes, EPA programs, as well as organization external to EPA, that track or regulate that substance and the synonym used by that statute, EPA program or external organization. Besides standardized information for each chemical, such as the Chemical Abstracts Services name and the Chemical Abstracts Number and the EPA Registry Name (the EPA standard name), the SRS also includes additional information, such as molecular weight and molecular formula. Additionally, an SRS Internal Tracking Number uniquely identifies each substance, enabling cross-walking between synonyms. EPA is providing a large .ZIP file providing the SRS data in CSV format, and a separate small metadata file in XML containing the field names and definitions.
iTOUGH2 Universal Optimization Using the PEST Protocol
DOE Office of Scientific and Technical Information (OSTI.GOV)
Finsterle, S.A.
2010-07-01
iTOUGH2 (http://www-esd.lbl.gov/iTOUGH2) is a computer program for parameter estimation, sensitivity analysis, and uncertainty propagation analysis [Finsterle, 2007a, b, c]. iTOUGH2 contains a number of local and global minimization algorithms for automatic calibration of a model against measured data, or for the solution of other, more general optimization problems (see, for example, Finsterle [2005]). A detailed residual and estimation uncertainty analysis is conducted to assess the inversion results. Moreover, iTOUGH2 can be used to perform a formal sensitivity analysis, or to conduct Monte Carlo simulations for the examination for prediction uncertainties. iTOUGH2's capabilities are continually enhanced. As the name implies, iTOUGH2more » is developed for use in conjunction with the TOUGH2 forward simulator for nonisothermal multiphase flow in porous and fractured media [Pruess, 1991]. However, iTOUGH2 provides FORTRAN interfaces for the estimation of user-specified parameters (see subroutine USERPAR) based on user-specified observations (see subroutine USEROBS). These user interfaces can be invoked to add new parameter or observation types to the standard set provided in iTOUGH2. They can also be linked to non-TOUGH2 models, i.e., iTOUGH2 can be used as a universal optimization code, similar to other model-independent, nonlinear parameter estimation packages such as PEST [Doherty, 2008] or UCODE [Poeter and Hill, 1998]. However, to make iTOUGH2's optimization capabilities available for use with an external code, the user is required to write some FORTRAN code that provides the link between the iTOUGH2 parameter vector and the input parameters of the external code, and between the output variables of the external code and the iTOUGH2 observation vector. While allowing for maximum flexibility, the coding requirement of this approach limits its applicability to those users with FORTRAN coding knowledge. To make iTOUGH2 capabilities accessible to many application models, the PEST protocol [Doherty, 2007] has been implemented into iTOUGH2. This protocol enables communication between the application (which can be a single 'black-box' executable or a script or batch file that calls multiple codes) and iTOUGH2. The concept requires that for the application model: (1) Input is provided on one or more ASCII text input files; (2) Output is returned to one or more ASCII text output files; (3) The model is run using a system command (executable or script/batch file); and (4) The model runs to completion without any user intervention. For each forward run invoked by iTOUGH2, select parameters cited within the application model input files are then overwritten with values provided by iTOUGH2, and select variables cited within the output files are extracted and returned to iTOUGH2. It should be noted that the core of iTOUGH2, i.e., its optimization routines and related analysis tools, remains unchanged; it is only the communication format between input parameters, the application model, and output variables that are borrowed from PEST. The interface routines have been provided by Doherty [2007]. The iTOUGH2-PEST architecture is shown in Figure 1. This manual contains installation instructions for the iTOUGH2-PEST module, and describes the PEST protocol as well as the input formats needed in iTOUGH2. Examples are provided that demonstrate the use of model-independent optimization and analysis using iTOUGH2.« less
SAP- FORTRAN STATIC SOURCE CODE ANALYZER PROGRAM (IBM VERSION)
NASA Technical Reports Server (NTRS)
Manteufel, R.
1994-01-01
The FORTRAN Static Source Code Analyzer program, SAP, was developed to automatically gather statistics on the occurrences of statements and structures within a FORTRAN program and to provide for the reporting of those statistics. Provisions have been made for weighting each statistic and to provide an overall figure of complexity. Statistics, as well as figures of complexity, are gathered on a module by module basis. Overall summed statistics are also accumulated for the complete input source file. SAP accepts as input syntactically correct FORTRAN source code written in the FORTRAN 77 standard language. In addition, code written using features in the following languages is also accepted: VAX-11 FORTRAN, IBM S/360 FORTRAN IV Level H Extended; and Structured FORTRAN. The SAP program utilizes two external files in its analysis procedure. A keyword file allows flexibility in classifying statements and in marking a statement as either executable or non-executable. A statistical weight file allows the user to assign weights to all output statistics, thus allowing the user flexibility in defining the figure of complexity. The SAP program is written in FORTRAN IV for batch execution and has been implemented on a DEC VAX series computer under VMS and on an IBM 370 series computer under MVS. The SAP program was developed in 1978 and last updated in 1985.
SAP- FORTRAN STATIC SOURCE CODE ANALYZER PROGRAM (DEC VAX VERSION)
NASA Technical Reports Server (NTRS)
Merwarth, P. D.
1994-01-01
The FORTRAN Static Source Code Analyzer program, SAP, was developed to automatically gather statistics on the occurrences of statements and structures within a FORTRAN program and to provide for the reporting of those statistics. Provisions have been made for weighting each statistic and to provide an overall figure of complexity. Statistics, as well as figures of complexity, are gathered on a module by module basis. Overall summed statistics are also accumulated for the complete input source file. SAP accepts as input syntactically correct FORTRAN source code written in the FORTRAN 77 standard language. In addition, code written using features in the following languages is also accepted: VAX-11 FORTRAN, IBM S/360 FORTRAN IV Level H Extended; and Structured FORTRAN. The SAP program utilizes two external files in its analysis procedure. A keyword file allows flexibility in classifying statements and in marking a statement as either executable or non-executable. A statistical weight file allows the user to assign weights to all output statistics, thus allowing the user flexibility in defining the figure of complexity. The SAP program is written in FORTRAN IV for batch execution and has been implemented on a DEC VAX series computer under VMS and on an IBM 370 series computer under MVS. The SAP program was developed in 1978 and last updated in 1985.
Linking netCDF Data with the Semantic Web - Enhancing Data Discovery Across Domains
NASA Astrophysics Data System (ADS)
Biard, J. C.; Yu, J.; Hedley, M.; Cox, S. J. D.; Leadbetter, A.; Car, N. J.; Druken, K. A.; Nativi, S.; Davis, E.
2016-12-01
Geophysical data communities are publishing large quantities of data across a wide variety of scientific domains which are overlapping more and more. Whilst netCDF is a common format for many of these communities, it is only one of a large number of data storage and transfer formats. One of the major challenges ahead is finding ways to leverage these diverse data sets to advance our understanding of complex problems. We describe a methodology for incorporating Resource Description Framework (RDF) triples into netCDF files called netCDF-LD (netCDF Linked Data). NetCDF-LD explicitly connects the contents of netCDF files - both data and metadata, with external web-based resources, including vocabularies, standards definitions, and data collections, and through them, a whole host of related information. This approach also preserves and enhances the self describing essence of the netCDF format and its metadata, whilst addressing the challenge of integrating various conventions into files. We present a case study illustrating how reasoning over RDF graphs can empower researchers to discover datasets across domain boundaries.
Grid Computing Application for Brain Magnetic Resonance Image Processing
NASA Astrophysics Data System (ADS)
Valdivia, F.; Crépeault, B.; Duchesne, S.
2012-02-01
This work emphasizes the use of grid computing and web technology for automatic post-processing of brain magnetic resonance images (MRI) in the context of neuropsychiatric (Alzheimer's disease) research. Post-acquisition image processing is achieved through the interconnection of several individual processes into pipelines. Each process has input and output data ports, options and execution parameters, and performs single tasks such as: a) extracting individual image attributes (e.g. dimensions, orientation, center of mass), b) performing image transformations (e.g. scaling, rotation, skewing, intensity standardization, linear and non-linear registration), c) performing image statistical analyses, and d) producing the necessary quality control images and/or files for user review. The pipelines are built to perform specific sequences of tasks on the alphanumeric data and MRIs contained in our database. The web application is coded in PHP and allows the creation of scripts to create, store and execute pipelines and their instances either on our local cluster or on high-performance computing platforms. To run an instance on an external cluster, the web application opens a communication tunnel through which it copies the necessary files, submits the execution commands and collects the results. We present result on system tests for the processing of a set of 821 brain MRIs from the Alzheimer's Disease Neuroimaging Initiative study via a nonlinear registration pipeline composed of 10 processes. Our results show successful execution on both local and external clusters, and a 4-fold increase in performance if using the external cluster. However, the latter's performance does not scale linearly as queue waiting times and execution overhead increase with the number of tasks to be executed.
Schreiber, Richard; Sittig, Dean F; Ash, Joan; Wright, Adam
2017-09-01
In this report, we describe 2 instances in which expert use of an electronic health record (EHR) system interfaced to an external clinical laboratory information system led to unintended consequences wherein 2 patients failed to have laboratory tests drawn in a timely manner. In both events, user actions combined with the lack of an acknowledgment message describing the order cancellation from the external clinical system were the root causes. In 1 case, rapid, near-simultaneous order entry was the culprit; in the second, astute order management by a clinician, unaware of the lack of proper 2-way interface messaging from the external clinical system, led to the confusion. Although testing had shown that the laboratory system would cancel duplicate laboratory orders, it was thought that duplicate alerting in the new order entry system would prevent such events. © The Author 2017. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
WE-G-213CD-03: A Dual Complementary Verification Method for Dynamic Tumor Tracking on Vero SBRT.
Poels, K; Depuydt, T; Verellen, D; De Ridder, M
2012-06-01
to use complementary cine EPID and gimbals log file analysis for in-vivo tracking accuracy monitoring. A clinical prototype of dynamic tracking (DT) was installed on the Vero SBRT system. This prototype version allowed tumor tracking by gimballed linac rotations using an internal-external correspondence model. The DT prototype software allowed the detailed logging of all applied gimbals rotations during tracking. The integration of an EPID on the vero system allowed the acquisition of cine EPID images during DT. We quantified the tracking error on cine EPID (E-EPID) by subtracting the target center (fiducial marker detection) and the field centroid. Dynamic gimbals log file information was combined with orthogonal x-ray verification images to calculate the in-vivo tracking error (E-kVLog). The correlation between E-kVLog and E-EPID was calculated for validation of the gimbals log file. Further, we investigated the sensitivity of the log file tracking error by introducing predefined systematic tracking errors. As an application we calculate gimbals log file tracking error for dynamic hidden target tests to investigate gravity effects and decoupled gimbals rotation from gantry rotation. Finally, calculating complementary cine EPID and log file tracking errors evaluated the clinical accuracy of dynamic tracking. A strong correlation was found between log file and cine EPID tracking error distribution during concurrent measurements (R=0.98). We found sensitivity in the gimbals log files to detect a systematic tracking error up to 0.5 mm. Dynamic hidden target tests showed no gravity influence on tracking performance and high degree of decoupled gimbals and gantry rotation during dynamic arc dynamic tracking. A submillimetric agreement between clinical complementary tracking error measurements was found. Redundancy of the internal gimbals log file with x-ray verification images with complementary independent cine EPID images was implemented to monitor the accuracy of gimballed tumor tracking on Vero SBRT. Research was financially supported by the Flemish government (FWO), Hercules Foundation and BrainLAB AG. © 2012 American Association of Physicists in Medicine.
Kfir, A; Elkes, D; Pawar, A; Weissman, A; Tsesis, I
2017-01-01
The objective of this study is to determine the potential for microcracks in the radicular dentin of first maxillary premolars using three different mechanized endodontic instrumentation systems. Eighty extracted maxillary first premolars with two root canals and no externally visible microcracks were selected. Root canal instrumentation was performed with either the ProTaper file system, the WaveOne primary file, or the self-adjusting file (SAF). Teeth with intact roots served as controls. The roots were cut into segments and examined with an intensive, small-diameter light source that was applied diagonally to the entire periphery of the root slice under ×20 magnification; the presence of microcracks and fractures was recorded. Pearson's chi-square method was used for statistical analysis, and significance was set at p < 0.05. Microcracks were present in 30 and 20 % of roots treated with the ProTaper and WaveOne systems, respectively, while no microcracks were present in the roots treated with the SAF (p = 0.008 and p = 0.035, respectively). Intact teeth presented with cracks in 5 % of the roots. The intensive, small-diameter light source revealed microcracks that could not be detected when using the microscope's light alone. Within the limitations of this study, it could be concluded that mechanized root canal instrumentation with the ProTaper and WaveOne systems in maxillary first premolars causes microcracks in the radicular dentin, while the use of the SAF file causes no such microcracks. Rotary and reciprocating files with large tapers may cause microcracks in the radicular dentin of maxillary first premolars. Less aggressive methods should be considered for these teeth.
VizieR Online Data Catalog: Planck Sunyaev-Zeldovich sources (PSZ2) (Planck+, 2016)
NASA Astrophysics Data System (ADS)
Planck Collaboration; Ade, P. A. R.; Aghanim, N.; Arnaud, M.; Ashdown, M.; Aumont, J.; Baccigalupi, C.; Banday, A. J.; Barreiro, R. B.; Barrena, R.; Bartlett, J. G.; Bartolo, N.; Battaner, E.; Battye, R.; Benabed, K.; Benoit, A.; Benoit-Levy, A.; Bernard, J.-P.; Bersanelli, M.; Bielewicz, P.; Bikmaev, I.; Bohringer, H.; Bonaldi, A.; Bonavera, L.; Bond, J. R.; Borrill, J.; Bouchet, F. R.; Bucher, M.; Burenin, R.; Burigana, C.; Butler, R. C.; Calabrese, E.; Cardoso, J.-F.; Carvalho, P.; Catalano, A.; Challinor, A.; Chamballu, A.; Chary, R.-R.; Chiang, H. C.; Chon, G.; Christensen, P. R.; Clements, D. L.; Colombi, S.; Colombo, L. P. L.; Combet, C.; Comis, B.; Couchot, F.; Coulais, A.; Crill, B. P.; Curto, A.; Cuttaia, F.; Dahle, H.; Danese, L.; Davies, R. D.; Davis, R. J.; de Bernardis, P.; De Rosa, A.; de Zotti, G.; Delabrouille, J.; Desert, F.-X.; Dickinson, C.; Diego, J. M.; Dolag, K.; Dole, H.; Donzelli, S.; Dore, O.; Douspis, M.; Ducout, A.; Dupac, X.; Efstathiou, G.; Eisenhardt, P. R. M.; Elsner, F.; Ensslin, T. A.; Eriksen, H. K.; Falgarone, E.; Fergusson, J.; Feroz, F.; Ferragamo, A.; Finelli, F.; Forni, O.; Frailis, M.; Fraisse, A. A.; Franceschi, E.; Frejsel, A.; Galeotta, S.; Galli, S.; Ganga, K.; Genova-Santos, R. T.; Giard, M.; Giraud-Heraud, Y.; Gjerlow, E.; Gonzalez-Nuevo, J.; Gorski, K. M.; Grainge, K. J. B.; Gratton, S.; Gregorio, A.; Gruppuso, A.; Gudmundsson, J. E.; Hansen, F. K.; Hanson, D.; Harrison, D. L.; Hempel, A.; Henrot-Versille, S.; Hernandez-Monteagudo, C.; Herranz, D.; Hildebrandt, S. R.; Hivon, E.; Hobson, M.; Holmes, W. A.; Hornstrup, A.; Hovest, W.; Huffenberger, K. M.; Hurier, G.; Jaffe, A. H.; Jaffe, T. R.; Jin, T.; Jones, W. C.; Juvela, M.; Keihanen, E.; Keskitalo, R.; Khamitov, I.; Kisner, T. S.; Kneissl, R.; Knoche, J.; Kunz, M.; Kurki-Suonio, H.; Lagache, G.; Lamarre, J.-M.; Lasenby, A.; Lattanzi, M.; Lawrence, C. R.; Leonardi, R.; Lesgourgues, J.; Levrier, F.; Liguori, M.; Lilje, P. B.; Linden-Vornle, M.; Lopez-Caniego, M.; Lubin, P. M.; Macias-Perez, J. F.; Maggio, G.; Maino, D.; Mak, D. S. Y.; Mandolesi, N.; Mangilli, A.; Martin, P. G.; Martinez-Gonzalez, E.; Masi, S.; Matarrese, S.; Mazzotta, P.; McGehee, P.; Mei, S.; Melchiorri, A.; Melin, J.-B.; Mendes, L.; Mennella, A.; Migliaccio, M.; Mitra, S.; Miville-Deschenes, M.-A.; Moneti, A.; Montier, L.; Morgante, G.; Mortlock, D.; Moss, A.; Munshi, D.; Murphy, J. A.; Naselsky, P.; Nastasi, A.; Nati, F.; Natoli, P.; Netterfield, C. B.; Norgaard-Nielsen, H. U.; Noviello, F.; Novikov, D.; Novikov, I.; Olamaie, M.; Oxborrow, C. A.; Paci, F.; Pagano, L.; Pajot, F.; Paoletti, D.; Pasian, F.; Patanchon, G.; Pearson, T. J.; Perdereau, O.; Perotto, L.; Perrott, Y. C.; Perrotta, F.; Pettorino, V.; Piacentini, F.; Piat, M.; Pierpaoli, E.; Pietrobon, D.; Plaszczynski, S.; Pointecouteau, E.; Polenta, G.; Pratt, G. W.; Prezeau, G.; Prunet, S.; Puget, J.-L.; Rachen, J. P.; Reach, W. T.; Rebolo, R.; Reinecke, M.; Remazeilles, M.; Renault, C.; Renzi, A.; Ristorcelli, I.; Rocha, G.; Rosset, C.; Rossetti, M.; Roudier, G.; Rozo, E.; Rubino-Martin, J. A.; Rumsey, C.; Rusholme, B.; Rykoff, E. S.; Sandri, M.; Santos, D.; Saunders, R. D. E.; Savelainen, M.; Savini, G.; Schammel, M. P.; Scott, D.; Seiffert, M. D.; Shellard, E. P. S.; Shimwell, T. W.; Spence, R. L. D.; Stanford, S. A.; Stern, D.; Stolyarov, V.; Stompor, R.; Streblyanska, A.; Sudiwala, R.; Sunyaev, R.; Sutton, D.; Suur-Uski, A.-S.; Sygnet, J.-F.; Tauber, J. A.; Terenzi, L.; Toffolatti, L.; Tomasi, M.; Tramonte, D.; Tristram, M.; Tucci, M.; Tuovinen, J.; Umana, G.; Valenziano, L.; Valiviita, J.; van Tent, B.; Vielva, P.; Villa, F.; Wade, L. A.; Wandelt, B. D.; Wehus, I. K.; White, S. D. M.; Wright, E. L.; Yvon, D.; Zacchei, A.; Zonca, A.
2017-01-01
Three pipelines are used to detect SZ clusters: two independent implementations of the Matched Multi-Filter (MMF1 and MMF3), and PowellSnakes (PwS). The main catalogue is constructed as the union of the catalogues from the three detection methods. The completeness and reliability of the catalogues have been assessed through internal and external validation as described in section 4 of the paper. (5 data files).
Hruban, L; Janků, P; Jordánová, K; Gerychová, R; Huser, M; Ventruba, P; Roztočil, A
2017-01-01
Evaluation of success rate and the safety of external cephalic version after 36 weeks of gestation. Retrospective analysis. Department of Obstetrics and Gynecology, Masaryk University, University Hospital Brno. A retrospective analysis of external cephalic version attempts performed on a group of 638 singleton breech pregnancies after 36 weeks gestation in the years 2003-2016 at the Department of Gynecology and Obstetrics, Masaryk University, Brno. The effectiveness, number and type of complications, mode of delivery and perinatal result were observed. The effectiveness of external cephalic version from breech to head presentation was 47.8% (305 cases). After a successful external cephalic version 238 patients (78.0%) gave birth vaginally. After unsuccessful cephalic version 130 patients (39.0%) gave birth vaginally. The number of serious complications did not exceed 0,9% and did not affect perinatal outcomes. External cephalic version-related emergency cesarean deliveries occurred in 6 cases (2 placental abruption, 4 abnormal cardiotocography). The fetal outcome was good in all these cases. The death of the fetus in connection with the external version has not occurred in our file. Spontaneous discharge of amniotic fluid within 24 hours after procedure occurred in 5 cases (0.8%). The spontaneous onset of labor within 24 hours of procedure occurred in 5 cases (0.8%). The pH value of a. umbilicalis < 7.00 occurred in 2 cases in the group with a successful external version and in the group with unsuccessful external version in 9 cases. The Apgar score in the 5th minute < 5 was both in the successful and unsuccessful group in 1 case. The external cephalic version of the fetus in the case of breech presentation after the 36th week of pregnancy is an effective and safe alternative for women who have a fear of the vaginal breech delivery. Performing the external cephalic version can reduce the rate of elective caesarean sections due to breech presentation at term.
NASA Astrophysics Data System (ADS)
Zaborowicz, M.; Przybył, J.; Koszela, K.; Boniecki, P.; Mueller, W.; Raba, B.; Lewicki, A.; Przybył, K.
2014-04-01
The aim of the project was to make the software which on the basis on image of greenhouse tomato allows for the extraction of its characteristics. Data gathered during the image analysis and processing were used to build learning sets of artificial neural networks. Program enables to process pictures in jpeg format, acquisition of statistical information of the picture and export them to an external file. Produced software is intended to batch analyze collected research material and obtained information saved as a csv file. Program allows for analysis of 33 independent parameters implicitly to describe tested image. The application is dedicated to processing and image analysis of greenhouse tomatoes. The program can be used for analysis of other fruits and vegetables of a spherical shape.
The NSO FTS database program and archive (FTSDBM)
NASA Technical Reports Server (NTRS)
Lytle, D. M.
1992-01-01
Data from the NSO Fourier transform spectrometer is being re-archived from half inch tape onto write-once compact disk. In the process, information about each spectrum and a low resolution copy of each spectrum is being saved into an on-line database. FTSDBM is a simple database management program in the NSO external package for IRAF. A command language allows the FTSDBM user to add entries to the database, delete entries, select subsets from the database based on keyword values including ranges of values, create new database files based on these subsets, make keyword lists, examine low resolution spectra graphically, and make disk number/file number lists. Once the archive is complete, FTSDBM will allow the database to be efficiently searched for data of interest to the user and the compact disk format will allow random access to that data.
Torshabi, Ahmad Esmaili; Nankali, Saber
2016-01-01
In external beam radiotherapy, one of the most common and reliable methods for patient geometrical setup and/or predicting the tumor location is use of external markers. In this study, the main challenging issue is increasing the accuracy of patient setup by investigating external markers location. Since the location of each external marker may yield different patient setup accuracy, it is important to assess different locations of external markers using appropriate selective algorithms. To do this, two commercially available algorithms entitled a) canonical correlation analysis (CCA) and b) principal component analysis (PCA) were proposed as input selection algorithms. They work on the basis of maximum correlation coefficient and minimum variance between given datasets. The proposed input selection algorithms work in combination with an adaptive neuro‐fuzzy inference system (ANFIS) as a correlation model to give patient positioning information as output. Our proposed algorithms provide input file of ANFIS correlation model accurately. The required dataset for this study was prepared by means of a NURBS‐based 4D XCAT anthropomorphic phantom that can model the shape and structure of complex organs in human body along with motion information of dynamic organs. Moreover, a database of four real patients undergoing radiation therapy for lung cancers was utilized in this study for validation of proposed strategy. Final analyzed results demonstrate that input selection algorithms can reasonably select specific external markers from those areas of the thorax region where root mean square error (RMSE) of ANFIS model has minimum values at that given area. It is also found that the selected marker locations lie closely in those areas where surface point motion has a large amplitude and a high correlation. PACS number(s): 87.55.km, 87.55.N PMID:27929479
NEAMS-IPL MOOSE Midyear Framework Activities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Permann, Cody; Alger, Brian; Peterson, John
The MOOSE Framework is a modular pluggable framework for building complex simulations. The ability to add new objects with custom syntax is a core capability that makes MOOSE a powerful platform for coupling multiple applications together within a single environment. The creation of a new, more standardized JSON syntax output improves the external interfaces for generating graphical components or for validating input file syntax. The design of this interface and the requirements it satisfies are covered in this short report.
Cue generation: How learners flexibly support future retrieval.
Tullis, Jonathan G; Benjamin, Aaron S
2015-08-01
The successful use of memory requires us to be sensitive to the cues that will be present during retrieval. In many situations, we have some control over the external cues that we will encounter. For instance, learners create shopping lists at home to help remember what items to later buy at the grocery store, and they generate computer file names to help remember the contents of those files. Generating cues in the service of later cognitive goals is a complex task that lies at the intersection of metacognition, communication, and memory. In this series of experiments, we investigated how and how well learners generate external mnemonic cues. Across 5 experiments, learners generated a cue for each target word in a to-be-remembered list and received these cues during a later cued recall test. Learners flexibly generated cues in response to different instructional demands and study list compositions. When generating mnemonic cues, as compared to descriptions of target items, learners produced cues that were more distinct than mere descriptions and consequently elicited greater cued recall performance than those descriptions. When learners were aware of competing targets in the study list, they generated mnemonic cues with smaller cue-to-target associative strength but that were even more distinct. These adaptations led to fewer confusions among competing targets and enhanced cued recall performance. These results provide another example of the metacognitively sophisticated tactics that learners use to effectively support future retrieval.
Lidierth, Malcolm
2005-02-15
This paper describes software that runs in the Spike2 for Windows environment and provides a versatile tool for generating stimuli during data acquisition from the 1401 family of interfaces (CED, UK). A graphical user interface (GUI) is used to provide dynamic control of stimulus timing. Both single stimuli and trains of stimuli can be generated. The pulse generation routines make use of programmable variables within the interface and allow these to be rapidly changed during an experiment. The routines therefore provide the ease-of-use associated with external, stand-alone pulse generators. Complex stimulus protocols can be loaded from an external text file and facilities are included to create these files through the GUI. The software consists of a Spike2 script that runs in the host PC, and accompanying routines written in the 1401 sequencer control code, that run in the 1401 interface. Handshaking between the PC and the interface card are built into the routines and provides for full integration of sampling, analysis and stimulus generation during an experiment. Control of the 1401 digital-to-analogue converters is also provided; this allows control of stimulus amplitude as well as timing and also provides a sample-hold feature that may be used to remove DC offsets and drift from recorded data.
Lee, Woonghee; Kim, Jin Hae; Westler, William M; Markley, John L
2011-06-15
PONDEROSA (Peak-picking Of Noe Data Enabled by Restriction of Shift Assignments) accepts input information consisting of a protein sequence, backbone and sidechain NMR resonance assignments, and 3D-NOESY ((13)C-edited and/or (15)N-edited) spectra, and returns assignments of NOESY crosspeaks, distance and angle constraints, and a reliable NMR structure represented by a family of conformers. PONDEROSA incorporates and integrates external software packages (TALOS+, STRIDE and CYANA) to carry out different steps in the structure determination. PONDEROSA implements internal functions that identify and validate NOESY peak assignments and assess the quality of the calculated three-dimensional structure of the protein. The robustness of the analysis results from PONDEROSA's hierarchical processing steps that involve iterative interaction among the internal and external modules. PONDEROSA supports a variety of input formats: SPARKY assignment table (.shifts) and spectrum file formats (.ucsf), XEASY proton file format (.prot), and NMR-STAR format (.star). To demonstrate the utility of PONDEROSA, we used the package to determine 3D structures of two proteins: human ubiquitin and Escherichia coli iron-sulfur scaffold protein variant IscU(D39A). The automatically generated structural constraints and ensembles of conformers were as good as or better than those determined previously by much less automated means. The program, in the form of binary code along with tutorials and reference manuals, is available at http://ponderosa.nmrfam.wisc.edu/.
Dragly, Svenn-Arne; Hobbi Mobarhan, Milad; Lepperød, Mikkel E.; Tennøe, Simen; Fyhn, Marianne; Hafting, Torkel; Malthe-Sørenssen, Anders
2018-01-01
Natural sciences generate an increasing amount of data in a wide range of formats developed by different research groups and commercial companies. At the same time there is a growing desire to share data along with publications in order to enable reproducible research. Open formats have publicly available specifications which facilitate data sharing and reproducible research. Hierarchical Data Format 5 (HDF5) is a popular open format widely used in neuroscience, often as a foundation for other, more specialized formats. However, drawbacks related to HDF5's complex specification have initiated a discussion for an improved replacement. We propose a novel alternative, the Experimental Directory Structure (Exdir), an open specification for data storage in experimental pipelines which amends drawbacks associated with HDF5 while retaining its advantages. HDF5 stores data and metadata in a hierarchy within a complex binary file which, among other things, is not human-readable, not optimal for version control systems, and lacks support for easy access to raw data from external applications. Exdir, on the other hand, uses file system directories to represent the hierarchy, with metadata stored in human-readable YAML files, datasets stored in binary NumPy files, and raw data stored directly in subdirectories. Furthermore, storing data in multiple files makes it easier to track for version control systems. Exdir is not a file format in itself, but a specification for organizing files in a directory structure. Exdir uses the same abstractions as HDF5 and is compatible with the HDF5 Abstract Data Model. Several research groups are already using data stored in a directory hierarchy as an alternative to HDF5, but no common standard exists. This complicates and limits the opportunity for data sharing and development of common tools for reading, writing, and analyzing data. Exdir facilitates improved data storage, data sharing, reproducible research, and novel insight from interdisciplinary collaboration. With the publication of Exdir, we invite the scientific community to join the development to create an open specification that will serve as many needs as possible and as a foundation for open access to and exchange of data. PMID:29706879
Dragly, Svenn-Arne; Hobbi Mobarhan, Milad; Lepperød, Mikkel E; Tennøe, Simen; Fyhn, Marianne; Hafting, Torkel; Malthe-Sørenssen, Anders
2018-01-01
Natural sciences generate an increasing amount of data in a wide range of formats developed by different research groups and commercial companies. At the same time there is a growing desire to share data along with publications in order to enable reproducible research. Open formats have publicly available specifications which facilitate data sharing and reproducible research. Hierarchical Data Format 5 (HDF5) is a popular open format widely used in neuroscience, often as a foundation for other, more specialized formats. However, drawbacks related to HDF5's complex specification have initiated a discussion for an improved replacement. We propose a novel alternative, the Experimental Directory Structure (Exdir), an open specification for data storage in experimental pipelines which amends drawbacks associated with HDF5 while retaining its advantages. HDF5 stores data and metadata in a hierarchy within a complex binary file which, among other things, is not human-readable, not optimal for version control systems, and lacks support for easy access to raw data from external applications. Exdir, on the other hand, uses file system directories to represent the hierarchy, with metadata stored in human-readable YAML files, datasets stored in binary NumPy files, and raw data stored directly in subdirectories. Furthermore, storing data in multiple files makes it easier to track for version control systems. Exdir is not a file format in itself, but a specification for organizing files in a directory structure. Exdir uses the same abstractions as HDF5 and is compatible with the HDF5 Abstract Data Model. Several research groups are already using data stored in a directory hierarchy as an alternative to HDF5, but no common standard exists. This complicates and limits the opportunity for data sharing and development of common tools for reading, writing, and analyzing data. Exdir facilitates improved data storage, data sharing, reproducible research, and novel insight from interdisciplinary collaboration. With the publication of Exdir, we invite the scientific community to join the development to create an open specification that will serve as many needs as possible and as a foundation for open access to and exchange of data.
Software Management for the NOνAExperiment
NASA Astrophysics Data System (ADS)
Davies, G. S.; Davies, J. P.; C Group; Rebel, B.; Sachdev, K.; Zirnstein, J.
2015-12-01
The NOvAsoftware (NOνASoft) is written in C++, and built on the Fermilab Computing Division's art framework that uses ROOT analysis software. NOνASoftmakes use of more than 50 external software packages, is developed by more than 50 developers and is used by more than 100 physicists from over 30 universities and laboratories in 3 continents. The software builds are handled by Fermilab's custom version of Software Release Tools (SRT), a UNIX based software management system for large, collaborative projects that is used by several experiments at Fermilab. The system provides software version control with SVN configured in a client-server mode and is based on the code originally developed by the BaBar collaboration. In this paper, we present efforts towards distributing the NOvA software via the CernVM File System distributed file system. We will also describe our recent work to use a CMake build system and Jenkins, the open source continuous integration system, for NOνASoft.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bosler, Peter
Stride Search provides a flexible tool for detecting storms or other extreme climate events in high-resolution climate data sets saved on uniform latitude-longitude grids in standard NetCDF format. Users provide the software a quantitative description of a meteorological event they are interested in; the software searches a data set for locations in space and time that meet the user’s description. In its first stage, Stride Search performs a spatial search of the data set at each timestep by dividing a search domain into circular sectors of constant geodesic radius. Data from a netCDF file is read into memory for eachmore » circular search sector. If the data meet or exceed a set of storm identification criteria (defined by the user), a storm is recorded to a linked list. Finally, the linked list is examined and duplicate detections of the same storm are removed and the results are written to an output file. The first stage’s output file is read by a second program that builds storm. Additional identification criteria may be applied at this stage to further classify storms. Storm tracks are the software’s ultimate output and routines are provided for formatting that output for various external software libraries for plotting and tabulating data.« less
Mitigating Errors in External Respiratory Surrogate-Based Models of Tumor Position
DOE Office of Scientific and Technical Information (OSTI.GOV)
Malinowski, Kathleen T.; Fischell Department of Bioengineering, University of Maryland, College Park, MD; McAvoy, Thomas J.
2012-04-01
Purpose: To investigate the effect of tumor site, measurement precision, tumor-surrogate correlation, training data selection, model design, and interpatient and interfraction variations on the accuracy of external marker-based models of tumor position. Methods and Materials: Cyberknife Synchrony system log files comprising synchronously acquired positions of external markers and the tumor from 167 treatment fractions were analyzed. The accuracy of Synchrony, ordinary-least-squares regression, and partial-least-squares regression models for predicting the tumor position from the external markers was evaluated. The quantity and timing of the data used to build the predictive model were varied. The effects of tumor-surrogate correlation and the precisionmore » in both the tumor and the external surrogate position measurements were explored by adding noise to the data. Results: The tumor position prediction errors increased during the duration of a fraction. Increasing the training data quantities did not always lead to more accurate models. Adding uncorrelated noise to the external marker-based inputs degraded the tumor-surrogate correlation models by 16% for partial-least-squares and 57% for ordinary-least-squares. External marker and tumor position measurement errors led to tumor position prediction changes 0.3-3.6 times the magnitude of the measurement errors, varying widely with model algorithm. The tumor position prediction errors were significantly associated with the patient index but not with the fraction index or tumor site. Partial-least-squares was as accurate as Synchrony and more accurate than ordinary-least-squares. Conclusions: The accuracy of surrogate-based inferential models of tumor position was affected by all the investigated factors, except for the tumor site and fraction index.« less
VirGO: A Visual Browser for the ESO Science Archive Facility
NASA Astrophysics Data System (ADS)
Hatziminaoglou, Evanthia; Chéreau, Fabien
2009-03-01
VirGO is the next generation Visual Browser for the ESO Science Archive Facility (SAF) developed in the Virtual Observatory Project Office. VirGO enables astronomers to discover and select data easily from millions of observations in a visual and intuitive way. It allows real-time access and the graphical display of a large number of observations by showing instrumental footprints and image previews, as well as their selection and filtering for subsequent download from the ESO SAF web interface. It also permits the loading of external FITS files or VOTables, as well as the superposition of Digitized Sky Survey images to be used as background. All data interfaces are based on Virtual Observatory (VO) standards that allow access to images and spectra from external data centres, and interaction with the ESO SAF web interface or any other VO applications.
Development of a HACS User Interface Module.
1981-09-30
OF 0 IS STORED FOR ALL UNDEFINED SEGMENTS, C C COMMON/PLTCN/ANG,IBUF(4000),IFRSTIPLTWIND CC C ANG = SPECIFIES WIND DIRECTION FROM NORTH FOR USE IN...FILE NAME USED FOR PLOT TAPE C WIND = WIND VELOCITY OBTAINED FROM HACS DATA FIELD 2016 C C EXTERNAL FCHCK INTEGER FLDTAB(257) INTEGER SCNTAB(32...MSGSpMSG DIMENSION SAVE(2489) PSTATE(2489) EQUIVALENCE (SAVE(1) PMSGS (1) ) (STATE(1) ,MS6(1)) C OCOMMON/CNTRL/EOFFICDIDFLTLBL(4),LSTCN(3,3),MODEL(15),NOP
Integration of external metadata into the Earth System Grid Federation (ESGF)
NASA Astrophysics Data System (ADS)
Berger, Katharina; Levavasseur, Guillaume; Stockhause, Martina; Lautenschlager, Michael
2015-04-01
International projects with high volume data usually disseminate their data in a federated data infrastructure, e.g.~the Earth System Grid Federation (ESGF). The ESGF aims to make the geographically distributed data seamlessly discoverable and accessible. Additional data-related information is currently collected and stored in separate repositories by each data provider. This scattered and useful information is not or only partly available for ESGF users. Examples for such additional information systems are ES-DOC/metafor for model and simulation information, IPSL's versioning information, CHARMe for user annotations, DKRZ's quality information and data citation information. The ESGF Quality Control working team (esgf-qcwt) aims to integrate these valuable pieces of additional information into the ESGF in order to make them available to users and data archive managers by (i) integrating external information into ESGF portal, (ii) integrating links to external information objects into the ESGF metadata index, e.g. by the use of PIDs (Persistent IDentifiers), and (iii) automating the collection of external information during the ESGF data publication process. For the sixth phase of CMIP (Coupled Model Intercomparison Project), the ESGF metadata index is to be enriched by additional information on data citation, file version, etc. This information will support users directly and can be automatically exploited by higher level services (human and machine readability).
Understanding and Analyzing Latency of Near Real-time Satellite Data
NASA Astrophysics Data System (ADS)
Han, W.; Jochum, M.; Brust, J.
2016-12-01
Acquiring and disseminating time-sensitive satellite data in a timely manner is much concerned by researchers and decision makers of weather forecast, severe weather warning, disaster and emergency response, environmental monitoring, and so on. Understanding and analyzing the latency of near real-time satellite data is very useful and helpful to explore the whole data transmission flow, indentify the possible issues, and connect data providers and users better. The STAR (Center for Satellite Applications and Research of NOAA) Central Data Repository (SCDR) is a central repository to acquire, manipulate, and disseminate various types of near real-time satellite datasets to internal and external users. In this system, important timestamps, including observation beginning/end, processing, uploading, downloading, and ingestion, are retrieved and organized in the database, so the time length of each transmission phase can be figured out easily. Open source NoSQL database MongoDB is selected to manage the timestamp information because of features of dynamic schema, aggregation and data processing. A user-friendly user interface is developed to visualize and characterize the latency interactively. Taking the Himawari-8 HSD (Himawari Standard Data) file as an example, the data transmission phases, including creating HSD file from satellite observation, uploading the file to HimawariCloud, updating file link in the webpage, downloading and ingesting the file to SCDR, are worked out from the above mentioned timestamps. The latencies can be observed by time of period, day of week, or hour of day in chart or table format, and the anomaly latencies can be detected and reported through the user interface. Latency analysis provides data providers and users actionable insight on how to improve the data transmission of near real-time satellite data, and enhance its acquisition and management.
Maynard, Charles; Trivedi, Ranak; Nelson, Karin; Fihn, Stephan D
2018-03-26
The association between disability and cause of death in Veterans with service-connected disabilities has not been studied. The objective of this study was to compare age at death, military service and disability characteristics, including disability rating, and cause of death by year of birth. We also examined cause of death for specific service-connected conditions. This study used information from the VETSNET file, which is a snapshot of selected items from the Veterans Benefits Administration corporate database. We also used the National Death Index (NDI) for Veterans which is part of the VA Suicide Data Repository. In VETSNET, there were 758,324 Veterans who had a service-connected condition and died between the years 2004 and 2014. Using the scrambled social security number to link the two files resulted in 605,493 (80%) deceased Veterans. Age at death, sex, and underlying cause of death were obtained from the NDI for Veterans and military service characteristics and types of disability were acquired from VETSNET. We constructed age categories corresponding to period of service; birth years 1938 and earlier corresponded to Korea and World War II ("oldest"), birth years 1939-1957 to the Vietnam era ("middle"), and birth years 1958 and later to post Vietnam, Gulf War, and the more recent conflicts in Iraq and Afghanistan ("youngest"). Sixty-two percent were in the oldest age category, 34% in the middle group, and 4% in the youngest one. The overall age at death was 75 ± 13 yr. Only 1.6% of decedents were women; among women 25% were in the youngest age group, while among men only 4% were in the youngest group. Most decedents were enlisted personnel, and 60% served in the U.S. Army. Nearly 61% had a disability rating of >50% and for the middle age group 54% had a disability rating of 100%. The most common service-connected conditions were tinnitus, hearing loss, and post-traumatic stress disorder (PTSD). In the oldest group, nearly half of deaths were due to cancer or cardiovascular conditions and <2% were due to external causes. In the youngest group, cardiovascular disease and cancer accounted for about 1/3 of deaths, whereas external causes or deaths due to accidents, suicide, or assault accounted for nearly 33% of deaths. For Veterans with service-connected PTSD or major depression; 6.5% of deaths were due to external causes whereas for Veterans without these conditions, only 3.1% were due to external causes. The finding of premature death due to external causes in the youngest age group as well as the finding of higher proportions of external causes in those with PTSD or major depression should be of great concern to those who care for Veterans.
The mechanical forces in katydid sound production
NASA Astrophysics Data System (ADS)
Xiao, Huaping; Chiu, Cheng-Wei; Zhou, Yan; He, Xingliang; Epstein, Ben; Liang, Hong
2013-10-01
Katydids and crickets generate their characteristic calling sound by rubbing their wings together. The mechanisms of the rubbing force, however, have not been extensively studied. The change of mechanical force with external parameters (speed and applied load) in the stridulation process has not been reported. Our current study aims to investigate the mechanical forces of katydid stridulation. Four pairs of files and plectrums from a katydid, which are responsible for the katydid's sound production, were examined with a specially designed experimental configuration. Due to the asymmetric nature of the wing motion in their opening and closing, the contact between the plectrum and file resembles that of a ratchet. Multiple frequencies were generated during experimental wing rubbing so that a calling-like sound was produced. Results showed that the morphology of the plectrum/file contact has significant effects on mechanical forces induced on the wings and resulting sound production. The roles of the mechanical forces include sound generation, tone modification, and energy consumption. The findings in this work reveal the variation trend of mechanical force with sliding speed and applied load. The frequency and amplitude of the sound wave produced in tribo-test are close to those in natural condition. By mimicking the microstructure of the plectrum and file teeth, acoustic instruments with high mechanical energy conversion rate can be developed. Our results provide new approaches in the design and improvement of micro-machines for acoustic applications, as well as in hybrid robotic systems.
An expert system shell for inferring vegetation characteristics
NASA Technical Reports Server (NTRS)
Harrison, P. Ann; Harrison, Patrick R.
1993-01-01
The NASA VEGetation Workbench (VEG) is a knowledge based system that infers vegetation characteristics from reflectance data. VEG is described in detail in several references. The first generation version of VEG was extended. In the first year of this contract, an interface to a file of unknown cover type data was constructed. An interface that allowed the results of VEG to be written to a file was also implemented. A learning system that learned class descriptions from a data base of historical cover type data and then used the learned class descriptions to classify an unknown sample was built. This system had an interface that integrated it into the rest of VEG. The VEG subgoal PROPORTION.GROUND.COVER was completed and a number of additional techniques that inferred the proportion ground cover of a sample were implemented. This work was previously described. The work carried out in the second year of the contract is described. The historical cover type database was removed from VEG and stored as a series of flat files that are external to VEG. An interface to the files was provided. The framework and interface for two new VEG subgoals that estimate the atmospheric effect on reflectance data were built. A new interface that allows the scientist to add techniques to VEG without assistance from the developer was designed and implemented. A prototype Help System that allows the user to get more information about each screen in the VEG interface was also added to VEG.
NASA Technical Reports Server (NTRS)
Fong, Danny; Odell,Dorice; Barry, Peter; Abrahamian, Tomik
2008-01-01
This software provides internal, automated search mechanics of GIDEP (Government- Industry Data Exchange Program) Alert data imported from the GIDEP government Web site. The batching tool allows the import of a single parts list in tab-delimited text format into the local JPL GIDEP database. Delimiters from every part number are removed. The original part numbers with delimiters are compared, as well as the newly generated list without the delimiters. The two lists run against the GIDEP imports, and output any matches. This feature only works with Netscape 2.0 or greater, or Internet Explorer 4.0 or greater. The user selects the browser button to choose a text file to import. When the submit button is pressed, this script will import alerts from the text file into the local JPL GIDEP database. This batch tool provides complete in-house control over exported material and data for automated batch match abilities. The batching tool has the ability to match capabilities of the parts list to tables, and yields results that aid further research and analysis. This provides more control over GIDEP information for metrics and reports information not provided by the government site. This software yields results quickly and gives more control over external data from the government site in order to generate other reports not available from the external source. There is enough space to store years of data. The program relates to risk identification and management with regard to projects and GIDEP alert information encompassing flight parts for space exploration.
SOSPEX, an interactive tool to explore SOFIA spectral cubes
NASA Astrophysics Data System (ADS)
Fadda, Dario; Chambers, Edward T.
2018-01-01
We present SOSPEX (SOFIA SPectral EXplorer), an interactive tool to visualize and analyze spectral cubes obtained with the FIFI-LS and GREAT instruments onboard the SOFIA Infrared Observatory. This software package is written in Python 3 and it is available either through Github or Anaconda.Through this GUI it is possible to explore directly the spectral cubes produced by the SOFIA pipeline and archived in the SOFIA Science Archive. Spectral cubes are visualized showing their spatial and spectral dimensions in two different windows. By selecting a part of the spectrum, the flux from the corresponding slice of the cube is visualized in the spatial window. On the other hand, it is possible to define apertures on the spatial window to show the corresponding spectral energy distribution in the spectral window.Flux isocontours can be overlapped to external images in the spatial window while line names, atmospheric transmission, or external spectra can be overplotted on the spectral window. Atmospheric models with specific parameters can be retrieved, compared to the spectra and applied to the uncorrected FIFI-LS cubes in the cases where the standard values give unsatisfactory results. Subcubes can be selected and saved as FITS files by cropping or cutting the original cubes. Lines and continuum can be fitted in the spectral window saving the results in Jyson files which can be reloaded later. Finally, in the case of spatially extended observations, it is possible to compute spectral momenta as a function of the position to obtain velocity dispersion maps or velocity diagrams.
An Interactive, Design and Educational Tool for Supersonic External-Compression Inlets
NASA Technical Reports Server (NTRS)
Benson, Thomas J.
1994-01-01
A workstation-based interactive design tool called VU-INLET was developed for the inviscid flow in rectangular, supersonic, external-compression inlets. VU-INLET solves for the flow conditions from free stream, through the supersonic compression ramps, across the terminal normal shock region and the subsonic diffuser to the engine face. It calculates the shock locations, the capture streamtube, and the additive drag of the inlet. The inlet geometry can be modified using a graphical user interface and the new flow conditions recalculated interactively. Free stream conditions and engine airflow can also be interactively varied and off-design performance evaluated. Flow results from VU-INLET can be saved to a file for a permanent record, and a series of help screens make the simulator easy to learn and use. This paper will detail the underlying assumptions of the models and the numerical methods used in the simulator.
NASA Astrophysics Data System (ADS)
Yao, K. L.; Li, Y. C.; Sun, X. Z.; Liu, Q. M.; Qin, Y.; Fu, H. H.; Gao, G. Y.
2005-10-01
By using the density matrix renormalization group (DMRG) method for the one-dimensional (1D) Hubbard model, we have studied the von Neumann entropy of a quantum system, which describes the entanglement of the system block and the rest of the chain. It is found that there is a close relation between the entanglement entropy and properties of the system. The hole-doping can alter the charge charge and spin spin interactions, resulting in charge polarization along the chain. By comparing the results before and after the doping, we find that doping favors increase of the von Neumann entropy and thus also favors the exchange of information along the chain. Furthermore, we calculated the spin and entropy distribution in external magnetic filed. It is confirmed that both the charge charge and the spin spin interactions affect the exchange of information along the chain, making the entanglement entropy redistribute.
Brownian Dynamics simulations of model colloids in channel geometries and external fields
NASA Astrophysics Data System (ADS)
Siems, Ullrich; Nielaba, Peter
2018-04-01
We review the results of Brownian Dynamics simulations of colloidal particles in external fields confined in channels. Super-paramagnetic Brownian particles are well suited two- dimensional model systems for a variety of problems on different length scales, ranging from pedestrian walking through a bottleneck to ions passing ion-channels in living cells. In such systems confinement into channels can have a great influence on the diffusion and transport properties. Especially we will discuss the crossover from single file diffusion in a narrow channel to the diffusion in the extended two-dimensional system. Therefore a new algorithm for computing the mean square displacement (MSD) on logarithmic time scales is presented. In a different study interacting colloidal particles were dragged over a washboard potential and are additionally confined in a two-dimensional micro-channel. In this system kink and anti-kink solitons determine the depinning process of the particles from the periodic potential.
Xu, Fang; Wallace, Robyn C.; Garvin, William; Greenlund, Kurt J.; Bartoli, William; Ford, Derek; Eke, Paul; Town, G. Machell
2016-01-01
Public health researchers have used a class of statistical methods to calculate prevalence estimates for small geographic areas with few direct observations. Many researchers have used Behavioral Risk Factor Surveillance System (BRFSS) data as a basis for their models. The aims of this study were to 1) describe a new BRFSS small area estimation (SAE) method and 2) investigate the internal and external validity of the BRFSS SAEs it produced. The BRFSS SAE method uses 4 data sets (the BRFSS, the American Community Survey Public Use Microdata Sample, Nielsen Claritas population totals, and the Missouri Census Geographic Equivalency File) to build a single weighted data set. Our findings indicate that internal and external validity tests were successful across many estimates. The BRFSS SAE method is one of several methods that can be used to produce reliable prevalence estimates in small geographic areas. PMID:27418213
DOE Office of Scientific and Technical Information (OSTI.GOV)
O'Malley, Daniel; Vesselinov, Velimir V.
MADSpython (Model analysis and decision support tools in Python) is a code in Python that streamlines the process of using data and models for analysis and decision support using the code MADS. MADS is open-source code developed at LANL and written in C/C++ (MADS; http://mads.lanl.gov; LA-CC-11-035). MADS can work with external models of arbitrary complexity as well as built-in models of flow and transport in porous media. The Python scripts in MADSpython facilitate the generation of input and output file needed by MADS as wells as the external simulators which include FEHM and PFLOTRAN. MADSpython enables a number of data-more » and model-based analyses including model calibration, sensitivity analysis, uncertainty quantification, and decision analysis. MADSpython will be released under GPL V3 license. MADSpython will be distributed as a Git repo at gitlab.com and github.com. MADSpython manual and documentation will be posted at http://madspy.lanl.gov.« less
Fernández, José M; Valencia, Alfonso
2004-10-12
Downloading the information stored in relational databases into XML and other flat formats is a common task in bioinformatics. This periodical dumping of information requires considerable CPU time, disk and memory resources. YAdumper has been developed as a purpose-specific tool to deal with the integral structured information download of relational databases. YAdumper is a Java application that organizes database extraction following an XML template based on an external Document Type Declaration. Compared with other non-native alternatives, YAdumper substantially reduces memory requirements and considerably improves writing performance.
Saada: A Generator of Astronomical Database
NASA Astrophysics Data System (ADS)
Michel, L.
2011-11-01
Saada transforms a set of heterogeneous FITS files or VOtables of various categories (images, tables, spectra, etc.) in a powerful database deployed on the Web. Databases are located on your host and stay independent of any external server. This job doesn’t require writing code. Saada can mix data of various categories in multiple collections. Data collections can be linked each to others making relevant browsing paths and allowing data-mining oriented queries. Saada supports 4 VO services (Spectra, images, sources and TAP) . Data collections can be published immediately after the deployment of the Web interface.
External Theory for Stochastic Processes.
1985-11-01
1.2 1.4 1.8 11111125 11.I6 MICROCOP RESOLUTION TEST CHART M.. MW’ PAPI ~ W W ’W IV AV a a W 4 * S6 _ ~.. r dV . Unclassif’ DA 7 4 9JT FILE COPY...intensity measure has the Laplace : <-f Transform L (f)=exp(-x (l-e - f ) whereas a Compound Poisson Process has Laplace Transform (2.3.1) L (f...see Example 2.2.4 as an illustration of this). The result is a clustering of exceedances, leading to a compounding of events in the limiting point
External Data and Attribute Hyperlink Programs for Promis*e(Registered Trademark)
NASA Technical Reports Server (NTRS)
Derengowski, Rich; Gruel, Andrew
2001-01-01
External Data and Attribute Hyperlink are computer programs that can be added to Promis*e(trademark) which is a commercial software system that automates routine tasks in the design (including drawing schematic diagrams) of electrical control systems. The programs were developed under the Stennis Space Center's (SSC) Dual Use Technology Development Program to provide capabilities for SSC's BMCS configuration management system which uses Promis*e(trademark). The External Data program enables the storage and management of information in an external database linked to a drawing. Changes can be made either in the database or on the drawing. Information that originates outside Promis*e(trademark) can be stored in custom fields that can be added to the database. Although this information is not available in Promis*e(trademark) printed drawings, it can be associated with symbols in the drawings, and can be retrieved through the drawings when the software is running. The Attribute Hyperlink program enables the addition of hyperlink information as attributes of symbols. This program enables the formation of a direct hyperlink between a schematic diagram and an Internet site or a file on a compact disk, on the user's hard drive, or on another computer on a network to which the user's computer is connected. The user can then obtain information directly related to the part (e.g., maintenance, or troubleshooting information) associated with the hyperlink.
Garcia, Andres; Evans, James W.
2017-04-03
In this paper, we consider a variety of diffusion-mediated processes occurring within linear nanopores, but which involve coupling to an equilibrated external fluid through adsorption and desorption. By determining adsorption and desorption rates through a set of tailored simulations, and by exploiting a spatial Markov property of the models, we develop a formulation for performing efficient pore-only simulations of these processes. Coupling to the external fluid is described exactly through appropriate nontrivial boundary conditions at the pore openings. This formalism is applied to analyze the following: (i) tracer counter permeation (TCP) where different labeled particles adsorb into opposite ends ofmore » the pore and establish a nonequilibrium steady state; (ii) tracer exchange (TE) with exchange of differently labeled particles within and outside the pore; (iii) catalytic conversion reactions where a reactant in the external fluid adsorbs into the pore and converts to a product which may desorb. The TCP analysis also generates a position-dependent generalized tracer diffusion coefficient, the form of which controls behavior in the TE and catalytic conversion processes. We focus on the regime of single-file diffusion within the pore which produces the strongest correlations and largest deviations from mean-field type behavior. Finally, behavior is quantified precisely via kinetic Monte Carlo simulations but is also captured with appropriate analytic treatments.« less
Mass storage technology in networks
NASA Astrophysics Data System (ADS)
Ishii, Katsunori; Takeda, Toru; Itao, Kiyoshi; Kaneko, Reizo
1990-08-01
Trends and features of mass storage subsystems in network are surveyed and their key technologies spotlighted. Storage subsystems are becoming increasingly important in new network systems in which communications and data processing are systematically combined. These systems require a new class of high-performance mass-information storage in order to effectively utilize their processing power. The requirements of high transfer rates, high transactional rates and large storage capacities, coupled with high functionality, fault tolerance and flexibility in configuration, are major challenges in storage subsystems. Recent progress in optical disk technology has resulted in improved performance of on-line external memories to optical disk drives, which are competing with mid-range magnetic disks. Optical disks are more effective than magnetic disks in using low-traffic random-access file storing multimedia data that requires large capacity, such as in archive use and in information distribution use by ROM disks. Finally, it demonstrates image coded document file servers for local area network use that employ 130mm rewritable magneto-optical disk subsystems.
Design and Execution of make-like, distributed Analyses based on Spotify’s Pipelining Package Luigi
NASA Astrophysics Data System (ADS)
Erdmann, M.; Fischer, B.; Fischer, R.; Rieger, M.
2017-10-01
In high-energy particle physics, workflow management systems are primarily used as tailored solutions in dedicated areas such as Monte Carlo production. However, physicists performing data analyses are usually required to steer their individual workflows manually which is time-consuming and often leads to undocumented relations between particular workloads. We present a generic analysis design pattern that copes with the sophisticated demands of end-to-end HEP analyses and provides a make-like execution system. It is based on the open-source pipelining package Luigi which was developed at Spotify and enables the definition of arbitrary workloads, so-called Tasks, and the dependencies between them in a lightweight and scalable structure. Further features are multi-user support, automated dependency resolution and error handling, central scheduling, and status visualization in the web. In addition to already built-in features for remote jobs and file systems like Hadoop and HDFS, we added support for WLCG infrastructure such as LSF and CREAM job submission, as well as remote file access through the Grid File Access Library. Furthermore, we implemented automated resubmission functionality, software sandboxing, and a command line interface with auto-completion for a convenient working environment. For the implementation of a t \\overline{{{t}}} H cross section measurement, we created a generic Python interface that provides programmatic access to all external information such as datasets, physics processes, statistical models, and additional files and values. In summary, the setup enables the execution of the entire analysis in a parallelized and distributed fashion with a single command.
CHARACTERIZATION OF EXPOSURES TO WORKERS COVERED UNDER THE U.S. ENERGY EMPLOYEES COMPENSATION ACT
Neton, James W.
2015-01-01
Since the mid-1940s, hundreds of thousands of workers have been engaged in nuclear weapons-related activities for the U.S. Department of Energy (DOE) and its predecessor agencies. In 2000, Congress promulgated the Energy Employees Occupational Illness Compensation Program Act of 2000 (EEOICPA), which provides monetary compensation and medical benefits to certain energy employees who have developed cancer. Under Part B of EEOICPA, the National Institute for Occupational Safety and Health (NIOSH) is required to estimate radiation doses for those workers who have filed a claim, or whose survivors have filed a claim, under Part B of the Act. To date, over 39,000 dose reconstructions have been completed for workers from more than 200 facilities. These reconstructions have included assessment of both internal and external exposure at all major DOE facilities, as well as at a large number of private companies [known as Atomic Weapons Employer (AWE) facilities in the Act] that engaged in contract work for the DOE and its predecessor agencies. To complete these dose reconstructions, NIOSH has captured and reviewed thousands of historical documents related to site operations and worker/workplace monitoring practices at these facilities. Using the data collected and reviewed pursuant to NIOSH’s role under EEOICPA, this presentation will characterize historical internal and external exposures received by workers at DOE and AWE facilities. To the extent possible, use will be made of facility specific coworker models to highlight changes in exposure patterns over time. In addition, the effects that these exposures have on compensation rates for workers are discussed. PMID:24378500
Lee, Woonghee; Kim, Jin Hae; Westler, William M.; Markley, John L.
2011-01-01
Summary: PONDEROSA (Peak-picking Of Noe Data Enabled by Restriction of Shift Assignments) accepts input information consisting of a protein sequence, backbone and sidechain NMR resonance assignments, and 3D-NOESY (13C-edited and/or 15N-edited) spectra, and returns assignments of NOESY crosspeaks, distance and angle constraints, and a reliable NMR structure represented by a family of conformers. PONDEROSA incorporates and integrates external software packages (TALOS+, STRIDE and CYANA) to carry out different steps in the structure determination. PONDEROSA implements internal functions that identify and validate NOESY peak assignments and assess the quality of the calculated three-dimensional structure of the protein. The robustness of the analysis results from PONDEROSA's hierarchical processing steps that involve iterative interaction among the internal and external modules. PONDEROSA supports a variety of input formats: SPARKY assignment table (.shifts) and spectrum file formats (.ucsf), XEASY proton file format (.prot), and NMR-STAR format (.star). To demonstrate the utility of PONDEROSA, we used the package to determine 3D structures of two proteins: human ubiquitin and Escherichia coli iron-sulfur scaffold protein variant IscU(D39A). The automatically generated structural constraints and ensembles of conformers were as good as or better than those determined previously by much less automated means. Availability: The program, in the form of binary code along with tutorials and reference manuals, is available at http://ponderosa.nmrfam.wisc.edu/. Contact: whlee@nmrfam.wisc.edu; markley@nmrfam.wisc.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:21511715
The SCALE Verified, Archived Library of Inputs and Data - VALID
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marshall, William BJ J; Rearden, Bradley T
The Verified, Archived Library of Inputs and Data (VALID) at ORNL contains high quality, independently reviewed models and results that improve confidence in analysis. VALID is developed and maintained according to a procedure of the SCALE quality assurance (QA) plan. This paper reviews the origins of the procedure and its intended purpose, the philosophy of the procedure, some highlights of its implementation, and the future of the procedure and associated VALID library. The original focus of the procedure was the generation of high-quality models that could be archived at ORNL and applied to many studies. The review process associated withmore » model generation minimized the chances of errors in these archived models. Subsequently, the scope of the library and procedure was expanded to provide high quality, reviewed sensitivity data files for deployment through the International Handbook of Evaluated Criticality Safety Benchmark Experiments (IHECSBE). Sensitivity data files for approximately 400 such models are currently available. The VALID procedure and library continue fulfilling these multiple roles. The VALID procedure is based on the quality assurance principles of ISO 9001 and nuclear safety analysis. Some of these key concepts include: independent generation and review of information, generation and review by qualified individuals, use of appropriate references for design data and documentation, and retrievability of the models, results, and documentation associated with entries in the library. Some highlights of the detailed procedure are discussed to provide background on its implementation and to indicate limitations of data extracted from VALID for use by the broader community. Specifically, external users of data generated within VALID must take responsibility for ensuring that the files are used within the QA framework of their organization and that use is appropriate. The future plans for the VALID library include expansion to include additional experiments from the IHECSBE, to include experiments from areas beyond criticality safety, such as reactor physics and shielding, and to include application models. In the future, external SCALE users may also obtain qualification under the VALID procedure and be involved in expanding the library. The VALID library provides a pathway for the criticality safety community to leverage modeling and analysis expertise at ORNL.« less
Sandia Advanced MEMS Design Tools, Version 2.2.5
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yarberry, Victor; Allen, James; Lantz, Jeffery
2010-01-19
The Sandia National Laboratories Advanced MEMS Design Tools, Version 2.2.5, is a collection of menus, prototype drawings, and executables that provide significant productivity enhancements when using AutoCAD to design MEMS components. This release is designed for AutoCAD 2000i, 2002, or 2004 and is supported under Windows NT 4.0, Windows 2000, or XP. SUMMiT V (Sandia Ultra planar Multi level MEMS Technology) is a 5 level surface micromachine fabrication technology, which customers internal and external to Sandia can access to fabricate prototype MEMS devices. This CD contains an integrated set of electronic files that: a) Describe the SUMMiT V fabrication processmore » b) Facilitate the process of designing MEMS with the SUMMiT process (prototype file, Design Rule Checker, Standard Parts Library) New features in this version: AutoCAD 2004 support has been added. SafeExplode ? a new feature that explodes blocks without affecting polylines (avoids exploding polylines into objects that are ignored by the DRC and Visualization tools). Layer control menu ? a pull-down menu for selecting layers to isolate, freeze, or thaw. Updated tools: A check has been added to catch invalid block names. DRC features: Added username/password validation, added a method to update the user?s password. SNL_DRC_WIDTH ? a value to control the width of the DRC error lines. SNL_BIAS_VALUE ? a value use to offset selected geometry SNL_PROCESS_NAME ? a value to specify the process name Documentation changes: The documentation has been updated to include the new features. While there exist some files on the CD that are used in conjunction with software package AutoCAD, these files are not intended for use independent of the CD. Note that the customer must purchase his/her own copy of AutoCAD to use with these files.« less
Chow, James C.L.; Grigorov, Grigor N.; Yazdani, Nuri
2006-01-01
A custom‐made computer program, SWIMRT, to construct “multileaf collimator (MLC) machine” file for intensity‐modulated radiotherapy (IMRT) fluence maps was developed using MATLAB® and the sliding window algorithm. The user can either import a fluence map with a graphical file format created by an external treatment‐planning system such as Pinnacle3 or create his or her own fluence map using the matrix editor in the program. Through comprehensive calibrations of the dose and the dimension of the imported fluence field, the user can use associated image‐processing tools such as field resizing and edge trimming to modify the imported map. When the processed fluence map is suitable, a “MLC machine” file is generated for our Varian 21 EX linear accelerator with a 120‐leaf Millennium MLC. This machine file is transferred to the MLC console of the LINAC to control the continuous motions of the leaves during beam irradiation. An IMRT field is then irradiated with the 2D intensity profiles, and the irradiated profiles are compared to the imported or modified fluence map. This program was verified and tested using film dosimetry to address the following uncertainties: (1) the mechanical limitation due to the leaf width and maximum traveling speed, and (2) the dosimetric limitation due to the leaf leakage/transmission and penumbra effect. Because the fluence map can be edited, resized, and processed according to the requirement of a study, SWIMRT is essential in studying and investigating the IMRT technique using the sliding window algorithm. Using this program, future work on the algorithm may include redistributing the time space between segmental fields to enhance the fluence resolution, and readjusting the timing of each leaf during delivery to avoid small fields. Possible clinical utilities and examples for SWIMRT are given in this paper. PACS numbers: 87.53.Kn, 87.53.St, 87.53.Uv PMID:17533330
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, F.; Brown, K.; Flach, G.
The goal of the Cementitious Barriers Partnership (CBP) is to develop a reasonable and credible set of software tools to predict the structural, hydraulic, and chemical performance of cement barriers used in nuclear applications over extended time frames (greater than 100 years for operating facilities and greater than 1000 years for waste management). The simulation tools will be used to evaluate and predict the behavior of cementitious barriers used in near surface engineered waste disposal systems including waste forms, containment structures, entombments, and environmental remediation. These cementitious materials are exposed to dynamic environmental conditions that cause changes in material propertiesmore » via (i) aging, (ii) chloride attack, (iii) sulfate attack, (iv) carbonation, (v) oxidation, and (vi) primary constituent leaching. A set of state-of-the-art software tools has been selected as a starting point to capture these important aging and degradation phenomena. Integration of existing software developed by the CBP partner organizations was determined to be the quickest method of meeting the CBP goal of providing a computational tool that improves the prediction of the long-term behavior of cementitious materials. These partner codes were selected based on their maturity and ability to address the problems outlined above. The GoldSim Monte Carlo simulation program (GTG 2010a, GTG 2010b) was chosen as the code integration platform (Brown & Flach 2009b). GoldSim (current Version 10.5) is a Windows based graphical object-oriented computer program that provides a flexible environment for model development (Brown & Flach 2009b). The linking of GoldSim to external codes has previously been successfully demonstrated (Eary 2007, Mattie et al. 2007). GoldSim is capable of performing deterministic and probabilistic simulations and of modeling radioactive decay and constituent transport. As part of the CBP project, a general Dynamic Link Library (DLL) interface was developed to link GoldSim with external codes (Smith III et al. 2010). The DLL uses a list of code inputs provided by GoldSim to create an input file for the external application, runs the external code, and returns a list of outputs (read from files created by the external application) back to GoldSim. In this way GoldSim provides: (1) a unified user interface to the applications, (2) the capability of coupling selected codes in a synergistic manner, and (3) the capability of performing probabilistic uncertainty analysis with the codes. GoldSim is made available by the GoldSim Technology Group as a free 'Player' version that allows running but not editing GoldSim models. The player version makes the software readily available to a wider community of users that would wish to use the CBP application but do not have a license for GoldSim.« less
Multi-centre audit of VMAT planning and pre-treatment verification.
Jurado-Bruggeman, Diego; Hernández, Victor; Sáez, Jordi; Navarro, David; Pino, Francisco; Martínez, Tatiana; Alayrach, Maria-Elena; Ailleres, Norbert; Melero, Alejandro; Jornet, Núria
2017-08-01
We performed a multi-centre intercomparison of VMAT dose planning and pre-treatment verification. The aims were to analyse the dose plans in terms of dosimetric quality and deliverability, and to validate whether in-house pre-treatment verification results agreed with those of an external audit. The nine participating centres encompassed different machines, equipment, and methodologies. Two mock cases (prostate and head and neck) were planned using one and two arcs. A plan quality index was defined to compare the plans and different complexity indices were calculated to check their deliverability. We compared gamma index pass rates using the centre's equipment and methodology to those of an external audit (global 3D gamma, absolute dose differences, 10% of maximum dose threshold). Log-file analysis was performed to look for delivery errors. All centres fulfilled the dosimetric goals but plan quality and delivery complexity were heterogeneous and uncorrelated, depending on the manufacturer and the planner's methodology. Pre-treatment verifications results were within tolerance in all cases for gamma 3%-3mm evaluation. Nevertheless, differences between the external audit and in-house measurements arose due to different equipment or methodology, especially for 2%-2mm criteria with differences up to 20%. No correlation was found between complexity indices and verification results amongst centres. All plans fulfilled dosimetric constraints, but plan quality and complexity did not correlate and were strongly dependent on the planner and the vendor. In-house measurements cannot completely replace external audits for credentialing. Copyright © 2017 Elsevier B.V. All rights reserved.
Momentum Management Tool for Low-Thrust Missions
NASA Technical Reports Server (NTRS)
Swenka, Edward R.; Smith, Brett A.; Vanelli, Charles A.
2010-01-01
A momentum management tool was designed for the Dawn low-thrust interplanetary spacecraft en route to the asteroids Vesta and Ceres, in an effort to better understand the early creation of the solar system. Momentum must be managed to ensure the spacecraft has enough control authority to perform necessary turns and hold a fixed inertial attitude against external torques. Along with torques from solar pressure and gravity-gradients, ion-propulsion engines produce a torque about the thrust axis that must be countered by the four reaction wheel assemblies (RWA). MomProf is a ground operations tool built to address these concerns. The momentum management tool was developed during initial checkout and early cruise, and has been refined to accommodate a wide range of momentum-management issues. With every activity or sequence, wheel speeds and momentum state must be checked to avoid undesirable conditions and use of consumables. MomProf was developed to operate in the MATLAB environment. All data are loaded into MATLAB as a structure to provide consistent access to all inputs by individual functions within the tool. Used in its most basic application, the Dawn momentum tool uses the basic principle of angular momentum conservation, computing momentum in the body frame, and RWA wheel speeds, for all given orientations in the input file. MomProf was designed specifically to be able to handle the changing external torques and frequent de - saturations. Incorporating significant external torques adds complexity since there are various external torques that act under different operational modes.
Zhang, Hui; Ren, Ji-Xia; Kang, Yan-Li; Bo, Peng; Liang, Jun-Yu; Ding, Lan; Kong, Wei-Bao; Zhang, Ji
2017-08-01
Toxicological testing associated with developmental toxicity endpoints are very expensive, time consuming and labor intensive. Thus, developing alternative approaches for developmental toxicity testing is an important and urgent task in the drug development filed. In this investigation, the naïve Bayes classifier was applied to develop a novel prediction model for developmental toxicity. The established prediction model was evaluated by the internal 5-fold cross validation and external test set. The overall prediction results for the internal 5-fold cross validation of the training set and external test set were 96.6% and 82.8%, respectively. In addition, four simple descriptors and some representative substructures of developmental toxicants were identified. Thus, we hope the established in silico prediction model could be used as alternative method for toxicological assessment. And these obtained molecular information could afford a deeper understanding on the developmental toxicants, and provide guidance for medicinal chemists working in drug discovery and lead optimization. Copyright © 2017 Elsevier Inc. All rights reserved.
Development of a mini-mobile digital radiography system by using wireless smart devices.
Jeong, Chang-Won; Joo, Su-Chong; Ryu, Jong-Hyun; Lee, Jinseok; Kim, Kyong-Woo; Yoon, Kwon-Ha
2014-08-01
The current technologies that trend in digital radiology (DR) are toward systems using portable smart mobile as patient-centered care. We aimed to develop a mini-mobile DR system by using smart devices for wireless connection into medical information systems. We developed a mini-mobile DR system consisting of an X-ray source and a Complementary Metal-Oxide Semiconductor (CMOS) sensor based on a flat panel detector for small-field diagnostics in patients. It is used instead of the systems that are difficult to perform with a fixed traditional device. We also designed a method for embedded systems in the development of portable DR systems. The external interface used the fast and stable IEEE 802.11n wireless protocol, and we adapted the device for connections with Picture Archiving and Communication System (PACS) and smart devices. The smart device could display images on an external monitor other than the monitor in the DR system. The communication modules, main control board, and external interface supporting smart devices were implemented. Further, a smart viewer based on the external interface was developed to display image files on various smart devices. In addition, the advantage of operators is to reduce radiation dose when using remote smart devices. It is integrated with smart devices that can provide X-ray imaging services anywhere. With this technology, it can permit image observation on a smart device from a remote location by connecting to the external interface. We evaluated the response time of the mini-mobile DR system to compare to mobile PACS. The experimental results show that our system outperforms conventional mobile PACS in this regard.
Kim, Hyeon-Cheol; Lee, Min-Ho; Yum, Jiwan; Versluis, Antheunis; Lee, Chan-Joo; Kim, Byung-Min
2010-07-01
Nickel-titanium (NiTi) rotary files can produce cleanly tapered canal shapes with low tendency of transporting the canal lumen. Because NiTi instruments are generally perceived to have high fracture risk during use, new designs have been marketed to lower fracture risks. However, these design variations may also alter the forces on a root during instrumentation and increase dentinal defects that predispose a root to fracture. This study compared the stress conditions during rotary instrumentation in a curved root for three NiTi file designs. Stresses were calculated using finite element (FE) analysis. FE models of ProFile (Dentsply Maillefer, Ballaigues, Switzerland; U-shaped cross-section and constant 6% tapered shaft), ProTaper Universal (Dentsply; convex triangular cross-section with notch and progressive taper shaft), and LightSpeed LSX (Lightspeed Technology, Inc, San Antonio, TX; noncutting round shaft) were rotated within a curved root canal. The stress and strain conditions resulting from the simulated shaping action were evaluated in the apical root dentin. ProTaper Universal induced the highest von Mises stress concentration in the root dentin and had the highest tensile and compressive principal strain components at the external root surface. The calculated stress values from ProTaper Universal, which had the biggest taper shaft, approached the strength properties of dentin. LightSpeed generated the lowest stresses. The stiffer file designs generated higher stress concentrations in the apical root dentin during shaping of the curved canal, which raises the risk of dentinal defects that may lead to apical root cracking. Thus, stress levels during shaping and fracture susceptibility after shaping vary with instrument design. Copyright 2010 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.
Data Flow for the TERRA-REF project
NASA Astrophysics Data System (ADS)
Kooper, R.; Burnette, M.; Maloney, J.; LeBauer, D.
2017-12-01
The Transportation Energy Resources from Renewable Agriculture Phenotyping Reference Platform (TERRA-REF) program aims to identify crop traits that are best suited to producing high-energy sustainable biofuels and match those plant characteristics to their genes to speed the plant breeding process. One tool used to achieve this goal is a high-throughput phenotyping robot outfitted with sensors and cameras to monitor the growth of 1.25 acres of sorghum. Data types range from hyperspectral imaging to 3D reconstructions and thermal profiles, all at 1mm resolution. This system produces thousands of daily measurements with high spatiotemporal resolution. The team at NCSA processes, annotates, organizes and stores the massive amounts of data produced by this system - up to 5 TB per day. Data from the sensors is streamed to a local gantry-cache server. The standardized sensor raw data stream is automatically and securely delivered to NCSA using Globus Connect service. Once files have been successfully received by the Globus endpoint, the files are removed from the gantry-cache server. As each dataset arrives or is created the Clowder system automatically triggers different software tools to analyze each file, extract information, and convert files to a common format. Other tools can be triggered to run after all required data is uploaded. For example, a stitched image of the entire field is created after all images of the field become available. Some of these tools were developed by external collaborators based on predictive models and algorithms, others were developed as part of other projects and could be leveraged by the TERRA project. Data will be stored for the lifetime of the project and is estimated to reach 10 PB over 3 years. The Clowder system, BETY and other systems will allow users to easily find data by browsing or searching the extracted information.
Characterization of exposures to workers covered under the U.S. Energy Employees Compensation Act.
Neton, James W
2014-02-01
Since the mid-1940s, hundreds of thousands of workers have been engaged in nuclear weapons-related activities for the U.S. Department of Energy (DOE) and its predecessor agencies. In 2000, Congress promulgated the Energy Employees Occupational Illness Compensation Program Act of 2000 (EEOICPA), which provides monetary compensation and medical benefits to certain energy employees who have developed cancer. Under Part B of EEOICPA, the National Institute for Occupational Safety and Health (NIOSH) is required to estimate radiation doses for those workers who have filed a claim, or whose survivors have filed a claim, under Part B of the Act. To date, over 39,000 dose reconstructions have been completed for workers from more than 200 facilities. These reconstructions have included assessment of both internal and external exposure at all major DOE facilities, as well as at a large number of private companies [known as Atomic Weapons Employer (AWE) facilities in the Act] that engaged in contract work for the DOE and its predecessor agencies. To complete these dose reconstructions, NIOSH has captured and reviewed thousands of historical documents related to site operations and worker/workplace monitoring practices at these facilities. Using the data collected and reviewed pursuant to NIOSH's role under EEOICPA, this presentation will characterize historical internal and external exposures received by workers at DOE and AWE facilities. To the extent possible, use will be made of facility specific coworker models to highlight changes in exposure patterns over time. In addition, the effects that these exposures have on compensation rates for workers are discussed.Introduction of Characterization of Exposures to Workers (Video 1:59, http://links.lww.com/HP/A3).
An Accurate and Dynamic Computer Graphics Muscle Model
NASA Technical Reports Server (NTRS)
Levine, David Asher
1997-01-01
A computer based musculo-skeletal model was developed at the University in the departments of Mechanical and Biomedical Engineering. This model accurately represents human shoulder kinematics. The result of this model is the graphical display of bones moving through an appropriate range of motion based on inputs of EMGs and external forces. The need existed to incorporate a geometric muscle model in the larger musculo-skeletal model. Previous muscle models did not accurately represent muscle geometries, nor did they account for the kinematics of tendons. This thesis covers the creation of a new muscle model for use in the above musculo-skeletal model. This muscle model was based on anatomical data from the Visible Human Project (VHP) cadaver study. Two-dimensional digital images from the VHP were analyzed and reconstructed to recreate the three-dimensional muscle geometries. The recreated geometries were smoothed, reduced, and sliced to form data files defining the surfaces of each muscle. The muscle modeling function opened these files during run-time and recreated the muscle surface. The modeling function applied constant volume limitations to the muscle and constant geometry limitations to the tendons.
An XML-based Generic Tool for Information Retrieval in Solar Databases
NASA Astrophysics Data System (ADS)
Scholl, Isabelle F.; Legay, Eric; Linsolas, Romain
This paper presents the current architecture of the `Solar Web Project' now in its development phase. This tool will provide scientists interested in solar data with a single web-based interface for browsing distributed and heterogeneous catalogs of solar observations. The main goal is to have a generic application that can be easily extended to new sets of data or to new missions with a low level of maintenance. It is developed with Java and XML is used as a powerful configuration language. The server, independent of any database scheme, can communicate with a client (the user interface) and several local or remote archive access systems (such as existing web pages, ftp sites or SQL databases). Archive access systems are externally described in XML files. The user interface is also dynamically generated from an XML file containing the window building rules and a simplified database description. This project is developed at MEDOC (Multi-Experiment Data and Operations Centre), located at the Institut d'Astrophysique Spatiale (Orsay, France). Successful tests have been conducted with other solar archive access systems.
BOREAS HYD-2 Estimated Snow Water Equivalent (SWE) from Microwave Measurements
NASA Technical Reports Server (NTRS)
Powell, Hugh; Chang, Alfred T. C.; Hall, Forrest G. (Editor); Knapp, David E. (Editor); Smith, David E. (Technical Monitor)
2000-01-01
The surface meteorological data collected at the Boreal Ecosystem-Atmosphere Study (BOREAS) tower and ancillary sites are being used as inputs to an energy balance model to monitor the amount of snow storage in the boreal forest region. The BOREAS Hydrology (HYD)-2 team used Snow Water Equivalent (SWE) derived from an energy balance model and in situ observed SWE to compare the SWE inferred from airborne and spaceborne microwave data, and to assess the accuracy of microwave retrieval algorithms. The major external measurements that are needed are snowpack temperature profiles, in situ snow areal extent, and SWE data. The data in this data set were collected during February 1994 and cover portions of the Southern Study Area (SSA), Northern Study Area (NSA), and the transect areas. The data are available from BORIS as comma-delimited tabular ASCII files. The SWE data are available from the Earth Observing System Data and Information System (EOSDIS) Oak Ridge National Laboratory (ORNL) Distributed Active Archive Center (DAAC). The data files are available on a CD-ROM (see document number 20010000884).
Martinez-Cuezva, Alberto; Pastor, Aurelia; Cioncoloni, Giacomo; Orenes, Raul-Angel; Alajarin, Mateo; Symes, Mark D.
2015-01-01
A cyclic network of chemical reactions has been conceived for exchanging the dynamic behaviour of di(acylamino)pyridine-based rotaxanes and surrogates. X-ray diffraction studies revealed the intercomponent interactions in these interlocked compounds and were consistent with those found in solution by dynamic NMR experiments. This particular binding site was incorporated into a molecular shuttle enabled for accessing two states with an outstanding positional discrimination through chemical manipulation. Furthermore, the ability of the di(acylamino)pyridine domain to associate with external binders with a complementary array of HB donor and acceptor sites was exploited for the advance of an unprecedented electrochemical switch operating through a reversible anion radical recognition process. PMID:28706682
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pelaia, II, Thomas A.
2014-06-05
it is common for facilities to have a lobby with a display loop while also requiring an option for guided tours. Existing solutions have required expensive hardware and awkward software. Our solution is relative low cost as it runs on an iPad connected to an external monitor, and our software provides an intuitive touch interface. The media files are downloaded from a web server onto the device allowing a mobile option (e.g. displays at conferences). Media may include arbitrary sequences of images, movies or PDF documents. Tour guides can select different tracks of slides to display and the presentation willmore » return to the default loop after a timeout.« less
NASA Astrophysics Data System (ADS)
Rahman, Md Mushfiqur; Lei, Yu; Kalantzis, Georgios
2018-01-01
Quality Assurance (QA) for medical linear accelerator (linac) is one of the primary concerns in external beam radiation Therapy. Continued advancements in clinical accelerators and computer control technology make the QA procedures more complex and time consuming which often, adequate software accompanied with specific phantoms is required. To ameliorate that matter, we introduce QALMA (Quality Assurance for Linac with MATLAB), a MALAB toolkit which aims to simplify the quantitative analysis of QA for linac which includes Star-Shot analysis, Picket Fence test, Winston-Lutz test, Multileaf Collimator (MLC) log file analysis and verification of light & radiation field coincidence test.
A high performance hierarchical storage management system for the Canadian tier-1 centre at TRIUMF
NASA Astrophysics Data System (ADS)
Deatrich, D. C.; Liu, S. X.; Tafirout, R.
2010-04-01
We describe in this paper the design and implementation of Tapeguy, a high performance non-proprietary Hierarchical Storage Management (HSM) system which is interfaced to dCache for efficient tertiary storage operations. The system has been successfully implemented at the Canadian Tier-1 Centre at TRIUMF. The ATLAS experiment will collect a large amount of data (approximately 3.5 Petabytes each year). An efficient HSM system will play a crucial role in the success of the ATLAS Computing Model which is driven by intensive large-scale data analysis activities that will be performed on the Worldwide LHC Computing Grid infrastructure continuously. Tapeguy is Perl-based. It controls and manages data and tape libraries. Its architecture is scalable and includes Dataset Writing control, a Read-back Queuing mechanism and I/O tape drive load balancing as well as on-demand allocation of resources. A central MySQL database records metadata information for every file and transaction (for audit and performance evaluation), as well as an inventory of library elements. Tapeguy Dataset Writing was implemented to group files which are close in time and of similar type. Optional dataset path control dynamically allocates tape families and assign tapes to it. Tape flushing is based on various strategies: time, threshold or external callbacks mechanisms. Tapeguy Read-back Queuing reorders all read requests by using an elevator algorithm, avoiding unnecessary tape loading and unloading. Implementation of priorities will guarantee file delivery to all clients in a timely manner.
MolabIS--an integrated information system for storing and managing molecular genetics data.
Truong, Cong V C; Groeneveld, Linn F; Morgenstern, Burkhard; Groeneveld, Eildert
2011-10-31
Long-term sample storage, tracing of data flow and data export for subsequent analyses are of great importance in genetics studies. Therefore, molecular labs do need a proper information system to handle an increasing amount of data from different projects. We have developed a molecular labs information management system (MolabIS). It was implemented as a web-based system allowing the users to capture original data at each step of their workflow. MolabIS provides essential functionality for managing information on individuals, tracking samples and storage locations, capturing raw files, importing final data from external files, searching results, accessing and modifying data. Further important features are options to generate ready-to-print reports and convert sequence and microsatellite data into various data formats, which can be used as input files in subsequent analyses. Moreover, MolabIS also provides a tool for data migration. MolabIS is designed for small-to-medium sized labs conducting Sanger sequencing and microsatellite genotyping to store and efficiently handle a relative large amount of data. MolabIS not only helps to avoid time consuming tasks but also ensures the availability of data for further analyses. The software is packaged as a virtual appliance which can run on different platforms (e.g. Linux, Windows). MolabIS can be distributed to a wide range of molecular genetics labs since it was developed according to a general data model. Released under GPL, MolabIS is freely available at http://www.molabis.org.
MolabIS - An integrated information system for storing and managing molecular genetics data
2011-01-01
Background Long-term sample storage, tracing of data flow and data export for subsequent analyses are of great importance in genetics studies. Therefore, molecular labs do need a proper information system to handle an increasing amount of data from different projects. Results We have developed a molecular labs information management system (MolabIS). It was implemented as a web-based system allowing the users to capture original data at each step of their workflow. MolabIS provides essential functionality for managing information on individuals, tracking samples and storage locations, capturing raw files, importing final data from external files, searching results, accessing and modifying data. Further important features are options to generate ready-to-print reports and convert sequence and microsatellite data into various data formats, which can be used as input files in subsequent analyses. Moreover, MolabIS also provides a tool for data migration. Conclusions MolabIS is designed for small-to-medium sized labs conducting Sanger sequencing and microsatellite genotyping to store and efficiently handle a relative large amount of data. MolabIS not only helps to avoid time consuming tasks but also ensures the availability of data for further analyses. The software is packaged as a virtual appliance which can run on different platforms (e.g. Linux, Windows). MolabIS can be distributed to a wide range of molecular genetics labs since it was developed according to a general data model. Released under GPL, MolabIS is freely available at http://www.molabis.org. PMID:22040322
THERMINATOR 2: THERMal heavy Io N gener ATOR 2
NASA Astrophysics Data System (ADS)
Chojnacki, Mikołaj; Kisiel, Adam; Florkowski, Wojciech; Broniowski, Wojciech
2012-03-01
We present an extended version of THERMINATOR, a Monte Carlo event generator dedicated to studies of the statistical production of particles in relativistic heavy-ion collisions. The package is written in C++ and uses the CERN ROOT data-analysis environment. The largely increased functionality of the code contains the following main features: 1) The possibility of input of any shape of the freeze-out hypersurface and the expansion velocity field, including the 3+1-dimensional profiles, in particular those generated externally with various hydrodynamic codes. 2) The hypersurfaces may have variable thermal parameters, which allow studies departing significantly from the mid-rapidity region where the baryon chemical potential becomes large. 3) We include a library of standard sets of hypersurfaces and velocity profiles describing the RHIC Au + Au data at √{s}=200 GeV for various centralities, as well as those anticipated for the LHC Pb + Pb collisions at √{s}=5.5 TeV. 4) A separate code, FEMTO-THERMINATOR, is provided to carry out the analysis of the pion-pion femtoscopic correlations which are an important source of information concerning the size and expansion of the system. 5) We also include several useful scripts that carry out auxiliary tasks, such as obtaining an estimate of the number of elastic collisions after the freeze-out, counting of particles flowing back into the fireball and violating causality (typically very few), or visualizing various results: the particle p-spectra, the elliptic flow coefficients, and the HBT correlation radii. Program summaryProgram title:THERMINATOR 2 Catalogue identifier: ADXL_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADXL_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 423 444 No. of bytes in distributed program, including test data, etc.: 2 854 602 Distribution format: tar.gz Programming language:C++ with the CERN ROOT libraries, BASH shell Computer: Any with a C++ compiler and the CERN ROOT environment, ver. 5.26 or later, tested with Intel Core2 Duo CPU E8400 @ 3 GHz, 4 GB RAM Operating system: Linux Ubuntu 10.10 x64 (gcc 4.4.5) ROOT 5.26 Linux Ubuntu 11.04 x64 (gcc Ubuntu/Linaro 4.5.2-8ubuntu4) ROOT 5.30/00 (compiled from source) Linux CentOS 5.2 (gcc Red Hat 4.1.2-42) ROOT 5.30/00 (compiled from source) Mac OS X 10.6.8 (i686-apple-darwin10-g++-4.2.1) ROOT 5.30/00 (for Mac OS X 10.6 x86-64 with gcc 4.2.1) cygwin-1.7.9-1 (gcc gcc4-g++-4.3.4-4) ROOT 5.30/00 (for cygwin gcc 4.3) RAM: 30 MB therm2 events 150 MB therm2 femto Classification: 11.2 Catalogue identifier of previous version: ADXL_v1_0 Journal reference of previous version: Comput. Phys. Comm. 174 (2006) 669 External routines: CERN ROOT ( http://root.cern.ch/drupal/) Does the new version supersede the previous version?: Yes Nature of problem: Particle production via statistical hadronization in relativistic heavy-ion collisions. Solution method: Monte Carlo simulation, analyzed with ROOT. Reasons for new version: The increased functionality of the code contains the following important features. The input of any shape of the freeze-out hypersurface and the expansion velocity field, including the 3+1-dimensional profiles, in particular those generated externally with the various popular hydrodynamic codes. The hypersurfaces may have variable thermal parameters, which allows for studies departing significantly from the mid-rapidity region. We include a library of standard sets of hypersurfaces and velocity profiles describing the RHIC Au + Au and the LHC Pb+Pb data. A separate code, FEMTO-THERMINATOR, is provided to carry out the analysis of femtoscopic correlations. Summary of revisions: THERMINATOR 2 incorporates major revisions to encompass the enhanced functionality. Classes: The Integrator class has been expanded and a new subgroup of classes defined. Model and abstract class: These classes are responsible for the physical models of the freeze-out process. The functionality and readability of the code has been substantially increased by implementing each freeze-out model in a different class. The Hypersurface class was added to handle the input form hydrodynamic codes. The hydro input is passed to the program as a lattice of the freeze-out hypersurface. That information is stored in the .xml files. Input: THERMINATOR 2 programs are now controlled by *. ini type files. The programs parameters and the freeze-out model parameters are now in separate ini files. Output: The event files generated by the therm2_events program are not backward compatible with the previous version. The event*. root file structure was expanded with two new TTree structures. From the particle entry it is possible to back-trace the whole cascade. Event text output is now optional. The ROOT macros produce the *. eps figures with physics results, e.g. the pT-spectra, the elliptic-flow coefficient, rapidity distributions, etc. The THERMINATOR HBT package creates the ROOT files femto*. root ( therm2_femto) and hbtfit*. root ( therm2_hbtfit). Directory structure: The directory structure has been reorganized. Source code resides in the build directory. The freeze-out model input files, event files, ROOT macros are stored separately. The THERMINATOR 2 system, after installation, is able to run on a cluster. Scripts: The package contains a few BASH scripts helpful when running e.g. on a cluster the whole system can be executed via a single script. Additional comments: Typical data file size: default configuration. 45 MB/500 events; 35 MB/correlation file (one k bin); 45 kB/fit file (projections and fits). Running time: Default configuration at 3 GHz. primordial multiplicities 70 min (calculated only once per case); 8 min/500 events; 10 min - draw all figures; 25 min/one k bin in the HBT analysis with 5000 events.
BiGG: a Biochemical Genetic and Genomic knowledgebase of large scale metabolic reconstructions
2010-01-01
Background Genome-scale metabolic reconstructions under the Constraint Based Reconstruction and Analysis (COBRA) framework are valuable tools for analyzing the metabolic capabilities of organisms and interpreting experimental data. As the number of such reconstructions and analysis methods increases, there is a greater need for data uniformity and ease of distribution and use. Description We describe BiGG, a knowledgebase of Biochemically, Genetically and Genomically structured genome-scale metabolic network reconstructions. BiGG integrates several published genome-scale metabolic networks into one resource with standard nomenclature which allows components to be compared across different organisms. BiGG can be used to browse model content, visualize metabolic pathway maps, and export SBML files of the models for further analysis by external software packages. Users may follow links from BiGG to several external databases to obtain additional information on genes, proteins, reactions, metabolites and citations of interest. Conclusions BiGG addresses a need in the systems biology community to have access to high quality curated metabolic models and reconstructions. It is freely available for academic use at http://bigg.ucsd.edu. PMID:20426874
VirGO: A Visual Browser for the ESO Science Archive Facility
NASA Astrophysics Data System (ADS)
Chéreau, Fabien
2012-04-01
VirGO is the next generation Visual Browser for the ESO Science Archive Facility developed by the Virtual Observatory (VO) Systems Department. It is a plug-in for the popular open source software Stellarium adding capabilities for browsing professional astronomical data. VirGO gives astronomers the possibility to easily discover and select data from millions of observations in a new visual and intuitive way. Its main feature is to perform real-time access and graphical display of a large number of observations by showing instrumental footprints and image previews, and to allow their selection and filtering for subsequent download from the ESO SAF web interface. It also allows the loading of external FITS files or VOTables, the superimposition of Digitized Sky Survey (DSS) background images, and the visualization of the sky in a `real life' mode as seen from the main ESO sites. All data interfaces are based on Virtual Observatory standards which allow access to images and spectra from external data centers, and interaction with the ESO SAF web interface or any other VO applications supporting the PLASTIC messaging system.
Concerted orientation induced unidirectional water transport through nanochannels.
Wan, Rongzheng; Lu, Hangjun; Li, Jinyuan; Bao, Jingdong; Hu, Jun; Fang, Haiping
2009-11-14
The dynamics of water inside nanochannels is of great importance for biological activities as well as for the design of molecular sensors, devices, and machines, particularly for sea water desalination. When confined in specially sized nanochannels, water molecules form a single-file structure with concerted dipole orientations, which collectively flip between the directions along and against the nanotube axis. In this paper, by using molecular dynamics simulations, we observed a net flux along the dipole-orientation without any application of an external electric field or external pressure difference during the time period of the particular concerted dipole orientations of the molecules along or against the nanotube axis. We found that this unique special-directional water transportation resulted from the asymmetric potential of water-water interaction along the nanochannel, which originated from the concerted dipole orientation of the water molecules that breaks the symmetry of water orientation distribution along the channel within a finite time period. This finding suggests a new mechanism for achieving high-flux water transportation, which may be useful for nanotechnology and biological applications.
Variable polarity plasma arc welding on the Space Shuttle external tank
NASA Technical Reports Server (NTRS)
Nunes, A. C., Jr.; Bayless, E. O., Jr.; Jones, C. S., III; Munafo, P. M.; Biddle, A. P.; Wilson, W. A.
1984-01-01
Variable polarity plasma arc (VPPA) techniques used at NASA's Marshall Space Flight Center for the fabrication of the Space Shuttle External Tank are presentedd. The high plasma arc jet velocities of 300-2000 m/s are produced by heating the plasma gas as it passes through a constraining orifice, with the plasma arc torch becoming a miniature jet engine. As compared to the GTA jet, the VPPA has the following advantages: (1) less sensitive to contamination, (2) a more symmetrical fusion zone, and (3) greater joint penetration. The VPPA welding system is computerized, operating with a microprocessor, to set welding variables in accordance with set points inputs, including the manipulator and wire feeder, as well as torch control and power supply. Some other VPPA welding technique advantages are: reduction in weld repair costs by elimination of porosity; reduction of joint preparation costs through elimination of the need to scrape or file faying surfaces; reduction in depeaking costs; eventual reduction of the 100 percent-X-ray inspection requirements. The paper includes a series of schematic and block diagrams.
NASA Astrophysics Data System (ADS)
Kashina, M. A.; Alabuzhev, A. A.
2018-02-01
The dynamics of the incompressible fluid drop under the non-uniform electric field are considered. The drop is bounded axially by two parallel solid planes and the case of heterogeneous plates is investigated. The external electric field acts as an external force that causes motion of the contact line. We assume that the electric current is alternative current and the AC filed amplitude is a spatially non-uniform function. In equilibrium, the drop has the form of a circular cylinder. The equilibrium contact angle is 0.5 π. In order to describe this contact line motion the modified Hocking boundary condition is applied: the velocity of the contact line is proportional to the deviation of the contact angle and the speed of the fast relaxation processes, which frequency is proportional to twice the frequency of the electric field. The Hocking parameter depends on the polar angle, i.e. the coefficient of the interaction between the plate and the fluid (the contact line) is a function of the plane coordinates. This function is expanded in a series of the Laplace operator eigenfunctions.
Mazurenko, Olena; Hearld, Larry R; Menachemi, Nir
Physician e-mail communication, with patients and other providers, is one of the cornerstones of effective care coordination but varies significantly across physicians. A physician's external environment may contribute to such variations by enabling or constraining a physician's ability to adopt innovations such as health information technology (HIT) that can be used to support e-mail communication. The aim of the study was to examine whether the relationship of the external environment and physician e-mail communication with patients and other providers is mediated by the practice's HIT availability. The data were obtained from the Health Tracking Physician Survey (2008) and the Area Resource File (2008). Cross-sectional multivariable subgroup path analysis was used to investigate the mediating role of HIT availability across 2,850 U.S. physicians. Solo physicians' perceptions about malpractice were associated with 0.97 lower odds (p < .05) of e-mail communication with patients and other providers, as compared to group and hospital practices, even when mediated by HIT availability. Subgroup analyses indicated that different types of practices are responsive to the different dimensions of the external environment. Specifically, solo practitioners were more responsive to the availability of resources in their environment, with per capita income associated with lower likelihood of physician e-mail communication (OR = 0.99, p < .01). In contrast, physicians working in the group practices were more responsive to the complexity of their environment, with a physician's perception of practicing in environments with higher malpractice risks associated with greater information technology availability, which in turn was associated with a greater likelihood of communicating via e-mail with patients (OR = 1.02, p < .05) and other physicians (OR = 1.03, p < .001). The association between physician e-mail communication and the external environment is mediated by the practice's HIT availability. Efforts to improve physician e-mail communication and HIT adoption may need to reflect the varied perceptions of different types of practices.
3D for Geosciences: Interactive Tangibles and Virtual Models
NASA Astrophysics Data System (ADS)
Pippin, J. E.; Matheney, M.; Kitsch, N.; Rosado, G.; Thompson, Z.; Pierce, S. A.
2016-12-01
Point cloud processing provides a method of studying and modelling geologic features relevant to geoscience systems and processes. Here, software including Skanect, MeshLab, Blender, PDAL, and PCL are used in conjunction with 3D scanning hardware, including a Structure scanner and a Kinect camera, to create and analyze point cloud images of small scale topography, karst features, tunnels, and structures at high resolution. This project successfully scanned internal karst features ranging from small stalactites to large rooms, as well as an external waterfall feature. For comparison purposes, multiple scans of the same object were merged into single object files both automatically, using commercial software, and manually using open source libraries and code. Files with format .ply were manually converted into numeric data sets to be analyzed for similar regions between files in order to match them together. We can assume a numeric process would be more powerful and efficient than the manual method, however it could lack other useful features that GUI's may have. The digital models have applications in mining as efficient means of replacing topography functions such as measuring distances and areas. Additionally, it is possible to make simulation models such as drilling templates and calculations related to 3D spaces. Advantages of using methods described here for these procedures include the relatively quick time to obtain data and the easy transport of the equipment. With regard to openpit mining, obtaining 3D images of large surfaces and with precision would be a high value tool by georeferencing scan data to interactive maps. The digital 3D images obtained from scans may be saved as printable files to create physical 3D-printable models to create tangible objects based on scientific information, as well as digital "worlds" able to be navigated virtually. The data, models, and algorithms explored here can be used to convey complex scientific ideas to a range of professionals and audiences.
NASA Astrophysics Data System (ADS)
Lee, Choonik; Jung, Jae Won; Pelletier, Christopher; Pyakuryal, Anil; Lamart, Stephanie; Kim, Jong Oh; Lee, Choonsik
2015-03-01
Organ dose estimation for retrospective epidemiological studies of late effects in radiotherapy patients involves two challenges: radiological images to represent patient anatomy are not usually available for patient cohorts who were treated years ago, and efficient dose reconstruction methods for large-scale patient cohorts are not well established. In the current study, we developed methods to reconstruct organ doses for radiotherapy patients by using a series of computational human phantoms coupled with a commercial treatment planning system (TPS) and a radiotherapy-dedicated Monte Carlo transport code, and performed illustrative dose calculations. First, we developed methods to convert the anatomy and organ contours of the pediatric and adult hybrid computational phantom series to Digital Imaging and Communications in Medicine (DICOM)-image and DICOM-structure files, respectively. The resulting DICOM files were imported to a commercial TPS for simulating radiotherapy and dose calculation for in-field organs. The conversion process was validated by comparing electron densities relative to water and organ volumes between the hybrid phantoms and the DICOM files imported in TPS, which showed agreements within 0.1 and 2%, respectively. Second, we developed a procedure to transfer DICOM-RT files generated from the TPS directly to a Monte Carlo transport code, x-ray Voxel Monte Carlo (XVMC) for more accurate dose calculations. Third, to illustrate the performance of the established methods, we simulated a whole brain treatment for the 10 year-old male phantom and a prostate treatment for the adult male phantom. Radiation doses to selected organs were calculated using the TPS and XVMC, and compared to each other. Organ average doses from the two methods matched within 7%, whereas maximum and minimum point doses differed up to 45%. The dosimetry methods and procedures established in this study will be useful for the reconstruction of organ dose to support retrospective epidemiological studies of late effects in radiotherapy patients.
Template-based combinatorial enumeration of virtual compound libraries for lipids
2012-01-01
A variety of software packages are available for the combinatorial enumeration of virtual libraries for small molecules, starting from specifications of core scaffolds with attachments points and lists of R-groups as SMILES or SD files. Although SD files include atomic coordinates for core scaffolds and R-groups, it is not possible to control 2-dimensional (2D) layout of the enumerated structures generated for virtual compound libraries because different packages generate different 2D representations for the same structure. We have developed a software package called LipidMapsTools for the template-based combinatorial enumeration of virtual compound libraries for lipids. Virtual libraries are enumerated for the specified lipid abbreviations using matching lists of pre-defined templates and chain abbreviations, instead of core scaffolds and lists of R-groups provided by the user. 2D structures of the enumerated lipids are drawn in a specific and consistent fashion adhering to the framework for representing lipid structures proposed by the LIPID MAPS consortium. LipidMapsTools is lightweight, relatively fast and contains no external dependencies. It is an open source package and freely available under the terms of the modified BSD license. PMID:23006594
Monte Carlo simulation of electrothermal atomization on a desktop personal computer
NASA Astrophysics Data System (ADS)
Histen, Timothy E.; Güell, Oscar A.; Chavez, Iris A.; Holcombea, James A.
1996-07-01
Monte Carlo simulations have been applied to electrothermal atomization (ETA) using a tubular atomizer (e.g. graphite furnace) because of the complexity in the geometry, heating, molecular interactions, etc. The intense computational time needed to accurately model ETA often limited its effective implementation to the use of supercomputers. However, with the advent of more powerful desktop processors, this is no longer the case. A C-based program has been developed and can be used under Windows TM or DOS. With this program, basic parameters such as furnace dimensions, sample placement, furnace heating and kinetic parameters such as activation energies for desorption and adsorption can be varied to show the absorbance profile dependence on these parameters. Even data such as time-dependent spatial distribution of analyte inside the furnace can be collected. The DOS version also permits input of external temperaturetime data to permit comparison of simulated profiles with experimentally obtained absorbance data. The run-time versions are provided along with the source code. This article is an electronic publication in Spectrochimica Acta Electronica (SAE), the electronic section of Spectrochimica Acta Part B (SAB). The hardcopy text is accompanied by a diskette with a program (PC format), data files and text files.
Template-based combinatorial enumeration of virtual compound libraries for lipids.
Sud, Manish; Fahy, Eoin; Subramaniam, Shankar
2012-09-25
A variety of software packages are available for the combinatorial enumeration of virtual libraries for small molecules, starting from specifications of core scaffolds with attachments points and lists of R-groups as SMILES or SD files. Although SD files include atomic coordinates for core scaffolds and R-groups, it is not possible to control 2-dimensional (2D) layout of the enumerated structures generated for virtual compound libraries because different packages generate different 2D representations for the same structure. We have developed a software package called LipidMapsTools for the template-based combinatorial enumeration of virtual compound libraries for lipids. Virtual libraries are enumerated for the specified lipid abbreviations using matching lists of pre-defined templates and chain abbreviations, instead of core scaffolds and lists of R-groups provided by the user. 2D structures of the enumerated lipids are drawn in a specific and consistent fashion adhering to the framework for representing lipid structures proposed by the LIPID MAPS consortium. LipidMapsTools is lightweight, relatively fast and contains no external dependencies. It is an open source package and freely available under the terms of the modified BSD license.
Moretti, Rocco; Lyskov, Sergey; Das, Rhiju; Meiler, Jens; Gray, Jeffrey J
2018-01-01
The Rosetta molecular modeling software package provides a large number of experimentally validated tools for modeling and designing proteins, nucleic acids, and other biopolymers, with new protocols being added continually. While freely available to academic users, external usage is limited by the need for expertise in the Unix command line environment. To make Rosetta protocols available to a wider audience, we previously created a web server called Rosetta Online Server that Includes Everyone (ROSIE), which provides a common environment for hosting web-accessible Rosetta protocols. Here we describe a simplification of the ROSIE protocol specification format, one that permits easier implementation of Rosetta protocols. Whereas the previous format required creating multiple separate files in different locations, the new format allows specification of the protocol in a single file. This new, simplified protocol specification has more than doubled the number of Rosetta protocols available under ROSIE. These new applications include pK a determination, lipid accessibility calculation, ribonucleic acid redesign, protein-protein docking, protein-small molecule docking, symmetric docking, antibody docking, cyclic toxin docking, critical binding peptide determination, and mapping small molecule binding sites. ROSIE is freely available to academic users at http://rosie.rosettacommons.org. © 2017 The Protein Society.
Dedicated computer system AOTK for image processing and analysis of horse navicular bone
NASA Astrophysics Data System (ADS)
Zaborowicz, M.; Fojud, A.; Koszela, K.; Mueller, W.; Górna, K.; Okoń, P.; Piekarska-Boniecka, H.
2017-07-01
The aim of the research was made the dedicated application AOTK (pol. Analiza Obrazu Trzeszczki Kopytowej) for image processing and analysis of horse navicular bone. The application was produced by using specialized software like Visual Studio 2013 and the .NET platform. To implement algorithms of image processing and analysis were used libraries of Aforge.NET. Implemented algorithms enabling accurate extraction of the characteristics of navicular bones and saving data to external files. Implemented in AOTK modules allowing the calculations of distance selected by user, preliminary assessment of conservation of structure of the examined objects. The application interface is designed in a way that ensures user the best possible view of the analyzed images.
Dendrimer-magnetic nanostructure: a Monte Carlo simulation
NASA Astrophysics Data System (ADS)
Jabar, A.; Masrour, R.
2017-11-01
In this paper, the magnetic properties of ternary mixed spins (σ,S,q) Ising model on a dendrimer nanostructure are studied using Monte Carlo simulations. The ground state phase diagrams of dendrimer nanostructure with ternary mixed spins σ = 1/2, S = 1 and q = 3/2 Ising model are found. The variation of the thermal total and partial magnetizations with the different exchange interactions, the external magnetic fields and the crystal fields have been also studied. The reduced critical temperatures have been deduced. The magnetic hysteresis cycles have been discussed. In particular, the corresponding magnetic coercive filed values have been deduced. The multiples hysteresis cycles are found. The dendrimer nanostructure has several applications in the medicine.
Burn, K W; Daffara, C; Gualdrini, G; Pierantoni, M; Ferrari, P
2007-01-01
The question of Monte Carlo simulation of radiation transport in voxel geometries is addressed. Patched versions of the MCNP and MCNPX codes are developed aimed at transporting radiation both in the standard geometry mode and in the voxel geometry treatment. The patched code reads an unformatted FORTRAN file derived from DICOM format data and uses special subroutines to handle voxel-to-voxel radiation transport. The various phases of the development of the methodology are discussed together with the new input options. Examples are given of employment of the code in internal and external dosimetry and comparisons with results from other groups are reported.
Linking multiple biodiversity informatics platforms with Darwin Core Archives
2014-01-01
Abstract We describe an implementation of the Darwin Core Archive (DwC-A) standard that allows for the exchange of biodiversity information contained within the Scratchpads virtual research environment with external collaborators. Using this single archive file Scratchpad users can expose taxonomies, specimen records, species descriptions and a range of other data to a variety of third-party aggregators and tools (currently Encyclopedia of Life, eMonocot Portal, CartoDB, and the Common Data Model) for secondary use. This paper describes our technical approach to dynamically building and validating Darwin Core Archives for the 600+ Scratchpad user communities, which can be used to serve the diverse data needs of all of our content partners. PMID:24723785
An analytic approach to sunset diagrams in chiral perturbation theory: Theory and practice
NASA Astrophysics Data System (ADS)
Ananthanarayan, B.; Bijnens, Johan; Ghosh, Shayan; Hebbar, Aditya
2016-12-01
We demonstrate the use of several code implementations of the Mellin-Barnes method available in the public domain to derive analytic expressions for the sunset diagrams that arise in the two-loop contribution to the pion mass and decay constant in three-flavoured chiral perturbation theory. We also provide results for all possible two mass configurations of the sunset integral, and derive a new one-dimensional integral representation for the one mass sunset integral with arbitrary external momentum. Thoroughly annotated Mathematica notebooks are provided as ancillary files in the Electronic Supplementary Material to this paper, which may serve as pedagogical supplements to the methods described in this paper.
NASA Technical Reports Server (NTRS)
2005-01-01
KENNEDY SPACE CENTER, FLA. In the Vehicle Assembly Building at NASAs Kennedy Space Center, a digital still camera has been mounted in the External Tank (ET) umbilical well on the aft end of Space Shuttle Discovery. The camera is being used to obtain and downlink high-resolution images of the disconnect point on the ET following ET separation from the orbiter after launch. The Kodak camera will record 24 images, at one frame per 1.5 seconds, on a flash memory card. After orbital insertion, the crew will transfer the images from the memory card to a laptop computer. The files will then be downloaded through the Ku-band system to the Mission Control Center in Houston for analysis.
NASA Technical Reports Server (NTRS)
2005-01-01
KENNEDY SPACE CENTER, FLA. In the Vehicle Assembly Building at NASAs Kennedy Space Center, workers check the digital still camera they will mount in the External Tank (ET) umbilical well on the aft end of Space Shuttle Discovery. The camera is being used to obtain and downlink high-resolution images of the disconnect point on the ET following the tank's separation from the orbiter after launch. The Kodak camera will record 24 images, at one frame per 1.5 seconds, on a flash memory card. After orbital insertion, the crew will transfer the images from the memory card to a laptop computer. The files will then be downloaded through the Ku-band system to the Mission Control Center in Houston for analysis.
NASA Technical Reports Server (NTRS)
2005-01-01
KENNEDY SPACE CENTER, FLA. In the Vehicle Assembly Building at NASAs Kennedy Space Center, a worker mounts a digital still camera in the External Tank (ET) umbilical well on the aft end of Space Shuttle Discovery. The camera is being used to obtain and downlink high-resolution images of the disconnect point on the ET following the ET separation from the orbiter after launch. The Kodak camera will record 24 images, at one frame per 1.5 seconds, on a flash memory card. After orbital insertion, the crew will transfer the images from the memory card to a laptop computer. The files will then be downloaded through the Ku-band system to the Mission Control Center in Houston for analysis.
NASA Technical Reports Server (NTRS)
2005-01-01
KENNEDY SPACE CENTER, FLA. In the Vehicle Assembly Building at NASAs Kennedy Space Center, workers prepare a digital still camera they will mount in the External Tank (ET) umbilical well on the aft end of Space Shuttle Discovery. The camera is being used to obtain and downlink high-resolution images of the disconnect point on the ET following its separation from the orbiter after launch. The Kodak camera will record 24 images, at one frame per 1.5 seconds, on a flash memory card. After orbital insertion, the crew will transfer the images from the memory card to a laptop computer. The files will then be downloaded through the Ku-band system to the Mission Control Center in Houston for analysis.
NASA Technical Reports Server (NTRS)
2005-01-01
KENNEDY SPACE CENTER, FLA. In the Vehicle Assembly Building at NASAs Kennedy Space Center, workers prepare a digital still camera they will mount in the External Tank (ET) umbilical well on the aft end of Space Shuttle Discovery. The camera is being used to obtain and downlink high-resolution images of the disconnect point on the ET following the ET separation from the orbiter after launch. The Kodak camera will record 24 images, at one frame per 1.5 seconds, on a flash memory card. After orbital insertion, the crew will transfer the images from the memory card to a laptop computer. The files will then be downloaded through the Ku-band system to the Mission Control Center in Houston for analysis.
IVS contribution to the next ITRF
NASA Astrophysics Data System (ADS)
Bachmann, Sabine; Messerschmitt, Linda; Thaller, Daniela
2015-04-01
Generating the contribution of the International VLBI Service (IVS) to the next ITRF (ITRF2013 or ITRF2014) was the main task of the IVS Combination Center at the Federal Agency for Cartography and Geodesy (BKG, Germany) in 2014. Starting with the ITRF2005, the IVS contribution to the ITRF is an intra-technique combined solution using multiple individual contributions from different institutions. For the upcoming ITRF ten international institutions submitted data files for a combined solution. The data files contain 24h VLBI sessions from the late 1970s until the end of 2014 in SINEX file format containing datum free normal equations with station coordinates and Earth Orientation Parameters (EOP). All contributions have to meet the IVS standards for ITRF contribution in order to guarantee a consistent combined solution. In the course of the generation of the intra-technique combined solution, station coordinate time series for each station as well as a Terrestrial Reference Frame based on the contributed VLBI data (VTRF) were generated and analyzed. Preliminary results using data until the end of 2013 show a scaling factor of -0.47 ppb resulting from a 7-parameter Helmert transformation of the VTRF w.r.t. ITRF2008, which is comparable to the scaling factor that was determined in the precedent ITRF generation. An internal comparison of the EOPs between the combined solution and the individual contributions as well as external comparisons of the EOP series were carried out in order to assure a consistent quality of the EOPs. The data analyses, the combination procedure and results of the combined solution for station coordinates and EOP will be presented.
CIF2Cell: Generating geometries for electronic structure programs
NASA Astrophysics Data System (ADS)
Björkman, Torbjörn
2011-05-01
The CIF2Cell program generates the geometrical setup for a number of electronic structure programs based on the crystallographic information in a Crystallographic Information Framework (CIF) file. The program will retrieve the space group number, Wyckoff positions and crystallographic parameters, make a sensible choice for Bravais lattice vectors (primitive or principal cell) and generate all atomic positions. Supercells can be generated and alloys are handled gracefully. The code currently has output interfaces to the electronic structure programs ABINIT, CASTEP, CPMD, Crystal, Elk, Exciting, EMTO, Fleur, RSPt, Siesta and VASP. Program summaryProgram title: CIF2Cell Catalogue identifier: AEIM_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEIM_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU GPL version 3 No. of lines in distributed program, including test data, etc.: 12 691 No. of bytes in distributed program, including test data, etc.: 74 933 Distribution format: tar.gz Programming language: Python (versions 2.4-2.7) Computer: Any computer that can run Python (versions 2.4-2.7) Operating system: Any operating system that can run Python (versions 2.4-2.7) Classification: 7.3, 7.8, 8 External routines: PyCIFRW [1] Nature of problem: Generate the geometrical setup of a crystallographic cell for a variety of electronic structure programs from data contained in a CIF file. Solution method: The CIF file is parsed using routines contained in the library PyCIFRW [1], and crystallographic as well as bibliographic information is extracted. The program then generates the principal cell from symmetry information, crystal parameters, space group number and Wyckoff sites. Reduction to a primitive cell is then performed, and the resulting cell is output to suitably named files along with documentation of the information source generated from any bibliographic information contained in the CIF file. If the space group symmetries is not present in the CIF file the program will fall back on internal tables, so only the minimal input of space group, crystal parameters and Wyckoff positions are required. Additional key features are handling of alloys and supercell generation. Additional comments: Currently implements support for the following general purpose electronic structure programs: ABINIT [2,3], CASTEP [4], CPMD [5], Crystal [6], Elk [7], exciting [8], EMTO [9], Fleur [10], RSPt [11], Siesta [12] and VASP [13-16]. Running time: The examples provided in the distribution take only seconds to run.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lv, Mei; Liu, Zengrong; He, Bing
In previous studies, we reported molecular dynamics (MD) simulations showing that single-file water wires confined inside Y-shaped single-walled carbon nanotubes (Y-SWNTs) held strong and robust capability to convert and multiply charge signals [Y. S. Tu, P. Xiu, R. Z. Wan, J. Hu, R. H. Zhou, and H. P. Fang, Proc. Natl. Acad. Sci. U.S.A. 106, 18120 (2009); Y. Tu, H. Lu, Y. Zhang, T. Huynh, and R. Zhou, J. Chem. Phys. 138, 015104 (2013)]. It is fascinating to see whether the signal multiplication can be realized by other kinds of polar molecules with larger dipole moments (which make the experimentalmore » realization easier). In this article, we use MD simulations to study the urea-mediated signal conversion and multiplication with Y-SWNTs. We observe that when a Y-SWNT with an external charge of magnitude 1.0 e (the model of a signal at the single-electron level) is solvated in 1 M urea solutions, urea can induce drying of the Y-SWNT and fill its interiors in single-file, forming Y-shaped urea wires. The external charge can effectively control the dipole orientation of the urea wire inside the main channel (i.e., the signal can be readily converted), and this signal can further be multiplied into 2 (or more) output signals by modulating dipole orientations of urea wires in bifurcated branch channels of the Y-SWNT. This remarkable signal transduction capability arises from the strong dipole-induced ordering of urea wires under extreme confinement. We also discuss the advantage of urea as compared with water in the signal multiplication, as well as the robustness and biological implications of our findings. This study provides the possibility for multiplying signals by using urea molecules (or other polar organic molecules) with Y-shaped nanochannels and might also help understand the mechanism behind signal conduction in both physical and biological systems.« less
Orbiter/External Tank Mate 3-D Solid Modeling
NASA Technical Reports Server (NTRS)
Godfrey, G. S.; Brandt, B.; Rorden, D.; Kapr, F.
2004-01-01
This research and development project presents an overview of the work completed while attending a summer 2004 American Society of Engineering Education/National Aeronautics and Space Administration (ASEE/NASA) Faculty Fellowship. This fellowship was completed at the Kennedy Space Center, Florida. The scope of the project was to complete parts, assemblies, and drawings that could be used by Ground Support Equipment (GSE) personnel to simulate situations and scenarios commonplace to the space shuttle Orbiter/External Tank (ET) Mate (50004). This mate takes place in the Vehicle Assembly Building (VAB). These simulations could then be used by NASA engineers as decision-making tools. During the summer of 2004, parts were created that defined the Orbiter/ET structural interfaces. Emphasis was placed upon assemblies that included the Orbiter/ET forward attachment (EO-1), aft left thrust strut (EO-2), aft right tripod support structure (EO-3), and crossbeam and aft feedline/umbilical supports. These assemblies are used to attach the Orbiter to the ET. The Orbiter/ET Mate assembly was then used to compare and analyze clearance distances using different Orbiter hang angles. It was found that a 30-minute arc angle change in Orbiter hang angle affected distance at the bipod strut to Orbiter yoke fitting 8.11 inches. A 3-D solid model library was established as a result of this project. This library contains parts, assemblies, and drawings translated into several formats. This library contains a collection of the following files: sti for sterolithography, stp for neutral file work, shrinkwrap for compression. tiff for photoshop work, jpeg for Internet use, and prt and asm for Pro/Engineer use. This library was made available to NASA engineers so that they could access its contents to make angle, load, and clearance analysis studies. These decision-making tools may be used by Pro/Engineer users and non-users.
NASA Astrophysics Data System (ADS)
Alexander, A.; DeBlois, F.; Stroian, G.; Al-Yahya, K.; Heath, E.; Seuntjens, J.
2007-07-01
Radiotherapy research lacks a flexible computational research environment for Monte Carlo (MC) and patient-specific treatment planning. The purpose of this study was to develop a flexible software package on low-cost hardware with the aim of integrating new patient-specific treatment planning with MC dose calculations suitable for large-scale prospective and retrospective treatment planning studies. We designed the software package 'McGill Monte Carlo treatment planning' (MMCTP) for the research development of MC and patient-specific treatment planning. The MMCTP design consists of a graphical user interface (GUI), which runs on a simple workstation connected through standard secure-shell protocol to a cluster for lengthy MC calculations. Treatment planning information (e.g., images, structures, beam geometry properties and dose distributions) is converted into a convenient MMCTP local file storage format designated, the McGill RT format. MMCTP features include (a) DICOM_RT, RTOG and CADPlan CART format imports; (b) 2D and 3D visualization views for images, structure contours, and dose distributions; (c) contouring tools; (d) DVH analysis, and dose matrix comparison tools; (e) external beam editing; (f) MC transport calculation from beam source to patient geometry for photon and electron beams. The MC input files, which are prepared from the beam geometry properties and patient information (e.g., images and structure contours), are uploaded and run on a cluster using shell commands controlled from the MMCTP GUI. The visualization, dose matrix operation and DVH tools offer extensive options for plan analysis and comparison between MC plans and plans imported from commercial treatment planning systems. The MMCTP GUI provides a flexible research platform for the development of patient-specific MC treatment planning for photon and electron external beam radiation therapy. The impact of this tool lies in the fact that it allows for systematic, platform-independent, large-scale MC treatment planning for different treatment sites. Patient recalculations were performed to validate the software and ensure proper functionality.
LCG MCDB—a knowledgebase of Monte-Carlo simulated events
NASA Astrophysics Data System (ADS)
Belov, S.; Dudko, L.; Galkin, E.; Gusev, A.; Pokorski, W.; Sherstnev, A.
2008-02-01
In this paper we report on LCG Monte-Carlo Data Base (MCDB) and software which has been developed to operate MCDB. The main purpose of the LCG MCDB project is to provide a storage and documentation system for sophisticated event samples simulated for the LHC Collaborations by experts. In many cases, the modern Monte-Carlo simulation of physical processes requires expert knowledge in Monte-Carlo generators or significant amount of CPU time to produce the events. MCDB is a knowledgebase mainly dedicated to accumulate simulated events of this type. The main motivation behind LCG MCDB is to make the sophisticated MC event samples available for various physical groups. All the data from MCDB is accessible in several convenient ways. LCG MCDB is being developed within the CERN LCG Application Area Simulation project. Program summaryProgram title: LCG Monte-Carlo Data Base Catalogue identifier: ADZX_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADZX_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public Licence No. of lines in distributed program, including test data, etc.: 30 129 No. of bytes in distributed program, including test data, etc.: 216 943 Distribution format: tar.gz Programming language: Perl Computer: CPU: Intel Pentium 4, RAM: 1 Gb, HDD: 100 Gb Operating system: Scientific Linux CERN 3/4 RAM: 1 073 741 824 bytes (1 Gb) Classification: 9 External routines:perl >= 5.8.5; Perl modules DBD-mysql >= 2.9004, File::Basename, GD::SecurityImage, GD::SecurityImage::AC, Linux::Statistics, XML::LibXML > 1.6, XML::SAX, XML::NamespaceSupport; Apache HTTP Server >= 2.0.59; mod auth external >= 2.2.9; edg-utils-system RPM package; gd >= 2.0.28; rpm package CASTOR-client >= 2.1.2-4; arc-server (optional) Nature of problem: Often, different groups of experimentalists prepare similar samples of particle collision events or turn to the same group of authors of Monte-Carlo (MC) generators to prepare the events. For example, the same MC samples of Standard Model (SM) processes can be employed for the investigations either in the SM analyses (as a signal) or in searches for new phenomena in Beyond Standard Model analyses (as a background). If the samples are made available publicly and equipped with corresponding and comprehensive documentation, it can speed up cross checks of the samples themselves and physical models applied. Some event samples require a lot of computing resources for preparation. So, a central storage of the samples prevents possible waste of researcher time and computing resources, which can be used to prepare the same events many times. Solution method: Creation of a special knowledgebase (MCDB) designed to keep event samples for the LHC experimental and phenomenological community. The knowledgebase is realized as a separate web-server ( http://mcdb.cern.ch). All event samples are kept on types at CERN. Documentation describing the events is the main contents of MCDB. Users can browse the knowledgebase, read and comment articles (documentation), and download event samples. Authors can upload new event samples, create new articles, and edit own articles. Restrictions: The software is adopted to solve the problems, described in the article and there are no any additional restrictions. Unusual features: The software provides a framework to store and document large files with flexible authentication and authorization system. Different external storages with large capacity can be used to keep the files. The WEB Content Management System provides all of the necessary interfaces for the authors of the files, end-users and administrators. Running time: Real time operations. References: [1] The main LCG MCDB server, http://mcdb.cern.ch/. [2] P. Bartalini, L. Dudko, A. Kryukov, I.V. Selyuzhenkov, A. Sherstnev, A. Vologdin, LCG Monte-Carlo data base, hep-ph/0404241. [3] J.P. Baud, B. Couturier, C. Curran, J.D. Durand, E. Knezo, S. Occhetti, O. Barring, CASTOR: status and evolution, cs.oh/0305047.
VirGO: A Visual Browser for the ESO Science Archive Facility
NASA Astrophysics Data System (ADS)
Chéreau, F.
2008-08-01
VirGO is the next generation Visual Browser for the ESO Science Archive Facility developed by the Virtual Observatory (VO) Systems Department. It is a plug-in for the popular open source software Stellarium adding capabilities for browsing professional astronomical data. VirGO gives astronomers the possibility to easily discover and select data from millions of observations in a new visual and intuitive way. Its main feature is to perform real-time access and graphical display of a large number of observations by showing instrumental footprints and image previews, and to allow their selection and filtering for subsequent download from the ESO SAF web interface. It also allows the loading of external FITS files or VOTables, the superimposition of Digitized Sky Survey (DSS) background images, and the visualization of the sky in a `real life' mode as seen from the main ESO sites. All data interfaces are based on Virtual Observatory standards which allow access to images and spectra from external data centers, and interaction with the ESO SAF web interface or any other VO applications supporting the PLASTIC messaging system. The main website for VirGO is at http://archive.eso.org/cms/virgo.
Computational studies of steering nanoparticles with magnetic gradients
NASA Astrophysics Data System (ADS)
Aylak, Sultan Suleyman
Magnetic Resonance Imaging (MRI) guided nanorobotic systems that could perform diagnostic, curative, and reconstructive treatments in the human body at the cellular and subcellular level in a controllable manner have recently been proposed. The concept of a MRI-guided nanorobotic system is based on the use of a MRI scanner to induce the required external driving forces to guide magnetic nanocapsules to a specific target. However, the maximum magnetic gradient specifications of existing clinical MRI systems are not capable of driving magnetic nanocapsules against the blood flow. This thesis presents the visualization of nanoparticles inside blood vessel, Graphical User Interface (GUI) for updating file including initial parameters and demonstrating the simulation of particles and C++ code for computing magnetic forces and fluidic forces. The visualization and GUI were designed using Virtual Reality Modeling Language (VRML), MATLAB and C#. The addition of software for MRI-guided nanorobotic system provides simulation results. Preliminary simulation results demonstrate that external magnetic field causes aggregation of nanoparticles while they flow in the vessel. This is a promising result --in accordance with similar experimental results- and encourages further investigation on the nanoparticle-based self-assembly structures for use in nanorobotic drug delivery.
Thermo Scientific Sulfur Dioxide Analyzer Instrument Handbook
DOE Office of Scientific and Technical Information (OSTI.GOV)
Springston, S. R.
The Sulfur Dioxide Analyzer measures sulfur dioxide based on absorbance of UV light at one wavelength by SO2 molecules which then decay to a lower energy state by emitting UV light at a longer wavelength. Specifically, SO2 + hυ1 →SO2 *→SO2 + hυ2 The emitted light is proportional to the concentration of SO2 in the optical cell. External communication with the analyzer is available through an Ethernet port configured through the instrument network of the AOS systems. The Model 43i-TLE is part of the i-series of Thermo Scientific instruments. The i-series instruments are designed to interface with external computers throughmore » the proprietary Thermo Scientific iPort Software. However, this software is somewhat cumbersome and inflexible. BNL has written an interface program in National Instruments LabView that both controls the Model 43i-TLE Analyzer AND queries the unit for all measurement and housekeeping data. The LabView vi (the software program written by BNL) ingests all raw data from the instrument and outputs raw data files in a uniform data format similar to other instruments in the AOS and described more fully in Section 6.0 below.« less
Abad-Gallegos, M; Arnabat-Domínguez, J; España-Tost, A; Berini-Aytés, L; Gay-Escoda, C
2009-12-01
A study was made to determine the temperature increment at the dental root surface following Er,Cr:YSGG laser irradiation of the root canal. Human canines and incisors previously instrumented to K file number ISO 30 were used. Irradiation was carried out with glass fiber endodontic tips measuring 200 mm in diameter and especially designed for insertion in the root canal. The teeth were irradiated at 1 and 2 W for 30 seconds, without water spraying or air, and applying a continuous circular movement (approximately 2 mm/sec.) in the apico-coronal direction. At the 1 W power setting, the mean temperature increment was 3.84 degrees C versus 5.01 degrees C at 2 W. In all cases the difference in mean value obtained after irradiation versus the mean baseline temperature proved statistically significant (p<0.05). Application of the Er,Cr:YSGG laser gives rise to a statistically significant temperature increment at the external root surface, though this increment is probably clinically irrelevant, since it would appear to damage the tissues (periodontal ligament and alveolar bone) in proximity to the treated tooth.
Deaths on board ships assisted by the Centro Internazionale Radio Medico in the last 25 years.
Grappasonni, Iolanda; Petrelli, Fabio; Amenta, Francesco
2012-07-01
Data on occupational diseases of seafarers and of causes of death during their career are sparse. The causes of deaths on board ships assisted by Centro Internazionale Radio Medico (CIRM), the Italian Telemedical Maritime Assistance Service (TMAS) were reviewed by examining 29,146 files of patients treated from 1986 to 2010. In the 25 years, 383 deaths occurred (1.31%). Diseases of the circulation were the most frequent, followed by external causes such as accidents and violence, infectious and parasitic diseases, alcohol and drug addiction, respiratory system diseases. Cardiovascular and external causes were the principal causes of deaths among seafarers. This investigation is the first study on the causes of death on board ships obtained from data of a maritime telemedical centre, that has assisted seafarers when they were alive or immediately after their death. The fact that diseases of the circulatory system are the first cause of death of sailing seafarers deserves specific initiatives. They should include campaigns for adequate lifestyles and the availability on ships of medical devices useful for diagnostic purposes, resuscitation as well as for verification of death. Copyright © 2012 Elsevier Ltd. All rights reserved.
Implications for a Wireless, External Device System to Study Electrocorticography
Rotermund, David; Pistor, Jonas; Hoeffmann, Janpeter; Schellenberg, Tim; Boll, Dmitriy; Tolstosheeva, Elena; Gauck, Dieter; Stemmann, Heiko; Peters-Drolshagen, Dagmar; Kreiter, Andreas Kurt; Schneider, Martin; Paul, Steffen; Lang, Walter; Pawelzik, Klaus Richard
2017-01-01
Implantable neuronal interfaces to the brain are an important keystone for future medical applications. However, entering this field of research is difficult since such an implant requires components from many different areas of technology. Since the complete avoidance of wires is important due to the risk of infections and other long-term problems, means for wirelessly transmitting data and energy are a necessity which adds to the requirements. In recent literature, many high-tech components for such implants are presented with remarkable properties. However, these components are typically not freely available for such a system. Every group needs to re-develop their own solution. This raises the question if it is possible to create a reusable design for an implant and its external base-station, such that it allows other groups to use it as a starting point. In this article, we try to answer this question by presenting a design based exclusively on commercial off-the-shelf components and studying the properties of the resulting system. Following this idea, we present a fully wireless neuronal implant for simultaneously measuring electrocorticography signals at 128 locations from the surface of the brain. All design files are available as open source. PMID:28375161
DOE Office of Scientific and Technical Information (OSTI.GOV)
Springston, Stephen R.
The Sulfur Dioxide Analyzer measures sulfur dioxide based on absorbance of UV light at one wavelength by SO 2 molecules which then decay to a lower energy state by emitting UV light at a longer wavelength. Specifically, SO 2 + hυ1 →SO 2 *→SO 2 + hυ2 The emitted light is proportional to the concentration of SO 2 in the optical cell. External communication with the analyzer is available through an Ethernet port configured through the instrument network of the AOS systems. The Model 43i-TLE is part of the i-series of Thermo Scientific instruments. The i-series instruments are designed tomore » interface with external computers through the proprietary Thermo Scientific iPort Software. However, this software is somewhat cumbersome and inflexible. Brookhaven National Laboratory (BNL) has written an interface program in National Instruments LabView that both controls the Model 43i-TLE Analyzer AND queries the unit for all measurement and housekeeping data. The LabView vi (the software program written by BNL) ingests all raw data from the instrument and outputs raw data files in a uniform data format similar to other instruments in the AOS and described more fully in Section 6.0 below.« less
Sequanix: a dynamic graphical interface for Snakemake workflows.
Desvillechabrol, Dimitri; Legendre, Rachel; Rioualen, Claire; Bouchier, Christiane; van Helden, Jacques; Kennedy, Sean; Cokelaer, Thomas
2018-06-01
We designed a PyQt graphical user interface-Sequanix-aimed at democratizing the use of Snakemake pipelines in the NGS space and beyond. By default, Sequanix includes Sequana NGS pipelines (Snakemake format) (http://sequana.readthedocs.io), and is also capable of loading any external Snakemake pipeline. New users can easily, visually, edit configuration files of expert-validated pipelines and can interactively execute these production-ready workflows. Sequanix will be useful to both Snakemake developers in exposing their pipelines and to a wide audience of users. Source on http://github.com/sequana/sequana, bio-containers on http://bioconda.github.io and Singularity hub (http://singularity-hub.org). dimitri.desvillechabrol@pasteur.fr or thomas.cokelaer@pasteur.fr. Supplementary data are available at Bioinformatics online.
Birney, E; Andrews, D; Bevan, P; Caccamo, M; Cameron, G; Chen, Y; Clarke, L; Coates, G; Cox, T; Cuff, J; Curwen, V; Cutts, T; Down, T; Durbin, R; Eyras, E; Fernandez-Suarez, X M; Gane, P; Gibbins, B; Gilbert, J; Hammond, M; Hotz, H; Iyer, V; Kahari, A; Jekosch, K; Kasprzyk, A; Keefe, D; Keenan, S; Lehvaslaiho, H; McVicker, G; Melsopp, C; Meidl, P; Mongin, E; Pettett, R; Potter, S; Proctor, G; Rae, M; Searle, S; Slater, G; Smedley, D; Smith, J; Spooner, W; Stabenau, A; Stalker, J; Storey, R; Ureta-Vidal, A; Woodwark, C; Clamp, M; Hubbard, T
2004-01-01
The Ensembl (http://www.ensembl.org/) database project provides a bioinformatics framework to organize biology around the sequences of large genomes. It is a comprehensive and integrated source of annotation of large genome sequences, available via interactive website, web services or flat files. As well as being one of the leading sources of genome annotation, Ensembl is an open source software engineering project to develop a portable system able to handle very large genomes and associated requirements. The facilities of the system range from sequence analysis to data storage and visualization and installations exist around the world both in companies and at academic sites. With a total of nine genome sequences available from Ensembl and more genomes to follow, recent developments have focused mainly on closer integration between genomes and external data.
3-D reservoir characterization of the House Creek oil field, Powder River Basin, Wyoming
Higley, Debra K.; Pantea, Michael P.; Slatt, Roger M.
1997-01-01
This CD-ROM is intended to serve a broad audience. An important purpose is to explain geologic and geochemical factors that control petroleum production from the House Creek Field. This information may serve as an analog for other marine-ridge sandstone reservoirs. The 3-D slide and movie images are tied to explanations and 2-D geologic and geochemical images to visualize geologic structures in three dimensions, explain the geologic significance of porosity/permeability distribution across the sandstone bodies, and tie this to petroleum production characteristics in the oil field. Movies, text, images including scanning electron photomicrographs (SEM), thin-section photomicrographs, and data files can be copied from the CD-ROM for use in external mapping, statistical, and other applications.
NASA Technical Reports Server (NTRS)
2004-01-01
KENNEDY SPACE CENTER, FLA. In the Orbiter Processing Facility, United Space Alliance worker Craig Meyer fits an External Tank (ET) digital still camera in the right-hand liquid oxygen umbilical well on Space Shuttle Atlantis. NASA is pursuing use of the camera, beginning with the Shuttles Return To Flight, to obtain and downlink high-resolution images of the ET following separation of the ET from the orbiter after launch. The Kodak camera will record 24 images, at one frame per 1.5 seconds, on a flash memory card. After orbital insertion, the crew will transfer the images from the memory card to a laptop computer. The files will then be downloaded through the Ku-band system to the Mission Control Center in Houston for analysis.
NASA Technical Reports Server (NTRS)
2004-01-01
KENNEDY SPACE CENTER, FLA. In the Orbiter Processing Facility, an External Tank (ET) digital still camera is positioned into the right-hand liquid oxygen umbilical well on Space Shuttle Atlantis to determine if it fits properly. NASA is pursuing use of the camera, beginning with the Shuttles Return To Flight, to obtain and downlink high-resolution images of the ET following separation of the ET from the orbiter after launch. The Kodak camera will record 24 images, at one frame per 1.5 seconds, on a flash memory card. After orbital insertion, the crew will transfer the images from the memory card to a laptop computer. The files will then be downloaded through the Ku-band system to the Mission Control Center in Houston for analysis.
2004-09-17
KENNEDY SPACE CENTER, FLA. - In the Orbiter Processing Facility, an External Tank (ET) digital still camera is positioned into the right-hand liquid oxygen umbilical well on Space Shuttle Atlantis to determine if it fits properly. NASA is pursuing use of the camera, beginning with the Shuttle’s Return To Flight, to obtain and downlink high-resolution images of the ET following separation of the ET from the orbiter after launch. The Kodak camera will record 24 images, at one frame per 1.5 seconds, on a flash memory card. After orbital insertion, the crew will transfer the images from the memory card to a laptop computer. The files will then be downloaded through the Ku-band system to the Mission Control Center in Houston for analysis.
2004-09-17
KENNEDY SPACE CENTER, FLA. - In the Orbiter Processing Facility, United Space Alliance worker Craig Meyer fits an External Tank (ET) digital still camera in the right-hand liquid oxygen umbilical well on Space Shuttle Atlantis. NASA is pursuing use of the camera, beginning with the Shuttle’s Return To Flight, to obtain and downlink high-resolution images of the ET following separation of the ET from the orbiter after launch. The Kodak camera will record 24 images, at one frame per 1.5 seconds, on a flash memory card. After orbital insertion, the crew will transfer the images from the memory card to a laptop computer. The files will then be downloaded through the Ku-band system to the Mission Control Center in Houston for analysis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Myronakis, M; Cai, W; Dhou, S
Purpose: To design a comprehensive open-source, publicly available, graphical user interface (GUI) to facilitate the configuration, generation, processing and use of the 4D Extended Cardiac-Torso (XCAT) phantom. Methods: The XCAT phantom includes over 9000 anatomical objects as well as respiratory, cardiac and tumor motion. It is widely used for research studies in medical imaging and radiotherapy. The phantom generation process involves the configuration of a text script to parameterize the geometry, motion, and composition of the whole body and objects within it, and to generate simulated PET or CT images. To avoid the need for manual editing or script writing,more » our MATLAB-based GUI uses slider controls, drop-down lists, buttons and graphical text input to parameterize and process the phantom. Results: Our GUI can be used to: a) generate parameter files; b) generate the voxelized phantom; c) combine the phantom with a lesion; d) display the phantom; e) produce average and maximum intensity images from the phantom output files; f) incorporate irregular patient breathing patterns; and f) generate DICOM files containing phantom images. The GUI provides local help information using tool-tip strings on the currently selected phantom, minimizing the need for external documentation. The DICOM generation feature is intended to simplify the process of importing the phantom images into radiotherapy treatment planning systems or other clinical software. Conclusion: The GUI simplifies and automates the use of the XCAT phantom for imaging-based research projects in medical imaging or radiotherapy. This has the potential to accelerate research conducted with the XCAT phantom, or to ease the learning curve for new users. This tool does not include the XCAT phantom software itself. We would like to acknowledge funding from MRA, Varian Medical Systems Inc.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schaumberg, Andrew
The Omics Tools package provides several small trivial tools for work in genomics. This single portable package, the omics.jar file, is a toolbox that works in any Java-based environment, including PCs, Macs, and supercomputers. The number of tools is expected to grow. One tool (called cmsearch.hadoop or cmsearch.local), calls the external cmsearch program to predict non-coding RNA in a genome. The cmsearch program is part of the third-party Infernal package. Omics Tools does not contain Infernal. Infernal may be installed separately. The cmsearch.hadoop subtool requires Apache Hadoop and runs on a supercomputer, though cmsearch.local does not and runs on amore » server. Omics Tools does not contain Hadoop. Hadoop mat be installed separartely The other tools (cmgbk, cmgff, fastats, pal, randgrp, randgrpr, randsub) do not interface with third-party tools. Omics Tools is written in Java and Scala programming languages. Invoking the help command shows currently available tools, as shown below: schaumbe@gpint06:~/proj/omics$ java -jar omics.jar help Known commands are: cmgbk : compare cmsearch and GenBank Infernal hits cmgff : compare hits among two GFF (version 3) files cmsearch.hadoop : find Infernal hits in a genome, on your supercomputer cmsearch.local : find Infernal hits in a genome, on your workstation fastats : FASTA stats, e.g. # bases, GC content pal : stem-loop motif detection by palindromic sequence search (code stub) randgrp : random subsample without replacement, of groups randgrpr : random subsample with replacement, of groups (fast) randsub : random subsample without replacement, of file lines For more help regarding a particular command, use: java -jar omics.jar command help Usage: java -jar omics.jar command args« less
Chansirinukor, Wunpen; Maher, Christopher G; Latimer, Jane; Hush, Julia
2005-01-01
Retrospective design. To compare the responsiveness and test-retest reliability of the Functional Rating Index and the 18-item version of the Roland-Morris Disability Questionnaire in detecting change in disability in patients with work-related low back pain. Many low back pain-specific disability questionnaires are available, including the Functional Rating Index and the 18-item version of the Roland-Morris Disability Questionnaire. No previous study has compared the responsiveness and reliability of these questionnaires. Files of patients who had been treated for work-related low back pain at a physical therapy clinic were reviewed, and those containing initial and follow-up Functional Rating Index and 18-item Roland-Morris Disability Questionnaires were selected. The responsiveness of both questionnaires was compared using two different methods. First, using the assumption that patients receiving treatment improve over time, various responsiveness coefficients were calculated. Second, using change in work status as an external criterion to identify improved and nonimproved patients, Spearman's rho and receiver operating characteristic curves were calculated. Reliability was estimated from the subset of patients who reported no change in their condition over this period and expressed with the intraclass correlation coefficient and the minimal detectable change. One hundred and forty-three patient files were retrieved. The responsiveness coefficients for the Functional Rating Index were greater than for the 18-item Roland-Morris Disability Questionnaire. The intraclass correlation coefficient values for both questionnaires calculated from 96 patient files were similar, but the minimal detectable change for the Functional Rating Index was less than for the 18-item Roland-Morris Disability Questionnaire. The Functional Rating Index seems preferable to the 18-item Roland-Morris Disability Questionnaire for use in clinical trials and clinical practice.
Publications of Western Earth Surface Processes Team 2001
Powell, II; Graymer, R.W.
2002-01-01
The Western Earth Surface Processes Team (WESPT) of the U.S. Geological Survey (USGS) conducts geologic mapping and related topical earth-science studies in the Western United States. This work is focused on areas where modern geologic maps and associated earth-science data are needed to address key societal and environmental issues, such as ground-water quality, landslides and other potential geologic hazards, and land-use decisions. Areas of primary emphasis in 2001 included southern California, the San Francisco Bay region, the Pacific Northwest, and the Las Vegas urban corridor. The team has its headquarters in Menlo Park, California, and maintains smaller field offices at several other locations in the Western United States. The results of research conducted by the WESPT are released to the public as a variety of databases, maps, text reports, and abstracts, both through the internal publication system of the USGS and in diverse external publications such as scientific journals and books. This report lists publications of the WESPT released in 2001, as well as additional 1999 and 2000 publications that were not included in the previous list (USGS Open-File Report 00–215 and USGS Open-File Report 01–198). Most of the publications listed were authored or coauthored by WESPT staff. The list also includes some publications authored by non-USGS cooperators with the WESPT, as well as some authored by USGS staff outside the WESPT in cooperation with WESPT projects. Several of the publications listed are available on the World Wide Web; for these, URL addresses are provided. Many of these web publications are USGS Open-File Reports that contain large digital databases of geologic map and related information.
Sharing electronic structure and crystallographic data with ETSF_IO
NASA Astrophysics Data System (ADS)
Caliste, D.; Pouillon, Y.; Verstraete, M. J.; Olevano, V.; Gonze, X.
2008-11-01
We present a library of routines whose main goal is to read and write exchangeable files (NetCDF file format) storing electronic structure and crystallographic information. It is based on the specification agreed inside the European Theoretical Spectroscopy Facility (ETSF). Accordingly, this library is nicknamed ETSF_IO. The purpose of this article is to give both an overview of the ETSF_IO library and a closer look at its usage. ETSF_IO is designed to be robust and easy to use, close to Fortran read and write routines. To facilitate its adoption, a complete documentation of the input and output arguments of the routines is available in the package, as well as six tutorials explaining in detail various possible uses of the library routines. Catalogue identifier: AEBG_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEBG_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Gnu Lesser General Public License No. of lines in distributed program, including test data, etc.: 63 156 No. of bytes in distributed program, including test data, etc.: 363 390 Distribution format: tar.gz Programming language: Fortran 95 Computer: All systems with a Fortran95 compiler Operating system: All systems with a Fortran95 compiler Classification: 7.3, 8 External routines: NetCDF, http://www.unidata.ucar.edu/software/netcdf Nature of problem: Store and exchange electronic structure data and crystallographic data independently of the computational platform, language and generating software Solution method: Implement a library based both on NetCDF file format and an open specification (http://etsf.eu/index.php?page=standardization)
Zarei, Mina; Javidi, Maryam; Erfanian, Mahdi; Lomee, Mahdi; Afkhami, Farzaneh
2013-01-01
Cleaning and shaping is one of the most important phases in root canal therapy. Various rotary NiTi systems minimize accidents and facilitate the shaping process. Todays NiTi files are used with air-driven and electric handpieces. This study compared the canal centering after instrumentation using the ProTaper system using Endo IT, electric torque-control motor, and NSK air-driven handpiece. This ex vivo randomized controlled trial study involved 26 mesial mandibular root canals with 10 to 35° curvature. The roots were randomly divided into 2 groups of 13 canals each. The roots were mounted in an endodontic cube with acrylic resin, sectioned horizontally at 2, 6 and 10 mm from the apex and then reassembled. The canals were instrumented according to the manufacturer's instructions using ProTaper rotary files and electric torque-control motors (group 1) or air-driven handpieces (group 2). Photographs of the cross-sections included shots before and after instrumentation, and image analysis was performed using Photoshop software. The centering ability and canal transportation was also evaluated. Repeated measurement and independent t-test provided statistical analysis of canal transportation. The comparison of the rate of transportation toward internal or external walls between the two groups was not statistically significant (p = 0.62). Comparison of the rate of transportation of sections within one group was not significant (p = 0.28). Use of rotary NiTi file with either electric torquecontrol motor or air-driven handpiece had no effect on canal centering. NiTi rotary instruments can be used with air-driven motors without any considerable changes in root canal anatomy, however it needs the clinician to be expert.
Chhabra, Sanjay; Yadav, Seema; Talwar, Sangeeta
2014-05-01
The study was aimed to acquire better understanding of C-shaped canal systems in mandibular second molar teeth through a clinical approach using sophisticated techniques such as surgical operating microscope and cone beam computed tomography (CBCT). A total of 42 extracted mandibular second molar teeth with fused roots and longitudinal grooves were collected randomly from native Indian population. Pulp chamber floors of all specimens were examined under surgical operating microscope and classified into four types (Min's method). Subsequently, samples were subjected to CBCT scan after insertion of K-files size #10 or 15 into each canal orifice and evaluated using the cross-sectional and 3-dimensional images in consultation with dental radiologist so as to obtain more accurate results. Minimum distance between the external root surface on the groove and initial file placed in the canal was also measured at different levels and statistically analyzed. Out of 42 teeth, maximum number of samples (15) belonged to Type-II category. A total of 100 files were inserted in 86 orifices of various types of specimens. Evaluation of the CBCT scan images of the teeth revealed that a total of 21 canals were missing completely or partially at different levels. The mean values for the minimum thickness were highest at coronal followed by middle and apical third levels in all the categories. Lowest values were obtained for teeth with Type-III category at all three levels. The present study revealed anatomical variations of C-shaped canal system in mandibular second molars. The prognosis of such complex canal anatomies can be improved by simultaneous employment of modern techniques such as surgical operating microscope and CBCT.
Programming PHREEQC calculations with C++ and Python a comparative study
Charlton, Scott R.; Parkhurst, David L.; Muller, Mike
2011-01-01
The new IPhreeqc module provides an application programming interface (API) to facilitate coupling of other codes with the U.S. Geological Survey geochemical model PHREEQC. Traditionally, loose coupling of PHREEQC with other applications required methods to create PHREEQC input files, start external PHREEQC processes, and process PHREEQC output files. IPhreeqc eliminates most of this effort by providing direct access to PHREEQC capabilities through a component object model (COM), a library, or a dynamically linked library (DLL). Input and calculations can be specified through internally programmed strings, and all data exchange between an application and the module can occur in computer memory. This study compares simulations programmed in C++ and Python that are tightly coupled with IPhreeqc modules to the traditional simulations that are loosely coupled to PHREEQC. The study compares performance, quantifies effort, and evaluates lines of code and the complexity of the design. The comparisons show that IPhreeqc offers a more powerful and simpler approach for incorporating PHREEQC calculations into transport models and other applications that need to perform PHREEQC calculations. The IPhreeqc module facilitates the design of coupled applications and significantly reduces run times. Even a moderate knowledge of one of the supported programming languages allows more efficient use of PHREEQC than the traditional loosely coupled approach.
Okazaki, Masato; Pander, Piotr; Higginbotham, Heather; Monkman, Andrew P.
2017-01-01
Novel U-shaped donor–acceptor–donor (D–A–D) π-conjugated multi-functional molecules comprising dibenzo[a,j]phenazine (DBPHZ) as an acceptor and phenothiazines (PTZ) as donors have been developed. Most importantly, the D–A–D compounds exhibit not only distinct tricolor-changeable mechanochromic luminescence (MCL) properties but also efficient thermally activated delayed fluorescence (TADF). Quantum chemical calculations, X-ray diffraction analysis, and systematic studies on the photophysical properties indicated that the “two-conformation-switchable” PTZ units play a highly important role in achieving multi-color-changing MCL. Time-resolved photophysical measurements revealed that the developed D–A–D compounds also exhibit efficient orange-TADF. Furthermore, organic light-emitting diode (OLED) devices fabricated with the new TADF emitters have achieved high external quantum efficiencies (EQEs) up to 16.8%, which significantly exceeds the theoretical maximum (∼5%) of conventional fluorescent emitters. PMID:28553504
Experiment Software and Projects on the Web with VISPA
NASA Astrophysics Data System (ADS)
Erdmann, M.; Fischer, B.; Fischer, R.; Geiser, E.; Glaser, C.; Müller, G.; Rieger, M.; Urban, M.; von Cube, R. F.; Welling, C.
2017-10-01
The Visual Physics Analysis (VISPA) project defines a toolbox for accessing software via the web. It is based on latest web technologies and provides a powerful extension mechanism that enables to interface a wide range of applications. Beyond basic applications such as a code editor, a file browser, or a terminal, it meets the demands of sophisticated experiment-specific use cases that focus on physics data analyses and typically require a high degree of interactivity. As an example, we developed a data inspector that is capable of browsing interactively through event content of several data formats, e.g., MiniAOD which is utilized by the CMS collaboration. The VISPA extension mechanism can also be used to embed external web-based applications that benefit from dynamic allocation of user-defined computing resources via SSH. For example, by wrapping the JSROOT project, ROOT files located on any remote machine can be inspected directly through a VISPA server instance. We introduced domains that combine groups of users and role-based permissions. Thereby, tailored projects are enabled, e.g. for teaching where access to student’s homework is restricted to a team of tutors, or for experiment-specific data that may only be accessible for members of the collaboration. We present the extension mechanism including corresponding applications and give an outlook onto the new permission system.
Publications of the Western Earth Surface Processes Team 2002
Powell, Charles; Graymer, R.W.
2003-01-01
The Western Earth Surface Processes Team (WESPT) of the U.S. Geological Survey (USGS) conducts geologic mapping and related topical earth science studies in the western United States. This work is focused on areas where modern geologic maps and associated earth-science data are needed to address key societal and environmental issues such as ground-water quality, landslides and other potential geologic hazards, and land-use decisions. Areas of primary emphasis in 2001 included southern California, the San Francisco Bay region, the Pacific Northwest, and the Las Vegas urban corridor. The team has its headquarters in Menlo Park, California, and maintains smaller field offices at several other locations in the western United States. The results of research conducted by the WESPT are released to the public as a variety of databases, maps, text reports, and abstracts, both through the internal publication system of the USGS and in diverse external publications such as scientific journals and books. This report lists publications of the WESPT released in 2002 as well as additional 1998 and 2001 publications that were not included in the previous list (USGS Open-File Report 00-215, USGS Open-File Report 01-198, and USGS Open-File Report 02-269). Most of the publications listed were authored or coauthored by WESPT staff. The list also includes some publications authored by non-USGS cooperators with the WESPT, as well as some authored by USGS staff outside the WESPT in cooperation with WESPT projects. Several of the publications listed are available on the World Wide Web; for these, URL addresses are provided. Many of these web publications are USGS open-file reports that contain large digital databases of geologic map and related information. Information on ordering USGS publications can be found on the World Wide Web or by calling 1-888-ASK-USGS. The U.S. Geological Survey’s web server for geologic information in the western United States is located at http://geology.wr.usgs.gov. More information is available about the WESPT is available on-line at the team website.
Resources for comparing the speed and performance of medical autocoders.
Berman, Jules J
2004-06-15
Concept indexing is a popular method for characterizing medical text, and is one of the most important early steps in many data mining efforts. Concept indexing differs from simple word or phrase indexing because concepts are typically represented by a nomenclature code that binds a medical concept to all equivalent representations. A concept search on the term renal cell carcinoma would be expected to find occurrences of hypernephroma, and renal carcinoma (concept equivalents). The purpose of this study is to provide freely available resources to compare speed and performance among different autocoders. These tools consist of: 1) a public domain autocoder written in Perl (a free and open source programming language that installs on any operating system); 2) a nomenclature database derived from the unencumbered subset of the publicly available Unified Medical Language System; 3) a large corpus of autocoded output derived from a publicly available medical text. A simple lexical autocoder was written that parses plain-text into a listing of all 1,2,3, and 4-word strings contained in text, assigning a nomenclature code for text strings that match terms in the nomenclature. The nomenclature used is the unencumbered subset of the 2003 Unified Medical Language System (UMLS). The unencumbered subset of UMLS was reduced to exclude homonymous one-word terms and proper names, resulting in a term/code data dictionary containing about a half million medical terms. The Online Mendelian Inheritance in Man (OMIM), a 92+ Megabyte publicly available medical opus, was used as sample medical text for the autocoder. The autocoding Perl script is remarkably short, consisting of just 38 command lines. The 92+ Megabyte OMIM file was completely autocoded in 869 seconds on a 2.4 GHz processor (less than 10 seconds per Megabyte of text). The autocoded output file (9,540,442 bytes) contains 367,963 coded terms from OMIM and is distributed with this manuscript. A public domain Perl script is provided that can parse through plain-text files of any length, matching concepts against an external nomenclature. The script and associated files can be used freely to compare the speed and performance of autocoding software.
Steinbach, Gábor; Kaňa, Radek
2016-04-01
Photosynthesis research employs several biophysical methods, including the detection of fluorescence. Even though fluorescence is a key method to detect photosynthetic efficiency, it has not been applied/adapted to single-cell confocal microscopy measurements to examine photosynthetic microorganisms. Experiments with photosynthetic cells may require automation to perform a large number of measurements with different parameters, especially concerning light conditions. However, commercial microscopes support custom protocols (through Time Controller offered by Olympus or Experiment Designer offered by Zeiss) that are often unable to provide special set-ups and connection to external devices (e.g., for irradiation). Our new system combining an Arduino microcontroller with the Cell⊕Finder software was developed for controlling Olympus FV1000 and FV1200 confocal microscopes and the attached hardware modules. Our software/hardware solution offers (1) a text file-based macro language to control the imaging functions of the microscope; (2) programmable control of several external hardware devices (light sources, thermal controllers, actuators) during imaging via the Arduino microcontroller; (3) the Cell⊕Finder software with ergonomic user environment, a fast selection method for the biologically important cells and precise positioning feature that reduces unwanted bleaching of the cells by the scanning laser. Cell⊕Finder can be downloaded from http://www.alga.cz/cellfinder. The system was applied to study changes in fluorescence intensity in Synechocystis sp. PCC6803 cells under long-term illumination. Thus, we were able to describe the kinetics of phycobilisome decoupling. Microscopy data showed that phycobilisome decoupling appears slowly after long-term (>1 h) exposure to high light.
Kocher, Katharina; Kowalski, Piotr; Kolokitha, Olga-Elpis; Katsaros, Christos; Fudalej, Piotr S
2016-05-01
To determine whether judgment of nasolabial esthetics in cleft lip and palate (CLP) is influenced by overall facial attractiveness. Experimental study. University of Bern, Switzerland. Seventy-two fused images (36 of boys, 36 of girls) were constructed. Each image comprised (1) the nasolabial region of a treated child with complete unilateral CLP (UCLP) and (2) the external facial features, i.e., the face with masked nasolabial region, of a noncleft child. Photographs of the nasolabial region of six boys and six girls with UCLP representing a wide range of esthetic outcomes, i.e., from very good to very poor appearance, were randomly chosen from a sample of 60 consecutively treated patients in whom nasolabial esthetics had been rated in a previous study. Photographs of external facial features of six boys and six girls without UCLP with various esthetics were randomly selected from patients' files. Eight lay raters evaluated the fused images using a 100-mm visual analogue scale. Method reliability was assessed by reevaluation of fused images after >1 month. A regression model was used to analyze which elements of facial esthetics influenced the perception of nasolabial appearance. Method reliability was good. A regression analysis demonstrated that only the appearance of the nasolabial area affected the esthetic scores of fused images (coefficient = -11.44; P < .001; R(2) = 0.464). The appearance of the external facial features did not influence perceptions of fused images. Cropping facial images for assessment of nasolabial appearance in CLP seems unnecessary. Instead, esthetic evaluation can be performed on images of full faces.
Tanaka, Shinobu; Hayashi, Shigeki; Fukushima, Satoshi; Yasuki, Tsuyoshi
2013-01-01
This article describes the chest injury risk reduction effect of shoulder restraints using finite element (FE) models of the worldwide harmonized side impact dummy (WorldSID) and Total Human Model for Safety (THUMS) in an FE model 32 km/h oblique pole side impact. This research used an FE model of a mid-sized vehicle equipped with various combinations of curtain shield air bags, torso air bags, and shoulder restraint air bags. As occupant models, AM50 WorldSID and THUMS AM50 Version 4 were used for comparison. The research investigated the effect of shoulder restraint air bag on chest injury by comparing cases with and without a shoulder side air bag. The maximum external force to the chest was reduced by shoulder restraint air bag in both WorldSID and THUMS, reducing chest injury risk as measured by the amount of rib deflection, number of the rib fractures, and rib deflection ratio. However, it was also determined that the external force to shoulder should be limited to the chest injury threshold because the external shoulder force transmits to the chest via the arm in the case of WorldSID and via the scapula in the case of THUMS. Because these results show the shoulder restraint air bag effect on chest injury risk, the vent hole size of the shoulder restraint air bag was changed for varying reaction forces to investigate the relationship between the external force to the shoulder and the risk of chest injury. In the case of THUMS, an external shoulder force of 1.8 kN and more force from the shoulder restraint air bag was necessary to help prevent rib fracture. Increasing external force applied to shoulder up to 6.2 kN (the maximum force used in this study) did not induce any rib or clavicle fractures in the THUMS. When the shoulder restraint air bag generated external force to the shoulder from 1.8 to 6.2 kN in THUMS, which were applied to the WorldSID, the shoulder deflection ranged from 35 to 68 mm, and the shoulder force ranged from 1.8 to 2.3 kN. In the test configuration used, a shoulder restraint using the air bag helps reduce chest injury risk by lowering the maximum magnitude of external force to the shoulder and chest. To help reduce rib fracture risk in the THUMS, the shoulder restraint air bag was expected to generate a force of 3.7 kN with a minimum rib deflection ratio. This corresponds to a shoulder rib deflection of 60 mm and a shoulder load of 2.2 kN in WorldSID. Supplemental materials are available for this article. Go to the publisher's online edition of Traffic Injury Prevention to view the supplemental file.
ARC: An open-source library for calculating properties of alkali Rydberg atoms
NASA Astrophysics Data System (ADS)
Šibalić, N.; Pritchard, J. D.; Adams, C. S.; Weatherill, K. J.
2017-11-01
We present an object-oriented Python library for the computation of properties of highly-excited Rydberg states of alkali atoms. These include single-body effects such as dipole matrix elements, excited-state lifetimes (radiative and black-body limited) and Stark maps of atoms in external electric fields, as well as two-atom interaction potentials accounting for dipole and quadrupole coupling effects valid at both long and short range for arbitrary placement of the atomic dipoles. The package is cross-referenced to precise measurements of atomic energy levels and features extensive documentation to facilitate rapid upgrade or expansion by users. This library has direct application in the field of quantum information and quantum optics which exploit the strong Rydberg dipolar interactions for two-qubit gates, robust atom-light interfaces and simulating quantum many-body physics, as well as the field of metrology using Rydberg atoms as precise microwave electrometers. Program Files doi:http://dx.doi.org/10.17632/hm5n8w628c.1 Licensing provisions: BSD-3-Clause Programming language: Python 2.7 or 3.5, with C extension External Routines: NumPy [1], SciPy [1], Matplotlib [2] Nature of problem: Calculating atomic properties of alkali atoms including lifetimes, energies, Stark shifts and dipole-dipole interaction strengths using matrix elements evaluated from radial wavefunctions. Solution method: Numerical integration of radial Schrödinger equation to obtain atomic wavefunctions, which are then used to evaluate dipole matrix elements. Properties are calculated using second order perturbation theory or exact diagonalisation of the interaction Hamiltonian, yielding results valid even at large external fields or small interatomic separation. Restrictions: External electric field fixed to be parallel to quantisation axis. Supplementary material: Detailed documentation (.html), and Jupyter notebook with examples and benchmarking runs (.html and .ipynb). [1] T.E. Oliphant, Comput. Sci. Eng. 9, 10 (2007). http://www.scipy.org/. [2] J.D. Hunter, Comput. Sci. Eng. 9, 90 (2007). http://matplotlib.org/.
Effect of the Rhinoplasty Technique and Lateral Osteotomy on Periorbital Edema and Ecchymosis.
Kiliç, Caner; Tuncel, Ümit; Cömert, Ela; Şencan, Ziya
2015-07-01
The present study aimed to compare edema and ecchymosis in the early and late postoperative periods following the application of different surgical techniques (open and endonasal) and different types of lateral osteotomy (internal and external). The files and photographs of a total of 120 patients whose records were regularly maintained/updated and who underwent septorhinoplasty operation with the same surgeon were retrospectively evaluated. Sixty-nine (57.5%) patients were women and 51 (43.5%) were men. The patients were divided into 4 different groups according to the operations they underwent as follows--Group I: open technique septorhinoplasty + internal/continuous lateral osteotomy; Group II: endonasal rhinoplasty + internal/continuous lateral osteotomy; Group III: open technique septorhinoplasty + external/perforating lateral osteotomy; and Group IV: endonasal rhinoplasty + external/perforating lateral osteotomy. Postoperative edema and ecchymosis, and lateral nasal wall mucosal damage because of osteotomy were evaluated. Postoperative second day edema and ecchymosis scores were statistically significantly better in patients in Group II compared with the patients in Group I (P = 0.010 and P = 0.004, respectively). Postoperative first day edema and postoperative seventh day ecchymosis scores were statistically significantly better in the patients in Group IV compared with the patients in Group III (P = 0.025 and P = 0.011, respectively). Intraoperative bleeding was similar in all groups. The nasal tip was more flexible in patients who underwent closed technique rhinoplasty. Unilateral mucosal damage occurred in 3 patients (4%) with internal lateral osteotomy, whereas no mucosal damage was present in patients with external osteotomy. The difference in the rate of edema and ecchymosis in the early postoperative period between the closed technique rhinoplasty and the open surgical approach was statistically significant, whereas osteotomy did not cause a significant difference. According to these results, the authors suggest endonasal surgery to prevent the development of edema and ecchymosis, whereas the choice of lateral osteotomy should be dependent on the experience of the surgeon.
NASA Technical Reports Server (NTRS)
2004-01-01
KENNEDY SPACE CENTER, FLA. In the Orbiter Processing Facility, from left, United Space Alliance workers Loyd Turner, Craig Meyer and Erik Visser prepare to conduct a fit check of an External Tank (ET) digital still camera in the right-hand liquid oxygen umbilical well on Space Shuttle Atlantis. NASA is pursuing use of the camera, beginning with the Shuttles Return To Flight, to obtain and downlink high-resolution images of the ET following separation of the ET from the orbiter after launch. The Kodak camera will record 24 images, at one frame per 1.5 seconds, on a flash memory card. After orbital insertion, the crew will transfer the images from the memory card to a laptop computer. The files will then be downloaded through the Ku-band system to the Mission Control Center in Houston for analysis.
NASA Technical Reports Server (NTRS)
2004-01-01
KENNEDY SPACE CENTER, FLA. In the Orbiter Processing Facility, from left, United Space Alliance workers Loyd Turner, Craig Meyer and Erik Visser conduct a fit check of an External Tank (ET) digital still camera in the right-hand liquid oxygen umbilical well on Space Shuttle Atlantis. NASA is pursuing use of the camera, beginning with the Shuttles Return To Flight, to obtain and downlink high-resolution images of the ET following separation of the ET from the orbiter after launch. The Kodak camera will record 24 images, at one frame per 1.5 seconds, on a flash memory card. After orbital insertion, the crew will transfer the images from the memory card to a laptop computer. The files will then be downloaded through the Ku-band system to the Mission Control Center in Houston for analysis.
2004-09-17
KENNEDY SPACE CENTER, FLA. - In the Orbiter Processing Facility, from left, United Space Alliance workers Loyd Turner, Craig Meyer and Erik Visser conduct a fit check of an External Tank (ET) digital still camera in the right-hand liquid oxygen umbilical well on Space Shuttle Atlantis. NASA is pursuing use of the camera, beginning with the Shuttle’s Return To Flight, to obtain and downlink high-resolution images of the ET following separation of the ET from the orbiter after launch. The Kodak camera will record 24 images, at one frame per 1.5 seconds, on a flash memory card. After orbital insertion, the crew will transfer the images from the memory card to a laptop computer. The files will then be downloaded through the Ku-band system to the Mission Control Center in Houston for analysis.
2004-09-17
KENNEDY SPACE CENTER, FLA. - In the Orbiter Processing Facility, from left, United Space Alliance workers Loyd Turner, Craig Meyer and Erik Visser prepare to conduct a fit check of an External Tank (ET) digital still camera in the right-hand liquid oxygen umbilical well on Space Shuttle Atlantis. NASA is pursuing use of the camera, beginning with the Shuttle’s Return To Flight, to obtain and downlink high-resolution images of the ET following separation of the ET from the orbiter after launch. The Kodak camera will record 24 images, at one frame per 1.5 seconds, on a flash memory card. After orbital insertion, the crew will transfer the images from the memory card to a laptop computer. The files will then be downloaded through the Ku-band system to the Mission Control Center in Houston for analysis.
Blonigen, Daniel M; Patrick, Christopher J; Douglas, Kevin S; Poythress, Norman G; Skeem, Jennifer L; Lilienfeld, Scott O; Edens, John F; Krueger, Robert F
2010-03-01
Research to date has revealed divergent relations across factors of psychopathy measures with criteria of internalizing (INT; anxiety, depression) and externalizing (EXT; antisocial behavior, substance use). However, failure to account for method variance and suppressor effects has obscured the consistency of these findings across distinct measures of psychopathy. Using a large correctional sample, the current study employed a multimethod approach to psychopathy assessment (self-report, interview and file review) to explore convergent and discriminant relations between factors of psychopathy measures and latent criteria of INT and EXT derived from the Personality Assessment Inventory (Morey, 2007). Consistent with prediction, scores on the affective-interpersonal factor of psychopathy were negatively associated with INT and negligibly related to EXT, whereas scores on the social deviance factor exhibited positive associations (moderate and large, respectively) with both INT and EXT. Notably, associations were highly comparable across the psychopathy measures when accounting for method variance (in the case of EXT) and when assessing for suppressor effects (in the case of INT). Findings are discussed in terms of implications for clinical assessment and evaluation of the validity of interpretations drawn from scores on psychopathy measures. PsycINFO Database Record (c) 2010 APA, all rights reserved.
Blonigen, Daniel M.; Patrick, Christopher J.; Douglas, Kevin S.; Poythress, Norman G.; Skeem, Jennifer L.; Lilienfeld, Scott O.; Edens, John F.; Krueger, Robert F.
2010-01-01
Research to date has revealed divergent relations across factors of psychopathy measures with criteria of internalizing (INT; anxiety, depression) and externalizing (EXT; antisocial behavior, substance use). However, failure to account for method variance and suppressor effects has obscured the consistency of these findings across distinct measures of psychopathy. Using a large correctional sample, the current study employed a multi-method approach to psychopathy assessment (self-report, interview/file review) to explore convergent and discriminant relations between factors of psychopathy measures and latent criteria of INT and EXT derived from the Personality Assessment Inventory (PAI; L. Morey, 2007). Consistent with prediction, scores on the affective-interpersonal factor of psychopathy were negatively associated with INT and negligibly related to EXT, whereas scores on the social deviance factor exhibited positive associations (moderate and large, respectively) with both INT and EXT. Notably, associations were highly comparable across the psychopathy measures when accounting for method variance (in the case of EXT) and when assessing for suppressor effects (in the case of INT). Findings are discussed in terms of implications for clinical assessment and evaluation of the validity of interpretations drawn from scores on psychopathy measures. PMID:20230156
NASA Astrophysics Data System (ADS)
Wang, Yu; Zhao, Yan-Jiao; Huang, Ji-Ping
2012-07-01
The detection of macromolecular conformation is particularly important in many physical and biological applications. Here we theoretically explore a method for achieving this detection by probing the electricity of sequential charged segments of macromolecules. Our analysis is based on molecular dynamics simulations, and we investigate a single file of water molecules confined in a half-capped single-walled carbon nanotube (SWCNT) with an external electric charge of +e or -e (e is the elementary charge). The charge is located in the vicinity of the cap of the SWCNT and along the centerline of the SWCNT. We reveal the picosecond timescale for the re-orientation (namely, from one unidirectional direction to the other) of the water molecules in response to a switch in the charge signal, -e → +e or +e → -e. Our results are well understood by taking into account the electrical interactions between the water molecules and between the water molecules and the external charge. Because such signals of re-orientation can be magnified and transported according to Tu et al. [2009 Proc. Natl. Acad. Sci. USA 106 18120], it becomes possible to record fingerprints of electric signals arising from sequential charged segments of a macromolecule, which are expected to be useful for recognizing the conformations of some particular macromolecules.
Measurement of time delay for a prospectively gated CT simulator.
Goharian, M; Khan, R F H
2010-04-01
For the management of mobile tumors, respiratory gating is the ideal option, both during imaging and during therapy. The major advantage of respiratory gating during imaging is that it is possible to create a single artifact-free CT data-set during a selected phase of the patient's breathing cycle. The purpose of the present work is to present a simple technique to measure the time delay during acquisition of a prospectively gated CT. The time delay of a Philips Brilliance BigBore (Philips Medical Systems, Madison, WI) scanner attached to a Varian Real-Time Position Management (RPM) system (Varian Medical Systems, Palo Alto, CA) was measured. Two methods were used to measure the CT time delay: using a motion phantom and using a recorded data file from the RPM system. In the first technique, a rotating wheel phantom was altered by placing two plastic balls on its axis and rim, respectively. For a desired gate, the relative positions of the balls were measured from the acquired CT data and converted into corresponding phases. Phase difference was calculated between the measured phases and the desired phases. Using period of motion, the phase difference was converted into time delay. The Varian RPM system provides an external breathing signal; it also records transistor-transistor logic (TTL) 'X-Ray ON' status signal from the CT scanner in a text file. The TTL 'X-Ray ON' indicates the start of CT image acquisition. Thus, knowledge of the start time of CT acquisition, combined with the real-time phase and amplitude data from the external respiratory signal, provides time-stamping of all images in an axial CT scan. The TTL signal with time-stamp was used to calculate when (during the breathing cycle) a slice was recorded. Using the two approaches, the time delay between the prospective gating signal and CT simulator has been determined to be 367 +/- 40 ms. The delay requires corrections both at image acquisition and while setting gates for the treatment delivery; otherwise the simulation and treatment may not be correlated with the patient's breathing.
AgdbNet – antigen sequence database software for bacterial typing
Jolley, Keith A; Maiden, Martin CJ
2006-01-01
Background Bacterial typing schemes based on the sequences of genes encoding surface antigens require databases that provide a uniform, curated, and widely accepted nomenclature of the variants identified. Due to the differences in typing schemes, imposed by the diversity of genes targeted, creating these databases has typically required the writing of one-off code to link the database to a web interface. Here we describe agdbNet, widely applicable web database software that facilitates simultaneous BLAST querying of multiple loci using either nucleotide or peptide sequences. Results Databases are described by XML files that are parsed by a Perl CGI script. Each database can have any number of loci, which may be defined by nucleotide and/or peptide sequences. The software is currently in use on at least five public databases for the typing of Neisseria meningitidis, Campylobacter jejuni and Streptococcus equi and can be set up to query internal isolate tables or suitably-configured external isolate databases, such as those used for multilocus sequence typing. The style of the resulting website can be fully configured by modifying stylesheets and through the use of customised header and footer files that surround the output of the script. Conclusion The software provides a rapid means of setting up customised Internet antigen sequence databases. The flexible configuration options enable typing schemes with differing requirements to be accommodated. PMID:16790057
Plotting and Analyzing Data Trends in Ternary Diagrams Made Easy
NASA Astrophysics Data System (ADS)
John, Cédric M.
2004-04-01
Ternary plots are used in many fields of science to characterize a system based on three components. Triangular plotting is thus useful to a broad audience in the Earth sciences and beyond. Unfortunately, it is typically the most expensive commercial software packages that offer the option to plot data in ternary diagrams, and they lack features that are paramount to the geosciences, such as the ability to plot data directly into a standardized diagram and the possibility to analyze temporal and stratigraphic trends within this diagram. To address these issues, δPlot was developed with a strong emphasis on ease of use, community orientation, and availability free of charges. This ``freeware'' supports a fully graphical user interface where data can be imported as text files, or by copying and pasting. A plot is automatically generated, and any standard diagram can be selected for plotting in the background using a simple pull-down menu. Standard diagrams are stored in an external database of PDF files that currently holds some 30 diagrams that deal with different fields of the Earth sciences. Using any drawing software supporting PDF, one can easily produce new standard diagrams to be used with δPlot by simply adding them to the library folder. An independent column of values, commonly stratigraphic depths or ages, can be used to sort the data sets.
Home Care Providers to the Rescue: A Novel First-Responder Programme
Hansen, Steen M.; Brøndum, Stig; Thomas, Grethe; Rasmussen, Susanne R.; Kvist, Birgitte; Christensen, Anette; Lyng, Charlotte; Lindberg, Jan; Lauritsen, Torsten L. B.; Lippert, Freddy K.; Torp-Pedersen, Christian; Hansen, Poul A.
2015-01-01
Aim To describe the implementation of a novel first-responder programme in which home care providers equipped with automated external defibrillators (AEDs) were dispatched in parallel with existing emergency medical services in the event of a suspected out-of-hospital cardiac arrest (OHCA). Methods We evaluated a one-year prospective study that trained home care providers in performing cardiopulmonary resuscitation (CPR) and using an AED in cases of suspected OHCA. Data were collected from cardiac arrest case files, case files from each provider dispatch and a survey among dispatched providers. The study was conducted in a rural district in Denmark. Results Home care providers were dispatched to 28 of the 60 OHCAs that occurred in the study period. In ten cases the providers arrived before the ambulance service and subsequently performed CPR. AED analysis was executed in three cases and shock was delivered in one case. For 26 of the 28 cases, the cardiac arrest occurred in a private home. Ninety-five per cent of the providers who had been dispatched to a cardiac arrest reported feeling prepared for managing the initial resuscitation, including use of AED. Conclusion Home care providers are suited to act as first-responders in predominantly rural and residential districts. Future follow-up will allow further evaluation of home care provider arrivals and patient survival. PMID:26509532
Su, Jiaye; Guo, Hongxia
2011-01-25
The transport of water molecules through nanopores is not only crucial to biological activities but also useful for designing novel nanofluidic devices. Despite considerable effort and progress that has been made, a controllable and unidirectional water flow is still difficult to achieve and the underlying mechanism is far from being understood. In this paper, using molecular dynamics simulations, we systematically investigate the effects of an external electric field on the transport of single-file water molecules through a carbon nanotube (CNT). We find that the orientation of water molecules inside the CNT can be well-tuned by the electric field and is strongly coupled to the water flux. This orientation-induced water flux is energetically due to the asymmetrical water-water interaction along the CNT axis. The wavelike water density profiles are disturbed under strong field strengths. The frequency of flipping for the water dipoles will decrease as the field strength is increased, and the flipping events vanish completely for the relatively large field strengths. Most importantly, a critical field strength E(c) related to the water flux is found. The water flux is increased as E is increased for E ≤ E(c), while it is almost unchanged for E > E(c). Thus, the electric field offers a level of governing for unidirectional water flow, which may have some biological applications and provides a route for designing efficient nanopumps.
Vallejo, J.; Viciano-Chumillas, M.; Castro, I.; Amorós, P.; Déniz, M.; Ruiz-Pérez, C.; Yuste-Vivas, C.; Krzystek, J.; Julve, M.; Lloret, F.
2017-01-01
A vast impact on molecular nanoscience can be achieved using simple transition metal complexes as dynamic chemical systems to perform specific and selective tasks under the control of an external stimulus that switches “ON” and “OFF” their electronic properties. While the interest in single-ion magnets (SIMs) lies in their potential applications in information storage and quantum computing, the switching of their slow magnetic relaxation associated with host–guest processes is insufficiently explored. Herein, we report a unique example of a mononuclear cobalt(ii) complex in which geometrical constraints are the cause of easy and reversible water coordination and its release. As a result, a reversible and selective colour and SIM behaviour switch occurs between a “slow-relaxing” deep red anhydrous material (compound 1) and its “fast-relaxing” orange hydrated form (compound 2). The combination of this optical and magnetic switching in this new class of vapochromic and thermochromic SIMs offers fascinating possibilities for designing multifunctional molecular materials. PMID:28580105
The beam stop array method to measure object scatter in digital breast tomosynthesis
NASA Astrophysics Data System (ADS)
Lee, Haeng-hwa; Kim, Ye-seul; Park, Hye-Suk; Kim, Hee-Joung; Choi, Jae-Gu; Choi, Young-Wook
2014-03-01
Scattered radiation is inevitably generated in the object. The distribution of the scattered radiation is influenced by object thickness, filed size, object-to-detector distance, and primary energy. One of the investigations to measure scatter intensities involves measuring the signal detected under the shadow of the lead discs of a beam-stop array (BSA). The measured scatter by BSA includes not only the scattered radiation within the object (object scatter), but also the external scatter source. The components of external scatter source include the X-ray tube, detector, collimator, x-ray filter, and BSA. Excluding background scattered radiation can be applied to different scanner geometry by simple parameter adjustments without prior knowledge of the scanned object. In this study, a method using BSA to differentiate scatter in phantom (object scatter) from external background was used. Furthermore, this method was applied to BSA algorithm to correct the object scatter. In order to confirm background scattered radiation, we obtained the scatter profiles and scatter fraction (SF) profiles in the directions perpendicular to the chest wall edge (CWE) with and without scattering material. The scatter profiles with and without the scattering material were similar in the region between 127 mm and 228 mm from chest wall. This result indicated that the measured scatter by BSA included background scatter. Moreover, the BSA algorithm with the proposed method could correct the object scatter because the total radiation profiles of object scatter correction corresponded to original image in the region between 127 mm and 228 mm from chest wall. As a result, the BSA method to measure object scatter could be used to remove background scatter. This method could apply for different scanner geometry after background scatter correction. In conclusion, the BSA algorithm with the proposed method is effective to correct object scatter.
... PDF file Microsoft PowerPoint file Microsoft Word file Microsoft Excel file Audio/Video file Apple Quicktime file RealPlayer file Text file Zip Archive file SAS file ePub file RIS file Page last reviewed: February 18, 2013 Page last updated: March 30, 2017 Content source: ...
DOT National Transportation Integrated Search
2001-02-01
The Minnesota data system includes the following basic files: Accident data (Accident File, Vehicle File, Occupant File); Roadlog File; Reference Post File; Traffic File; Intersection File; Bridge (Structures) File; and RR Grade Crossing File. For ea...
Scabies: Workplace Frequently Asked Questions (FAQs)
... PDF file Microsoft PowerPoint file Microsoft Word file Microsoft Excel file Audio/Video file Apple Quicktime file RealPlayer file Text file Zip Archive file SAS file ePub file RIS file Page last reviewed: July 19, 2013 Page last updated: July 19, 2013 Content source: ...
FastStats: Chronic Liver Disease and Cirrhosis
... PDF file Microsoft PowerPoint file Microsoft Word file Microsoft Excel file Audio/Video file Apple Quicktime file RealPlayer file Text file Zip Archive file SAS file ePub file RIS file Page last reviewed: May 30, 2013 Page last updated: October 6, 2016 Content source: ...
MaMiCo: Transient multi-instance molecular-continuum flow simulation on supercomputers
NASA Astrophysics Data System (ADS)
Neumann, Philipp; Bian, Xin
2017-11-01
We present extensions of the macro-micro-coupling tool MaMiCo, which was designed to couple continuum fluid dynamics solvers with discrete particle dynamics. To enable local extraction of smooth flow field quantities especially on rather short time scales, sampling over an ensemble of molecular dynamics simulations is introduced. We provide details on these extensions including the transient coupling algorithm, open boundary forcing, and multi-instance sampling. Furthermore, we validate the coupling in Couette flow using different particle simulation software packages and particle models, i.e. molecular dynamics and dissipative particle dynamics. Finally, we demonstrate the parallel scalability of the molecular-continuum simulations by using up to 65 536 compute cores of the supercomputer Shaheen II located at KAUST. Program Files doi:http://dx.doi.org/10.17632/w7rgdrhb85.1 Licensing provisions: BSD 3-clause Programming language: C, C++ External routines/libraries: For compiling: SCons, MPI (optional) Subprograms used: ESPResSo, LAMMPS, ls1 mardyn, waLBerla For installation procedures of the MaMiCo interfaces, see the README files in the respective code directories located in coupling/interface/impl. Journal reference of previous version: P. Neumann, H. Flohr, R. Arora, P. Jarmatz, N. Tchipev, H.-J. Bungartz. MaMiCo: Software design for parallel molecular-continuum flow simulations, Computer Physics Communications 200: 324-335, 2016 Does the new version supersede the previous version?: Yes. The functionality of the previous version is completely retained in the new version. Nature of problem: Coupled molecular-continuum simulation for multi-resolution fluid dynamics: parts of the domain are resolved by molecular dynamics or another particle-based solver whereas large parts are covered by a mesh-based CFD solver, e.g. a lattice Boltzmann automaton. Solution method: We couple existing MD and CFD solvers via MaMiCo (macro-micro coupling tool). Data exchange and coupling algorithmics are abstracted and incorporated in MaMiCo. Once an algorithm is set up in MaMiCo, it can be used and extended, even if other solvers are used (as soon as the respective interfaces are implemented/available). Reasons for the new version: We have incorporated a new algorithm to simulate transient molecular-continuum systems and to automatically sample data over multiple MD runs that can be executed simultaneously (on, e.g., a compute cluster). MaMiCo has further been extended by an interface to incorporate boundary forcing to account for open molecular dynamics boundaries. Besides support for coupling with various MD and CFD frameworks, the new version contains a test case that allows to run molecular-continuum Couette flow simulations out-of-the-box. No external tools or simulation codes are required anymore. However, the user is free to switch from the included MD simulation package to LAMMPS. For details on how to run the transient Couette problem, see the file README in the folder coupling/tests, Remark on MaMiCo V1.1. Summary of revisions: Open boundary forcing; Multi-instance MD sampling; support for transient molecular-continuum systems Restrictions: Currently, only single-centered systems are supported. For access to the LAMMPS-based implementation of DPD boundary forcing, please contact Xin Bian, xin.bian@tum.de. Additional comments: Please see file license_mamico.txt for further details regarding distribution and advertising of this software.
Copernicus POD Service Operations
NASA Astrophysics Data System (ADS)
Fernandez, Jaime; Escobar, Diego; Ayuga, Francisco; Peter, Heike; Femenias, Pierre
2015-12-01
The Copernicus POD (Precise Orbit Determination) Service is part of the Copernicus PDGS Ground Segment of the Sentinel missions. A GMV-led consortium is operating the Copernicus POD Service (CPOD) being in charge of generating precise orbital products and auxiliary data files for their use as part of the processing chains of the respective Sentinel PDGS (Payload Data Ground Segment). This paper describes the CPOD Service and presents the current status operating Sentinel-1A and its readiness to support the Sentinel-2A and in particular Sentinel-3A incoming Commissioning Phases, with an especial emphasis on describing the Calibration and Validation (Cal/Val) activities to be performed during the Comm. Phase. Then, it is shown how the quality of the orbital products is guaranteed through external validation activities and the role of the Copernicus POD QWG (Quality Working Group).
NASA Technical Reports Server (NTRS)
Witkop, D. L.; Dale, B. J.; Gellin, S.
1991-01-01
The programming aspects of SFENES are described in the User's Manual. The information presented is provided for the installation programmer. It is sufficient to fully describe the general program logic and required peripheral storage. All element generated data is stored externally to reduce required memory allocation. A separate section is devoted to the description of these files thereby permitting the optimization of Input/Output (I/O) time through efficient buffer descriptions. Individual subroutine descriptions are presented along with the complete Fortran source listings. A short description of the major control, computation, and I/O phases is included to aid in obtaining an overall familiarity with the program's components. Finally, a discussion of the suggested overlay structure which allows the program to execute with a reasonable amount of memory allocation is presented.
The Ensembl genome database project.
Hubbard, T; Barker, D; Birney, E; Cameron, G; Chen, Y; Clark, L; Cox, T; Cuff, J; Curwen, V; Down, T; Durbin, R; Eyras, E; Gilbert, J; Hammond, M; Huminiecki, L; Kasprzyk, A; Lehvaslaiho, H; Lijnzaad, P; Melsopp, C; Mongin, E; Pettett, R; Pocock, M; Potter, S; Rust, A; Schmidt, E; Searle, S; Slater, G; Smith, J; Spooner, W; Stabenau, A; Stalker, J; Stupka, E; Ureta-Vidal, A; Vastrik, I; Clamp, M
2002-01-01
The Ensembl (http://www.ensembl.org/) database project provides a bioinformatics framework to organise biology around the sequences of large genomes. It is a comprehensive source of stable automatic annotation of the human genome sequence, with confirmed gene predictions that have been integrated with external data sources, and is available as either an interactive web site or as flat files. It is also an open source software engineering project to develop a portable system able to handle very large genomes and associated requirements from sequence analysis to data storage and visualisation. The Ensembl site is one of the leading sources of human genome sequence annotation and provided much of the analysis for publication by the international human genome project of the draft genome. The Ensembl system is being installed around the world in both companies and academic sites on machines ranging from supercomputers to laptops.
Developments in capture-γ libraries for nonproliferation applications
NASA Astrophysics Data System (ADS)
Hurst, A. M.; Firestone, R. B.; Sleaford, B. W.; Bleuel, D. L.; Basunia, M. S.; Bečvář, F.; Belgya, T.; Bernstein, L. A.; Carroll, J. J.; Detwiler, B.; Escher, J. E.; Genreith, C.; Goldblum, B. L.; Krtička, M.; Lerch, A. G.; Matters, D. A.; McClory, J. W.; McHale, S. R.; Révay, Zs.; Szentmiklosi, L.; Turkoglu, D.; Ureche, A.; Vujic, J.
2017-09-01
The neutron-capture reaction is fundamental for identifying and analyzing the γ-ray spectrum from an unknown assembly because it provides unambiguous information on the neutron-absorbing isotopes. Nondestructive-assay applications may exploit this phenomenon passively, for example, in the presence of spontaneous-fission neutrons, or actively where an external neutron source is used as a probe. There are known gaps in the Evaluated Nuclear Data File libraries corresponding to neutron-capture γ-ray data that otherwise limit transport-modeling applications. In this work, we describe how new thermal neutron-capture data are being used to improve information in the neutron-data libraries for isotopes relevant to nonproliferation applications. We address this problem by providing new experimentally-deduced partial and total neutron-capture reaction cross sections and then evaluate these data by comparison with statistical-model calculations.
Commissioning of a CERN Production and Analysis Facility Based on xrootd
NASA Astrophysics Data System (ADS)
Campana, Simone; van der Ster, Daniel C.; Di Girolamo, Alessandro; Peters, Andreas J.; Duellmann, Dirk; Coelho Dos Santos, Miguel; Iven, Jan; Bell, Tim
2011-12-01
The CERN facility hosts the Tier-0 of the four LHC experiments, but as part of WLCG it also offers a platform for production activities and user analysis. The CERN CASTOR storage technology has been extensively tested and utilized for LHC data recording and exporting to external sites according to experiments computing model. On the other hand, to accommodate Grid data processing activities and, more importantly, chaotic user analysis, it was realized that additional functionality was needed including a different throttling mechanism for file access. This paper will describe the xroot-based CERN production and analysis facility for the ATLAS experiment and in particular the experiment use case and data access scenario, the xrootd redirector setup on top of the CASTOR storage system, the commissioning of the system and real life experience for data processing and data analysis.
Modeling of rolling element bearing mechanics. Theoretical manual
NASA Technical Reports Server (NTRS)
Merchant, David H.; Greenhill, Lyn M.
1994-01-01
This report documents the theoretical basis for the Rolling Element Bearing Analysis System (REBANS) analysis code which determines the quasistatic response to external loads or displacement of three types of high-speed rolling element bearings: angular contact ball bearings; duplex angular contact ball bearings; and cylindrical roller bearings. The model includes the effects of bearing ring and support structure flexibility. It is comprised of two main programs: the Preprocessor for Bearing Analysis (PREBAN) which creates the input files for the main analysis program; and Flexibility Enhanced Rolling Element Bearing Analysis (FEREBA), the main analysis program. A companion report addresses the input instructions for and features of the computer codes. REBANS extends the capabilities of the SHABERTH (Shaft and Bearing Thermal Analysis) code to include race and housing flexibility, including such effects as dead band and preload springs.
Fluid breakup in carbon nanotubes: An explanation of ultrafast ion transport
NASA Astrophysics Data System (ADS)
Gao, Xiang; Zhao, Tianshou; Li, Zhigang
2017-09-01
Ultrafast ion transport in carbon nanotubes (CNTs) has been experimentally observed, but the underlying mechanism is unknown. In this work, we investigate ion transport in CNTs through molecular dynamics (MD) simulations. It is found that the flow in CNTs undergoes a transition from the passage of a continuous liquid chain to the transport of isolated ion-water clusters as the CNT length or the external electric filed strength is increased. The breakup of the liquid chain in CNTs greatly reduces the resistance caused by the hydrogen bonds of water and significantly enhances the ionic mobility, which explains the two-order-magnitude enhancement of ionic conductance in CNTs reported in the literature. A theoretical criterion for fluid breakup is proposed, which agrees well with MD results. The fluid breakup phenomenon provides new insights into enhancing ion transport in nanoconfinements.
NASA Astrophysics Data System (ADS)
Schumacher, F.; Friederich, W.
2015-12-01
We present the modularized software package ASKI which is a flexible and extendable toolbox for seismic full waveform inversion (FWI) as well as sensitivity or resolution analysis operating on the sensitivity matrix. It utilizes established wave propagation codes for solving the forward problem and offers an alternative to the monolithic, unflexible and hard-to-modify codes that have typically been written for solving inverse problems. It is available under the GPL at www.rub.de/aski. The Gauss-Newton FWI method for 3D-heterogeneous elastic earth models is based on waveform sensitivity kernels and can be applied to inverse problems at various spatial scales in both Cartesian and spherical geometries. The kernels are derived in the frequency domain from Born scattering theory as the Fréchet derivatives of linearized full waveform data functionals, quantifying the influence of elastic earth model parameters on the particular waveform data values. As an important innovation, we keep two independent spatial descriptions of the earth model - one for solving the forward problem and one representing the inverted model updates. Thereby we account for the independent needs of spatial model resolution of forward and inverse problem, respectively. Due to pre-integration of the kernels over the (in general much coarser) inversion grid, storage requirements for the sensitivity kernels are dramatically reduced.ASKI can be flexibly extended to other forward codes by providing it with specific interface routines that contain knowledge about forward code-specific file formats and auxiliary information provided by the new forward code. In order to sustain flexibility, the ASKI tools must communicate via file output/input, thus large storage capacities need to be accessible in a convenient way. Storing the complete sensitivity matrix to file, however, permits the scientist full manual control over each step in a customized procedure of sensitivity/resolution analysis and full waveform inversion.
DOT National Transportation Integrated Search
2001-02-01
The Minnesota data system includes the following basic files: Accident data (Accident File, Vehicle File, Occupant File); Roadlog File; Reference Post File; Traffic File; Intersection File; Bridge (Structures) File; and RR Grade Crossing File. For ea...
DOT National Transportation Integrated Search
2001-02-01
The Minnesota data system includes the following basic files: Accident data (Accident File, Vehicle File, Occupant File); Roadlog File; Reference Post File; Traffic File; Intersection File; Bridge (Structures) File; and RR Grade Crossing File. For ea...
NASA Astrophysics Data System (ADS)
Appel, Marius; Lahn, Florian; Pebesma, Edzer; Buytaert, Wouter; Moulds, Simon
2016-04-01
Today's amount of freely available data requires scientists to spend large parts of their work on data management. This is especially true in environmental sciences when working with large remote sensing datasets, such as obtained from earth-observation satellites like the Sentinel fleet. Many frameworks like SpatialHadoop or Apache Spark address the scalability but target programmers rather than data analysts, and are not dedicated to imagery or array data. In this work, we use the open-source data management and analytics system SciDB to bring large earth-observation datasets closer to analysts. Its underlying data representation as multidimensional arrays fits naturally to earth-observation datasets, distributes storage and computational load over multiple instances by multidimensional chunking, and also enables efficient time-series based analyses, which is usually difficult using file- or tile-based approaches. Existing interfaces to R and Python furthermore allow for scalable analytics with relatively little learning effort. However, interfacing SciDB and file-based earth-observation datasets that come as tiled temporal snapshots requires a lot of manual bookkeeping during ingestion, and SciDB natively only supports loading data from CSV-like and custom binary formatted files, which currently limits its practical use in earth-observation analytics. To make it easier to work with large multi-temporal datasets in SciDB, we developed software tools that enrich SciDB with earth observation metadata and allow working with commonly used file formats: (i) the SciDB extension library scidb4geo simplifies working with spatiotemporal arrays by adding relevant metadata to the database and (ii) the Geospatial Data Abstraction Library (GDAL) driver implementation scidb4gdal allows to ingest and export remote sensing imagery from and to a large number of file formats. Using added metadata on temporal resolution and coverage, the GDAL driver supports time-based ingestion of imagery to existing multi-temporal SciDB arrays. While our SciDB plugin works directly in the database, the GDAL driver has been specifically developed using a minimum amount of external dependencies (i.e. CURL). Source code for both tools is available from github [1]. We present these tools in a case-study that demonstrates the ingestion of multi-temporal tiled earth-observation data to SciDB, followed by a time-series analysis using R and SciDBR. Through the exclusive use of open-source software, our approach supports reproducibility in scalable large-scale earth-observation analytics. In the future, these tools can be used in an automated way to let scientists only work on ready-to-use SciDB arrays to significantly reduce the data management workload for domain scientists. [1] https://github.com/mappl/scidb4geo} and \\url{https://github.com/mappl/scidb4gdal
High-performance mass storage system for workstations
NASA Technical Reports Server (NTRS)
Chiang, T.; Tang, Y.; Gupta, L.; Cooperman, S.
1993-01-01
Reduced Instruction Set Computer (RISC) workstations and Personnel Computers (PC) are very popular tools for office automation, command and control, scientific analysis, database management, and many other applications. However, when using Input/Output (I/O) intensive applications, the RISC workstations and PC's are often overburdened with the tasks of collecting, staging, storing, and distributing data. Also, by using standard high-performance peripherals and storage devices, the I/O function can still be a common bottleneck process. Therefore, the high-performance mass storage system, developed by Loral AeroSys' Independent Research and Development (IR&D) engineers, can offload a RISC workstation of I/O related functions and provide high-performance I/O functions and external interfaces. The high-performance mass storage system has the capabilities to ingest high-speed real-time data, perform signal or image processing, and stage, archive, and distribute the data. This mass storage system uses a hierarchical storage structure, thus reducing the total data storage cost, while maintaining high-I/O performance. The high-performance mass storage system is a network of low-cost parallel processors and storage devices. The nodes in the network have special I/O functions such as: SCSI controller, Ethernet controller, gateway controller, RS232 controller, IEEE488 controller, and digital/analog converter. The nodes are interconnected through high-speed direct memory access links to form a network. The topology of the network is easily reconfigurable to maximize system throughput for various applications. This high-performance mass storage system takes advantage of a 'busless' architecture for maximum expandability. The mass storage system consists of magnetic disks, a WORM optical disk jukebox, and an 8mm helical scan tape to form a hierarchical storage structure. Commonly used files are kept in the magnetic disk for fast retrieval. The optical disks are used as archive media, and the tapes are used as backup media. The storage system is managed by the IEEE mass storage reference model-based UniTree software package. UniTree software will keep track of all files in the system, will automatically migrate the lesser used files to archive media, and will stage the files when needed by the system. The user can access the files without knowledge of their physical location. The high-performance mass storage system developed by Loral AeroSys will significantly boost the system I/O performance and reduce the overall data storage cost. This storage system provides a highly flexible and cost-effective architecture for a variety of applications (e.g., realtime data acquisition with a signal and image processing requirement, long-term data archiving and distribution, and image analysis and enhancement).
Computing Gravitational Fields of Finite-Sized Bodies
NASA Technical Reports Server (NTRS)
Quadrelli, Marco
2005-01-01
A computer program utilizes the classical theory of gravitation, implemented by means of the finite-element method, to calculate the near gravitational fields of bodies of arbitrary size, shape, and mass distribution. The program was developed for application to a spacecraft and to floating proof masses and associated equipment carried by the spacecraft for detecting gravitational waves. The program can calculate steady or time-dependent gravitational forces, moments, and gradients thereof. Bodies external to a proof mass can be moving around the proof mass and/or deformed under thermoelastic loads. An arbitrarily shaped proof mass is represented by a collection of parallelepiped elements. The gravitational force and moment acting on each parallelepiped element of a proof mass, including those attributable to the self-gravitational field of the proof mass, are computed exactly from the closed-form equation for the gravitational potential of a parallelepiped. The gravitational field of an arbitrary distribution of mass external to a proof mass can be calculated either by summing the fields of suitably many point masses or by higher-order Gauss-Legendre integration over all elements surrounding the proof mass that are part of a finite-element mesh. This computer program is compatible with more general finite-element codes, such as NASTRAN, because it is configured to read a generic input data file, containing the detailed description of the finiteelement mesh.
Martins, Renata Cristófani; Buchalla, Cassia Maria
2015-01-01
To prepare a dictionary in Portuguese for using in Iris and to evaluate its completeness for coding causes of death. Iniatially, a dictionary with all illness and injuries was created based on the International Classification of Diseases - tenth revision (ICD-10) codes. This dictionary was based on two sources: the electronic file of ICD-10 volume 1 and the data from Thesaurus of the International Classification of Primary Care (ICPC-2). Then, a death certificate sample from the Program of Improvement of Mortality Information in São Paulo (PRO-AIM) was coded manually and by Iris version V4.0.34, and the causes of death were compared. Whenever Iris was not able to code the causes of death, adjustments were made in the dictionary. Iris was able to code all causes of death in 94.4% death certificates, but only 50.6% were directly coded, without adjustments. Among death certificates that the software was unable to fully code, 89.2% had a diagnosis of external causes (chapter XX of ICD-10). This group of causes of death showed less agreement when comparing the coding by Iris to the manual one. The software performed well, but it needs adjustments and improvement in its dictionary. In the upcoming versions of the software, its developers are trying to solve the external causes of death problem.
DMFS: A Data Migration File System for NetBSD
NASA Technical Reports Server (NTRS)
Studenmund, William
1999-01-01
I have recently developed dmfs, a Data Migration File System, for NetBSD. This file system is based on the overlay file system, which is discussed in a separate paper, and provides kernel support for the data migration system being developed by my research group here at NASA/Ames. The file system utilizes an underlying file store to provide the file backing, and coordinates user and system access to the files. It stores its internal meta data in a flat file, which resides on a separate file system. Our data migration system provides archiving and file migration services. System utilities scan the dmfs file system for recently modified files, and archive them to two separate tape stores. Once a file has been doubly archived, files larger than a specified size will be truncated to that size, potentially freeing up large amounts of the underlying file store. Some sites will choose to retain none of the file (deleting its contents entirely from the file system) while others may choose to retain a portion, for instance a preamble describing the remainder of the file. The dmfs layer coordinates access to the file, retaining user-perceived access and modification times, file size, and restricting access to partially migrated files to the portion actually resident. When a user process attempts to read from the non-resident portion of a file, it is blocked and the dmfs layer sends a request to a system daemon to restore the file. As more of the file becomes resident, the user process is permitted to begin accessing the now-resident portions of the file. For simplicity, our data migration system divides a file into two portions, a resident portion followed by an optional non-resident portion. Also, a file is in one of three states: fully resident, fully resident and archived, and (partially) non-resident and archived. For a file which is only partially resident, any attempt to write or truncate the file, or to read a non-resident portion, will trigger a file restoration. Truncations and writes are blocked until the file is fully restored so that a restoration which only partially succeed does not leave the file in an indeterminate state with portions existing only on tape and other portions only in the disk file system. We chose layered file system technology as it permits us to focus on the data migration functionality, and permits end system administrators to choose the underlying file store technology. We chose the overlay layered file system instead of the null layer for two reasons: first to permit our layer to better preserve meta data integrity and second to prevent even root processes from accessing migrated files. This is achieved as the underlying file store becomes inaccessible once the dmfs layer is mounted. We are quite pleased with how the layered file system has turned out. Of the 45 vnode operations in NetBSD, 20 (forty-four percent) required no intervention by our file layer - they are passed directly to the underlying file store. Of the twenty five we do intercept, nine (such as vop_create()) are intercepted only to ensure meta data integrity. Most of the functionality was concentrated in five operations: vop_read, vop_write, vop_getattr, vop_setattr, and vop_fcntl. The first four are the core operations for controlling access to migrated files and preserving the user experience. vop_fcntl, a call generated for a certain class of fcntl codes, provides the command channel used by privileged user programs to communicate with the dmfs layer.
Effect of reciprocating file motion on microcrack formation in root canals: an SEM study.
Ashwinkumar, V; Krithikadatta, J; Surendran, S; Velmurugan, N
2014-07-01
To compare dentinal microcrack formation whilst using Ni-Ti hand K-files, ProTaper hand and rotary files and the WaveOne reciprocating file. One hundred and fifty mandibular first molars were selected. Thirty teeth were left unprepared and served as controls, and the remaining 120 teeth were divided into four groups. Ni-Ti hand K-files, ProTaper hand files, ProTaper rotary files and WaveOne Primary reciprocating files were used to prepare the mesial canals. Roots were then sectioned 3, 6 and 9 mm from the apex, and the cut surface was observed under scanning electron microscope (SEM) and checked for the presence of dentinal microcracks. The control and Ni-Ti hand K-files groups were not associated with microcracks. In roots prepared with ProTaper hand files, ProTaper rotary files and WaveOne Primary reciprocating files, dentinal microcracks were present. There was a significant difference between control/Ni-Ti hand K-files group and ProTaper hand files/ProTaper rotary files/WaveOne Primary reciprocating file group (P < 0.001) with ProTaper rotary files producing the most microcracks. No significant difference was observed between teeth prepared with ProTaper hand files and WaveOne Primary reciprocating files. ProTaper rotary files were associated with significantly more microcracks than ProTaper hand files and WaveOne Primary reciprocating files. Ni-Ti hand K-files did not produce microcracks at any levels inside the root canals. © 2013 International Endodontic Journal. Published by John Wiley & Sons Ltd.
Data Processing Factory for the Sloan Digital Sky Survey
NASA Astrophysics Data System (ADS)
Stoughton, Christopher; Adelman, Jennifer; Annis, James T.; Hendry, John; Inkmann, John; Jester, Sebastian; Kent, Steven M.; Kuropatkin, Nickolai; Lee, Brian; Lin, Huan; Peoples, John, Jr.; Sparks, Robert; Tucker, Douglas; Vanden Berk, Dan; Yanny, Brian; Yocum, Dan
2002-12-01
The Sloan Digital Sky Survey (SDSS) data handling presents two challenges: large data volume and timely production of spectroscopic plates from imaging data. A data processing factory, using technologies both old and new, handles this flow. Distribution to end users is via disk farms, to serve corrected images and calibrated spectra, and a database, to efficiently process catalog queries. For distribution of modest amounts of data from Apache Point Observatory to Fermilab, scripts use rsync to update files, while larger data transfers are accomplished by shipping magnetic tapes commercially. All data processing pipelines are wrapped in scripts to address consecutive phases: preparation, submission, checking, and quality control. We constructed the factory by chaining these pipelines together while using an operational database to hold processed imaging catalogs. The science database catalogs all imaging and spectroscopic object, with pointers to the various external files associated with them. Diverse computing systems address particular processing phases. UNIX computers handle tape reading and writing, as well as calibration steps that require access to a large amount of data with relatively modest computational demands. Commodity CPUs process steps that require access to a limited amount of data with more demanding computations requirements. Disk servers optimized for cost per Gbyte serve terabytes of processed data, while servers optimized for disk read speed run SQLServer software to process queries on the catalogs. This factory produced data for the SDSS Early Data Release in June 2001, and it is currently producing Data Release One, scheduled for January 2003.
Rao, M S Rama; Shameem, Abdul; Nair, Rashmi; Ghanta, Sureshbabu; Thankachan, Rekha P; Issac, Johnson K
2013-07-01
The aim of the present study was to compare the remaining dental thickness (RDT) in the mesiobuccal root of mandibular first molars at 3 and 7 mm from the anatomic apex after instrumentation with ProTaper, light speed LSX, K3 and M2 and to compare with that of K-files. In this study, 60 extracted, untreated human mandibular first molars with fully formed apices, with curvature less than 35° and no root resorption were used. Prepared specimens were cut horizontally at 3 and 7 mm short of anatomic apex. The least dentin thickness from canal to external root surface was observed under 3× magnification and recorded using Clemax measuring tool and the sections were reassembled. Group I-instrumentation with ProTaper, group II-instrumentation with K3, group III-instrumentation with Light Speed LSX, group IV-instrumentation with M2 and group V- instrumentation with K-files and RDT was measured. Results showed that group V removed lesser amount of dentin compared to all other groups while all the three instrumentation techniques removed almost equal amount of dentin apically. Cleaning and shaping of the root canal space involves the elimination of pathogenic contents as well as attaining a uniform specific shape. However, the RDT following the use of various intraradicular procedures is an important factor to be considered as an iatrogenic cause that may result in root fracture. To avoid this, newer rotary instruments are being introduced.
Rizvi, Sanam Shahla; Chung, Tae-Sun
2010-01-01
Flash memory has become a more widespread storage medium for modern wireless devices because of its effective characteristics like non-volatility, small size, light weight, fast access speed, shock resistance, high reliability and low power consumption. Sensor nodes are highly resource constrained in terms of limited processing speed, runtime memory, persistent storage, communication bandwidth and finite energy. Therefore, for wireless sensor networks supporting sense, store, merge and send schemes, an efficient and reliable file system is highly required with consideration of sensor node constraints. In this paper, we propose a novel log structured external NAND flash memory based file system, called Proceeding to Intelligent service oriented memorY Allocation for flash based data centric Sensor devices in wireless sensor networks (PIYAS). This is the extended version of our previously proposed PIYA [1]. The main goals of the PIYAS scheme are to achieve instant mounting and reduced SRAM space by keeping memory mapping information to a very low size of and to provide high query response throughput by allocation of memory to the sensor data by network business rules. The scheme intelligently samples and stores the raw data and provides high in-network data availability by keeping the aggregate data for a longer period of time than any other scheme has done before. We propose effective garbage collection and wear-leveling schemes as well. The experimental results show that PIYAS is an optimized memory management scheme allowing high performance for wireless sensor networks.
Huertas, María F; Gonzalez, Juliana; Camacho, Sandra; Sarralde, Ana L; Rodríguez, Adriana
2017-04-01
Dentistry is interested in identifying and controlling adverse events, understood as involuntary injuries to the patient during dental care. The aim of this study was to analyze the adverse events reported to the Office of the Clinical Director at the School of Dentistry at Pontificia Universidad Javeriana (Colombia) during 2011-2012. It was an observational, descriptive study that evaluated 227 dental clinical records of patients who filed a complaint with the Office of the Clinical Director. Of these, 43 were adverse events and were used as the basis for this study. Of the 16,060 patients who received care during 2011 - 2012, 0.26% (43) filed a complaint involving an adverse event, of which 97.7 % were considered preventable. Most of these (76.18%, n= 32) occurred during clinical management of treatments in different specialties, 9.5% (4) were the result of deficient external dental laboratory quality, and 14.32% (6) were due to failure in document management, soft tissue injury, misdiagnosis and swallowing foreign objects. Of the patients involved, 65.2% (28) received care from postgraduate students, with the highest number of cases in the Oral Rehabilitation speciality. The occurrence of adverse events during dental care, indicates the need for information about their origin in order to establish protection barriers and prevent their incidence, particularly in the educational area under the student dental clinic service model. Sociedad Argentina de Investigación Odontológica.
Duffing revisited: phase-shift control and internal resonance in self-sustained oscillators
NASA Astrophysics Data System (ADS)
Arroyo, Sebastián I.; Zanette, Damián H.
2016-01-01
We address two aspects of the dynamics of the forced Duffing oscillator which are relevant to the technology of micromechanical devices and, at the same time, have intrinsic significance to the field of nonlinear oscillating systems. First, we study the stability of periodic motion when the phase shift between the external force and the oscillation is controlled - contrary to the standard case, where the control parameter is the frequency of the force. Phase-shift control is the operational configuration under which self-sustained oscillators - and, in particular, micromechanical oscillators - provide a frequency reference useful for time keeping. We show that, contrary to the standard forced Duffing oscillator, under phase-shift control oscillations are stable over the whole resonance curve, and provide analytical approximate expressions for the time dependence of the oscillation amplitude and frequency during transients. Second, we analyze a model for the internal resonance between the main Duffing oscillation mode and a higher-harmonic mode of a vibrating solid bar clamped at its two ends. We focus on the stabilization of the oscillation frequency when the resonance takes place, and present preliminary experimental results that illustrate the phenomenon. This synchronization process has been proposed to counteract the undesirable frequency-amplitude interdependence in nonlinear time-keeping micromechanical devices. Supplementary material in the form of one pdf file and one gif file available from the Journal web page at http://dx.doi.org/10.1140/epjb/e2015-60517-3
Clinical Documents: Attribute-Values Entity Representation, Context, Page Layout And Communication
Lovis, Christian; Lamb, Alexander; Baud, Robert; Rassinoux, Anne-Marie; Fabry, Paul; Geissbühler, Antoine
2003-01-01
This paper presents how acquisition, storage and communication of clinical documents are implemented at the University Hospitals of Geneva. Careful attention has been given to user-interfaces, in order to support complex layouts, spell checking, templates management with automatic prefilling in order to facilitate acquisition. A dual architecture has been developed for storage using an attributes-values entity unified database and a consolidated, patient-centered, layout-respectful files-based storage, providing both representation power and sinsert (peed of accesses. This architecture allows great flexibility to store a continuum of data types from simple type values up to complex clinical reports. Finally, communication is entirely based on HTTP-XML internally and a HL-7 CDA interface V2 is currently studied for external communication. Some of the problem encountered, mostly concerning the typology of documents and the ontology of clinical attributes are evoked. PMID:14728202
Nature of the electromagnetic force between classical magnetic dipoles
NASA Astrophysics Data System (ADS)
Mansuripur, Masud
2017-09-01
The Lorentz force law of classical electrodynamics states that the force 𝑭𝑭 exerted by the magnetic induction 𝑩𝑩 on a particle of charge 𝑞𝑞 moving with velocity 𝑽𝑽 is given by 𝑭𝑭 = 𝑞𝑞𝑽𝑽 × 𝑩𝑩. Since this force is orthogonal to the direction of motion, the magnetic field is said to be incapable of performing mechanical work. Yet there is no denying that a permanent magnet can readily perform mechanical work by pushing/pulling on another permanent magnet or by attracting pieces of magnetizable material such as scrap iron or iron filings. We explain this apparent contradiction by examining the magnetic Lorentz force acting on an Amperian current loop, which is the model for a magnetic dipole. We then extend the discussion by analyzing the Einstein-Laub model of magnetic dipoles in the presence of external magnetic fields.
The State of Social Media Policies in Higher Education
Pomerantz, Jeffrey; Hank, Carolyn; Sugimoto, Cassidy R.
2015-01-01
This paper presents an analysis of the current state of development of social media policies at institution of higher education. Content analysis of social media policies for all institutions listed in the Carnegie Classification Data File revealed that less than one-quarter of institutions had an accessible social media policy. Analysis was done by institution and campus unit, finding that social media policies were most likely to appear at doctorate-granting institutions and health, athletics, and library units. Policies required that those affiliated with the institution post appropriate content, represent the unit appropriately, and moderate conversations with coworkers and external agencies. This analysis may inform the development and revision of social media policies across the field of higher education, taking into consideration the rapidly changing landscape of social media, issues of academic freedom, and notions of interoperability with policies at the unit and campus levels. PMID:26017549
Ensembl 2002: accommodating comparative genomics.
Clamp, M; Andrews, D; Barker, D; Bevan, P; Cameron, G; Chen, Y; Clark, L; Cox, T; Cuff, J; Curwen, V; Down, T; Durbin, R; Eyras, E; Gilbert, J; Hammond, M; Hubbard, T; Kasprzyk, A; Keefe, D; Lehvaslaiho, H; Iyer, V; Melsopp, C; Mongin, E; Pettett, R; Potter, S; Rust, A; Schmidt, E; Searle, S; Slater, G; Smith, J; Spooner, W; Stabenau, A; Stalker, J; Stupka, E; Ureta-Vidal, A; Vastrik, I; Birney, E
2003-01-01
The Ensembl (http://www.ensembl.org/) database project provides a bioinformatics framework to organise biology around the sequences of large genomes. It is a comprehensive source of stable automatic annotation of human, mouse and other genome sequences, available as either an interactive web site or as flat files. Ensembl also integrates manually annotated gene structures from external sources where available. As well as being one of the leading sources of genome annotation, Ensembl is an open source software engineering project to develop a portable system able to handle very large genomes and associated requirements. These range from sequence analysis to data storage and visualisation and installations exist around the world in both companies and at academic sites. With both human and mouse genome sequences available and more vertebrate sequences to follow, many of the recent developments in Ensembl have focusing on developing automatic comparative genome analysis and visualisation.
Coding conventions and principles for a National Land-Change Modeling Framework
Donato, David I.
2017-07-14
This report establishes specific rules for writing computer source code for use with the National Land-Change Modeling Framework (NLCMF). These specific rules consist of conventions and principles for writing code primarily in the C and C++ programming languages. Collectively, these coding conventions and coding principles create an NLCMF programming style. In addition to detailed naming conventions, this report provides general coding conventions and principles intended to facilitate the development of high-performance software implemented with code that is extensible, flexible, and interoperable. Conventions for developing modular code are explained in general terms and also enabled and demonstrated through the appended templates for C++ base source-code and header files. The NLCMF limited-extern approach to module structure, code inclusion, and cross-module access to data is both explained in the text and then illustrated through the module templates. Advice on the use of global variables is provided.
NASA Technical Reports Server (NTRS)
McGuire, Robert E.; Candey, Robert M.
2007-01-01
SPDF now supports a broad range of data, user services and other activities. These include: CDAWeb current multi-mission data graphics, listings, file subsetting and supersetting by time and parameters; SSCWeb and 3-D Java client orbit graphics, listings and conjunction queries; OMNIWeb 1/5/60 minute interplanetary parameters at Earth; product-level SPASE descriptions of data including holdings of nssdcftp; VSPO SPASE-based heliophysics-wide product site finding and data use;, standard Data format Translation Webservices (DTWS); metrics software and others. These data and services are available through standard user and application webservices interfaces, so middleware services such as the Heliophysics VxOs, and externally-developed clients or services, can readily leverage our data and capabilities. Beyond a short summary of the above, we will then conduct the talk as a conversation to evolving VxO needs and planned approach to leverage such existing and ongoing services.
The Biological Reference Repository (BioR): a rapid and flexible system for genomics annotation.
Kocher, Jean-Pierre A; Quest, Daniel J; Duffy, Patrick; Meiners, Michael A; Moore, Raymond M; Rider, David; Hossain, Asif; Hart, Steven N; Dinu, Valentin
2014-07-01
The Biological Reference Repository (BioR) is a toolkit for annotating variants. BioR stores public and user-specific annotation sources in indexed JSON-encoded flat files (catalogs). The BioR toolkit provides the functionality to combine and retrieve annotation from these catalogs via the command-line interface. Several catalogs from commonly used annotation sources and instructions for creating user-specific catalogs are provided. Commands from the toolkit can be combined with other UNIX commands for advanced annotation processing. We also provide instructions for the development of custom annotation pipelines. The package is implemented in Java and makes use of external tools written in Java and Perl. The toolkit can be executed on Mac OS X 10.5 and above or any Linux distribution. The BioR application, quickstart, and user guide documents and many biological examples are available at http://bioinformaticstools.mayo.edu. © The Author 2014. Published by Oxford University Press.
NASA Astrophysics Data System (ADS)
Ferland, G. J.; Chatzikos, M.; Guzmán, F.; Lykins, M. L.; van Hoof, P. A. M.; Williams, R. J. R.; Abel, N. P.; Badnell, N. R.; Keenan, F. P.; Porter, R. L.; Stancil, P. C.
2017-10-01
We describe the 2017 release of the spectral synthesis code Cloudy, summarizing the many improvements to the scope and accuracy of the physics which have been made since the previous release. Exporting the atomic data into external data files has enabled many new large datasets to be incorporated into the code. The use of the complete datasets is not realistic for most calculations, so we describe the limited subset of data used by default, which predicts significantly more lines than the previous release of Cloudy. This version is nevertheless faster than the previous release, as a result of code optimizations. We give examples of the accuracy limits using small models, and the performance requirements of large complete models. We summarize several advances in the H- and He-like iso-electronic sequences and use our complete collisional-radiative models to establish the densities where the coronal and local thermodynamic equilibrium approximations work.
Genomic mutation consequence calculator.
Major, John E
2007-11-15
The genomic mutation consequence calculator (GMCC) is a tool that will reliably and quickly calculate the consequence of arbitrary genomic mutations. GMCC also reports supporting annotations for the specified genomic region. The particular strength of the GMCC is it works in genomic space, not simply in spliced transcript space as some similar tools do. Within gene features, GMCC can report on the effects on splice site, UTR and coding regions in all isoforms affected by the mutation. A considerable number of genomic annotations are also reported, including: genomic conservation score, known SNPs, COSMIC mutations, disease associations and others. The manual interface also offers link outs to various external databases and resources. In batch mode, GMCC returns a csv file which can easily be parsed by the end user. GMCC is intended to support the many tumor resequencing efforts, but can be useful to any study investigating genomic mutations.
Knowledge Interaction Design for Creative Knowledge Work
NASA Astrophysics Data System (ADS)
Nakakoji, Kumiyo; Yamamoto, Yasuhiro
This paper describes our approach for the development of application systems for creative knowledge work, particularly for early stages of information design tasks. Being a cognitive tool serving as a means of externalization, an application system affects how the user is engaged in the creative process through its visual interaction design. Knowledge interaction design described in this paper is a framework where a set of application systems for different information design domains are developed based on an interaction model, which is designed for a particular model of a thinking process. We have developed two sets of application systems using the knowledge interaction design framework: one includes systems for linear information design, such as writing, movie-editing, and video-analysis; the other includes systems for network information design, such as file-system navigation and hypertext authoring. Our experience shows that the resulting systems encourage users to follow a certain cognitive path through graceful user experience.
Analysis and Exchange of Multimedia Laboratory Data Using the Brain Database
Wertheim, Steven L.
1990-01-01
Two principal goals of the Brain Database are: 1) to support laboratory data collection and analysis of multimedia information about the nervous system and 2) to support exchange of these data among researchers and clinicians who may be physically distant. This has been achieved by an implementation of experimental and clinical records within a relational database. An Image Series Editor has been created that provides a graphical interface to these data for the purposes of annotation, quantification and other analyses. Cooperating laboratories each maintain their own copies of the Brain Database to which they may add private data. Although the data in a given experimental or patient record will be distributed among many tables and external image files, the user can treat each record as a unit that can be extracted from the local database and sent to a distant colleague.
Masuya, Yoshihiro; Baba, Katsuaki
2016-01-01
A new process has been developed for the palladium(ii)-catalyzed synthesis of dibenzothiophene derivatives via the cleavage of C–H and C–S bonds. In contrast to the existing methods for the synthesis of this scaffold by C–H functionalization, this new catalytic C–H/C–S coupling method does not require the presence of an external stoichiometric oxidant or reactive functionalities such as C–X or S–H, allowing its application to the synthesis of elaborate π-systems. Notably, the product-forming step of this reaction lies in an oxidative addition step rather than a reductive elimination step, making this reaction mechanistically uncommon. PMID:28660030
The state of social media policies in higher education.
Pomerantz, Jeffrey; Hank, Carolyn; Sugimoto, Cassidy R
2015-01-01
This paper presents an analysis of the current state of development of social media policies at institution of higher education. Content analysis of social media policies for all institutions listed in the Carnegie Classification Data File revealed that less than one-quarter of institutions had an accessible social media policy. Analysis was done by institution and campus unit, finding that social media policies were most likely to appear at doctorate-granting institutions and health, athletics, and library units. Policies required that those affiliated with the institution post appropriate content, represent the unit appropriately, and moderate conversations with coworkers and external agencies. This analysis may inform the development and revision of social media policies across the field of higher education, taking into consideration the rapidly changing landscape of social media, issues of academic freedom, and notions of interoperability with policies at the unit and campus levels.
Modeling of rolling element bearing mechanics. Computer program user's manual
NASA Technical Reports Server (NTRS)
Greenhill, Lyn M.; Merchant, David H.
1994-01-01
This report provides the user's manual for the Rolling Element Bearing Analysis System (REBANS) analysis code which determines the quasistatic response to external loads or displacement of three types of high-speed rolling element bearings: angular contact ball bearings, duplex angular contact ball bearings, and cylindrical roller bearings. The model includes the defects of bearing ring and support structure flexibility. It is comprised of two main programs: the Preprocessor for Bearing Analysis (PREBAN) which creates the input files for the main analysis program, and Flexibility Enhanced Rolling Element Bearing Analysis (FEREBA), the main analysis program. This report addresses input instructions for and features of the computer codes. A companion report addresses the theoretical basis for the computer codes. REBANS extends the capabilities of the SHABERTH (Shaft and Bearing Thermal Analysis) code to include race and housing flexibility, including such effects as dead band and preload springs.
RINGMesh: A programming library for developing mesh-based geomodeling applications
NASA Astrophysics Data System (ADS)
Pellerin, Jeanne; Botella, Arnaud; Bonneau, François; Mazuyer, Antoine; Chauvin, Benjamin; Lévy, Bruno; Caumon, Guillaume
2017-07-01
RINGMesh is a C++ open-source programming library for manipulating discretized geological models. It is designed to ease the development of applications and workflows that use discretized 3D models. It is neither a geomodeler, nor a meshing software. RINGMesh implements functionalities to read discretized surface-based or volumetric structural models and to check their validity. The models can be then exported in various file formats. RINGMesh provides data structures to represent geological structural models, either defined by their discretized boundary surfaces, and/or by discretized volumes. A programming interface allows to develop of new geomodeling methods, and to plug in external software. The goal of RINGMesh is to help researchers to focus on the implementation of their specific method rather than on tedious tasks common to many applications. The documented code is open-source and distributed under the modified BSD license. It is available at https://www.ring-team.org/index.php/software/ringmesh.
GEM at 10: a decade's experience with the Guideline Elements Model.
Hajizadeh, Negin; Kashyap, Nitu; Michel, George; Shiffman, Richard N
2011-01-01
The Guideline Elements Model (GEM) was developed in 2000 to organize the information contained in clinical practice guidelines using XML and to represent guideline content in a form that can be understood by human readers and processed by computers. In this work, we systematically reviewed the literature to better understand how GEM was being used, potential barriers to its use, and suggestions for improvement. Fifty external and twelve internally produced publications were identified and analyzed. GEM was used most commonly for modeling and ontology creation. Other investigators applied GEM for knowledge extraction and data mining, for clinical decision support for guideline generation. The GEM Cutter software-used to markup guidelines for translation into XML- has been downloaded 563 times since 2000. Although many investigators found GEM to be valuable, others critiqued its failure to clarify guideline semantics, difficulties in markup, and the fact that GEM files are not usually executable.
The synchronous orbit magnetic field data set
NASA Technical Reports Server (NTRS)
Mcpherron, R. L.
1979-01-01
The magnetic field at synchronous orbit is the result of superposition of fields from many sources such as the earth, the magnetopause, the geomagnetic tail, the ring current and field-aligned currents. In addition, seasonal changes in the orientation of the earth's dipole axis causes significant changes in each of the external sources. Main reasons for which the synchronous orbit magnetic field data set is a potentially valuable resource are outlined. The primary reason why synchronous magnetic field data have not been used more extensively in magnetic field modeling is the presence of absolute errors in the measured fields. Nevertheless, there exists a reasonably large collection of synchronous orbit magnetic field data. Some of these data can be useful in quantitative modeling of the earth's magnetic field. A brief description is given of the spacecraft, the magnetometers, the standard graphical data displays, and the digital data files.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Staley, Martin
2017-09-20
This high-performance ray tracing library provides very fast rendering; compact code; type flexibility through C++ "generic programming" techniques; and ease of use via an application programming interface (API) that operates independently of any GUI, on-screen display, or other enclosing application. Kip supports constructive solid geometry (CSG) models based on a wide variety of built-in shapes and logical operators, and also allows for user-defined shapes and operators to be provided. Additional features include basic texturing; input/output of models using a simple human-readable file format and with full error checking and detailed diagnostics; and support for shared data parallelism. Kip is writtenmore » in pure, ANSI standard C++; is entirely platform independent; and is very easy to use. As a C++ "header only" library, it requires no build system, configuration or installation scripts, wizards, non-C++ preprocessing, makefiles, shell scripts, or external libraries.« less
The RCSB Protein Data Bank: views of structural biology for basic and applied research and education
Rose, Peter W.; Prlić, Andreas; Bi, Chunxiao; Bluhm, Wolfgang F.; Christie, Cole H.; Dutta, Shuchismita; Green, Rachel Kramer; Goodsell, David S.; Westbrook, John D.; Woo, Jesse; Young, Jasmine; Zardecki, Christine; Berman, Helen M.; Bourne, Philip E.; Burley, Stephen K.
2015-01-01
The RCSB Protein Data Bank (RCSB PDB, http://www.rcsb.org) provides access to 3D structures of biological macromolecules and is one of the leading resources in biology and biomedicine worldwide. Our efforts over the past 2 years focused on enabling a deeper understanding of structural biology and providing new structural views of biology that support both basic and applied research and education. Herein, we describe recently introduced data annotations including integration with external biological resources, such as gene and drug databases, new visualization tools and improved support for the mobile web. We also describe access to data files, web services and open access software components to enable software developers to more effectively mine the PDB archive and related annotations. Our efforts are aimed at expanding the role of 3D structure in understanding biology and medicine. PMID:25428375
Slow Controls Using the Axiom M5235BCC
NASA Astrophysics Data System (ADS)
Hague, Tyler
2008-10-01
The Forward Vertex Detector group at PHENIX plans to adopt the Axiom M5235 Business Card Controller for use as slow controls. It is also being evaluated for slow controls on FermiLab e906. This controller features the Freescale MCF5235 microprocessor. It also has three parallel buses, these being the MCU port, BUS port, and enhanced Time Processing Unit (eTPU) port. The BUS port uses a chip select module with three external chip selects to communicate with peripherals. This will be used to communicate with and configure Field Programmable Gate Arrays (FPGAs). The controller also has an Ethernet port which can use several different protocols such as TCP and UDP. This will be used to transfer files with computers on a network. The M5235 Business Card Controller will be placed in a VME crate along with VME card and a Spartan-3 FPGA.
... because it was fifth in a list of historical classifications of common skin rash illnesses in children. ... Audio/Video file Apple Quicktime file RealPlayer file Text file Zip Archive file SAS file ePub file ...
Smoking and Tobacco Use Health Effects
... Reports Vital Signs Surgeon General’s Reports 2016 2014 Historical Reports 2012 2010 2006 2004 2001 A Brief ... Audio/Video file Apple Quicktime file RealPlayer file Text file Zip Archive file SAS file ePub file ...
... County-level Lyme disease data from 2000-2016 Microsoft Excel file [Excel CSV – 209KB] ––Right–click the link ... PDF file Microsoft PowerPoint file Microsoft Word file Microsoft Excel file Audio/Video file Apple Quicktime file RealPlayer ...
... HEADS UP Resources Training Custom PDFs Mobile Apps Videos Graphics Podcasts Social Media File Formats Help: How do I view different file formats (PDF, DOC, PPT, MPEG) on this site? Adobe PDF file Microsoft PowerPoint ... file Apple Quicktime file RealPlayer file Text file ...
Small file aggregation in a parallel computing system
Faibish, Sorin; Bent, John M.; Tzelnic, Percy; Grider, Gary; Zhang, Jingwang
2014-09-02
Techniques are provided for small file aggregation in a parallel computing system. An exemplary method for storing a plurality of files generated by a plurality of processes in a parallel computing system comprises aggregating the plurality of files into a single aggregated file; and generating metadata for the single aggregated file. The metadata comprises an offset and a length of each of the plurality of files in the single aggregated file. The metadata can be used to unpack one or more of the files from the single aggregated file.
Usage analysis of user files in UNIX
NASA Technical Reports Server (NTRS)
Devarakonda, Murthy V.; Iyer, Ravishankar K.
1987-01-01
Presented is a user-oriented analysis of short term file usage in a 4.2 BSD UNIX environment. The key aspect of this analysis is a characterization of users and files, which is a departure from the traditional approach of analyzing file references. Two characterization measures are employed: accesses-per-byte (combining fraction of a file referenced and number of references) and file size. This new approach is shown to distinguish differences in files as well as users, which cam be used in efficient file system design, and in creating realistic test workloads for simulations. A multi-stage gamma distribution is shown to closely model the file usage measures. Even though overall file sharing is small, some files belonging to a bulletin board system are accessed by many users, simultaneously and otherwise. Over 50% of users referenced files owned by other users, and over 80% of all files were involved in such references. Based on the differences in files and users, suggestions to improve the system performance were also made.
VizieR Online Data Catalog: Atlas of Galactic Neutral Hydrogen (Hartmann+, 1997)
NASA Astrophysics Data System (ADS)
Hartmann, D.; Burton, W. B.
1999-04-01
The Leiden/Dwingeloo HI survey mapped the 21-cm spectral line emission over the entire sky above declinations of -30 degrees using a grid spacing of ~ 0.5 degree and a velocity sampling of ~ 1.03 km/s. The useful velocity (V_lsr) range is from -450 to +400 km/s. The Atlas presents calibrated spectra in units of brightness temperature. Using interpolation and averaging, the authors have placed their data on an evenly-spaced grid in Galactic coordinates (l,b). A detailed discussion of the instrument and calibration procedures is provided in the published Atlas. The average sensitivity level of the survey is 0.07 K (1-sigma, rms). This sensitivity level depends critically on the success of the stray-radiation correction as discussed in Hartmann et al. (1996A&AS..119..115H). In that discussion, several caveats are offered regarding the removal of stray radiation, in particular that component which might be due to reflection from the ground. Some instances have been found where there are residuals which are clearly larger than the mean accuracy quoted as representative of the Leiden/Dwingeloo survey. Users of the data are reminded that the stray-radiation correction was applied conservatively, ensuring that no overestimate was calculated and removed, thereby yielding spurious negative intensities. A specific example of remaining spurious emission is evident towards the North Galactic Pole, a direction notoriously difficult to observe. All spectra taken towards b=+90 degrees should, of course, be identical, no matter the longitude or the orientation of the telescope with respect to the ground or to the meridian. Because the sky was sampled in 5x5 degree boxes, a spectrum was recorded at b=+90 degrees for every Nx5 degrees (N=0..72) in longitude. The spectra in the final dataset were interpolated between these measured spectra to yield a 0.5x0.5 degree grid. So, only every 10th spectrum at this extreme latitude corresponds to an observed spectrum. Comparing all spectra at b=+90 reveals differences which are larger than expected. The origin of this discrepancy is currently unknown. There is also an instrumental effect which reveals itself as correlated noise, showing a pattern which alternates sign at adjacent channels when the very lowest levels of intensity are examined. This effect is due to an offset in the DAS autocorrelator used as the backend in the Leiden/Dwingeloo survey. The presence of this artifact becomes noticeable only after averaging 50 or more spectra. Although a Hanning convolution of the data would eliminate this effect, it would also degrade the velocity resolution; as the correlated noise is noticeable only at very low levels (about 15 mK), well below the mean rms sensitivity of the survey itself, the original spectra have not been Hanning smoothed. Excepted are those spectra which suffered from sinc interference. These spectra were Hanning smoothed to enable the elimination of the interference spike. Dr. Lloyd Higgs has compared the HI spectra made with the DRAO 26-m telescope in support of the Canadian Galactic Plane Survey with those of the Leiden Dwingeloo Survey, and has pointed out what are evidently calibration problems in a small number of isolated LDS spectra. Either Hartmann, Burton, or Higgs could provide additional information. The Leiden/Dwingeloo HI survey is intended primarily for studies of the interstellar gas associated with our own Galaxy. There are, however, a small number of spectra in which 'contaminating' signatures from known external galaxies are present. Detections of roughly 50 such external galaxies were made; refer to table 4 of the Atlas for a list. The HI spectra from the Leiden/Dwingeloo survey are archived as 721 files. Each file is in FITS image format, and maps the 21-cm brightness temperature at a fixed Galactic longitude for an evenly-spaced rectangular grid of (Galactic latitude, velocity) points. There is one FITS file for every 0.5 degree in Galactic longitude in the "fits" subdirectory. In addition to the 721 (b,v) FITS files, there is an (l,b) FITS image named TOTAL_HI.FIT, which contains the integrated intensity map over the velocity range -450 km/s <= V_lsr <= +400 km/s. The map units are in [K.km/s] and the FITS header contains comments regarding the conversion to column densities. Included as a visual aid is the GIF image file total_hi.gif, which depicts the velocity-integrated map. The data were originally distributed on a CD-ROM enclosed with the Atlas of Galactic Neutral Hydrogen (reference given above). The CD also contains animations of velocity slices through the data cube. (1 data file).
VizieR Online Data Catalog: ND2 rotational spectrum (Melosso+,
NASA Astrophysics Data System (ADS)
Melosso, M.; Degli Esposti, C.; Dore, L.
2018-01-01
files used with the SPFIT/SPCAT program suite. There are 8 files of supplementary material, including a ReadMe, which was created by the AAS data editors. The text files are as follows: 1_Explan.txt = information on the content of the other files. 2ND2.fit = the output file of the fit of spectroscopic data used in the present study. 3ND2.lin = the corresponding line file. 4ND2.par = the corresponding parameter file. 5ND2.cat = the output file of the prediction made with the parameters determined in this study. 6ND2.var = the corresponding parameter file 7ND2.int = the corresponding intensity file (1 data file).
33 CFR 148.246 - When is a document considered filed and where should I file it?
Code of Federal Regulations, 2010 CFR
2010-07-01
... filed and where should I file it? 148.246 Section 148.246 Navigation and Navigable Waters COAST GUARD... Formal Hearings § 148.246 When is a document considered filed and where should I file it? (a) If a document to be filed is submitted by mail, it is considered filed on the date it is postmarked. If a...
Please Move Inactive Files Off the /projects File System | High-Performance
Computing | NREL Please Move Inactive Files Off the /projects File System Please Move Inactive Files Off the /projects File System January 11, 2018 The /projects file system is a shared resource . This year this has created a space crunch - the file system is now about 90% full and we need your help
5 CFR 1201.4 - General definitions.
Code of Federal Regulations, 2010 CFR
2010-01-01
... commercial or personal delivery, or by electronic filing (e-filing) in accordance with § 1201.14. (j) Date of... the document was delivered to the commercial delivery service. The date of filing by e-filing is the date of electronic submission. (m) Electronic filing (e-filing). Filing and receiving documents in...
12 CFR 303.8 - Public access to filing.
Code of Federal Regulations, 2010 CFR
2010-01-01
... portions of a filing (the public file) until 180 days following final disposition of a filing. Following the 180-day period, non-confidential portions of an application file will be made available in accordance with ' 303.8(c). The public file generally consists of portions of the filing, supporting data...
12 CFR 303.8 - Public access to filing.
Code of Federal Regulations, 2011 CFR
2011-01-01
... portions of a filing (the public file) until 180 days following final disposition of a filing. Following the 180-day period, non-confidential portions of an application file will be made available in accordance with ' 303.8(c). The public file generally consists of portions of the filing, supporting data...
A linked GeoData map for enabling information access
Powell, Logan J.; Varanka, Dalia E.
2018-01-10
OverviewThe Geospatial Semantic Web (GSW) is an emerging technology that uses the Internet for more effective knowledge engineering and information extraction. Among the aims of the GSW are to structure the semantic specifications of data to reduce ambiguity and to link those data more efficiently. The data are stored as triples, the basic data unit in graph databases, which are similar to the vector data model of geographic information systems (GIS); that is, a node-edge-node model that forms a graph of semantically related information. The GSW is supported by emerging technologies such as linked geospatial data, described below, that enable it to store and manage geographical data that require new cartographic methods for visualization. This report describes a map that can interact with linked geospatial data using a simulation of a data query approach called the browsable graph to find information that is semantically related to a subject of interest, visualized using the Data Driven Documents (D3) library. Such a semantically enabled map functions as a map knowledge base (MKB) (Varanka and Usery, 2017).A MKB differs from a database in an important way. The central element of a triple, alternatively called the edge or property, is composed of a logic formalization that structures the relation between the first and third parts, the nodes or objects. Node-edge-node represents the graphic form of the triple, and the subject-property-object terms represent the data structure. Object classes connect to build a federated graph, similar to a network in visual form. Because the triple property is a logical statement (a predicate), the data graph represents logical propositions or assertions accepted to be true about the subject matter. These logical formalizations can be manipulated to calculate new triples, representing inferred logical assertions, from the existing data.To demonstrate a MKB system, a technical proof-of-concept is developed that uses geographically attributed Resource Description Framework (RDF) serializations of linked data for mapping. The proof-of-concept focuses on accessing triple data from visual elements of a geographic map as the interface to the MKB. The map interface is embedded with other essential functions such as SPARQL Protocol and RDF Query Language (SPARQL) data query endpoint services and reasoning capabilities of Apache Marmotta (Apache Software Foundation, 2017). An RDF database of the Geographic Names Information System (GNIS), which contains official names of domestic feature in the United States, was linked to a county data layer from The National Map of the U.S. Geological Survey. The county data are part of a broader Government Units theme offered to the public as Esri shapefiles. The shapefile used to draw the map itself was converted to a geographic-oriented JavaScript Object Notation (JSON) (GeoJSON) format and linked through various properties with a linked geodata version of the GNIS database called “GNIS–LD” (Butler and others, 2016; B. Regalia and others, University of California-Santa Barbara, written commun., 2017). The GNIS–LD files originated in Terse RDF Triple Language (Turtle) format but were converted to a JSON format specialized in linked data, “JSON–LD” (Beckett and Berners-Lee, 2011; Sorny and others, 2014). The GNIS–LD database is composed of roughly three predominant triple data graphs: Features, Names, and History. The graphs include a set of namespace prefixes used by each of the attributes. Predefining the prefixes made the conversion to the JSON–LD format simple to complete because Turtle and JSON–LD are variant specifications of the basic RDF concept.To convert a shapefile into GeoJSON format to capture the geospatial coordinate geometry objects, an online converter, Mapshaper, was used (Bloch, 2013). To convert the Turtle files, a custom converter written in Java reconstructs the files by parsing each grouping of attributes belonging to one subject and pasting the data into a new file that follows the syntax of JSON–LD. Additionally, the Features file contained its own set of geometries, which was exported into a separate JSON–LD file along with its elevation value to form a fourth file, named “features-geo.json.” Extracted data from external files can be represented in HyperText Markup Language (HTML) path objects. The goal was to import multiple JSON–LD files using this approach.
Shahi, Shahriar; Rahimi, Saeed; Shiezadeh, Vahab; Ashasi, Habib; Abdolrahimi, Majid; Foroughreyhani, Mohammad
2012-01-01
Aim: The aim of the present study was to electrochemically evaluate corrosion resistance of RaCe and Mtwo files after repeated sterilization and preparation procedures. Study Design: A total of 450 rotary files were used. In the working groups, 72 files from each file type were distributed into 4 groups. RaCe and Mtwo files were used to prepare one root canal of the mesial root of extracted human mandibular first molars. The procedure was repeated to prepare 2 to 8 canals. The following irrigation solutions were used: group 1, RaCe files with 2.5% NaOCl; group 2, RaCe files with normal saline; group 3, Mtwo files with 2.5% NaOCl; and group 4, Mtwo files with normal saline in the manner described. In autoclave groups, 72 files from each file type were evenly distributed into 2 groups. Files were used for a cycle of sterilization without the use of files for root canal preparation. Nine new unused files from each file type were used as controls. Then the instruments were sent for corrosion assessment. Mann-Whitney U and Wilcoxon tests were used for independent and dependent groups, respectively. Results: Statistical analysis indicated that there were significant differences in corrosion resistance of files associated with working and autoclave groups between RaCe and Mtwo file types (p<0.001). Conclusions: Corrosion resistance of #25, #30, and #35 Mtwo files is significantly higher than that in RaCe files with similar sizes. Key words:Corrosion, NiTi instruments, autoclave, RaCe, Mtwo. PMID:22143690
Permanent-File-Validation Utility Computer Program
NASA Technical Reports Server (NTRS)
Derry, Stephen D.
1988-01-01
Errors in files detected and corrected during operation. Permanent File Validation (PFVAL) utility computer program provides CDC CYBER NOS sites with mechanism to verify integrity of permanent file base. Locates and identifies permanent file errors in Mass Storage Table (MST) and Track Reservation Table (TRT), in permanent file catalog entries (PFC's) in permit sectors, and in disk sector linkage. All detected errors written to listing file and system and job day files. Program operates by reading system tables , catalog track, permit sectors, and disk linkage bytes to vaidate expected and actual file linkages. Used extensively to identify and locate errors in permanent files and enable online correction, reducing computer-system downtime.
Zebra: A striped network file system
NASA Technical Reports Server (NTRS)
Hartman, John H.; Ousterhout, John K.
1992-01-01
The design of Zebra, a striped network file system, is presented. Zebra applies ideas from log-structured file system (LFS) and RAID research to network file systems, resulting in a network file system that has scalable performance, uses its servers efficiently even when its applications are using small files, and provides high availability. Zebra stripes file data across multiple servers, so that the file transfer rate is not limited by the performance of a single server. High availability is achieved by maintaining parity information for the file system. If a server fails its contents can be reconstructed using the contents of the remaining servers and the parity information. Zebra differs from existing striped file systems in the way it stripes file data: Zebra does not stripe on a per-file basis; instead it stripes the stream of bytes written by each client. Clients write to the servers in units called stripe fragments, which are analogous to segments in an LFS. Stripe fragments contain file blocks that were written recently, without regard to which file they belong. This method of striping has numerous advantages over per-file striping, including increased server efficiency, efficient parity computation, and elimination of parity update.
Code of Federal Regulations, 2011 CFR
2011-01-01
... citations, postdeparture filing citations, AES downtime filing citation, exemption or exclusion legends. The... citations, AES downtime filing citation, exemption or exclusion legends required in § 30.4(e) to the...) Postal exports. The proof of filing citations, postdeparture filing citations, AES downtime filing...
Code of Federal Regulations, 2013 CFR
2013-01-01
... citations, postdeparture filing citations, AES downtime filing citation, exemption or exclusion legends. The... citations, AES downtime filing citation, exemption or exclusion legends required in § 30.4(e) to the...) Postal exports. The proof of filing citations, postdeparture filing citations, AES downtime filing...
Code of Federal Regulations, 2012 CFR
2012-01-01
... citations, postdeparture filing citations, AES downtime filing citation, exemption or exclusion legends. The... citations, AES downtime filing citation, exemption or exclusion legends required in § 30.4(e) to the...) Postal exports. The proof of filing citations, postdeparture filing citations, AES downtime filing...
Code of Federal Regulations, 2014 CFR
2014-01-01
... citations, postdeparture filing citations, AES downtime filing citation, exemption or exclusion legends. The... citations, AES downtime filing citation, exemption or exclusion legends required in § 30.4(e) to the...) Postal exports. The proof of filing citations, postdeparture filing citations, AES downtime filing...
Code of Federal Regulations, 2010 CFR
2010-01-01
... citations, postdeparture filing citations, AES downtime filing citation, exemption or exclusion legends. The... citations, AES downtime filing citation, exemption or exclusion legends required in § 30.4(e) to the...) Postal exports. The proof of filing citations, postdeparture filing citations, AES downtime filing...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-03
...] Wisconsin Public Service Corporation; Notices of Intent To File License Applications, Filing of Pre-Application Documents (PAD), Commencement of Pre-Filing Processes and Scoping, Request for Comments on the...: Notices of Intent to File License Applications for Two New Licenses and Commencing the Pre-filing Process...
47 CFR 1.10008 - What are IBFS file numbers?
Code of Federal Regulations, 2013 CFR
2013-10-01
... Random Selection International Bureau Filing System § 1.10008 What are IBFS file numbers? (a) We assign...) For a description of file number information, see The International Bureau Filing System File Number... 47 Telecommunication 1 2013-10-01 2013-10-01 false What are IBFS file numbers? 1.10008 Section 1...
47 CFR 1.10008 - What are IBFS file numbers?
Code of Federal Regulations, 2010 CFR
2010-10-01
... Bureau Filing System § 1.10008 What are IBFS file numbers? (a) We assign file numbers to electronic... information, see The International Bureau Filing System File Number Format Public Notice, DA-04-568 (released... 47 Telecommunication 1 2010-10-01 2010-10-01 false What are IBFS file numbers? 1.10008 Section 1...
47 CFR 1.10008 - What are IBFS file numbers?
Code of Federal Regulations, 2012 CFR
2012-10-01
... Random Selection International Bureau Filing System § 1.10008 What are IBFS file numbers? (a) We assign...) For a description of file number information, see The International Bureau Filing System File Number... 47 Telecommunication 1 2012-10-01 2012-10-01 false What are IBFS file numbers? 1.10008 Section 1...
47 CFR 1.10008 - What are IBFS file numbers?
Code of Federal Regulations, 2011 CFR
2011-10-01
... Bureau Filing System § 1.10008 What are IBFS file numbers? (a) We assign file numbers to electronic... information, see The International Bureau Filing System File Number Format Public Notice, DA-04-568 (released... 47 Telecommunication 1 2011-10-01 2011-10-01 false What are IBFS file numbers? 1.10008 Section 1...
47 CFR 1.10008 - What are IBFS file numbers?
Code of Federal Regulations, 2014 CFR
2014-10-01
... Random Selection International Bureau Filing System § 1.10008 What are IBFS file numbers? (a) We assign...) For a description of file number information, see The International Bureau Filing System File Number... 47 Telecommunication 1 2014-10-01 2014-10-01 false What are IBFS file numbers? 1.10008 Section 1...
12 CFR 1780.9 - Filing of papers.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 12 Banks and Banking 7 2010-01-01 2010-01-01 false Filing of papers. 1780.9 Section 1780.9 Banks... papers. (a) Filing. Any papers required to be filed shall be addressed to the presiding officer and filed... Director or the presiding officer. All papers filed by electronic media shall also concurrently be filed in...
12 CFR 1780.9 - Filing of papers.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 12 Banks and Banking 7 2011-01-01 2011-01-01 false Filing of papers. 1780.9 Section 1780.9 Banks... papers. (a) Filing. Any papers required to be filed shall be addressed to the presiding officer and filed... Director or the presiding officer. All papers filed by electronic media shall also concurrently be filed in...
VizieR Online Data Catalog: Opacities from the Opacity Project (Seaton+, 1995)
NASA Astrophysics Data System (ADS)
Seaton, M. J.; Yan, Y.; Mihalas, D.; Pradhan, A. K.
1997-08-01
1 CODES. ***** 1.1 Code rop.for ************ This code reads opacity files written in standard OP format. Its main purpose is to provide documentation on the contents of the files. This code, like the other codes provided, prompts for the name of the file (or files) to be read. The file names read in response to the prompt may have up to 128 characters. 1.2 Code opfit.for ************** This code reads opacity files in standard OP format, and provides for interpolation of opacities to any required values of temperature and mass-density. The method used is described in OPF. The code prompts for the name of a file giving all required control parameters. As an example, the file opfit.dat is provided (users will need to change directory names and file names). The use of opfit.for is illustrated using opfit.dat. Most users will probably want to adapt opfit.for for use as a subroutine in other codes. Timings for DEC 7000 ALPHA: 0.3 sec for data read and initialisations; then 0.0007 sec for each temperature-density point. Users who like OPAL formats should note that opfit.for has a facility to produce files of OP data in OPAL-type formats. 1.3 Code ixz.for ************ This code provides for interpolations to any required values of X and Z. See IXZ. It prompts for the name of a file giving all required control parameters. An example of such a file if provided, ixz.dat (the user will need to change directory and file names). The output files have names s92INT.'nnn'. The user specifies the first value of nnn, and the number of files to be produced. 2. DATA FILES ********** 2.1 Data files for solar metal-mix ****************************** Data for solar metal-mix s92 as defined in SYMP. These files are from version 2 runs of December 1994 (see IXZ for details on Version 2). There are 213 files with names s92.'nnn', 'nnn'=201 to 413. Each file occupies 83762 bytes. The file s92.version2 gives values of X (hydrogen mass-faction) and Z (metals mass-fraction) for each value of 'nnn'. The user can get s92.version2, select the values of 'nnn' required, then get the required files s92.'nnn'. The user can see the file in ftp, displayed on the screen, by typing "get s92.version2 -". The files s92.'nnn' can be used with opfit.for to obtain opacities for any requires value of temperature and mass density. Files for other metal-mixtures will be added in due course. Send requests to mjs@star.ucl.ac.uk. 2.2 Files for interpolation in X and Z ********************************** The data files have names s92xz.'mmm', where 'mmm'=001 to 096. They differ from the standard OP files (such as s92.'nnn' --- section 2.1 above) in that they contain information giving derivatives of opacities with respect to X and Z. Each file s92xz.'mmm' occupies 148241 bytes. The interpolations to any required values of X and Z are made using ixz.for. Timings: on DEC 7000 ALPHA, 2.16 sec for each new-mixture file. For interpolations to some specified values of X and Z, one requires just 4 files s92xz.'mmm'. Most users will not require the complete set of files s92xz.'mmm'. The file s92xz.index includes a table (starting on line 3) giving values, for each 'mmm' file, of x,y,z (abundances by number-factions) and X,Y,Z (abundances by mass-fractions). Users are advised to get the file s92.index, and select values of 'mmm' for files required, then get those files. The files produced by ixz.for are in standard OP format and can be used with opfit.for to obtain opacities for any required values of temperature and mass density. 3 RECOMMENDED PROCEDURE FOR USE OF OPACITY FILES ********************************************** (1) Get the file s92.version2. (2) If the values of X and Z you require are available in the files s92.'nnn' then get those files. (3) If not, get the file s92xz.index. (4) Select from s92xz.index the values of 'mmm' which cover the range of X and Z in which your are interested. Get those files and use ixz.for to generate files for your exact required values of X and Z. (5) Note that the exact abundance mixtures used are specified in each file (see rop.for). Also each run of opfit.for produces a table of abundances. (6) If you want a metal-mix different from that of s92, contact mjs@star.ucl.ac.uk. 4 FUTURE DEVELOPMENTS ******************* (1) Data for the calculation of radiative forces are provided as the CDS catalog
Accessing files in an Internet: The Jade file system
NASA Technical Reports Server (NTRS)
Peterson, Larry L.; Rao, Herman C.
1991-01-01
Jade is a new distribution file system that provides a uniform way to name and access files in an internet environment. It makes two important contributions. First, Jade is a logical system that integrates a heterogeneous collection of existing file systems, where heterogeneous means that the underlying file systems support different file access protocols. Jade is designed under the restriction that the underlying file system may not be modified. Second, rather than providing a global name space, Jade permits each user to define a private name space. These private name spaces support two novel features: they allow multiple file systems to be mounted under one directory, and they allow one logical name space to mount other logical name spaces. A prototype of the Jade File System was implemented on Sun Workstations running Unix. It consists of interfaces to the Unix file system, the Sun Network File System, the Andrew File System, and FTP. This paper motivates Jade's design, highlights several aspects of its implementation, and illustrates applications that can take advantage of its features.
Accessing files in an internet - The Jade file system
NASA Technical Reports Server (NTRS)
Rao, Herman C.; Peterson, Larry L.
1993-01-01
Jade is a new distribution file system that provides a uniform way to name and access files in an internet environment. It makes two important contributions. First, Jade is a logical system that integrates a heterogeneous collection of existing file systems, where heterogeneous means that the underlying file systems support different file access protocols. Jade is designed under the restriction that the underlying file system may not be modified. Second, rather than providing a global name space, Jade permits each user to define a private name space. These private name spaces support two novel features: they allow multiple file systems to be mounted under one directory, and they allow one logical name space to mount other logical name spaces. A prototype of the Jade File System was implemented on Sun Workstations running Unix. It consists of interfaces to the Unix file system, the Sun Network File System, the Andrew File System, and FTP. This paper motivates Jade's design, highlights several aspects of its implementation, and illustrates applications that can take advantage of its features.
Twin-tailed fail-over for fileservers maintaining full performance in the presence of a failure
Coteus, Paul W.; Gara, Alan G.; Giampapa, Mark E.; Heidelberger, Philip; Steinmacher-Burow, Burkhard D.
2008-02-12
A method for maintaining full performance of a file system in the presence of a failure is provided. The file system having N storage devices, where N is an integer greater than zero and N primary file servers where each file server is operatively connected to a corresponding storage device for accessing files therein. The file system further having a secondary file server operatively connected to at least one of the N storage devices. The method including: switching the connection of one of the N storage devices to the secondary file server upon a failure of one of the N primary file servers; and switching the connections of one or more of the remaining storage devices to a primary file server other than the failed file server as necessary so as to prevent a loss in performance and to provide each storage device with an operating file server.
Registered File Support for Critical Operations Files at (Space Infrared Telescope Facility) SIRTF
NASA Technical Reports Server (NTRS)
Turek, G.; Handley, Tom; Jacobson, J.; Rector, J.
2001-01-01
The SIRTF Science Center's (SSC) Science Operations System (SOS) has to contend with nearly one hundred critical operations files via comprehensive file management services. The management is accomplished via the registered file system (otherwise known as TFS) which manages these files in a registered file repository composed of a virtual file system accessible via a TFS server and a file registration database. The TFS server provides controlled, reliable, and secure file transfer and storage by registering all file transactions and meta-data in the file registration database. An API is provided for application programs to communicate with TFS servers and the repository. A command line client implementing this API has been developed as a client tool. This paper describes the architecture, current implementation, but more importantly, the evolution of these services based on evolving community use cases and emerging information system technology.
Data handling with SAM and art at the NO vA experiment
Aurisano, A.; Backhouse, C.; Davies, G. S.; ...
2015-12-23
During operations, NOvA produces between 5,000 and 7,000 raw files per day with peaks in excess of 12,000. These files must be processed in several stages to produce fully calibrated and reconstructed analysis files. In addition, many simulated neutrino interactions must be produced and processed through the same stages as data. To accommodate the large volume of data and Monte Carlo, production must be possible both on the Fermilab grid and on off-site farms, such as the ones accessible through the Open Science Grid. To handle the challenge of cataloging these files and to facilitate their off-line processing, we havemore » adopted the SAM system developed at Fermilab. SAM indexes files according to metadata, keeps track of each file's physical locations, provides dataset management facilities, and facilitates data transfer to off-site grids. To integrate SAM with Fermilab's art software framework and the NOvA production workflow, we have developed methods to embed metadata into our configuration files, art files, and standalone ROOT files. A module in the art framework propagates the embedded information from configuration files into art files, and from input art files to output art files, allowing us to maintain a complete processing history within our files. Embedding metadata in configuration files also allows configuration files indexed in SAM to be used as inputs to Monte Carlo production jobs. Further, SAM keeps track of the input files used to create each output file. Parentage information enables the construction of self-draining datasets which have become the primary production paradigm used at NOvA. In this study we will present an overview of SAM at NOvA and how it has transformed the file production framework used by the experiment.« less
76 FR 49761 - Combined Notice of Filings #1
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-11
... Operator, Inc. submits tariff filing per 35.13(a)(2)(iii: Filing of Notice of Succession to Interconnection.... submits tariff filing per 35.13(a)(2)(iii: Filing of Notice of Succession of ITC Midwest to be effective.... submits tariff filing per 35.13(a)(2)(iii: Notice of Succession to be effective 10/4/2011. Filed Date: 08...
76 FR 50210 - Combined Notice of Filings #
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-12
..., Inc. submits tariff filing per 35.13(a)(2)(iii: Filing of Notice of Succession to be effective 10/5..., Inc. submits tariff filing per 35.13(a)(2)(iii: Filing of Notice of Succession to be effective 10/5..., Inc. submits tariff filing per 35.13(a)(2)(iii: Filing of Notice of Succession to be effective 10/5...
77 FR 23708 - Combined Notice of Filings #2
Federal Register 2010, 2011, 2012, 2013, 2014
2012-04-20
... Company submits tariff filing per 35.13(a)(2)(iii: 7--20120413 OPCo OATT Conc to be effective 1/1/2012... tariff filing per 35: ER12-247 Compliance Filing to be effective 4/20/2011. Filed Date: 4/13/12... Compliance Filing to be effective 4/20/2011. Filed Date: 4/13/12. Accession Number: 20120413-5145. Comments...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-02-15
.... Electronic Submissions (E-Filing) All documents filed in NRC adjudicatory proceedings, including a request... accordance with the NRC E-Filing rule (72 FR 49139, August 28, 2007). The E-Filing process requires... requirements of E-FilingFiling, at least ten (10) days prior to the filing deadline, the participant should...
VizieR Online Data Catalog: Infrared Arcturus Atlas (Hinkle+ 1995)
NASA Astrophysics Data System (ADS)
Hinkle, K.; Wallace, L.; Livingston, W.
1996-01-01
The atlas is contained in 310 spectral files a list of line identifications, plus a file containing a list of the files and unobserved spectral regions. The spectral file names are in the form 'abnnnnn' where 'nnnnn' denotes the spectral region, e.g. file 'ab4300' contains spectra for the 4300-4325 cm-1 range. The atomic and molecular line identifications are in files 'appendix.a' and 'appendix.b', and repeated with a uniform format in file 'lines'. The file 'appendix.c' is a book-keeping device used to correlate the plot plages and spectral files with frequency. See the author-supplied description in 'readme.dat' for more information. (311 data files).
Chaniotis, A
2016-08-01
To report the clinical and radiographic treatment outcome of an immature replanted mandibular incisor with severe inflammatory external root resorption following a single-step regenerative approach. A 7-year-old female patient was referred 1 week following an extrusion injury to her mandibular central incisor (tooth 31). There was a history of a 6 months previous avulsion injury to the same tooth, which had been replanted after 20 min of extra-oral time. On clinical examination, all teeth were asymptomatic and there was an arch wire splint placed on the mandibular incisors. Radiographic examination revealed severe inflammatory external root resorption of tooth 31. A diagnosis of necrotic pulp and asymptomatic apical periodontitis was made. Under local anaesthesia and rubber dam isolation, an access cavity was prepared. The canal was irrigated using 6% NaOCl solution delivered through the EndoVac negative pressure irrigation system (Endo Vac, Axis/SybronEndo, Coppell, TX, USA). A 17% EDTA solution was used for 5 min followed by a final rinse of sterile water. The periapical tissues were probed using a K-file, and bleeding was induced. A blood clot was allowed to form filling the entire canal. A thick plug of MTA was placed in direct contact with the blood clot. The tooth was restored with composite resin. All procedures were performed in a single visit. The splint was removed 2 weeks later. Recall examination after 24 months revealed healthy soft tissues with normal periodontal probing and mobility. The 24 months radiographic evaluation revealed healing of the severe inflammatory external root resorption and continuous root development/dentine wall thickening of the apical third. No signs of ankylosis or significant discoloration was present. © 2015 International Endodontic Journal. Published by John Wiley & Sons Ltd.
NASA Technical Reports Server (NTRS)
Steinbrenner, John P.; Chawner, John R.
1992-01-01
GRIDGEN is a government domain software package for interactive generation of multiple block grids around general configurations. Though it has been freely available since 1989, it has not been widely embraced by the internal flow community due to a misconception that it was designed for external flow use only. In reality GRIDGEN has always worked for internal flow applications, and GRIDGEN ongoing enhancements are increasing the quality of and efficiency with which grids for external and internal flow problems may be constructed. The software consists of four codes used to perform the four steps of the grid generation process. GRIDBLOCK is first used to decompose the flow domain into a collection of component blocks and then to establish interblock connections and flow solver boundary conditions. GRIDGEN2D is then used to generate surface grids on the outer shell of each component block. GRIDGEN3D generates grid points on the interior of each block, and finally GRIDVUE3D is used to inspect the resulting multiple block grid. Three of these codes (GRIDBLOCK, GRIDGEN2D, and GRIDVUE3D) are highly interactive and graphical in nature, and currently run on Silicon Graphics, Inc., and IBM RS/6000 workstations. The lone batch code (GRIDGEN3D) may be run on any of several Unix based platforms. Surface grid generation in GRIDGEN2D is being improved with the addition of higher order surface definitions (NURBS and parametric surfaces input in IGES format and bicubic surfaces input in PATRAN Neutral File format) and double precision mathematics. In addition, two types of automation have been added to GRIDGEN2D that reduce the learning curve slope for new users and eliminate work for experienced users. Volume grid generation using GRIDGEN3D has been improved via the addition of an advanced hybrid control function formulation that provides both orthogonality and clustering control at the block faces and clustering control on the block interior.
Parikh, Priti P; Minning, Todd A; Nguyen, Vinh; Lalithsena, Sarasi; Asiaee, Amir H; Sahoo, Satya S; Doshi, Prashant; Tarleton, Rick; Sheth, Amit P
2012-01-01
Research on the biology of parasites requires a sophisticated and integrated computational platform to query and analyze large volumes of data, representing both unpublished (internal) and public (external) data sources. Effective analysis of an integrated data resource using knowledge discovery tools would significantly aid biologists in conducting their research, for example, through identifying various intervention targets in parasites and in deciding the future direction of ongoing as well as planned projects. A key challenge in achieving this objective is the heterogeneity between the internal lab data, usually stored as flat files, Excel spreadsheets or custom-built databases, and the external databases. Reconciling the different forms of heterogeneity and effectively integrating data from disparate sources is a nontrivial task for biologists and requires a dedicated informatics infrastructure. Thus, we developed an integrated environment using Semantic Web technologies that may provide biologists the tools for managing and analyzing their data, without the need for acquiring in-depth computer science knowledge. We developed a semantic problem-solving environment (SPSE) that uses ontologies to integrate internal lab data with external resources in a Parasite Knowledge Base (PKB), which has the ability to query across these resources in a unified manner. The SPSE includes Web Ontology Language (OWL)-based ontologies, experimental data with its provenance information represented using the Resource Description Format (RDF), and a visual querying tool, Cuebee, that features integrated use of Web services. We demonstrate the use and benefit of SPSE using example queries for identifying gene knockout targets of Trypanosoma cruzi for vaccine development. Answers to these queries involve looking up multiple sources of data, linking them together and presenting the results. The SPSE facilitates parasitologists in leveraging the growing, but disparate, parasite data resources by offering an integrative platform that utilizes Semantic Web techniques, while keeping their workload increase minimal.
High-Resolution Uitra Low Power, Intergrated Aftershock and Microzonation System
NASA Astrophysics Data System (ADS)
Passmore, P.; Zimakov, L. G.
2012-12-01
Rapid Aftershock Mobilization plays an essential role in the understanding of both focal mechanism and rupture propagation caused by strong earthquakes. A quick assessment of the data provides a unique opportunity to study the dynamics of the entire earthquake process in-situ. Aftershock study also provides practical information for local authorities regarding the post earthquake activity, which is very important in order to conduct the necessary actions for public safety in the area affected by the strong earthquake. Refraction Technology, Inc. has developed a self-contained, fully integrated Aftershock System, model 160-03, providing the customer simple and quick deployment during aftershock emergency mobilization and microzonation studies. The 160-03 has no external cables or peripheral equipment for command/control and operation in the field. The 160-03 contains three major components integrated in one case: a) 24-bit resolution state-of-the art low power ADC with CPU and Lid interconnect boards; b) power source; and c) three component 2 Hz sensors (two horizontals and one vertical), and built-in ±4g accelerometer. Optionally, the 1 Hz sensors can be built-in the 160-03 system at the customer's request. The self-contained rechargeable battery pack provides power autonomy up to 7 days during data acquisition at 200 sps on continuous three weak motion and triggered three strong motion recording channels. For longer power autonomy, the 160-03 Aftershock System battery pack can be charged from an external source (solar power system). The data in the field is recorded to a built-in swappable USB flash drive. The 160-03 configuration is fixed based on a configuration file stored on the system, so no external command/control interface is required for parameter setup in the field. For visual control of the system performance in the field, the 160-03 has a built-in LED display which indicates the systems recording status as well as a hot swappable USB drive and battery status. The detailed specifications and performance are presented and discussed.;
Fall-Related Injuries in Community-Dwelling Older Adults in Qom Province, Iran, 2010-2012.
Gilasi, Hamid Reza; Soori, Hamid; Yazdani, Shahram; Taheri Tenjani, Parisa
2015-03-01
Falls and related injuries are common health problems in the elderly. Fractures, brain and internal organ injuries and death are the common consequences of the falls, which result in dependence, decreased self-efficacy, fear of falling, depression, restricted daily activities, hospitalization and admission to the nursing home and impose costs on the individual and the society. The purpose of this study was to determine the types of fall-related injuries and the related risk factors in the elderly population of Qom province, Iran. This retrospective study was performed on 424 elderly people (65 years and over) referred to Shahid Beheshti Hospital, Qom, Iran, due to falls between 2010 and 2012. The ICD-10 codes of external causes of injury from w00 to w19 related to falls were selected from the health information system of the hospital and demographic variables of the patients and external causes of falls were extracted after accessing the files of the patients. Data were analyzed using SPSS version 18 (SPSS Inc., USA). The duration of hospital stay and its relationship with underlying variables were investigated using t test and ANOVA. The level of significance was considered P < 0.05. Among 424 elderly people, 180 cases (42.45%) were male and the mean age of the patients was 78.65 ± 7.70 years. Fall on the same level from slipping, tripping, and stumbling was the most common external cause with 291 victims (68.60%), and hip fracture in 121 patients (29.00%), intertrochanteric fracture in 112 patients (26.90%), and traumatic brain injury in 51 patients (12.20%) were the most common causes of hospital stay. The mean hospital stay was 7.33 ± 3.63 days. Lower limb fracture and traumatic brain injury were the most common causes of hospitalization, which resulted in the longest hospital stay and highest hospitalization costs in the elderly.
NASA Astrophysics Data System (ADS)
Olszewski, R.; Pillich-Kolipińska, A.; Fiedukowicz, A.
2013-12-01
Implementation of INSPIRE Directive in Poland requires not only legal transposition but also development of a number of technological solutions. The one of such tasks, associated with creation of Spatial Information Infrastructure in Poland, is developing a complex model of georeference database. Significant funding for GBDOT project enables development of the national basic topographical database as a multiresolution database (MRDB). Effective implementation of this type of database requires developing procedures for generalization of geographic information (generalization of digital landscape model - DLM), which, treating TOPO10 component as the only source for creation of TOPO250 component, will allow keeping conceptual and classification consistency between those database elements. To carry out this task, the implementation of the system's concept (prepared previously for Head Office of Geodesy and Cartography) is required. Such system is going to execute the generalization process using constrained-based modeling and allows to keep topological relationships between the objects as well as between the object classes. Full implementation of the designed generalization system requires running comprehensive tests which would help with its calibration and parameterization of the generalization procedures (related to the character of generalized area). Parameterization of this process will allow determining the criteria of specific objects selection, simplification algorithms as well as the operation order. Tests with the usage of differentiated, related to the character of the area, generalization process parameters become nowadays the priority issue. Parameters are delivered to the system in the form of XML files, which, with the help of dedicated tool, are generated from the spreadsheet files (XLS) filled in by user. Using XLS file makes entering and modifying the parameters easier. Among the other elements defined by the external parametric files there are: criteria of object selection, metric parameters of generalization algorithms (e.g. simplification or aggregation) and the operations' sequence. Testing on the trial areas of diverse character will allow developing the rules of generalization process' realization, its parameterization with the proposed tool within the multiresolution reference database. The authors have attempted to develop a generalization process' parameterization for a number of different trial areas. The generalization of the results will contribute to the development of a holistic system of generalized reference data stored in the national geodetic and cartographic resources.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gunter, Dan; Lee, Jason; Stoufer, Martin
2003-03-28
The NetLogger Toolkit is designed to monitor, under actual operating conditions, the behavior of all the elements of the application-to-application communication path in order to determine exactly where time is spent within a complex system Using NetLogger, distnbuted application components are modified to produce timestamped logs of "interesting" events at all the critical points of the distributed system Events from each component are correlated, which allov^ one to characterize the performance of all aspects of the system and network in detail. The NetLogger Toolkit itself consists of four components an API and library of functions to simplify the generation ofmore » application-level event logs, a set of tools for collecting and sorting log files, an event archive system, and a tool for visualization and analysis of the log files In order to instrument an application to produce event logs, the application developer inserts calls to the NetLogger API at all the critical points in the code, then links the application with the NetLogger library All the tools in the NetLogger Toolkit share a common log format, and assume the existence of accurate and synchronized system clocks NetLogger messages can be logged using an easy-to-read text based format based on the lETF-proposed ULM format, or a binary format that can still be used through the same API but that is several times faster and smaller, with performance comparable or better than binary message formats such as MPI, XDR, SDDF-Binary, and PBIO. The NetLogger binary format is both highly efficient and self-describing, thus optimized for the dynamic message construction and parsing of application instrumentation. NetLogger includes an "activation" API that allows NetLogger logging to be turned on, off, or modified by changing an external file This IS useful for activating logging in daemons/services (e g GndFTP server). The NetLogger reliability API provides the ability to specify backup logging locations and penodically try to reconnect broken TCP pipe. A typical use for this is to store data on local disk while net is down. An event archiver can log one or more incoming NetLogger streams to a local disk file (netlogd) or to a mySQL database (netarchd). We have found exploratory, visual analysis of the log event data to be the most useful means of determining the causes of performance anomalies The NetLogger Visualization tool, niv, has been developed to provide a flexible and interactive graphical representation of system-level and application-level events.« less
Runwien: a text-based interface for the WIEN package
NASA Astrophysics Data System (ADS)
Otero de la Roza, A.; Luaña, Víctor
2009-05-01
A new text-based interface for WIEN2k, the full-potential linearized augmented plane-waves (FPLAPW) program, is presented. This code provides an easy to use, yet powerful way of generating arbitrarily large sets of calculations. Thus, properties over a potential energy surface and WIEN2k parameter exploration can be calculated using a simple input text file. This interface also provides new capabilities to the WIEN2k package, such as the calculation of elastic constants on hexagonal systems or the automatic gathering of relevant information. Additionally, runwien is modular, flexible and intuitive. Program summaryProgram title: runwien Catalogue identifier: AECM_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AECM_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GPL version 3 No. of lines in distributed program, including test data, etc.: 62 567 No. of bytes in distributed program, including test data, etc.: 610 973 Distribution format: tar.gz Programming language: gawk (with locale POSIX or similar) Computer: All running Unix, Linux Operating system: Unix, GNU/Linux Classification: 7.3 External routines: WIEN2k ( http://www.wien2k.at/), GAWK ( http://www.gnu.org/software/gawk/), rename by L. Wall, a Perl script which renames files, modified by R. Barker to check for the existence of target files, gnuplot ( http://www.gnuplot.info/) Subprograms used:Cat Id: ADSY_v1_0/AECB_v1_0, Title: GIBBS/CRITIC, Reference: CPC 158 (2004) 57/CPC 999 (2009) 999 Nature of problem: Creation of a text-based, batch-oriented interface for the WIEN2k package. Solution method: WIEN2k solves the Kohn-Sham equations of a solid using the FPLAPW formalism. Runwien interprets an input file containing the description of the geometry and structure of the solid and drives the execution of the WIEN2k programs. The input is simplified thanks to the default values of the WIEN2k parameters known to runwien. Additional comments: Designed for WIEN2k versions 06.4, 07.2, 08.2, and 08.3. Running time: For the test case (TiC), a single geometry takes 5 to 10 minutes on a typical desktop PC (Intel Pentium 4, 3.4 GHz, 1 GB RAM). The full example including the calculation of the elastic constants and the equation of state, takes 9 hours and 32 minutes.
The Jade File System. Ph.D. Thesis
NASA Technical Reports Server (NTRS)
Rao, Herman Chung-Hwa
1991-01-01
File systems have long been the most important and most widely used form of shared permanent storage. File systems in traditional time-sharing systems, such as Unix, support a coherent sharing model for multiple users. Distributed file systems implement this sharing model in local area networks. However, most distributed file systems fail to scale from local area networks to an internet. Four characteristics of scalability were recognized: size, wide area, autonomy, and heterogeneity. Owing to size and wide area, techniques such as broadcasting, central control, and central resources, which are widely adopted by local area network file systems, are not adequate for an internet file system. An internet file system must also support the notion of autonomy because an internet is made up by a collection of independent organizations. Finally, heterogeneity is the nature of an internet file system, not only because of its size, but also because of the autonomy of the organizations in an internet. The Jade File System, which provides a uniform way to name and access files in the internet environment, is presented. Jade is a logical system that integrates a heterogeneous collection of existing file systems, where heterogeneous means that the underlying file systems support different file access protocols. Because of autonomy, Jade is designed under the restriction that the underlying file systems may not be modified. In order to avoid the complexity of maintaining an internet-wide, global name space, Jade permits each user to define a private name space. In Jade's design, we pay careful attention to avoiding unnecessary network messages between clients and file servers in order to achieve acceptable performance. Jade's name space supports two novel features: (1) it allows multiple file systems to be mounted under one direction; and (2) it permits one logical name space to mount other logical name spaces. A prototype of Jade was implemented to examine and validate its design. The prototype consists of interfaces to the Unix File System, the Sun Network File System, and the File Transfer Protocol.
25 CFR 580.5 - What happens if I file late or fail to file?
Code of Federal Regulations, 2013 CFR
2013-04-01
... 25 Indians 2 2013-04-01 2013-04-01 false What happens if I file late or fail to file? 580.5 Section 580.5 Indians NATIONAL INDIAN GAMING COMMISSION, DEPARTMENT OF THE INTERIOR APPEAL PROCEEDINGS... What happens if I file late or fail to file? (a) Failure to file an appeal within the time provided...
25 CFR 580.5 - What happens if I file late or fail to file?
Code of Federal Regulations, 2014 CFR
2014-04-01
... 25 Indians 2 2014-04-01 2014-04-01 false What happens if I file late or fail to file? 580.5 Section 580.5 Indians NATIONAL INDIAN GAMING COMMISSION, DEPARTMENT OF THE INTERIOR APPEAL PROCEEDINGS... What happens if I file late or fail to file? (a) Failure to file an appeal within the time provided...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-25
... before the issuance of any amendment. IV. Electronic Submissions (E-Filing) All documents filed in NRC... be filed in accordance with the NRC E-Filing rule (72 FR 49139; August 28, 2007). The E-Filing... the procedural requirements of E-Filing, at least ten 10 days prior to the filing deadline, the...
76 FR 12950 - Combined Notice of Filings
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-09
....204: EnCana Marketing Negotiated Rate Agreement Amendment to be effective 2/24/2011. Filed Date: 03/01... submits tariff filing per 154.403(d)(2): Fuel Filing 2011 to be effective 4/1/2011. Filed Date: 03/01/2011... submits tariff filing per 154.403(d)(2): Fuel Tracker 2011 to be effective 4/1/2011. Filed Date: 03/01...
Arkansas and Louisiana Aeromagnetic and Gravity Maps and Data - A Website for Distribution of Data
Bankey, Viki; Daniels, David L.
2008-01-01
This report contains digital data, image files, and text files describing data formats for aeromagnetic and gravity data used to compile the State aeromagnetic and gravity maps of Arkansas and Louisiana. The digital files include grids, images, ArcInfo, and Geosoft compatible files. In some of the data folders, ASCII files with the extension 'txt' describe the format and contents of the data files. Read the 'txt' files before using the data files.
University of Massachusetts Marine Renewable Energy Center Waverider Bouy Data
Lohrenz, Steven
2015-10-07
The compressed (.zip) file contains Datawell MK-III Directional Waverider binary and unpacked data files as well as a description of the data and manuals for the instrumentation. The data files are contained in the two directories within the zip file, ''Apr_July_2012'' and ''Jun_Sept_2013''. Time series and summary data were recorded in the buoy to binary files with extensions '.RDT' and '.SDT', respectively. These are located in the subdirectories 'Data_Raw' in each of the top-level deployment directories. '.RDT' files contain 3 days of time series (at 1.28 Hz) in 30 minute "bursts". Each '.SDT' file contains summary statistics for the month indicated computed at half-hour intervals for each burst. Each deployment directory also contains a description (in 'File.list') of the Datawell binary data files, and a figure ('Hs_vs_yearday') showing the significant wave height associated with each .RDT file (decoded from the filename). The corresponding unpacked Matlab .mat files are contained in the subdirectories 'Data_Mat'. These files have the extension '.mat' but use the root filename of the source .RDT and .SDT files.
Automated Assignment of MS/MS Cleavable Cross-Links in Protein 3D-Structure Analysis
NASA Astrophysics Data System (ADS)
Götze, Michael; Pettelkau, Jens; Fritzsche, Romy; Ihling, Christian H.; Schäfer, Mathias; Sinz, Andrea
2015-01-01
CID-MS/MS cleavable cross-linkers hold an enormous potential for an automated analysis of cross-linked products, which is essential for conducting structural proteomics studies. The created characteristic fragment ion patterns can easily be used for an automated assignment and discrimination of cross-linked products. To date, there are only a few software solutions available that make use of these properties, but none allows for an automated analysis of cleavable cross-linked products. The MeroX software fills this gap and presents a powerful tool for protein 3D-structure analysis in combination with MS/MS cleavable cross-linkers. We show that MeroX allows an automatic screening of characteristic fragment ions, considering static and variable peptide modifications, and effectively scores different types of cross-links. No manual input is required for a correct assignment of cross-links and false discovery rates are calculated. The self-explanatory graphical user interface of MeroX provides easy access for an automated cross-link search platform that is compatible with commonly used data file formats, enabling analysis of data originating from different instruments. The combination of an MS/MS cleavable cross-linker with a dedicated software tool for data analysis provides an automated workflow for 3D-structure analysis of proteins. MeroX is available at
Realization of a Complex Control & Diagnosis System on Simplified Hardware
NASA Astrophysics Data System (ADS)
Stetter, R.; Swamy Prasad, M.
2015-11-01
Energy is an important factor in today's industrial environment. Pump systems account for about 20% of the total industrial electrical energy consumption. Several studies show that with proper monitoring, control and maintenance, the efficiency of pump systems can be increased. Controlling pump systems with intelligent systems can help to reduce a pump's energy consumption by up to one third of its original consumption. The research in this paper was carried out in the scope of a research project which involves modelling and simulation of pump systems. This paper focuses on the future implementation of modelling capabilities in PLCs (programmable logic controllers). The whole project aims to use a pump itself as the sensor rather than introducing external sensors into the system, which would increase the cost considerably. One promising approach for an economic and robust industrial implementation of this intelligence is the use of PLCs. PLCs can be simulated in multiple ways; in this project, Codesys was chosen for several reasons which are explained in this paper. The first part of this paper explains the modelling of a pump itself, the process load of the asynchronous motor with a control system, and the simulation possibilities of the motor in Codesys. The second part describes the simulation and testing of a system realized. The third part elaborates the Codesys system structure and interfacing of the system with external files. The final part consists of comparing the result with an earlier Matlab/SIMULINK model and original test data.
Social inequalities in mortality by cause among men and women in France.
Saurel-Cubizolles, M-J; Chastang, J-F; Menvielle, G; Leclerc, A; Luce, D
2009-03-01
The aim of this study was to compare inequalities in mortality (all causes and by cause) by occupational group and educational level between men and women living in France in the 1990s. Data were analysed from a permanent demographic sample currently including about one million people. The French Institute of Statistics (INSEE) follows the subjects and collects demographic, social and occupational information from the census schedules and vital status forms. Causes of death were obtained from the national file of the French Institute of Health and Medical Research (INSERM). A relative index of inequality (RII) was calculated to quantify inequalities as a function of educational level and occupational group. Overall all-cause mortality, mortality due to cancer, mortality due to cardiovascular disease and mortality due to external causes (accident, suicide, violence) were considered. Overall, social inequalities were found to be wider among men than among women, for all-cause mortality, cancer mortality and external-cause mortality. However, this trend was not observed for cardiovascular mortality, for which the social inequalities were greater for women than for men, particularly for mortality due to ischaemic cardiac diseases. This study provides evidence for persistent social inequalities in mortality in France, in both men and women. These findings highlight the need for greater attention to social determinants of health. The reduction of cardiovascular disease mortality in low educational level groups should be treated as a major public health priority.
Code of Federal Regulations, 2010 CFR
2010-01-01
... identity when filing documents and serving participants electronically through the E-Filing system, and... transmitted electronically from the E-Filing system to the submitter confirming receipt of electronic filing... presentation of the docket and a link to its files. E-Filing System means an electronic system that receives...
Code of Federal Regulations, 2010 CFR
2010-04-01
...) Personnel and medical files and similar files the disclosure of which would constitute an unwarranted invasion of personal privacy; (g) Investigatory files (including security investigation files and files...
Code of Federal Regulations, 2011 CFR
2011-04-01
...) Personnel and medical files and similar files the disclosure of which would constitute an unwarranted invasion of personal privacy; (g) Investigatory files (including security investigation files and files...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-05-16
... posting CSV file samples. Order No. 770 revised the process for filing EQRs. Pursuant to Order No. 770, one of the new processes for filing allows EQRs to be filed using an XML file. The XML schema that is needed to file EQRs in this manner is now posted on the Commission's Web site at http://www.ferc.gov/docs...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-07-06
... system the time used by the Commission to mark officially the time that eFilings and eTariff submissions... timely. The time display will assist users in ensuring that their filings are timely filed, i.e., are... electronic submissions in lieu of paper using the eFiling link at http://www.ferc.gov . Also, Filing...
Design of housing file box of fire academy based on RFID
NASA Astrophysics Data System (ADS)
Li, Huaiyi
2018-04-01
This paper presents a design scheme of intelligent file box based on RFID. The advantages of RFID file box and traditional file box are compared and analyzed, and the feasibility of RFID file box design is analyzed based on the actual situation of our university. After introducing the shape and structure design of the intelligent file box, the paper discusses the working process of the file box, and explains in detail the internal communication principle of the RFID file box and the realization of the control system. The application of the RFID based file box will greatly improve the efficiency of our school's archives management.
Register file soft error recovery
Fleischer, Bruce M.; Fox, Thomas W.; Wait, Charles D.; Muff, Adam J.; Watson, III, Alfred T.
2013-10-15
Register file soft error recovery including a system that includes a first register file and a second register file that mirrors the first register file. The system also includes an arithmetic pipeline for receiving data read from the first register file, and error detection circuitry to detect whether the data read from the first register file includes corrupted data. The system further includes error recovery circuitry to insert an error recovery instruction into the arithmetic pipeline in response to detecting the corrupted data. The inserted error recovery instruction replaces the corrupted data in the first register file with a copy of the data from the second register file.
Code of Federal Regulations, 2010 CFR
2010-04-01
... medical files and similar files the disclosure of which would constitute a clearly unwarranted invasion of personal privacy; (g) Investigatory files (including security investigation files and files concerning the...
Code of Federal Regulations, 2011 CFR
2011-04-01
... medical files and similar files the disclosure of which would constitute a clearly unwarranted invasion of personal privacy; (g) Investigatory files (including security investigation files and files concerning the...
49 CFR 564.5 - Information filing; agency processing of filings.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 49 Transportation 6 2010-10-01 2010-10-01 false Information filing; agency processing of filings... HIGHWAY TRAFFIC SAFETY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION REPLACEABLE LIGHT SOURCE INFORMATION (Eff. until 12-01-12) § 564.5 Information filing; agency processing of filings. (a) Each manufacturer...
Wahlgren, Carl-Fredrik; Edelbring, Samuel; Fors, Uno; Hindbeck, Hans; Ståhle, Mona
2006-01-01
Background Most of the many computer resources used in clinical teaching of dermatology and venereology for medical undergraduates are information-oriented and focus mostly on finding a "correct" multiple-choice alternative or free-text answer. We wanted to create an interactive computer program, which facilitates not only factual recall but also clinical reasoning. Methods Through continuous interaction with students, a new computerised interactive case simulation system, NUDOV, was developed. It is based on authentic cases and contains images of real patients, actors and healthcare providers. The student selects a patient and proposes questions for medical history, examines the skin, and suggests investigations, diagnosis, differential diagnoses and further management. Feedback is given by comparing the user's own suggestions with those of a specialist. In addition, a log file of the student's actions is recorded. The program includes a large number of images, video clips and Internet links. It was evaluated with a student questionnaire and by randomising medical students to conventional teaching (n = 85) or conventional teaching plus NUDOV (n = 31) and comparing the results of the two groups in a final written examination. Results The questionnaire showed that 90% of the NUDOV students stated that the program facilitated their learning to a large/very large extent, and 71% reported that extensive working with authentic computerised cases made it easier to understand and learn about diseases and their management. The layout, user-friendliness and feedback concept were judged as good/very good by 87%, 97%, and 100%, respectively. Log files revealed that the students, in general, worked with each case for 60–90 min. However, the intervention group did not score significantly better than the control group in the written examination. Conclusion We created a computerised case simulation program allowing students to manage patients in a non-linear format supporting the clinical reasoning process. The student gets feedback through comparison with a specialist, eliminating the need for external scoring or correction. The model also permits discussion of case processing, since all transactions are stored in a log file. The program was highly appreciated by the students, but did not significantly improve their performance in the written final examination. PMID:16907972
75 FR 80851 - Records Schedules; Availability and Request for Comments
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-23
..., including highway bridge and tunnel correspondence, designs and plans, geotechnical and hydraulic files... files, asphalt and pavement research files, statewide contract files, delineation files, recycling and...
78 FR 46936 - Combined Notice of Filings
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-02
...: Filings Instituting Proceedings Docket Numbers: RP13-1103-000. Applicants: Northern Border Pipeline Company. Description: ACA Filing 2013 to be effective 10/1/2013. Filed Date: 7/25/13. Accession Number... Gas Transmission System. Description: ACA Filing 2013 to be effective 10/1/2013. Filed Date: 7/25/13...
77 FR 56833 - Combined Notice of Filings #1
Federal Register 2010, 2011, 2012, 2013, 2014
2012-09-14
... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission Combined Notice of Filings 1 Take notice... revisions compliance filing to be effective 9/1/2012. Filed Date: 8/29/12. Accession Number: 20120830-5007... Company. Description: SEGCO 2012 PBOP Filing to be effective 1/1/2012. Filed Date: 8/29/12. Accession...
77 FR 35371 - Combined Notice of Filings #1
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-13
... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission Combined Notice of Filings 1 Take notice.... Applicants: Duke Energy Miami Fort, LLC. Description: MBR Filing to be effective 10/1/2012. Filed Date: 6/5...-000. Applicants: Duke Energy Piketon, LLC. Description: MBR Filing to be effective 10/1/2012. Filed...
47 CFR 2.1205 - Filing of required declaration.
Code of Federal Regulations, 2010 CFR
2010-10-01
... Interference § 2.1205 Filing of required declaration. (a) For points of entry where electronic filing with... electronic filing with Customs is available, submit the following information to Customs when filing the... Customs for electronic filing. (i) The terms under which the device is being imported, as indicated by...
18 CFR 385.2003 - Specifications (Rule 2003).
Code of Federal Regulations, 2010 CFR
2010-04-01
... paper. (c) Filing via the Internet. (1) All documents filed under this Chapter may be filed via the Internet except those listed by the Secretary. Except as otherwise specifically provided in this Chapter, filing via the Internet is in lieu of other methods of filing. Internet filings must be made in...
47 CFR 1.10006 - Is electronic filing mandatory?
Code of Federal Regulations, 2011 CFR
2011-10-01
... 47 Telecommunication 1 2011-10-01 2011-10-01 false Is electronic filing mandatory? 1.10006 Section... International Bureau Filing System § 1.10006 Is electronic filing mandatory? Electronic filing is mandatory for... System (IBFS) form is available. Applications for which an electronic form is not available must be filed...
47 CFR 61.14 - Method of filing publications.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 47 Telecommunication 3 2014-10-01 2014-10-01 false Method of filing publications. 61.14 Section 61...) TARIFFS Rules for Electronic Filing § 61.14 Method of filing publications. (a) Publications filed... date of a publication received by the Electronic Tariff Filing System will be determined by the date...
Collaborative, Trust-Based Security Mechanisms for a National Utility Intranet
2007-09-01
time_message_ceated … username bearnold operation_type copy from_file C:/etc/ passwd \\MPLpw.txt from_file_data_type ND //network data...time_message_created … username bearnold operation_type paste from_file C:\\etc\\ passwd \\MPLpw.txt //logon server password file...from_file_data_type ND from_file_caveat restricted-release to_file F:\\Copy of C:\\etc\\ passwd \\MPLpw.txt //removable drive //end message data
1981-12-01
reading a file either saved in a previous session or created as a result of the internal execution save file (described later). LOAND PFN LOADS...command is used to make new data retrievals. READ PEN DIRECT ENTRY FROM A PREVIOUSLY SAVED FILE This command bypasses the conventional terminal entry by...INTERNAL SAVE FILE This command accesses a file created using the internal execution save file output option. Loading a file results in entering the
1983-12-01
MAIN OEG=NFGVB1.3266P //COPY PEOC EILE=, MEM = // EXEC PGM=IEBGENEB //SISPRINT DD SYSOUT=A //SYSIN DC DÖMMY //SYS0T1 DD...COE*,FILE=1, MEM =FL027 // EXEC COPY,FILE=2,HEM=A411IN // EXEC COEY,FILE=3, MEM =VWIN // EXEC COPY,FILE = 4, MEM =A411A01...EXEC C0EY,FILE=5,MEä=INTERE // EXEC COPY,FILE=6, MEM =A411PS // EXEC COEY,FILE=7, MEM =A411P1 // EXEC COPY,FILE
Treelink: data integration, clustering and visualization of phylogenetic trees.
Allende, Christian; Sohn, Erik; Little, Cedric
2015-12-29
Phylogenetic trees are central to a wide range of biological studies. In many of these studies, tree nodes need to be associated with a variety of attributes. For example, in studies concerned with viral relationships, tree nodes are associated with epidemiological information, such as location, age and subtype. Gene trees used in comparative genomics are usually linked with taxonomic information, such as functional annotations and events. A wide variety of tree visualization and annotation tools have been developed in the past, however none of them are intended for an integrative and comparative analysis. Treelink is a platform-independent software for linking datasets and sequence files to phylogenetic trees. The application allows an automated integration of datasets to trees for operations such as classifying a tree based on a field or showing the distribution of selected data attributes in branches and leafs. Genomic and proteonomic sequences can also be linked to the tree and extracted from internal and external nodes. A novel clustering algorithm to simplify trees and display the most divergent clades was also developed, where validation can be achieved using the data integration and classification function. Integrated geographical information allows ancestral character reconstruction for phylogeographic plotting based on parsimony and likelihood algorithms. Our software can successfully integrate phylogenetic trees with different data sources, and perform operations to differentiate and visualize those differences within a tree. File support includes the most popular formats such as newick and csv. Exporting visualizations as images, cluster outputs and genomic sequences is supported. Treelink is available as a web and desktop application at http://www.treelinkapp.com .
Publications of the Western Earth Surface Processes Team 2000
Powell, Charles L.; Stone, Paul
2001-01-01
The Western Earth Surface Processes Team (WESP) of the U.S. Geological Survey (USGS) conducts geologic mapping and related topical earth science studies in the western United States. This work is focused on areas where modern geologic maps and associated earth-science data are needed to address key societal and environmental issues such as ground-water quality, potential geologic hazards, and land-use decisions. Areas of primary emphasis in 2000 included southern California, the San Francisco Bay region, the Pacific Northwest, the Las Vegas urban corridor, and selected National Park lands. The team has its headquarters in Menlo Park, California, and maintains smaller field offices at several other locations in the western United States. The results of research conducted by the WESPT are released to the public as a variety of databases, maps, text reports, and abstracts, both through the internal publication system of the USGS and in diverse external publications such as scientific journals and books. This report lists publications of the WESPT released in 2000 as well as additional 1999 publications that were not included in the previous list (USGS Open-file Report 00-215). Most of the publications listed were authored or coauthored by WESPT staff. The list also includes some publications authored by non-USGS cooperators with the WESPT, as well as some authored by USGS staff outside the WESPT in cooperation with WESPT projects. Several of the publications listed are available on the World Wide Web; for these, URL addresses are provided. Many of these Web publications are USGS open-file reports that contain large digital databases of geologic map and related information.
[Digital teaching archive. Concept, implementation, and experiences in a university setting].
Trumm, C; Dugas, M; Wirth, S; Treitl, M; Lucke, A; Küttner, B; Pander, E; Clevert, D-A; Glaser, C; Reiser, M
2005-08-01
Film-based teaching files require a substantial investment in human, logistic, and financial resources. The combination of computer and network technology facilitates the workflow integration of distributing radiologic teaching cases within an institution (intranet) or via the World Wide Web (Internet). A digital teaching file (DTF) should include the following basic functions: image import from different sources and of different formats, editing of imported images, uniform case classification, quality control (peer review), a controlled access of different user groups (in-house and external), and an efficient retrieval strategy. The portable network graphics image format (PNG) is especially suitable for DTFs because of several features: pixel support, 2D-interlacing, gamma correction, and lossless compression. The American College of Radiology (ACR) "Index for Radiological Diagnoses" is hierarchically organized and thus an ideal classification system for a DTF. Computer-based training (CBT) in radiology is described in numerous publications, from supplementing traditional learning methods to certified education via the Internet. Attractiveness of a CBT application can be increased by integration of graphical and interactive elements but makes workflow integration of daily case input more difficult. Our DTF was built with established Internet instruments and integrated into a heterogeneous PACS/RIS environment. It facilitates a quick transfer (DICOM_Send) of selected images at the time of interpretation to the DTF and access to the DTF application at any time anywhere within the university hospital intranet employing a standard web browser. A DTF is a small but important building block in an institutional strategy of knowledge management.
Schnipper, Jeffrey Lawrence; Messler, Jordan; Ramos, Pedro; Kulasa, Kristen; Nolan, Ann; Rogers, Kendall
2014-01-01
Background: Insulin is a top source of adverse drug events in the hospital, and glycemic control is a focus of improvement efforts across the country. Yet, the majority of hospitals have no data to gauge their performance on glycemic control, hypoglycemia rates, or hypoglycemic management. Current tools to outsource glucometrics reports are limited in availability or function. Methods: Society of Hospital Medicine (SHM) faculty designed and implemented a web-based data and reporting center that calculates glucometrics on blood glucose data files securely uploaded by users. Unit labels, care type (critical care, non–critical care), and unit type (eg, medical, surgical, mixed, pediatrics) are defined on upload allowing for robust, flexible reporting. Reports for any date range, care type, unit type, or any combination of units are available on demand for review or downloading into a variety of file formats. Four reports with supporting graphics depict glycemic control, hypoglycemia, and hypoglycemia management by patient day or patient stay. Benchmarking and performance ranking reports are generated periodically for all hospitals in the database. Results: In all, 76 hospitals have uploaded at least 12 months of data for non–critical care areas and 67 sites have uploaded critical care data. Critical care benchmarking reveals wide variability in performance. Some hospitals achieve top quartile performance in both glycemic control and hypoglycemia parameters. Conclusions: This new web-based glucometrics data and reporting tool allows hospitals to track their performance with a flexible reporting system, and provides them with external benchmarking. Tools like this help to establish standardized glucometrics and performance standards. PMID:24876426
Maynard, Greg; Schnipper, Jeffrey Lawrence; Messler, Jordan; Ramos, Pedro; Kulasa, Kristen; Nolan, Ann; Rogers, Kendall
2014-07-01
Insulin is a top source of adverse drug events in the hospital, and glycemic control is a focus of improvement efforts across the country. Yet, the majority of hospitals have no data to gauge their performance on glycemic control, hypoglycemia rates, or hypoglycemic management. Current tools to outsource glucometrics reports are limited in availability or function. Society of Hospital Medicine (SHM) faculty designed and implemented a web-based data and reporting center that calculates glucometrics on blood glucose data files securely uploaded by users. Unit labels, care type (critical care, non-critical care), and unit type (eg, medical, surgical, mixed, pediatrics) are defined on upload allowing for robust, flexible reporting. Reports for any date range, care type, unit type, or any combination of units are available on demand for review or downloading into a variety of file formats. Four reports with supporting graphics depict glycemic control, hypoglycemia, and hypoglycemia management by patient day or patient stay. Benchmarking and performance ranking reports are generated periodically for all hospitals in the database. In all, 76 hospitals have uploaded at least 12 months of data for non-critical care areas and 67 sites have uploaded critical care data. Critical care benchmarking reveals wide variability in performance. Some hospitals achieve top quartile performance in both glycemic control and hypoglycemia parameters. This new web-based glucometrics data and reporting tool allows hospitals to track their performance with a flexible reporting system, and provides them with external benchmarking. Tools like this help to establish standardized glucometrics and performance standards. © 2014 Diabetes Technology Society.
A new insight into the oscillation characteristics of endosonic files used in dentistry.
Lea, S C; Walmsley, A D; Lumley, P J; Landini, G
2004-05-21
The aim of this study was to assess the oscillation characteristics of unconstrained endosonic files using a scanning laser vibrometer (SLV). Factors investigated included file vibration frequency and node/antinode location as well as the variation in file displacement amplitude due to increasing generator power setting. A 30 kHz Mini Piezon generator (Electro-Medical Systems, Switzerland) was used in conjunction with a #15 and #35 K-file. Each file was fixed in position with the long axis of the file perpendicular to the SLV camera head. The laser from the SLV was scanned over the length of the oscillating file for generator power settings 1 to 5 (minimum to half power). Measurements were repeated ten times. The fundamental vibration frequency for both files was 27.50 kHz. Scans of each file showed the positions of nodes/anti-nodes along the file length. The #15 file demonstrated no significant variation in its mean maximum displacement amplitude with increasing generator power, except at power setting 5, where a decrease in displacement amplitude was observed. The #35 file showed a general increase in mean maximum displacement amplitude with increasing power setting, except at power setting 4 where a 65% decrease in displacement amplitude occurred. In conclusion, scanning laser vibrometry is an effective method for assessing endosonic file vibration characteristics. The SLV was able to demonstrate that (unloaded) file vibration displacement amplitude does not increase linearly with increasing generator power. Further work is being performed on a greater variety of files and generators. Vibration characteristics of files under various loads and varying degrees of constraint should also be investigated.
Analyzing endosonic root canal file oscillations: an in vitro evaluation.
Lea, Simon C; Walmsley, A Damien; Lumley, Philip J
2010-05-01
Passive ultrasonic irrigation may be used to improve bacterial reduction within the root canal. The technique relies on a small file being driven to oscillate freely within the canal and activating an irrigant solution through biophysical forces such as microstreaming. There is limited information regarding a file's oscillation patterns when operated while surrounded by fluid as is the case within a canal root. Files of different sizes (#10 and #30, 27 mm and 31 mm) were connected to an ultrasound generator via a 120 degrees file holder. Files were immersed in a water bath, and a laser vibrometer set up with measurement lines superimposed over the files. The laser vibrometer was scanned over the oscillating files. Measurements were repeated 10 times for each file/power setting used. File mode shapes are comprised of a series of nodes/antinodes, with thinner, longer files producing more antinodes. The maximum vibration occurred at the free end of the file. Increasing generator power had no significant effect on this maximum amplitude (p > 0.20). Maximum displacement amplitudes were 17 to 22 microm (#10 file, 27 mm), 15 to 21 microm (#10 file, 31 mm), 6 to 9 microm (#30 file, 27 mm), and 5 to 7 microm (#30, 31 mm) for all power settings. Antinodes occurring along the remaining file length were significantly larger at generator power 1 than at powers 2 through 5 (p < 0.03). At higher generator powers, energy delivered to the file is dissipated in unwanted vibration resulting in reduced vibration displacement amplitudes. This may reduce the occurrence of the biophysical forces necessary to maximize the technique's effectiveness. Copyright (c) 2010 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.
... and Team Healthcare Providers Prevention Information and Advice Posters for the Athletic Community General MRSA Information and ... site? Adobe PDF file Microsoft PowerPoint file Microsoft Word file Microsoft Excel file Audio/Video file Apple ...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-06-13
... and Customs Enforcement, Customs and Border Protection--001 Alien File, Index, and National File... Services, Immigration and Customs Enforcement, and Customs and Border Protection--001 Alien File, Index... border protection processes. The Alien File (A-File), Index, and National File Tracking System of Records...
76 FR 30331 - Combined Notice of Filings No. 1
Federal Register 2010, 2011, 2012, 2013, 2014
2011-05-25
...: Docket Numbers: RP11-2102-000. Applicants: Big Sandy Pipeline, LLC. Description: Big Sandy Pipeline, LLC submits tariff filing per 154.204: Big Sandy Negotiated Rate Agreement Filing to be effective 6/ 1/2011... tariff filing per 154.204: Filing to Remove Expired Agreements to be effective 6/12/2011. Filed Date: 05...
29 CFR 24.103 - Filing of retaliation complaint.
Code of Federal Regulations, 2011 CFR
2011-07-01
... be reduced to writing by OSHA. If a complainant is not able to file the complaint in English, the complaint may be filed in any language. (c) Place of Filing. The complaint should be filed with the OSHA... resides or was employed, but may be filed with any OSHA officer or employee. Addresses and telephone...
39 CFR 3001.10 - Form and number of copies of documents.
Code of Federal Regulations, 2014 CFR
2014-07-01
... document filed with the Commission must be submitted through Filing Online by an account holder, unless a... Filing Online. (3) The form of documents filed as library references is governed by § 3001.31(b)(2)(iv). (4) Documents filed online must satisfy Filing Online system compatibility requirements specified by...
39 CFR 3001.10 - Form and number of copies of documents.
Code of Federal Regulations, 2013 CFR
2013-07-01
... document filed with the Commission must be submitted through Filing Online by an account holder, unless a... Filing Online. (3) The form of documents filed as library references is governed by § 3001.31(b)(2)(iv). (4) Documents filed online must satisfy Filing Online system compatibility requirements specified by...
29 CFR 4007.3 - Filing requirement; method of filing.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 29 Labor 9 2010-07-01 2010-07-01 false Filing requirement; method of filing. 4007.3 Section 4007.3... PREMIUMS § 4007.3 Filing requirement; method of filing. (a) In general. The estimation, determination... Web site (http://www.pbgc.gov). Subject to the provisions of § 4007.13, the plan administrator of each...
76 FR 43679 - Filing via the Internet; Notice of Additional File Formats for efiling
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-21
... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. RM07-16-000] Filing via the Internet; Notice of Additional File Formats for efiling Take notice that the Commission has added to its list of acceptable file formats the four-character file extensions for Microsoft Office 2007/2010...
Code of Federal Regulations, 2010 CFR
2010-04-01
... 20 Employees' Benefits 1 2010-04-01 2010-04-01 false Filing date. 217.16 Section 217.16 Employees... LUMP SUM Filing An Application § 217.16 Filing date. An application filed in a manner and form acceptable to the Board is officially filed with the Board on the earliest of the following dates: (a) On the...
18 CFR 385.2001 - Filings (Rule 2001).
Code of Federal Regulations, 2010 CFR
2010-04-01
... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Filings (Rule 2001... Filings in Proceedings Before the Commission § 385.2001 Filings (Rule 2001). (a) Filings with the... filing via the Internet pursuant to Rule 2003 through the links provided at http://www.ferc.gov. Note to...
49 CFR 1104.6 - Timely filing required.
Code of Federal Regulations, 2010 CFR
2010-10-01
... offers next day delivery to Washington, DC. If the e-filing option is chosen (for those pleadings and documents that are appropriate for e-filing, as determined by reference to the information on the Board's Web site), then the e-filed pleading or document is timely filed if the e-filing process is completed...
10 CFR 110.89 - Filing and service.
Code of Federal Regulations, 2010 CFR
2010-01-01
...: Rulemakings and Adjudications Staff or via the E-Filing system, following the procedure set forth in 10 CFR 2.302. Filing by mail is complete upon deposit in the mail. Filing via the E-Filing system is completed... residence with some occupant of suitable age and discretion; (2) Following the requirements for E-Filing in...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-04
... filings must be submitted in accordance with the Commission's electronic tariff filing (eTariff... eTariff Filing Title field and in the Description field in eFiling. DATES: Effective on October 4... electronic tariff filing (eTariff) requirements in Electronic Tariff Filings, Order No. 714, FERC Stats...
49 CFR 1104.1 - Address, identification, and electronic filing option.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 2 of 4” and so forth). (e) Persons filing pleadings and documents with the Board have the option of electronically filing (e-filing) certain types of pleadings and documents instead of filing paper copies. Details regarding the types of pleadings and documents eligible for e-filing, the procedures to be followed, and...
75 FR 70733 - Combined Notice of Filings #1
Federal Register 2010, 2011, 2012, 2013, 2014
2010-11-18
... Corporation submits tariff filing per 35: 2010-10-29 CAISO MSG Transition Cost Compliance to be effective 11.... Description: Alcan Power Marketing, Inc. submits tariff filing per 35.12: Baseline Filing to be effective 11/1... Compliance Filing to be effective 9/30/2010. Filed Date: 10/29/2010. Accession Number: 20101029-5211. Comment...
Long-Term file activity patterns in a UNIX workstation environment
NASA Technical Reports Server (NTRS)
Gibson, Timothy J.; Miller, Ethan L.
1998-01-01
As mass storage technology becomes more affordable for sites smaller than supercomputer centers, understanding their file access patterns becomes crucial for developing systems to store rarely used data on tertiary storage devices such as tapes and optical disks. This paper presents a new way to collect and analyze file system statistics for UNIX-based file systems. The collection system runs in user-space and requires no modification of the operating system kernel. The statistics package provides details about file system operations at the file level: creations, deletions, modifications, etc. The paper analyzes four months of file system activity on a university file system. The results confirm previously published results gathered from supercomputer file systems, but differ in several important areas. Files in this study were considerably smaller than those at supercomputer centers, and they were accessed less frequently. Additionally, the long-term creation rate on workstation file systems is sufficiently low so that all data more than a day old could be cheaply saved on a mass storage device, allowing the integration of time travel into every file system.
... Boolean useRights, FileShare share, Int32 bufferSize, FileOptions options, SECURITY_ATTRIBUTES secAttrs, String msgPath, Boolean bFromProxy) at System.IO.FileStream..ctor(String path, FileMode mode, FileAccess ...
Torsion and bending properties of shape memory and superelastic nickel-titanium rotary instruments.
Ninan, Elizabeth; Berzins, David W
2013-01-01
Recently introduced into the market are shape memory nickel-titanium (NiTi) rotary files. The objective of this study was to investigate the torsion and bending properties of shape memory files (CM Wire, HyFlex CM, and Phoenix Flex) and compare them with conventional (ProFile ISO and K3) and M-Wire (GT Series X and ProFile Vortex) NiTi files. Sizes 20, 30, and 40 (n = 12/size/taper) of 0.02 taper CM Wire, Phoenix Flex, K3, and ProFile ISO and 0.04 taper HyFlex CM, ProFile ISO, GT Series X, and Vortex were tested in torsion and bending per ISO 3630-1 guidelines by using a torsiometer. All data were statistically analyzed by analysis of variance and the Tukey-Kramer test (P = .05) to determine any significant differences between the files. Significant interactions were present among factors of size and file. Variability in maximum torque values was noted among the shape memory files brands, sometimes exhibiting the greatest or least torque depending on brand, size, and taper. In general, the shape memory files showed a high angle of rotation before fracture but were not statistically different from some of the other files. However, the shape memory files were more flexible, as evidenced by significantly lower bending moments (P < .008). Shape memory files show greater flexibility compared with several other NiTi rotary file brands. Copyright © 2013 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.
77 FR 38279 - Combined Notice of Filings
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-27
.... Description: CO2 Gas Quality Settlement Filing of Wyoming Interstate Company, LLC. Filed Date: 6/11/12.... Description: Fuel Filing to be effective 7/1/2012. Filed Date: 6/20/12. Accession Number: 20120620-5118...
30 CFR 865.12 - Procedures for filing an application for review of discrimination.
Code of Federal Regulations, 2011 CFR
2011-07-01
... for filing an application for review of discrimination. (a) Who may file. Any employee, or any... violation of § 865.11(a) of this part may file an application for review. For the purpose of these... alleged discrimination. (b) Where to file. The employee or representative may file the application for...
30 CFR 865.12 - Procedures for filing an application for review of discrimination.
Code of Federal Regulations, 2010 CFR
2010-07-01
... for filing an application for review of discrimination. (a) Who may file. Any employee, or any... violation of § 865.11(a) of this part may file an application for review. For the purpose of these... alleged discrimination. (b) Where to file. The employee or representative may file the application for...
77 FR 34943 - Combined Notice of Filings #2
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-12
... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission Combined Notice of Filings 2 Take notice... Texas North Company submits tariff filing per 35.13(a)(2)(iii: TNC-Texas New Mexico Power Amd. 3 to IA... filing per 35.13(a)(2)(iii: Reactive Filing to be effective 12/31/9998. Filed Date: 6/5/12. Accession...
77 FR 74839 - Combined Notice of Filings
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-18
..., LP. Description: National Grid LNG, LP submits tariff filing per 154.203: Adoption of NAESB Version 2... with Order to Amend NAESB Version 2.0 Filing to be effective 12/1/2012. Filed Date: 12/11/12. Accession...: Refile to comply with Order on NAESB Version 2.0 Filing to be effective 12/1/2012. Filed Date: 12/11/12...
39 CFR 3001.10 - Form and number of copies of documents.
Code of Federal Regulations, 2012 CFR
2012-07-01
... document filed with the Commission must be submitted through Filing Online by an account holder, unless a... Filing Online. (3) The form of documents filed as library references is governed by § 3001.31(b)(2)(iv... filed both online and in hardcopy form pursuant to paragraph (b) of this section. (5) Documents filed...
39 CFR 3001.10 - Form and number of copies of documents.
Code of Federal Regulations, 2011 CFR
2011-07-01
... document filed with the Commission must be submitted through Filing Online by an account holder, unless a... Filing Online. (3) The form of documents filed as library references is governed by § 3001.31(b)(2)(iv... filed both online and in hardcopy form pursuant to paragraph (b) of this section. (5) Documents filed...
48 CFR 6101.2 - Filing cases; time limits for filing; notice of docketing; consolidation [Rule 2].
Code of Federal Regulations, 2012 CFR
2012-10-01
... 48 Federal Acquisition Regulations System 7 2012-10-01 2012-10-01 false Filing cases; time limits... 6101.2 Filing cases; time limits for filing; notice of docketing; consolidation [Rule 2]. (a) Filing... name, address, telephone number, facsimile machine number, and e-mail address, if available, of the...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-21
... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket Nos. RM07-16-000; RM01-5-000; RM12-3-000] Filing via the Internet; Electronic Tariff Filings; Revisions to Electric Quarterly Report Filing Process; Notice of Technical Conference Take notice that on April 16, 2013, the staff of the...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-04
...-5-000] Revisions to Procedural Regulations Governing Filing, Indexing and Service by Oil Pipelines, Electronic Tariff Filings; Notice of Changes to eTariff Part 341 Type of Filing Codes Order No. 780... available eTariff Type of Filing Codes (TOFC) will be modified as follows: \\2\\ \\1\\ Filing, Indexing and...
5 CFR 1201.22 - Filing an appeal and responses to appeals.
Code of Federal Regulations, 2010 CFR
2010-01-01
... commercial or personal delivery, by facsimile, by mail, or by electronic filing under § 1201.14. (e) Filing a... 5 Administrative Personnel 3 2010-01-01 2010-01-01 false Filing an appeal and responses to appeals....22 Filing an appeal and responses to appeals. (a) Place of filing. Appeals, and responses to those...
Code of Federal Regulations, 2010 CFR
2010-07-01
... document shall be filed. (e) Filing date. (1) Except for the documents listed in paragraph (e)(2) of this... 29 Labor 9 2010-07-01 2010-07-01 false Filing. 2200.8 Section 2200.8 Labor Regulations Relating to... § 2200.8 Filing. (a) What to file. All papers required to be served on a party or intervenor, except for...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-02-27
... NRC E-filing system. Requests for a hearing and petitions for leave to intervene should be filed in.../ . IV. Electronic Submissions (E-Filing) All documents filed in NRC adjudicatory proceedings, including... NRC E-Filing rule (72 FR 49139, August 28, 2007). The E-Filing process requires participants to submit...
22 CFR 123.22 - Filing, retention, and return of export licenses and filing of export information.
Code of Federal Regulations, 2014 CFR
2014-04-01
... 22 Foreign Relations 1 2014-04-01 2014-04-01 false Filing, retention, and return of export licenses and filing of export information. 123.22 Section 123.22 Foreign Relations DEPARTMENT OF STATE....22 Filing, retention, and return of export licenses and filing of export information. (a) Any export...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-04-12
... Systems, Inc.; Notice of Intent To File License Application, Filing of Pre-Application Document, and Approving Use of the Traditional Licensing Process a. Type of Filing: Notice of Intent to File License...: November 11, 2012. d. Submitted by: Aquenergy Systems, Inc., a fully owned subsidiaries of Enel Green Power...
49 CFR 31.26 - Filing, form, and service of papers.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 49 Transportation 1 2011-10-01 2011-10-01 false Filing, form, and service of papers. 31.26 Section... Filing, form, and service of papers. (a) Filing and form. (1) A party filing any document under this part... paper filed in the proceeding shall contain a caption setting forth the title of the action, the case...
6 CFR 13.26 - Filing, form and service of papers.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 6 Domestic Security 1 2011-01-01 2011-01-01 false Filing, form and service of papers. 13.26... CIVIL REMEDIES § 13.26 Filing, form and service of papers. (a) Filing and form. (1) Documents filed with the ALJ will include an original and two copies. (2) Every pleading and paper filed in the proceeding...
12 CFR 908.25 - Filing of papers.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 12 Banks and Banking 7 2011-01-01 2011-01-01 false Filing of papers. 908.25 Section 908.25 Banks... RULES OF PRACTICE AND PROCEDURE IN HEARINGS ON THE RECORD General Rules § 908.25 Filing of papers. (a) Filing. Any papers required to be filed shall be addressed to the presiding officer and filed with the...
12 CFR 908.25 - Filing of papers.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 12 Banks and Banking 7 2010-01-01 2010-01-01 false Filing of papers. 908.25 Section 908.25 Banks... RULES OF PRACTICE AND PROCEDURE IN HEARINGS ON THE RECORD General Rules § 908.25 Filing of papers. (a) Filing. Any papers required to be filed shall be addressed to the presiding officer and filed with the...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-28
..., Inc.; Notice of Intent To File License Application, Filing of Pre-Application Document, and Approving... Application and Request to Use the Traditional Licensing Process. b. Project No.: 14432-000. c. Date Filed... Endangered Species Act. m. Archon filed a Pre-Application Document (PAD) with the Commission, pursuant to 18...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-10
... Hydro LLC; Notice of Intent To File License Application, Filing of Pre-Application Document (PAD... Application for a New License and Commencing Pre-filing Process. b. Project No.: 2531-067. c. Dated Filed... Commission a Pre-Application Document (PAD; including a proposed process plan and schedule), pursuant to 18...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-10
..., LLC; Notice of Intent To File License Application, Filing of Pre-Application Document, and Approving... Application and Request to Use the Traditional Licensing Process. b. Project No.: 13755-001. c. Date Filed.... m. Free Flow Power filed a Pre-Application Document (PAD; including a proposed process plan and...
Musale, P K; Mujawar, S A V
2014-04-01
This in vitro study aimed to evaluate the efficacy of rotary ProFile, ProTaper, Hero Shaper and K-files in shaping ability, cleaning efficacy, preparation time and instrument distortion in primary molars. Sixty extracted primary mandibular second molars were divided into four equal groups: Group I K-file, Group II ProFile, Group III ProTaper file and Group IV Hero Shaper file. The shaping ability was determined by comparing pre- and post-instrumentation CBCT scans and data analysed with SPSS program using the Chi-square test. Cleaning efficacy was evaluated by the degree of India ink removal from the canal walls under stereomicroscopy. Instrumentation times were calculated for each tooth and instrument distortion was visually checked and duly noted. The cleaning efficacy and instrumentation time were determined using ANOVA with Tukey's correction. Instrument distortion was analysed using Chi-square test. The canal taper was significantly more conical for rotary files as compared to K-files with Chi-square test (p < 0.05). Cleaning efficacy of rotary files with average scores (Groups II- 0.68, III- 0.48 and IV- 0.58) was significantly better than K-files (Group I- 0.93) (p < 0.05). Mean instrumentation time with K-file (20.7 min) was significantly higher than rotary files (Groups II 8.9, III 5.6, and IV 8.1 min) (p < 0.05). Instrument distortion was observed in Group I (4.3%), while none of the rotary files were distorted. Rotary files prepared more conical canals in primary teeth than manual instruments. Reduced preparation time with rotary files enhances patient cooperation especially in young children.
Cyclic fatigue of three types of rotary nickel-titanium files in a dynamic model.
Yao, James H; Schwartz, Scott A; Beeson, Thomas J
2006-01-01
The cyclic fatigue resistance of three types of nickel-titanium rotary files was compared in a model using reciprocating axial movement. The influence of file size and taper was also investigated and fracture patterns were examined under SEM. The 10 experimental groups consisted of ProFiles, K3s, and RaCe files, size 25 in .04 and .06 tapers, as well as ProFiles and K3s, size 40 in .04 and .06 tapers. Each file was rotated freely at 300 rpm inside a stainless steel tube with a 60 degree and 5 mm radius of curvature. A continuous 3 mm oscillating axial motion was applied at 1 cycle per second by attaching an electric dental handpiece to the most inferior load cell of an Instron machine using a custom-made jig. The number of rotations to failure was determined and analyzed using analysis of variance and Tukey's post hoc tests. Overall, K3 25/.04 files were significantly more resistant to cyclic fatigue compared to any other group in this study. In the 25/.04 category, K3s were significantly more resistant to failure than ProFiles and RaCe files. Also in the same category, ProFiles significantly outlasted RaCe files. In the 25/.06 group, K3s and ProFiles were significantly more resistant to failure than RaCe files, but K3s were not significantly different than ProFiles. In the 40/.04 and 40/.06 groups, K3s were significantly more resistant to cyclic fatigue than ProFiles. SEM observations demonstrated mostly a ductile mode of fracture. The results suggest that different cross-sectional designs, diameters, and tapers all contribute to a nickel-titanium instrument's vulnerability to cyclic failure.
Las Palmeras Molecular Dynamics: A flexible and modular molecular dynamics code
NASA Astrophysics Data System (ADS)
Davis, Sergio; Loyola, Claudia; González, Felipe; Peralta, Joaquín
2010-12-01
Las Palmeras Molecular Dynamics (LPMD) is a highly modular and extensible molecular dynamics (MD) code using interatomic potential functions. LPMD is able to perform equilibrium MD simulations of bulk crystalline solids, amorphous solids and liquids, as well as non-equilibrium MD (NEMD) simulations such as shock wave propagation, projectile impacts, cluster collisions, shearing, deformation under load, heat conduction, heterogeneous melting, among others, which involve unusual MD features like non-moving atoms and walls, unstoppable atoms with constant-velocity, and external forces like electric fields. LPMD is written in C++ as a compromise between efficiency and clarity of design, and its architecture is based on separate components or plug-ins, implemented as modules which are loaded on demand at runtime. The advantage of this architecture is the ability to completely link together the desired components involved in the simulation in different ways at runtime, using a user-friendly control file language which describes the simulation work-flow. As an added bonus, the plug-in API (Application Programming Interface) makes it possible to use the LPMD components to analyze data coming from other simulation packages, convert between input file formats, apply different transformations to saved MD atomic trajectories, and visualize dynamical processes either in real-time or as a post-processing step. Individual components, such as a new potential function, a new integrator, a new file format, new properties to calculate, new real-time visualizers, and even a new algorithm for handling neighbor lists can be easily coded, compiled and tested within LPMD by virtue of its object-oriented API, without the need to modify the rest of the code. LPMD includes already several pair potential functions such as Lennard-Jones, Morse, Buckingham, MCY and the harmonic potential, as well as embedded-atom model (EAM) functions such as the Sutton-Chen and Gupta potentials. Integrators to choose include Euler (if only for demonstration purposes), Verlet and Velocity Verlet, Leapfrog and Beeman, among others. Electrostatic forces are treated as another potential function, by default using the plug-in implementing the Ewald summation method. Program summaryProgram title: LPMD Catalogue identifier: AEHG_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEHG_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public License version 3 No. of lines in distributed program, including test data, etc.: 509 490 No. of bytes in distributed program, including test data, etc.: 6 814 754 Distribution format: tar.gz Programming language: C++ Computer: 32-bit and 64-bit workstation Operating system: UNIX RAM: Minimum 1024 bytes Classification: 7.7 External routines: zlib, OpenGL Nature of problem: Study of Statistical Mechanics and Thermodynamics of condensed matter systems, as well as kinetics of non-equilibrium processes in the same systems. Solution method: Equilibrium and non-equilibrium molecular dynamics method, Monte Carlo methods. Restrictions: Rigid molecules are not supported. Polarizable atoms and chemical bonds (proteins) either. Unusual features: The program is able to change the temperature of the simulation cell, the pressure, cut regions of the cell, color the atoms by properties, even during the simulation. It is also possible to fix the positions and/or velocity of groups of atoms. Visualization of atoms and some physical properties during the simulation. Additional comments: The program does not only perform molecular dynamics and Monte Carlo simulations, it is also able to filter and manipulate atomic configurations, read and write different file formats, convert between them, evaluate different structural and dynamical properties. Running time: 50 seconds on a 1000-step simulation of 4000 argon atoms, running on a single 2.67 GHz Intel processor.
Collective operations in a file system based execution model
Shinde, Pravin; Van Hensbergen, Eric
2013-02-12
A mechanism is provided for group communications using a MULTI-PIPE synthetic file system. A master application creates a multi-pipe synthetic file in the MULTI-PIPE synthetic file system, the master application indicating a multi-pipe operation to be performed. The master application then writes a header-control block of the multi-pipe synthetic file specifying at least one of a multi-pipe synthetic file system name, a message type, a message size, a specific destination, or a specification of the multi-pipe operation. Any other application participating in the group communications then opens the same multi-pipe synthetic file. A MULTI-PIPE file system module then implements the multi-pipe operation as identified by the master application. The master application and the other applications then either read or write operation messages to the multi-pipe synthetic file and the MULTI-PIPE synthetic file system module performs appropriate actions.
Collective operations in a file system based execution model
Shinde, Pravin; Van Hensbergen, Eric
2013-02-19
A mechanism is provided for group communications using a MULTI-PIPE synthetic file system. A master application creates a multi-pipe synthetic file in the MULTI-PIPE synthetic file system, the master application indicating a multi-pipe operation to be performed. The master application then writes a header-control block of the multi-pipe synthetic file specifying at least one of a multi-pipe synthetic file system name, a message type, a message size, a specific destination, or a specification of the multi-pipe operation. Any other application participating in the group communications then opens the same multi-pipe synthetic file. A MULTI-PIPE file system module then implements the multi-pipe operation as identified by the master application. The master application and the other applications then either read or write operation messages to the multi-pipe synthetic file and the MULTI-PIPE synthetic file system module performs appropriate actions.
Smartfiles: An OO approach to data file interoperability
NASA Technical Reports Server (NTRS)
Haines, Matthew; Mehrotra, Piyush; Vanrosendale, John
1995-01-01
Data files for scientific and engineering codes typically consist of a series of raw data values whose descriptions are buried in the programs that interact with these files. In this situation, making even minor changes in the file structure or sharing files between programs (interoperability) can only be done after careful examination of the data file and the I/O statement of the programs interacting with this file. In short, scientific data files lack self-description, and other self-describing data techniques are not always appropriate or useful for scientific data files. By applying an object-oriented methodology to data files, we can add the intelligence required to improve data interoperability and provide an elegant mechanism for supporting complex, evolving, or multidisciplinary applications, while still supporting legacy codes. As a result, scientists and engineers should be able to share datasets with far greater ease, simplifying multidisciplinary applications and greatly facilitating remote collaboration between scientists.
DMFS: A Data Migration File System for NetBSD
NASA Technical Reports Server (NTRS)
Studenmund, William
2000-01-01
I have recently developed DMFS, a Data Migration File System, for NetBSD. This file system provides kernel support for the data migration system being developed by my research group at NASA/Ames. The file system utilizes an underlying file store to provide the file backing, and coordinates user and system access to the files. It stores its internal metadata in a flat file, which resides on a separate file system. This paper will first describe our data migration system to provide a context for DMFS, then it will describe DMFS. It also will describe the changes to NetBSD needed to make DMFS work. Then it will give an overview of the file archival and restoration procedures, and describe how some typical user actions are modified by DMFS. Lastly, the paper will present simple performance measurements which indicate that there is little performance loss due to the use of the DMFS layer.
Storage of sparse files using parallel log-structured file system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bent, John M.; Faibish, Sorin; Grider, Gary
A sparse file is stored without holes by storing a data portion of the sparse file using a parallel log-structured file system; and generating an index entry for the data portion, the index entry comprising a logical offset, physical offset and length of the data portion. The holes can be restored to the sparse file upon a reading of the sparse file. The data portion can be stored at a logical end of the sparse file. Additional storage efficiency can optionally be achieved by (i) detecting a write pattern for a plurality of the data portions and generating a singlemore » patterned index entry for the plurality of the patterned data portions; and/or (ii) storing the patterned index entries for a plurality of the sparse files in a single directory, wherein each entry in the single directory comprises an identifier of a corresponding sparse file.« less
noaa_20110510_wbg.gif 10-May-2011 20:58 31K generic file noaa_20110510_wbg.pdf 10-May-2011 20:58 128K generic file noaa_20110513_wbg.gif 13-May-2011 20:10 27K generic file noaa_20110513_wbg.pdf 13-May-2011 20:10 122K generic file noaa_20110518_wbg.gif 18-May-2011 21:10 33K generic file noaa_20110518_wbg.pdf 18-May-2011 21:10 128K generic file
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sjaardema, Gregory
2010-08-06
Conjoin is a code for joining sequentially in time multiple exodusII database files. It is used to create a single results or restart file from multiple results or restart files which typically arise as the result of multiple restarted analyses. The resulting output file will be the union of the input files with a status variable indicating the status of each element at the various time planes.Combining multiple exodusII files arising from a restarted analysis or combining multiple exodusII files arising from a finite element analysis with dynamic topology changes.
75 FR 62522 - Combined Notice of Filings No. 3
Federal Register 2010, 2011, 2012, 2013, 2014
2010-10-12
... filing per 154.203: NAESB EDI Form Filing to be effective 11/1/ 2010. Filed Date: 09/30/2010. Accession....9 EDI Form to be effective 11/1/2010. Filed Date: 09/30/2010. Accession Number: 20100930-5348...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-01-03
... Commission strongly encourages electronic filing, documents may also be paper-filed. To paper-file, mail an... needed please contact Mr. David Pryor, Senior Environmental Scientist--California State Parks, at dpryor...
76 FR 21727 - Combined Notice of Filings No. 2
Federal Register 2010, 2011, 2012, 2013, 2014
2011-04-18
... 154.205(b): EnCana Amendment Filing to be effective 3/11/2011. Filed Date: 03/29/2011. Accession...-Wide Rate Case Motion Filing to be effective 4/1/2011. Filed Date: 04/01/2011. Accession Number...
How to Handle the Avalanche of Online Documentation.
ERIC Educational Resources Information Center
Nolan, Maureen P.
1981-01-01
The method of handling the printed documentation associated with online information retrieval, which is described, involves the use of a series of separate but related files: database files, system files, network files, index sheets, and equipment files. (FM)
75 FR 49923 - Combined Notice of Filings #1
Federal Register 2010, 2011, 2012, 2013, 2014
2010-08-16
... filing per 35.12: KCP&L-GMO Baseline Filing (Market-Based Volume 28) to be effective 8/2/2010. Filed Date...: KCP&L Greater Missouri Operations Company submits tariff filing per 35.12: GMO Volume 33 (Cost-Based...
17 CFR 274.402 - Form ID, uniform application for access codes to file on EDGAR.
Code of Federal Regulations, 2010 CFR
2010-04-01
... for access codes to file on EDGAR. 274.402 Section 274.402 Commodity and Securities Exchanges... Forms for Electronic Filing § 274.402 Form ID, uniform application for access codes to file on EDGAR..., filing agent or training agent to log on to the EDGAR system, submit filings, and change its CCC. (d...
17 CFR 274.402 - Form ID, uniform application for access codes to file on EDGAR.
Code of Federal Regulations, 2011 CFR
2011-04-01
... for access codes to file on EDGAR. 274.402 Section 274.402 Commodity and Securities Exchanges... Forms for Electronic Filing § 274.402 Form ID, uniform application for access codes to file on EDGAR..., filing agent or training agent to log on to the EDGAR system, submit filings, and change its CCC. (d...
Fallon, Nevada FORGE Thermal-Hydrological-Mechanical Models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Blankenship, Doug; Sonnenthal, Eric
Archive contains thermal-mechanical simulation input/output files. Included are files which fall into the following categories: ( 1 ) Spreadsheets with various input parameter calculations ( 2 ) Final Simulation Inputs ( 3 ) Native-State Thermal-Hydrological Model Input File Folders ( 4 ) Native-State Thermal-Hydrological-Mechanical Model Input Files ( 5 ) THM Model Stimulation Cases See 'File Descriptions.xlsx' resource below for additional information on individual files.
78 FR 9902 - Combined Notice of Filings #1
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-12
... Communications Agreement 5th Revised to be effective 4/1/2013. Filed Date: 1/30/13. Accession Number: 20130130... G549 Amended Filing to be effective 9/13/2012. Filed Date: 1/31/13. Accession Number: 20130131-5003.... Description: Compliance Filing per 1/8/2013 Order in Docket No. ER13-347-000 to be effective 4/1/2013. Filed...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-03
... Friends Fund XIX, LLC; Notice of Intent To File License Application, Filing of Pre-Application Document.... Date Filed: August 7, 2012. d. Submitted By: Lock + Hydro Friends Fund XIX, LLC. e. Name of Project.... Lock + Hydro Friends Fund XIX, LLC filed its request to use the Traditional Licensing Process on August...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-09-28
... November 27, 2012. IV. Electronic Submissions (E-Filing) All documents filed in NRC adjudicatory... accordance with the NRC E-Filing rule (72 FR 49139; August 28, 2007). The E-Filing process requires... requirements of E-Filing, at least 10 days prior to the filing deadline, the participant should contact the...
77 FR 66601 - Electronic Tariff Filings; Notice of Change to eTariff Type of Filing Codes
Federal Register 2010, 2011, 2012, 2013, 2014
2012-11-06
... Tariff Filings; Notice of Change to eTariff Type of Filing Codes Take notice that, effective November 18, 2012, the list of available eTariff Type of Filing Codes (TOFC) will be modified to include a new TOFC... Energy's regulations. Tariff records included in such filings will be automatically accepted to be...
Code of Federal Regulations, 2010 CFR
2010-07-01
... the Chief may designate. (e) Filing procedures. In order to file an appeal under this section, an... interested party in response to an appeal must be filed within 15 days after the close of the appeal filing... filing an appeal; however, when the filing period would expire on a Saturday, Sunday, or Federal holiday...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-04-26
... Commission by June 25, 2012. IV. Electronic Submissions (E-Filing) All documents filed in NRC adjudicatory... accordance with the NRC E-Filing rule (72 FR 49139, August 28, 2007). The E-Filing process requires... requirements of E-Filing, at least 10 days prior to the filing deadline, the participant should contact the...
ERIC Educational Resources Information Center
East Texas State Univ., Commerce. Occupational Curriculum Lab.
Nineteen units on filing, office machines, and general office clerical occupations are presented in this teacher's guide. The unit topics include indexing, alphabetizing, and filing (e.g., business names); labeling and positioning file folders and guides; establishing a correspondence filing system; utilizing charge-out and follow-up file systems;…
Federal Register 2010, 2011, 2012, 2013, 2014
2012-11-07
... can file your comments electronically using the eFiling feature located on the Commission's Web site ( www.ferc.gov ) under the Documents & Filings link. With eFiling, you can provide comments in a variety of formats by attaching them as a file with your submission. New eFiling users must first create an...
76 FR 36529 - Combined Notice of Filings #1
Federal Register 2010, 2011, 2012, 2013, 2014
2011-06-22
... submits tariff filing per 35: 2011-06-03 CAISO's Tariff Waiver Filing to be effective N/A. Filed Date: 06..., L.L.C. submits tariff filing per 35.13(a)(2)(iii: Queue No. W2-083; Original Service Agreement No... No. W2-088; Original Service Agreement No. 2929 to be effective 5/10/2011. Filed Date: 06/03/2011...
FGGE/ERBM tape specification and shipping letter description
NASA Technical Reports Server (NTRS)
Han, D.; Lo, H.
1983-01-01
The Nimbus-7 FGGE/ERBM tape contains 27 ERB parameters which are extracted and reformatted from the Nimbus-7 ERB-MATRIX tape. There are four types of files on a FGGE/ERBM tape: a test file; tape-header file which describes the data set characteristics and the contents of the tape; a grid-descriptor file which contains the information of the ERB scanning channel target number and their associated latitude limits and longitude intervals; and one or more data files. A single end-of-file (EOF) tape mark is written after each file, and two EOF marks are written after the last data file on the tape.
Index files for Belle II - very small skim containers
NASA Astrophysics Data System (ADS)
Sevior, Martin; Bloomfield, Tristan; Kuhr, Thomas; Ueda, I.; Miyake, H.; Hara, T.
2017-10-01
The Belle II experiment[1] employs the root file format[2] for recording data and is investigating the use of “index-files” to reduce the size of data skims. These files contain pointers to the location of interesting events within the total Belle II data set and reduce the size of data skims by 2 orders of magnitude. We implement this scheme on the Belle II grid by recording the parent file metadata and the event location within the parent file. While the scheme works, it is substantially slower than a normal sequential read of standard skim files using default root file parameters. We investigate the performance of the scheme by adjusting the “splitLevel” and “autoflushsize” parameters of the root files in the parent data files.
Fail-over file transfer process
NASA Technical Reports Server (NTRS)
Semancik, Susan K. (Inventor); Conger, Annette M. (Inventor)
2005-01-01
The present invention provides a fail-over file transfer process to handle data file transfer when the transfer is unsuccessful in order to avoid unnecessary network congestion and enhance reliability in an automated data file transfer system. If a file cannot be delivered after attempting to send the file to a receiver up to a preset number of times, and the receiver has indicated the availability of other backup receiving locations, then the file delivery is automatically attempted to one of the backup receiving locations up to the preset number of times. Failure of the file transfer to one of the backup receiving locations results in a failure notification being sent to the receiver, and the receiver may retrieve the file from the location indicated in the failure notification when ready.
75 FR 81594 - Combined Notice of Filings
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-28
... Pipeline Company submits tariff filing per 154.204: RP11-20 TOC Update to be effective 10/1/2010. Filed... filing per 154.204: RP11-1474 TOC Update to be effective 11/1/2010. Filed Date: 12/16/2010. Accession...
Forensic Analysis of Compromised Computers
NASA Technical Reports Server (NTRS)
Wolfe, Thomas
2004-01-01
Directory Tree Analysis File Generator is a Practical Extraction and Reporting Language (PERL) script that simplifies and automates the collection of information for forensic analysis of compromised computer systems. During such an analysis, it is sometimes necessary to collect and analyze information about files on a specific directory tree. Directory Tree Analysis File Generator collects information of this type (except information about directories) and writes it to a text file. In particular, the script asks the user for the root of the directory tree to be processed, the name of the output file, and the number of subtree levels to process. The script then processes the directory tree and puts out the aforementioned text file. The format of the text file is designed to enable the submission of the file as input to a spreadsheet program, wherein the forensic analysis is performed. The analysis usually consists of sorting files and examination of such characteristics of files as ownership, time of creation, and time of most recent access, all of which characteristics are among the data included in the text file.
Diop, Amadou; Maurel, Nathalie; Oiknine, Michel; Patoor, Etienne; Machtou, Pierre
2009-04-01
We proposed a new testing setup and in vitro experimental procedure allowing the analysis of the forces, torque, and file displacements during the preparation of root canals using nickel-titanium rotary endodontic files. We applied it to the preparation of 20 fresh frozen cadaveric teeth using ProTaper files (Dentsply Maillefer, Ballaigues, Switzerland), according to a clinically used sequence. During the preparations, a clinical hand motion was performed by an endodontist, and we measured the applied torque around the file axis and also the involved three-dimensional forces and 3-dimensional file displacements. Such a biomechanical procedure is useful to better understand the working conditions of the files in terms of loads and displacements. It could be used to analyze the effects of various mechanical and geometric parameters on the files' behavior and to get data for modelling purposes. Finally, it could contribute to studies aiming to improve files design in order to reduce the risks of file fractures.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-30
... file your comments electronically using the eFiling feature on the Commission's Web site ( www.ferc.gov ) under the link to Documents and Filings. With eFiling, you can provide comments in a variety of formats by attaching them as a file with your submission. New eFiling users must first create an account by...
20 CFR 404.621 - What happens if I file after the first month I meet the requirements for benefits?
Code of Federal Regulations, 2010 CFR
2010-04-01
... failure to file was due to a physical or mental condition is stated in § 404.322. (e) Filing after death... ADMINISTRATION FEDERAL OLD-AGE, SURVIVORS AND DISABILITY INSURANCE (1950- ) Filing of Applications and Other Forms Effective Filing Period of Application § 404.621 What happens if I file after the first month I...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-05-20
....315(c), must be filed in accordance with the NRC E-Filing rule (72 FR 49139, August 28, 2007). The E... with the procedural requirements of E-Filing, at least ten (10) days prior to the filing deadline, the... the NRC in accordance with the E-Filing rule, the participant must file the document using the NRC's...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-09-04
... on a project; (2) You can file your comments electronically using the eFiling feature located on the Commission's Web site ( www.ferc.gov ) under the Documents & Filings link. With eFiling, you can provide comments in a variety of formats by attaching them as a file with your submission. New eFiling users must...
47 CFR 61.14 - Method of filing publications.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 47 Telecommunication 3 2012-10-01 2012-10-01 false Method of filing publications. 61.14 Section 61.14 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) COMMON CARRIER SERVICES (CONTINUED) TARIFFS Rules for Electronic Filing § 61.14 Method of filing publications. (a) Publications filed...
48 CFR 4.803 - Contents of contract files.
Code of Federal Regulations, 2012 CFR
2012-10-01
... ADMINISTRATIVE MATTERS Government Contract Files 4.803 Contents of contract files. The following are examples of the records normally contained, if applicable, in contract files: (a) Contracting office contract file...) Justifications and approvals, determinations and findings, and associated documents. (3) Evidence of availability...
NASA Technical Reports Server (NTRS)
Pinelli, Thomas E.; Kennedy, John M.; White, Terry F.
1992-01-01
A telephone survey of U.S. aerospace engineers and scientists who were on the Society of Automotive Engineers (SAE) mailing list was conducted between August 14-26, 1991. The survey was undertaken to obtain information on the daily work activities of aerospace engineers and scientists, to measure various practices used by aerospace engineers and scientists to obtain STI, and to ask aerospace engineers and scientists about their use of electronic networks. Co-workers were found important sources of information. Co-workers are used to obtain technical information because the information they have is relevant, not because co-workers are accessible. As technical uncertainty increases, so does the need for information internal and external to the organization. Electronic networks enjoy widespread use within the aerospace community. These networks are accessible and they are used to contact people at remote sites. About 80 percent of the respondents used electronic mail, file transfer, and information or data retrieval to commercial or in-house data bases.
A self-defining hierarchical data system
NASA Technical Reports Server (NTRS)
Bailey, J.
1992-01-01
The Self-Defining Data System (SDS) is a system which allows the creation of self-defining hierarchical data structures in a form which allows the data to be moved between different machine architectures. Because the structures are self-defining they can be used for communication between independent modules in a distributed system. Unlike disk-based hierarchical data systems such as Starlink's HDS, SDS works entirely in memory and is very fast. Data structures are created and manipulated as internal dynamic structures in memory managed by SDS itself. A structure may then be exported into a caller supplied memory buffer in a defined external format. This structure can be written as a file or sent as a message to another machine. It remains static in structure until it is reimported into SDS. SDS is written in portable C and has been run on a number of different machine architectures. Structures are portable between machines with SDS looking after conversion of byte order, floating point format, and alignment. A Fortran callable version is also available for some machines.
Factors and processes causing accelerated decomposition in human cadavers - An overview.
Zhou, Chong; Byard, Roger W
2011-01-01
Artefactually enhanced putrefactive and autolytic changes may be misinterpreted as indicating a prolonged postmortem interval and throw doubt on the veracity of witness statements. Review of files from Forensic Science SA and the literature revealed a number of external and internal factors that may be responsible for accelerating these processes. Exogenous factors included exposure to elevated environmental temperatures, both outdoors and indoors, exacerbated by increased humidity or fires. Situations indoor involved exposure to central heating, hot water, saunas and electric blankets. Deaths within motor vehicles were also characterized by enhanced decomposition. Failure to quickly or adequately refrigerate bodies may also lead to early decomposition. Endogenous factors included fever, infections, illicit and prescription drugs, obesity and insulin-dependent diabetes mellitus. When these factors or conditions are identified at autopsy less significance should, therefore, be attached to changes of decomposition as markers of time since death. Copyright © 2010 Elsevier Ltd and Faculty of Forensic and Legal Medicine. All rights reserved.
The role of facial appearance on CEO selection after firm misconduct.
Gomulya, David; Wong, Elaine M; Ormiston, Margaret E; Boeker, Warren
2017-04-01
[Correction Notice: An Erratum for this article was reported in Vol 102(4) of Journal of Applied Psychology (see record 2017-10684-001). The wrong figure files were used. All versions of this article have been corrected.] We investigate a particular aspect of CEO successor trustworthiness that may be critically important after a firm has engaged in financial misconduct. Specifically, drawing on prior research that suggests that facial appearance is one critical way in which trustworthiness is signaled, we argue that leaders who convey integrity, a component of trustworthiness, will be more likely to be selected as successors after financial restatement. We predict that such appointments garner more positive reactions by external observers such as investment analysts and the media because these CEOs are perceived as having greater integrity. In an archival study of firms that have announced financial restatements, we find support for our predictions. These findings have implications for research on CEO succession, leadership selection, facial appearance, and firm misconduct. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
Rose, Peter W; Prlić, Andreas; Bi, Chunxiao; Bluhm, Wolfgang F; Christie, Cole H; Dutta, Shuchismita; Green, Rachel Kramer; Goodsell, David S; Westbrook, John D; Woo, Jesse; Young, Jasmine; Zardecki, Christine; Berman, Helen M; Bourne, Philip E; Burley, Stephen K
2015-01-01
The RCSB Protein Data Bank (RCSB PDB, http://www.rcsb.org) provides access to 3D structures of biological macromolecules and is one of the leading resources in biology and biomedicine worldwide. Our efforts over the past 2 years focused on enabling a deeper understanding of structural biology and providing new structural views of biology that support both basic and applied research and education. Herein, we describe recently introduced data annotations including integration with external biological resources, such as gene and drug databases, new visualization tools and improved support for the mobile web. We also describe access to data files, web services and open access software components to enable software developers to more effectively mine the PDB archive and related annotations. Our efforts are aimed at expanding the role of 3D structure in understanding biology and medicine. © The Author(s) 2014. Published by Oxford University Press on behalf of Nucleic Acids Research.
Siquieroli, Ana Carolina S; Vieira, Carlos U; Carvalho-Zilse, Gislene A; Goulart, Luiz R; Kerr, Warwick E; Bonetti, Ana M
2009-01-01
In colonies of Melipona scutellaris Latreille, 1811 workers can be found with four ganglion nerve cells, a morphological characteristic of the queen. It is hypothesized that these workers, called intercastes, or phenocopies, are phenotypically-like workers, but genotypically identical to queens due to this specific trait. Workers with the same number of ganglion as queens seem to be intercastes between queens and workers. Our objective was to analyze the mRNA pro files of workers, queens, and intercastes of M. scutellaris through DDRT-PCR. Three hundred (300) pupae with white eyes were collected and externally identified according to the number of abdominal nerve ganglions: workers (5 ganglions), queens (4 ganglions) and intercastes (4 ganglions). The analysis identified differentially expressed transcripts that were present only in workers, but absent in intercastes and queens, confirming the hypothesis, by demonstrating the environmental effect on the queen genotype that generated phenotype-like workers.
SeqDepot: streamlined database of biological sequences and precomputed features.
Ulrich, Luke E; Zhulin, Igor B
2014-01-15
Assembling and/or producing integrated knowledge of sequence features continues to be an onerous and redundant task despite a large number of existing resources. We have developed SeqDepot-a novel database that focuses solely on two primary goals: (i) assimilating known primary sequences with predicted feature data and (ii) providing the most simple and straightforward means to procure and readily use this information. Access to >28.5 million sequences and 300 million features is provided through a well-documented and flexible RESTful interface that supports fetching specific data subsets, bulk queries, visualization and searching by MD5 digests or external database identifiers. We have also developed an HTML5/JavaScript web application exemplifying how to interact with SeqDepot and Perl/Python scripts for use with local processing pipelines. Freely available on the web at http://seqdepot.net/. RESTaccess via http://seqdepot.net/api/v1. Database files and scripts maybe downloaded from http://seqdepot.net/download.
Next Generation Space Telescope Integrated Science Module Data System
NASA Technical Reports Server (NTRS)
Schnurr, Richard G.; Greenhouse, Matthew A.; Jurotich, Matthew M.; Whitley, Raymond; Kalinowski, Keith J.; Love, Bruce W.; Travis, Jeffrey W.; Long, Knox S.
1999-01-01
The Data system for the Next Generation Space Telescope (NGST) Integrated Science Module (ISIM) is the primary data interface between the spacecraft, telescope, and science instrument systems. This poster includes block diagrams of the ISIM data system and its components derived during the pre-phase A Yardstick feasibility study. The poster details the hardware and software components used to acquire and process science data for the Yardstick instrument compliment, and depicts the baseline external interfaces to science instruments and other systems. This baseline data system is a fully redundant, high performance computing system. Each redundant computer contains three 150 MHz power PC processors. All processors execute a commercially available real time multi-tasking operating system supporting, preemptive multi-tasking, file management and network interfaces. These six processors in the system are networked together. The spacecraft interface baseline is an extension of the network, which links the six processors. The final selection for Processor busses, processor chips, network interfaces, and high-speed data interfaces will be made during mid 2002.
Spin and charge thermopower effects in the ferromagnetic graphene junction
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vahedi, Javad, E-mail: javahedi@gmail.com; Center for Theoretical Physics of Complex Systems, Institute for Basic Science; Barimani, Fattaneh
2016-08-28
Using wave function matching approach and employing the Landauer-Buttiker formula, a ferromagnetic graphene junction with temperature gradient across the system is studied. We calculate the thermally induced charge and spin current as well as the thermoelectric voltage (Seebeck effect) in the linear and nonlinear regimes. Our calculation revealed that due to the electron-hole symmetry, the charge Seebeck coefficient is, for an undoped magnetic graphene, an odd function of chemical potential while the spin Seebeck coefficient is an even function regardless of the temperature gradient and junction length. We have also found with an accurate tuning external parameter, namely, the exchangemore » filed and gate voltage, the temperature gradient across the junction drives a pure spin current without accompanying the charge current. Another important characteristic of thermoelectric transport, thermally induced current in the nonlinear regime, is examined. It would be our main finding that with increasing thermal gradient applied to the junction the spin and charge thermovoltages decrease and even become zero for non zero temperature bias.« less
The RCSB protein data bank: integrative view of protein, gene and 3D structural information
Rose, Peter W.; Prlić, Andreas; Altunkaya, Ali; Bi, Chunxiao; Bradley, Anthony R.; Christie, Cole H.; Costanzo, Luigi Di; Duarte, Jose M.; Dutta, Shuchismita; Feng, Zukang; Green, Rachel Kramer; Goodsell, David S.; Hudson, Brian; Kalro, Tara; Lowe, Robert; Peisach, Ezra; Randle, Christopher; Rose, Alexander S.; Shao, Chenghua; Tao, Yi-Ping; Valasatava, Yana; Voigt, Maria; Westbrook, John D.; Woo, Jesse; Yang, Huangwang; Young, Jasmine Y.; Zardecki, Christine; Berman, Helen M.; Burley, Stephen K.
2017-01-01
The Research Collaboratory for Structural Bioinformatics Protein Data Bank (RCSB PDB, http://rcsb.org), the US data center for the global PDB archive, makes PDB data freely available to all users, from structural biologists to computational biologists and beyond. New tools and resources have been added to the RCSB PDB web portal in support of a ‘Structural View of Biology.’ Recent developments have improved the User experience, including the high-speed NGL Viewer that provides 3D molecular visualization in any web browser, improved support for data file download and enhanced organization of website pages for query, reporting and individual structure exploration. Structure validation information is now visible for all archival entries. PDB data have been integrated with external biological resources, including chromosomal position within the human genome; protein modifications; and metabolic pathways. PDB-101 educational materials have been reorganized into a searchable website and expanded to include new features such as the Geis Digital Archive. PMID:27794042
JAva GUi for Applied Research (JAGUAR) v 3.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
JAGUAR is a Java software tool for automatically rendering a graphical user interface (GUI) from a structured input specification. It is designed as a plug-in to the Eclipse workbench to enable users to create, edit, and externally execute analysis application input decks and then view the results. JAGUAR serves as a GUI for Sandia's DAKOTA software toolkit for optimization and uncertainty quantification. It will include problem (input deck)set-up, option specification, analysis execution, and results visualization. Through the use of wizards, templates, and views, JAGUAR helps uses navigate the complexity of DAKOTA's complete input specification. JAGUAR is implemented in Java, leveragingmore » Eclipse extension points and Eclipse user interface. JAGUAR parses a DAKOTA NIDR input specification and presents the user with linked graphical and plain text representations of problem set-up and option specification for DAKOTA studies. After the data has been input by the user, JAGUAR generates one or more input files for DAKOTA, executes DAKOTA, and captures and interprets the results« less
Satta, G; Atzeni, A; McHugh, T D
2017-02-01
Whole genome sequencing (WGS) has the potential to revolutionize the diagnosis of Mycobacterium tuberculosis infection but the lack of bioinformatic expertise among clinical microbiologists is a barrier for adoption. Software products for analysis should be simple, free of charge, able to accept data directly from the sequencer (FASTQ files) and to provide the basic functionalities all-in-one. The main aim of this narrative review is to provide a practical guide for the clinical microbiologist, with little or no practical experience of WGS analysis, with a specific focus on software products tailor-made for M. tuberculosis analysis. With sequencing performed by an external provider, it is now feasible to implement WGS analysis in the routine clinical practice of any microbiology laboratory, with the potential to detect resistance weeks before traditional phenotypic culture methods, but the clinical microbiologist should be aware of the limitations of this approach. Copyright © 2016 European Society of Clinical Microbiology and Infectious Diseases. Published by Elsevier Ltd. All rights reserved.
Mahesh, MC; Bhandary, Shreetha
2017-01-01
Introduction Stresses generated during root canal instrumentation have been reported to cause apical cracks. The smaller, less pronounced defects like cracks can later propagate into vertical root fracture, when the tooth is subjected to repeated stresses from endodontic or restorative procedures. Aim This study evaluated occurrence of apical cracks with stainless steel hand files, rotary NiTi RaCe and K3 files at two different instrumentation lengths. Materials and Methods In the present in vitro study, 60 mandibular premolars were mounted in resin blocks with simulated periodontal ligament. Apical 3 mm of the root surfaces were exposed and stained using India ink. Preoperative images of root apices were obtained at 100x using stereomicroscope. The teeth were divided into six groups of 10 each. First two groups were instrumented with stainless steel files, next two groups with rotary NiTi RaCe files and the last two groups with rotary NiTi K3 files. The instrumentation was carried out till the apical foramen (Working Length-WL) and 1 mm short of the apical foramen (WL-1) with each file system. After root canal instrumentation, postoperative images of root apices were obtained. Preoperative and postoperative images were compared and the occurrence of cracks was recorded. Descriptive statistical analysis and Chi-square tests were used to analyze the results. Results Apical root cracks were seen in 30%, 35% and 20% of teeth instrumented with K-files, RaCe files and K3 files respectively. There was no statistical significance among three instrumentation systems in the formation of apical cracks (p=0.563). Apical cracks were seen in 40% and 20% of teeth instrumented with K-files; 60% and 10% of teeth with RaCe files and 40% and 0% of teeth with K3 files at WL and WL-1 respectively. For groups instrumented with hand files there was no statistical significance in number of cracks at WL and WL-1 (p=0.628). But for teeth instrumented with RaCe files and K3 files significantly more number of cracks were seen at WL than WL-1 (p=0.057 for RaCe files and p=0.087 for K3 files). Conclusion There was no statistical significance between stainless steel hand files and rotary files in terms of crack formation. Instrumentation length had a significant effect on the formation of cracks when rotary files were used. Using rotary instruments 1 mm short of apical foramen caused lesser crack formation. But, there was no statistically significant difference in number of cracks formed with hand files at two instrumentation levels. PMID:28274036
Devale, Madhuri R; Mahesh, M C; Bhandary, Shreetha
2017-01-01
Stresses generated during root canal instrumentation have been reported to cause apical cracks. The smaller, less pronounced defects like cracks can later propagate into vertical root fracture, when the tooth is subjected to repeated stresses from endodontic or restorative procedures. This study evaluated occurrence of apical cracks with stainless steel hand files, rotary NiTi RaCe and K3 files at two different instrumentation lengths. In the present in vitro study, 60 mandibular premolars were mounted in resin blocks with simulated periodontal ligament. Apical 3 mm of the root surfaces were exposed and stained using India ink. Preoperative images of root apices were obtained at 100x using stereomicroscope. The teeth were divided into six groups of 10 each. First two groups were instrumented with stainless steel files, next two groups with rotary NiTi RaCe files and the last two groups with rotary NiTi K3 files. The instrumentation was carried out till the apical foramen (Working Length-WL) and 1 mm short of the apical foramen (WL-1) with each file system. After root canal instrumentation, postoperative images of root apices were obtained. Preoperative and postoperative images were compared and the occurrence of cracks was recorded. Descriptive statistical analysis and Chi-square tests were used to analyze the results. Apical root cracks were seen in 30%, 35% and 20% of teeth instrumented with K-files, RaCe files and K3 files respectively. There was no statistical significance among three instrumentation systems in the formation of apical cracks (p=0.563). Apical cracks were seen in 40% and 20% of teeth instrumented with K-files; 60% and 10% of teeth with RaCe files and 40% and 0% of teeth with K3 files at WL and WL-1 respectively. For groups instrumented with hand files there was no statistical significance in number of cracks at WL and WL-1 (p=0.628). But for teeth instrumented with RaCe files and K3 files significantly more number of cracks were seen at WL than WL-1 (p=0.057 for RaCe files and p=0.087 for K3 files). There was no statistical significance between stainless steel hand files and rotary files in terms of crack formation. Instrumentation length had a significant effect on the formation of cracks when rotary files were used. Using rotary instruments 1 mm short of apical foramen caused lesser crack formation. But, there was no statistically significant difference in number of cracks formed with hand files at two instrumentation levels.
Hilfer, Paul B; Bergeron, Brian E; Mayerchak, Michael J; Roberts, Howard W; Jeansonne, Billie G
2011-01-01
Novel nickel-titanium rotary files with proprietary manufacturing techniques have recently been marketed. The purpose of this study was to assess multiple autoclave cycle effects on cyclic fatigue of GT Series X files (Dentsply Tulsa Dental Specialties, Tulsa, OK) and Twisted Files (SybronEndo, Orange, CA) METHODS: A jig using a 5-mm radius curve with 90° of maximum file flexure was used to induce cyclic fatigue failure. Files (n = 10) representing each experimental group (GT Series X 20/.04 and 20/.06; Twisted Files 25/.04 and 25/.06) were first tested to establish baseline mean cycles to failure (MCF). Experimental groups (n = 20) were then cycled to 25% of the established baseline MCF and then autoclaved. Additional autoclaving was accomplished at 50% and 75% of MCF followed by continual testing until failure. Control groups (n = 20) underwent the same procedures except autoclaving was not accomplished. The GT Series X (20/.04 and 20/.06) files showed no significant difference (p = 0.918/p = 0.096) in MCF for experimental versus control files. Twisted Files (25/.04) showed no significant difference (p = 0.432) in MCF between experimental and control groups. However, the Twisted Files (25/.06) experimental group showed a significantly lower (p = 0.0175) MCF compared with the controls. Under the conditions of this evaluation, autoclave sterilization significantly decreased cyclic fatigue resistance of one of the four file groups tested. Repeated autoclaving significantly reduced the MCF of 25/.06 Twisted Files; however, 25/.04 Twisted Files and both GT Series X files tested were not significantly affected by the same conditions. Published by Elsevier Inc.
Impact of heat treatments on the fatigue resistance of different rotary nickel-titanium instruments.
Braga, Lígia Carolina Moreira; Faria Silva, Ana Cristina; Buono, Vicente Tadeu Lopes; de Azevedo Bahia, Maria Guiomar
2014-09-01
The aim of this study was to assess the influence of M-Wire (Dentsply Tulsa Dental Specialties, Tulsa, OK) and controlled memory technologies on the fatigue resistance of rotary nickel-titanium (NiTi) files by comparing files made using these 2 technologies with conventional NiTi files. Files with a similar cross-sectional design and diameter were chosen for the study: new 30/.06 files of the EndoWave (EW; J. Morita Corp, Osaka, Japan), HyFlex (HF; Coltene/Whaledent, Inc, Cuyahoga Falls, OH), ProFile Vortex (PV; Dentsply Tulsa Dental Specialties, Tulsa, OK), and Typhoon (TYP; Clinician's Choice Dental Products, New Milford, CT) systems together with ProTaper Universal F2 instruments (PTU F2; Dentsply Maillefer, Ballaigues, Switzerland). The compositions and transformation temperatures of the instruments were analyzed using x-ray energy-dispersive spectroscopy and differential scanning calorimetry, whereas the mean file diameter values at 3 mm from the tip (D3) were measured using image analysis software. The average number of cycles to failure was determined using a fatigue test device. X-ray energy-dispersive spectroscopy analysis showed that, on average, all the instruments exhibited the same chemical composition, namely, 51% Ni-49% Ti. The PV, TYP, and HF files exhibited increased transformation temperatures. The PTU F2, PV, and TYP files had similar D3 values, which were less than those of the EW and HF files. The average number of cycles to failure values were 150% higher for the TYP files compared with the PV files and 390% higher for the HF files compared with the EW files. M-Wire and controlled memory technologies increase the fatigue resistance of rotary NiTi files. Copyright © 2014 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.
Johnson, M A; Primack, P D; Loushine, R J; Craft, D W
1997-01-01
Ninety-two new endodontic files were randomly assigned to five groups with varying parameters of contamination, cleaning method, and sterilization (steam or chemical). Files were instrumented in bovine teeth to accumulate debris and a known contaminant, Bacillus stearothermophilus. Positive controls produced growth on both T-soy agar plates and in T-soy broth. Negative controls and experimental files (some with heavy debris) failed to produce growth. The results showed that there was no significant difference between contaminated files that were not cleaned before sterilization and contaminated files that were cleaned before sterilization. Bioburden present on endodontic files does not appear to affect the sterilization process.
14 CFR 221.195 - Requirement for filing printed material.
Code of Federal Regulations, 2010 CFR
2010-01-01
... (AVIATION PROCEEDINGS) ECONOMIC REGULATIONS TARIFFS Electronically Filed Tariffs § 221.195 Requirement for filing printed material. (a) Any tariff, or revision thereto, filed in paper format which accompanies....190(b). Further, such paper tariff, or revision thereto, shall be filed in accordance with the...
High-Speed Numeric Function Generator Using Piecewise Quadratic Approximations
2007-09-01
application; User specifies the fuction to approxiamte. % % This programs turns the function provided into an inline function... PRIMARY = < primary file 1> < primary file 2> #SECONDARY = <secondary file 1> <secondary file 2> #CHIP2 = <file to compile to user chip
18 CFR 35.7 - Electronic filing requirements.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 18 Conservation of Power and Water Resources 1 2011-04-01 2011-04-01 false Electronic filing... § 35.7 Electronic filing requirements. (a) General rule. All filings made in proceedings initiated... declarations or statements and electronic signatures. (c) Format requirements for electronic filing. The...
18 CFR 35.7 - Electronic filing requirements.
Code of Federal Regulations, 2012 CFR
2012-04-01
... 18 Conservation of Power and Water Resources 1 2012-04-01 2012-04-01 false Electronic filing... § 35.7 Electronic filing requirements. (a) General rule. All filings made in proceedings initiated... declarations or statements and electronic signatures. (c) Format requirements for electronic filing. The...