Sample records for archive network cran

  1. RxnSim: a tool to compare biochemical reactions.

    PubMed

    Giri, Varun; Sivakumar, Tadi Venkata; Cho, Kwang Myung; Kim, Tae Yong; Bhaduri, Anirban

    2015-11-15

    : Quantitative assessment of chemical reaction similarity aids database searches, classification of reactions and identification of candidate enzymes. Most methods evaluate reaction similarity based on chemical transformation patterns. We describe a tool, RxnSim, which computes reaction similarity based on the molecular signatures of participating molecules. The tool is able to compare reactions based on similarities of substrates and products in addition to their transformation. It allows masking of user-defined chemical moieties for weighted similarity computations. RxnSim is implemented in R and is freely available from the Comprehensive R Archive Network, CRAN (http://cran.r-project.org/web/packages/RxnSim/). anirban.b@samsung.com or ty76.kim@samsung.com Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  2. A Vignette (User's Guide) for “An R Package for Statistical ...

    EPA Pesticide Factsheets

    StatCharrms is a graphical user front-end for ease of use in analyzing data generated from OCSPP 890.2200, Medaka Extended One Generation Reproduction Test (MEOGRT) and OCSPP 890.2300, Larval Amphibian Gonad Development Assay (LAGDA). The analyses StatCharrms is capable of performing are: Rao-Scott adjusted Cochran-Armitage test for trend By Slices (RSCABS), a Standard Cochran-Armitage test for trend By Slices (SCABS), mixed effects Cox proportional model, Jonckheere-Terpstra step down trend test, Dunn test, one way ANOVA, weighted ANOVA, mixed effects ANOVA, repeated measures ANOVA, and Dunnett test. This document provides a User’s Manual (termed a Vignette by the Comprehensive R Archive Network (CRAN)) for the previously created R-code tool called StatCharrms (Statistical analysis of Chemistry, Histopathology, and Reproduction endpoints using Repeated measures and Multi-generation Studies). The StatCharrms R-code has been publically available directly from EPA staff since the approval of OCSPP 890.2200 and 890.2300, and now is available publically available at the CRAN.

  3. The kinship2 R package for pedigree data.

    PubMed

    Sinnwell, Jason P; Therneau, Terry M; Schaid, Daniel J

    2014-01-01

    The kinship2 package is restructured from the previous kinship package. Existing features are now enhanced and new features added for handling pedigree objects. Pedigree plotting features have been updated to display features on complex pedigrees while adhering to pedigree plotting standards. Kinship matrices can now be calculated for the X chromosome. Other methods have been added to subset and trim pedigrees while maintaining the pedigree structure. We make the kinship2 package available for R on the Contributed R Archives Network (CRAN), where data management is built-in and other packages can use the pedigree object.

  4. ePCR: an R-package for survival and time-to-event prediction in advanced prostate cancer, applied to real-world patient cohorts.

    PubMed

    Laajala, Teemu D; Murtojärvi, Mika; Virkki, Arho; Aittokallio, Tero

    2018-06-15

    Prognostic models are widely used in clinical decision-making, such as risk stratification and tailoring treatment strategies, with the aim to improve patient outcomes while reducing overall healthcare costs. While prognostic models have been adopted into clinical use, benchmarking their performance has been difficult due to lack of open clinical datasets. The recent DREAM 9.5 Prostate Cancer Challenge carried out an extensive benchmarking of prognostic models for metastatic Castration-Resistant Prostate Cancer (mCRPC), based on multiple cohorts of open clinical trial data. We make available an open-source implementation of the top-performing model, ePCR, along with an extended toolbox for its further re-use and development, and demonstrate how to best apply the implemented model to real-world data cohorts of advanced prostate cancer patients. The open-source R-package ePCR and its reference documentation are available at the Central R Archive Network (CRAN): https://CRAN.R-project.org/package=ePCR. R-vignette provides step-by-step examples for the ePCR usage. Supplementary data are available at Bioinformatics online.

  5. Analysis of high-throughput biological data using their rank values.

    PubMed

    Dembélé, Doulaye

    2018-01-01

    High-throughput biological technologies are routinely used to generate gene expression profiling or cytogenetics data. To achieve high performance, methods available in the literature become more specialized and often require high computational resources. Here, we propose a new versatile method based on the data-ordering rank values. We use linear algebra, the Perron-Frobenius theorem and also extend a method presented earlier for searching differentially expressed genes for the detection of recurrent copy number aberration. A result derived from the proposed method is a one-sample Student's t-test based on rank values. The proposed method is to our knowledge the only that applies to gene expression profiling and to cytogenetics data sets. This new method is fast, deterministic, and requires a low computational load. Probabilities are associated with genes to allow a statistically significant subset selection in the data set. Stability scores are also introduced as quality parameters. The performance and comparative analyses were carried out using real data sets. The proposed method can be accessed through an R package available from the CRAN (Comprehensive R Archive Network) website: https://cran.r-project.org/web/packages/fcros .

  6. minet: A R/Bioconductor package for inferring large transcriptional networks using mutual information.

    PubMed

    Meyer, Patrick E; Lafitte, Frédéric; Bontempi, Gianluca

    2008-10-29

    This paper presents the R/Bioconductor package minet (version 1.1.6) which provides a set of functions to infer mutual information networks from a dataset. Once fed with a microarray dataset, the package returns a network where nodes denote genes, edges model statistical dependencies between genes and the weight of an edge quantifies the statistical evidence of a specific (e.g transcriptional) gene-to-gene interaction. Four different entropy estimators are made available in the package minet (empirical, Miller-Madow, Schurmann-Grassberger and shrink) as well as four different inference methods, namely relevance networks, ARACNE, CLR and MRNET. Also, the package integrates accuracy assessment tools, like F-scores, PR-curves and ROC-curves in order to compare the inferred network with a reference one. The package minet provides a series of tools for inferring transcriptional networks from microarray data. It is freely available from the Comprehensive R Archive Network (CRAN) as well as from the Bioconductor website.

  7. TCIApathfinder: an R client for The Cancer Imaging Archive REST API.

    PubMed

    Russell, Pamela; Fountain, Kelly; Wolverton, Dulcy; Ghosh, Debashis

    2018-06-05

    The Cancer Imaging Archive (TCIA) hosts publicly available de-identified medical images of cancer from over 25 body sites and over 30,000 patients. Over 400 published studies have utilized freely available TCIA images. Images and metadata are available for download through a web interface or a REST API. Here we present TCIApathfinder, an R client for the TCIA REST API. TCIApathfinder wraps API access in user-friendly R functions that can be called interactively within an R session or easily incorporated into scripts. Functions are provided to explore the contents of the large database and to download image files. TCIApathfinder provides easy access to TCIA resources in the highly popular R programming environment. TCIApathfinder is freely available under the MIT license as a package on CRAN (https://cran.r-project.org/web/packages/TCIApathfinder/index.html) and at https://github.com/pamelarussell/TCIApathfinder. Copyright ©2018, American Association for Cancer Research.

  8. Performance Analysis of Optical Mobile Fronthaul for Cloud Radio Access Networks

    NASA Astrophysics Data System (ADS)

    Zhang, Jiawei; Xiao, Yuming; Li, Hui; Ji, Yuefeng

    2017-10-01

    Cloud radio access networks (C-RAN) separates baseband units (BBU) of conventional base station to a centralized pool which connects remote radio heads (RRH) through mobile fronthaul. Mobile fronthaul is a new network segment of C-RAN, it is designed to transport digital sampling data between BBU and RRH. Optical transport networks that provide large bandwidth and low latency is a promising fronthaul solution. In this paper, we discuss several optical transport networks which are candidates for mobile fronthaul, analyze their performances including the number of used wavelength, round-trip latency and wavelength utilization.

  9. polymapR - linkage analysis and genetic map construction from F1 populations of outcrossing polyploids.

    PubMed

    Bourke, Peter M; van Geest, Geert; Voorrips, Roeland E; Jansen, Johannes; Kranenburg, Twan; Shahin, Arwa; Visser, Richard G F; Arens, Paul; Smulders, Marinus J M; Maliepaard, Chris

    2018-05-02

    Polyploid species carry more than two copies of each chromosome, a condition found in many of the world's most important crops. Genetic mapping in polyploids is more complex than in diploid species, resulting in a lack of available software tools. These are needed if we are to realise all the opportunities offered by modern genotyping platforms for genetic research and breeding in polyploid crops. polymapR is an R package for genetic linkage analysis and integrated genetic map construction from bi-parental populations of outcrossing autopolyploids. It can currently analyse triploid, tetraploid and hexaploid marker datasets and is applicable to various crops including potato, leek, alfalfa, blueberry, chrysanthemum, sweet potato or kiwifruit. It can detect, estimate and correct for preferential chromosome pairing, and has been tested on high-density marker datasets from potato, rose and chrysanthemum, generating high-density integrated linkage maps in all of these crops. polymapR is freely available under the general public license from the Comprehensive R Archive Network (CRAN) at http://cran.r-project.org/package=polymapR. Chris Maliepaard chris.maliepaard@wur.nl or Roeland E. Voorrips roeland.voorrips@wur.nl. Supplementary data are available at Bioinformatics online.

  10. gkmSVM: an R package for gapped-kmer SVM

    PubMed Central

    Ghandi, Mahmoud; Mohammad-Noori, Morteza; Ghareghani, Narges; Lee, Dongwon; Garraway, Levi; Beer, Michael A.

    2016-01-01

    Summary: We present a new R package for training gapped-kmer SVM classifiers for DNA and protein sequences. We describe an improved algorithm for kernel matrix calculation that speeds run time by about 2 to 5-fold over our original gkmSVM algorithm. This package supports several sequence kernels, including: gkmSVM, kmer-SVM, mismatch kernel and wildcard kernel. Availability and Implementation: gkmSVM package is freely available through the Comprehensive R Archive Network (CRAN), for Linux, Mac OS and Windows platforms. The C ++ implementation is available at www.beerlab.org/gkmsvm Contact: mghandi@gmail.com or mbeer@jhu.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:27153639

  11. gkmSVM: an R package for gapped-kmer SVM.

    PubMed

    Ghandi, Mahmoud; Mohammad-Noori, Morteza; Ghareghani, Narges; Lee, Dongwon; Garraway, Levi; Beer, Michael A

    2016-07-15

    We present a new R package for training gapped-kmer SVM classifiers for DNA and protein sequences. We describe an improved algorithm for kernel matrix calculation that speeds run time by about 2 to 5-fold over our original gkmSVM algorithm. This package supports several sequence kernels, including: gkmSVM, kmer-SVM, mismatch kernel and wildcard kernel. gkmSVM package is freely available through the Comprehensive R Archive Network (CRAN), for Linux, Mac OS and Windows platforms. The C ++ implementation is available at www.beerlab.org/gkmsvm mghandi@gmail.com or mbeer@jhu.edu Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  12. Growthcurver: an R package for obtaining interpretable metrics from microbial growth curves.

    PubMed

    Sprouffske, Kathleen; Wagner, Andreas

    2016-04-19

    Plate readers can measure the growth curves of many microbial strains in a high-throughput fashion. The hundreds of absorbance readings collected simultaneously for hundreds of samples create technical hurdles for data analysis. Growthcurver summarizes the growth characteristics of microbial growth curve experiments conducted in a plate reader. The data are fitted to a standard form of the logistic equation, and the parameters have clear interpretations on population-level characteristics, like doubling time, carrying capacity, and growth rate. Growthcurver is an easy-to-use R package available for installation from the Comprehensive R Archive Network (CRAN). The source code is available under the GNU General Public License and can be obtained from Github (Sprouffske K, Growthcurver sourcecode, 2016).

  13. GOplot: an R package for visually combining expression data with functional analysis.

    PubMed

    Walter, Wencke; Sánchez-Cabo, Fátima; Ricote, Mercedes

    2015-09-01

    Despite the plethora of methods available for the functional analysis of omics data, obtaining comprehensive-yet detailed understanding of the results remains challenging. This is mainly due to the lack of publicly available tools for the visualization of this type of information. Here we present an R package called GOplot, based on ggplot2, for enhanced graphical representation. Our package takes the output of any general enrichment analysis and generates plots at different levels of detail: from a general overview to identify the most enriched categories (bar plot, bubble plot) to a more detailed view displaying different types of information for molecules in a given set of categories (circle plot, chord plot, cluster plot). The package provides a deeper insight into omics data and allows scientists to generate insightful plots with only a few lines of code to easily communicate the findings. The R package GOplot is available via CRAN-The Comprehensive R Archive Network: http://cran.r-project.org/web/packages/GOplot. The shiny web application of the Venn diagram can be found at: https://wwalter.shinyapps.io/Venn/. A detailed manual of the package with sample figures can be found at https://wencke.github.io/ fscabo@cnic.es or mricote@cnic.es. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  14. Pathway enrichment analysis approach based on topological structure and updated annotation of pathway.

    PubMed

    Yang, Qian; Wang, Shuyuan; Dai, Enyu; Zhou, Shunheng; Liu, Dianming; Liu, Haizhou; Meng, Qianqian; Jiang, Bin; Jiang, Wei

    2017-08-16

    Pathway enrichment analysis has been widely used to identify cancer risk pathways, and contributes to elucidating the mechanism of tumorigenesis. However, most of the existing approaches use the outdated pathway information and neglect the complex gene interactions in pathway. Here, we first reviewed the existing widely used pathway enrichment analysis approaches briefly, and then, we proposed a novel topology-based pathway enrichment analysis (TPEA) method, which integrated topological properties and global upstream/downstream positions of genes in pathways. We compared TPEA with four widely used pathway enrichment analysis tools, including database for annotation, visualization and integrated discovery (DAVID), gene set enrichment analysis (GSEA), centrality-based pathway enrichment (CePa) and signaling pathway impact analysis (SPIA), through analyzing six gene expression profiles of three tumor types (colorectal cancer, thyroid cancer and endometrial cancer). As a result, we identified several well-known cancer risk pathways that could not be obtained by the existing tools, and the results of TPEA were more stable than that of the other tools in analyzing different data sets of the same cancer. Ultimately, we developed an R package to implement TPEA, which could online update KEGG pathway information and is available at the Comprehensive R Archive Network (CRAN): https://cran.r-project.org/web/packages/TPEA/. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  15. MendelianRandomization: an R package for performing Mendelian randomization analyses using summarized data.

    PubMed

    Yavorska, Olena O; Burgess, Stephen

    2017-12-01

    MendelianRandomization is a software package for the R open-source software environment that performs Mendelian randomization analyses using summarized data. The core functionality is to implement the inverse-variance weighted, MR-Egger and weighted median methods for multiple genetic variants. Several options are available to the user, such as the use of robust regression, fixed- or random-effects models and the penalization of weights for genetic variants with heterogeneous causal estimates. Extensions to these methods, such as allowing for variants to be correlated, can be chosen if appropriate. Graphical commands allow summarized data to be displayed in an interactive graph, or the plotting of causal estimates from multiple methods, for comparison. Although the main method of data entry is directly by the user, there is also an option for allowing summarized data to be incorporated from the PhenoScanner database of genotype-phenotype associations. We hope to develop this feature in future versions of the package. The R software environment is available for download from [https://www.r-project.org/]. The MendelianRandomization package can be downloaded from the Comprehensive R Archive Network (CRAN) within R, or directly from [https://cran.r-project.org/web/packages/MendelianRandomization/]. Both R and the MendelianRandomization package are released under GNU General Public Licenses (GPL-2|GPL-3). © The Author 2017. Published by Oxford University Press on behalf of the International Epidemiological Association.

  16. An R package for the design, analysis and operation of reservoir systems

    NASA Astrophysics Data System (ADS)

    Turner, Sean; Ng, Jia Yi; Galelli, Stefano

    2016-04-01

    We present a new R package - named "reservoir" - which has been designed for rapid and easy routing of runoff through storage. The package comprises well-established tools for capacity design (e.g., the sequent peak algorithm), performance analysis (storage-yield-reliability and reliability-resilience-vulnerability analysis) and release policy optimization (Stochastic Dynamic Programming). Operating rules can be optimized for water supply, flood control and amenity objectives, as well as for maximum hydropower production. Storage-depth-area relationships are in-built, allowing users to incorporate evaporation from the reservoir surface. We demonstrate the capabilities of the software for global studies using thousands of reservoirs from the Global Reservoir and Dam (GRanD) database fed by historical monthly inflow time series from a 0.5 degree gridded global runoff dataset. The package is freely available through the Comprehensive R Archive Network (CRAN).

  17. ontologyX: a suite of R packages for working with ontological data.

    PubMed

    Greene, Daniel; Richardson, Sylvia; Turro, Ernest

    2017-04-01

    Ontologies are widely used constructs for encoding and analyzing biomedical data, but the absence of simple and consistent tools has made exploratory and systematic analysis of such data unnecessarily difficult. Here we present three packages which aim to simplify such procedures. The ontologyIndex package enables arbitrary ontologies to be read into R, supports representation of ontological objects by native R types, and provides a parsimonius set of performant functions for querying ontologies. ontologySimilarity and ontologyPlot extend ontologyIndex with functionality for straightforward visualization and semantic similarity calculations, including statistical routines. ontologyIndex , ontologyPlot and ontologySimilarity are all available on the Comprehensive R Archive Network website under https://cran.r-project.org/web/packages/ . Daniel Greene dg333@cam.ac.uk. Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.

  18. MPTinR: analysis of multinomial processing tree models in R.

    PubMed

    Singmann, Henrik; Kellen, David

    2013-06-01

    We introduce MPTinR, a software package developed for the analysis of multinomial processing tree (MPT) models. MPT models represent a prominent class of cognitive measurement models for categorical data with applications in a wide variety of fields. MPTinR is the first software for the analysis of MPT models in the statistical programming language R, providing a modeling framework that is more flexible than standalone software packages. MPTinR also introduces important features such as (1) the ability to calculate the Fisher information approximation measure of model complexity for MPT models, (2) the ability to fit models for categorical data outside the MPT model class, such as signal detection models, (3) a function for model selection across a set of nested and nonnested candidate models (using several model selection indices), and (4) multicore fitting. MPTinR is available from the Comprehensive R Archive Network at http://cran.r-project.org/web/packages/MPTinR/ .

  19. Sybil--efficient constraint-based modelling in R.

    PubMed

    Gelius-Dietrich, Gabriel; Desouki, Abdelmoneim Amer; Fritzemeier, Claus Jonathan; Lercher, Martin J

    2013-11-13

    Constraint-based analyses of metabolic networks are widely used to simulate the properties of genome-scale metabolic networks. Publicly available implementations tend to be slow, impeding large scale analyses such as the genome-wide computation of pairwise gene knock-outs, or the automated search for model improvements. Furthermore, available implementations cannot easily be extended or adapted by users. Here, we present sybil, an open source software library for constraint-based analyses in R; R is a free, platform-independent environment for statistical computing and graphics that is widely used in bioinformatics. Among other functions, sybil currently provides efficient methods for flux-balance analysis (FBA), MOMA, and ROOM that are about ten times faster than previous implementations when calculating the effect of whole-genome single gene deletions in silico on a complete E. coli metabolic model. Due to the object-oriented architecture of sybil, users can easily build analysis pipelines in R or even implement their own constraint-based algorithms. Based on its highly efficient communication with different mathematical optimisation programs, sybil facilitates the exploration of high-dimensional optimisation problems on small time scales. Sybil and all its dependencies are open source. Sybil and its documentation are available for download from the comprehensive R archive network (CRAN).

  20. Multi-service small-cell cloud wired/wireless access network based on tunable optical frequency comb

    NASA Astrophysics Data System (ADS)

    Xiang, Yu; Zhou, Kun; Yang, Liu; Pan, Lei; Liao, Zhen-wan; Zhang, Qiang

    2015-11-01

    In this paper, we demonstrate a novel multi-service wired/wireless integrated access architecture of cloud radio access network (C-RAN) based on radio-over-fiber passive optical network (RoF-PON) system, which utilizes scalable multiple- frequency millimeter-wave (MF-MMW) generation based on tunable optical frequency comb (TOFC). In the baseband unit (BBU) pool, the generated optical comb lines are modulated into wired, RoF and WiFi/WiMAX signals, respectively. The multi-frequency RoF signals are generated by beating the optical comb line pairs in the small cell. The WiFi/WiMAX signals are demodulated after passing through the band pass filter (BPF) and band stop filter (BSF), respectively, whereas the wired signal can be received directly. The feasibility and scalability of the proposed multi-service wired/wireless integrated C-RAN are confirmed by the simulations.

  1. NEAT: an efficient network enrichment analysis test.

    PubMed

    Signorelli, Mirko; Vinciotti, Veronica; Wit, Ernst C

    2016-09-05

    Network enrichment analysis is a powerful method, which allows to integrate gene enrichment analysis with the information on relationships between genes that is provided by gene networks. Existing tests for network enrichment analysis deal only with undirected networks, they can be computationally slow and are based on normality assumptions. We propose NEAT, a test for network enrichment analysis. The test is based on the hypergeometric distribution, which naturally arises as the null distribution in this context. NEAT can be applied not only to undirected, but to directed and partially directed networks as well. Our simulations indicate that NEAT is considerably faster than alternative resampling-based methods, and that its capacity to detect enrichments is at least as good as the one of alternative tests. We discuss applications of NEAT to network analyses in yeast by testing for enrichment of the Environmental Stress Response target gene set with GO Slim and KEGG functional gene sets, and also by inspecting associations between functional sets themselves. NEAT is a flexible and efficient test for network enrichment analysis that aims to overcome some limitations of existing resampling-based tests. The method is implemented in the R package neat, which can be freely downloaded from CRAN ( https://cran.r-project.org/package=neat ).

  2. Calibrated Multivariate Regression with Application to Neural Semantic Basis Discovery.

    PubMed

    Liu, Han; Wang, Lie; Zhao, Tuo

    2015-08-01

    We propose a calibrated multivariate regression method named CMR for fitting high dimensional multivariate regression models. Compared with existing methods, CMR calibrates regularization for each regression task with respect to its noise level so that it simultaneously attains improved finite-sample performance and tuning insensitiveness. Theoretically, we provide sufficient conditions under which CMR achieves the optimal rate of convergence in parameter estimation. Computationally, we propose an efficient smoothed proximal gradient algorithm with a worst-case numerical rate of convergence O (1/ ϵ ), where ϵ is a pre-specified accuracy of the objective function value. We conduct thorough numerical simulations to illustrate that CMR consistently outperforms other high dimensional multivariate regression methods. We also apply CMR to solve a brain activity prediction problem and find that it is as competitive as a handcrafted model created by human experts. The R package camel implementing the proposed method is available on the Comprehensive R Archive Network http://cran.r-project.org/web/packages/camel/.

  3. Multiple hot-deck imputation for network inference from RNA sequencing data.

    PubMed

    Imbert, Alyssa; Valsesia, Armand; Le Gall, Caroline; Armenise, Claudia; Lefebvre, Gregory; Gourraud, Pierre-Antoine; Viguerie, Nathalie; Villa-Vialaneix, Nathalie

    2018-05-15

    Network inference provides a global view of the relations existing between gene expression in a given transcriptomic experiment (often only for a restricted list of chosen genes). However, it is still a challenging problem: even if the cost of sequencing techniques has decreased over the last years, the number of samples in a given experiment is still (very) small compared to the number of genes. We propose a method to increase the reliability of the inference when RNA-seq expression data have been measured together with an auxiliary dataset that can provide external information on gene expression similarity between samples. Our statistical approach, hd-MI, is based on imputation for samples without available RNA-seq data that are considered as missing data but are observed on the secondary dataset. hd-MI can improve the reliability of the inference for missing rates up to 30% and provides more stable networks with a smaller number of false positive edges. On a biological point of view, hd-MI was also found relevant to infer networks from RNA-seq data acquired in adipose tissue during a nutritional intervention in obese individuals. In these networks, novel links between genes were highlighted, as well as an improved comparability between the two steps of the nutritional intervention. Software and sample data are available as an R package, RNAseqNet, that can be downloaded from the Comprehensive R Archive Network (CRAN). alyssa.imbert@inra.fr or nathalie.villa-vialaneix@inra.fr. Supplementary data are available at Bioinformatics online.

  4. Réalisation d'écrans magnétiques supraconducteurs

    NASA Astrophysics Data System (ADS)

    Lainée, F.; Kormann, R.

    1992-02-01

    Low fields and low frequency shielding properties of YBCO magnetic shields are measured at 77 K. They compare favourably with shielding properties of mumetal shields. Therefore high-T_c superconducting magnetic shields can already be used to shield small volumes. The case of magnetic shields for large volumes is also discussed. Nous mesurons à 77K les caractéristiques d'écrantage basse fréquence et bas champ d'écrans supraconducteurs en YBaCuO. Celles-ci se comparent favorablement à celles d'écrans en mumétal. La réalisation pratique d'écrans supraconducteurs est dès lors possible pour l'écrantage de petits volumes. Les géométries de réalisation d'écrans pour les grands volumes sont également discutées.

  5. LEAP: constructing gene co-expression networks for single-cell RNA-sequencing data using pseudotime ordering.

    PubMed

    Specht, Alicia T; Li, Jun

    2017-03-01

    To construct gene co-expression networks based on single-cell RNA-Sequencing data, we present an algorithm called LEAP, which utilizes the estimated pseudotime of the cells to find gene co-expression that involves time delay. R package LEAP available on CRAN. jun.li@nd.edu. Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  6. ShinyKGode: an interactive application for ODE parameter inference using gradient matching.

    PubMed

    Wandy, Joe; Niu, Mu; Giurghita, Diana; Daly, Rónán; Rogers, Simon; Husmeier, Dirk

    2018-07-01

    Mathematical modelling based on ordinary differential equations (ODEs) is widely used to describe the dynamics of biological systems, particularly in systems and pathway biology. Often the kinetic parameters of these ODE systems are unknown and have to be inferred from the data. Approximate parameter inference methods based on gradient matching (which do not require performing computationally expensive numerical integration of the ODEs) have been getting popular in recent years, but many implementations are difficult to run without expert knowledge. Here, we introduce ShinyKGode, an interactive web application to perform fast parameter inference on ODEs using gradient matching. ShinyKGode can be used to infer ODE parameters on simulated and observed data using gradient matching. Users can easily load their own models in Systems Biology Markup Language format, and a set of pre-defined ODE benchmark models are provided in the application. Inferred parameters are visualized alongside diagnostic plots to assess convergence. The R package for ShinyKGode can be installed through the Comprehensive R Archive Network (CRAN). Installation instructions, as well as tutorial videos and source code are available at https://joewandy.github.io/shinyKGode. Supplementary data are available at Bioinformatics online.

  7. Cranberry (Vaccinium macrocarpon) protects against doxorubicin-induced cardiotoxicity in rats.

    PubMed

    Elberry, Ahmed A; Abdel-Naim, Ashraf B; Abdel-Sattar, Essam A; Nagy, Ayman A; Mosli, Hisham A; Mohamadin, Ahmed M; Ashour, Osama M

    2010-05-01

    Doxorubicin (DOX) is a widely used cancer chemotherapeutic agent. However, it generates free oxygen radicals that result in serious dose-limiting cardiotoxicity. Supplementations with berries were proven effective in reducing oxidative stress associated with several ailments. The aim of the current study was to investigate the potential protective effect of cranberry extract (CRAN) against DOX-induced cardiotoxicity in rats. CRAN was given orally to rats (100mg/kg/day for 10 consecutive days) and DOX (15mg/kg; i.p.) was administered on the seventh day. CRAN protected against DOX-induced increased mortality and ECG changes. It significantly inhibited DOX-provoked glutathione (GSH) depletion and accumulation of oxidized glutathione (GSSG), malondialdehyde (MDA), and protein carbonyls in cardiac tissues. The reductions of cardiac activities of catalase (CAT), superoxide dismutase (SOD), glutathione peroxidase (GSH-Px) and glutathione reductase (GR) were significantly mitigated. Elevation of cardiac myeloperoxidase (MPO) activity in response to DOX treatment was significantly hampered. Pretreatment of CRAN significantly guarded against DOX-induced rise of serum lactate dehydrogenase (LDH), creatine phosphokinase (CK), creatine kinase-MB (CK-MB) as well as troponin I level. CRAN alleviated histopathological changes in rats' hearts treated with DOX. In conclusion, CRAN protects against DOX-induced cardiotoxicity in rats. This can be attributed, at least in part, to CRAN's antioxidant activity. Copyright (c) 2010 Elsevier Ltd. All rights reserved.

  8. LakeMetabolizer: An R package for estimating lake metabolism from free-water oxygen using diverse statistical models

    USGS Publications Warehouse

    Winslow, Luke; Zwart, Jacob A.; Batt, Ryan D.; Dugan, Hilary; Woolway, R. Iestyn; Corman, Jessica; Hanson, Paul C.; Read, Jordan S.

    2016-01-01

    Metabolism is a fundamental process in ecosystems that crosses multiple scales of organization from individual organisms to whole ecosystems. To improve sharing and reuse of published metabolism models, we developed LakeMetabolizer, an R package for estimating lake metabolism from in situ time series of dissolved oxygen, water temperature, and, optionally, additional environmental variables. LakeMetabolizer implements 5 different metabolism models with diverse statistical underpinnings: bookkeeping, ordinary least squares, maximum likelihood, Kalman filter, and Bayesian. Each of these 5 metabolism models can be combined with 1 of 7 models for computing the coefficient of gas exchange across the air–water interface (k). LakeMetabolizer also features a variety of supporting functions that compute conversions and implement calculations commonly applied to raw data prior to estimating metabolism (e.g., oxygen saturation and optical conversion models). These tools have been organized into an R package that contains example data, example use-cases, and function documentation. The release package version is available on the Comprehensive R Archive Network (CRAN), and the full open-source GPL-licensed code is freely available for examination and extension online. With this unified, open-source, and freely available package, we hope to improve access and facilitate the application of metabolism in studies and management of lentic ecosystems.

  9. Retrograde air escape via the nasolacrimal system: a previously unrecognized complication of continuous positive airway pressure in the management of obstructive sleep apnea.

    PubMed

    Singh, Narinder Pal; Walker, Robbie James Eades; Cowan, Fiona; Davidson, Arthur Craig; Roberts, David Newton

    2014-05-01

    Continuous positive airway pressure (CPAP) is the gold standard treatment for moderate to severe obstructive sleep apnoea (OSA). Eye-related side effects of CPAP are commonly attributed to a poorly sealed mask, allowing leaked air to blow over the eye. We present 3 cases where attended polysomnography (A-PSG) demonstrated CPAP-associated retrograde air escape via the nasolacrimal system (CRANS) in the absence of any mask leaks. Symptoms included dry eye, epiphora, air escape from the medial canthus, and eyelid flutter. Symptoms were controlled with a variety of surgical and nonsurgical techniques. CRANS represents a previously undescribed clinical entity. CRANS may be responsible for some CPAP-related eye side effects and possibly for rarer secondary eye complications, including conjunctivitis and corneal ulceration. CRANS should be suspected in any patient on CPAP complaining of eye symptoms. CRANS may be diagnosed through careful observation during A-PSG and confirmed by performing a "saline bubble test." Management options include nonsurgical (mask alternatives, humidification, nasopharyngeal airway) and surgical techniques (nasal airway surgery, inferior turbinate out-fracture and adhesion, injection of bulking agent around Hasner's valve).

  10. Optical RRH working in an all-optical fronthaul network

    NASA Astrophysics Data System (ADS)

    Zakrzewski, Zbigniew

    2017-12-01

    The paper presents an example of an optical RRH (Remote Radio Head) design, which is equipped with photonic components for direct connection to an all-optical network. The features that can be fulfilled by an all-optical network are indicated to support future 5G mobile networks. The demand for optical bandwidth in fronthaul/midhaul distribution network links, working in D-RoF and A-RoF formats was performed. The increase in demand is due to the very large traffic generated by the Optical Massive-MIMO RRH/RRU will work in format of an Active-Distributed Antenna System (A-DAS). An exemplary next-generation mobile network that will utilize O-RRH and an all-optical backbone is presented. All components of presented network will work in the Centralized/Cloud Radio Access Network (C-RAN) architecture, which is achievable by control with the use of the OpenFlow (OF).

  11. Quantum key distribution in multicore fibre for secure radio access networks

    NASA Astrophysics Data System (ADS)

    Llorente, Roberto; Provot, Antoine; Morant, Maria

    2018-01-01

    Broadband access in optical domain usually focuses in providing a pervasive cost-effective high bitrate communication in a given area. Nowadays, it is of utmost interest also to be able to provide a secure communication to the costumers in the area. Wireless access networks rely on optical domain for both fronthaul and backhaul of the radio access network (C-RAN). Multicore fiber (MCF) has been proposed as a promising candidate for the optical media of choice in nextgeneration wireless. The capacity demand of next-generation 5G networks makes interesting the use of high-capacity optical solutions as space-division multiplexing of different signals over MCF media. This work addresses secure MCF communication supporting C-RAN architectures. The paper proposes the use of one core in the MCF to transport securely an optical quantum key encoding altogether with end-to-end wireless signal transmitted in the remaining cores in radio-over-fiber (RoF). The RoF wireless signals are suitable for radio access fronthaul and backhaul. The theoretical principle and simulation analysis of quantum key distribution (QKD) are presented in this paper. The potential impact of optical RoF transmission crosstalk impairments is assessed experimentally considering different cellular signals on the remaining optical cores in the MCF. The experimental results report fronthaul performance over a four-core optical fiber with RoF transmission of full-standard CDMA signals providing 3.5G services in one core, HSPA+ signals providing 3.9G services in the second core and 3GPP LTEAdvanced signals providing 4G services in the third core, considering that the QKD signal is allocated in the fourth core.

  12. Influence des interactions entre écrans de soutènement sur le calcul de la butée

    NASA Astrophysics Data System (ADS)

    Magnan, Jean-Pierre; Meyer, Grégory

    2018-05-01

    La mobilisation de la butée devant un écran implique un volume de sol important, sur une distance plus grande que la fiche et qui dépend des paramètres du calcul. L'article passe en revue les méthodes de calcul utilisées pour évaluer la butée, en insistant sur la distance nécessaire au libre développement du mécanisme de butée. Il évalue ensuite de différentes façons l'effet de l'interaction entre deux écrans placés face à face de part et d'autre d'une excavation. La méthode recommandée pour calculer la butée mobilisable consiste à faire un calcul en éléments finis avec des valeurs réduites des paramètres de résistance au cisaillement dans la zone où se développera la butée. Cette démarche permet de déterminer des facteurs correctifs à appliquer au calcul de la butée d'un écran isolé en fonction du rapport de la distance entre écrans à leur fiche.

  13. pubmed.mineR: an R package with text-mining algorithms to analyse PubMed abstracts.

    PubMed

    Rani, Jyoti; Shah, A B Rauf; Ramachandran, Srinivasan

    2015-10-01

    The PubMed literature database is a valuable source of information for scientific research. It is rich in biomedical literature with more than 24 million citations. Data-mining of voluminous literature is a challenging task. Although several text-mining algorithms have been developed in recent years with focus on data visualization, they have limitations such as speed, are rigid and are not available in the open source. We have developed an R package, pubmed.mineR, wherein we have combined the advantages of existing algorithms, overcome their limitations, and offer user flexibility and link with other packages in Bioconductor and the Comprehensive R Network (CRAN) in order to expand the user capabilities for executing multifaceted approaches. Three case studies are presented, namely, 'Evolving role of diabetes educators', 'Cancer risk assessment' and 'Dynamic concepts on disease and comorbidity' to illustrate the use of pubmed.mineR. The package generally runs fast with small elapsed times in regular workstations even on large corpus sizes and with compute intensive functions. The pubmed.mineR is available at http://cran.rproject. org/web/packages/pubmed.mineR.

  14. CATREG SOFTWARE FOR CATEGORICAL REGRESSION ANALYSIS

    EPA Science Inventory

    CatReg is a computer program, written in the R (http://cran.r-project.org) programming language, to support the conduct of exposure-response analyses by toxicologists and health scientists. CatReg can be used to perform categorical regressi...

  15. Evaluating variability and uncertainty separately in microbial quantitative risk assessment using two R packages.

    PubMed

    Pouillot, Régis; Delignette-Muller, Marie Laure

    2010-09-01

    Quantitative risk assessment has emerged as a valuable tool to enhance the scientific basis of regulatory decisions in the food safety domain. This article introduces the use of two new computing resources (R packages) specifically developed to help risk assessors in their projects. The first package, "fitdistrplus", gathers tools for choosing and fitting a parametric univariate distribution to a given dataset. The data may be continuous or discrete. Continuous data may be right-, left- or interval-censored as is frequently obtained with analytical methods, with the possibility of various censoring thresholds within the dataset. Bootstrap procedures then allow the assessor to evaluate and model the uncertainty around the parameters and to transfer this information into a quantitative risk assessment model. The second package, "mc2d", helps to build and study two dimensional (or second-order) Monte-Carlo simulations in which the estimation of variability and uncertainty in the risk estimates is separated. This package easily allows the transfer of separated variability and uncertainty along a chain of conditional mathematical and probabilistic models. The usefulness of these packages is illustrated through a risk assessment of hemolytic and uremic syndrome in children linked to the presence of Escherichia coli O157:H7 in ground beef. These R packages are freely available at the Comprehensive R Archive Network (cran.r-project.org). Copyright 2010 Elsevier B.V. All rights reserved.

  16. CatReg Software for Categorical Regression Analysis (Jul 2012)

    EPA Science Inventory

    CatReg is a computer program, written in the R (http://cran.r-project.org) programming language, to support the conduct of exposure-response analyses by toxicologists and health scientists. CatReg can be used to perform categorical regressi...

  17. CatReg Software for Categorical Regression Analysis (Nov 2006)

    EPA Science Inventory

    CatReg is a computer program, written in the R (http://cran.r-project.org) programming language, to support the conduct of exposure-response analyses by toxicologists and health scientists. CatReg can be used to perform categorical regressi...

  18. CatReg Software for Categorical Regression Analysis (Feb 2011)

    EPA Science Inventory

    CatReg is a computer program, written in the R (http://cran.r-project.org) programming language, to support the conduct of exposure-response analyses by toxicologists and health scientists. CatReg can be used to perform categorical regressi...

  19. CORM: An R Package Implementing the Clustering of Regression Models Method for Gene Clustering

    PubMed Central

    Shi, Jiejun; Qin, Li-Xuan

    2014-01-01

    We report a new R package implementing the clustering of regression models (CORM) method for clustering genes using gene expression data and provide data examples illustrating each clustering function in the package. The CORM package is freely available at CRAN from http://cran.r-project.org. PMID:25452684

  20. Anti-inflammatory Activity of Berry Fruits in Mice Model of Inflammation is Based on Oxidative Stress Modulation

    PubMed Central

    Nardi, Geisson Marcos; Farias Januario, Adriana Graziele; Freire, Cassio Geremia; Megiolaro, Fernanda; Schneider, Kétlin; Perazzoli, Marlene Raimunda Andreola; Do Nascimento, Scheley Raap; Gon, Ana Cristina; Mariano, Luísa Nathália Bolda; Wagner, Glauber; Niero, Rivaldo; Locatelli, Claudriana

    2016-01-01

    Background: Many fruits have been used as nutraceuticals because the presence of bioactive molecules that play biological activities. Objective: The present study was designed to compare the anti-inflammatory and antioxidant effects of methanolic extracts of Lycium barbarum (GOJI), Vaccinium macrocarpon (CRAN) and Vaccinium myrtillus (BLUE). Materials and Methods: Mices were treated with extracts (50 and 200 mg/kg, p.o.), twice a day through 10 days. Phytochemical analysis was performed by high-performance liquid chromatography. Antioxidant activity was determine by 2,2-diphenyl-1-picrylhydrazyl (DPPH) assay, reducing power, lipid peroxidation thiobarbituric acid reactive substances (TBARS), reduced glutathione (GSH) and catalase (CAT) activity. Anti-inflammatory activity was evaluated by paw edema followed by determination of myeloperoxidase (MPO) and TBARS. Results: High amount of phenolic compounds, including rutin, were identified in all berries extracts. However, quercetin was observed only in BLUE and CRAN. GOJI presents higher scavenging activity of DPPH radical and reducing power than BLUE and CRAN. The extracts improved antioxidant status in liver; BLUE showed the largest reduction (75.3%) in TBARS when compared to CRAN (70.7%) and GOJI (65.3%). Nonetheless, CAT activity was lower in BLUE group. However, hepatic concentrations of GSH were higher in animals treated with GOJI rather than CRAN and BLUE. Despite all fruits caused a remarkable reduction in paw edema and TBARS, only BLUE and CRAN were able to reduce MPO. Conclusion: These results suggest that quercetin, rutin, or other phenolic compound found in these berry fruits extracts could produce an anti-inflammatory response based on modulation of oxidative stress in paw edema model. SUMMARY Within fruits broadly consumed because of its nutraceuticals properties include, Lycium barbarum (Goji berry), Vaccinium myrtillus (Blueberry or Bilberry) and Vaccinium macrocarpon (Cranberry)The objectives of this study were the investigation and comparison of chemical composition, antioxidant activity “in vitro” and “in vivo” and anti inflammatory property of berry fruits bought dry form.In summary, two main findings can be addressed with this study: (1) Berry fruits presented antioxidant and anti inflammatory activities “in vitro” and “in vivo”; (2) the extracts of GOJI, CRAN, and BLUE modulate the inflammatory process by different mechanisms. PMID:27114691

  1. SDN based millimetre wave radio over fiber (RoF) network

    NASA Astrophysics Data System (ADS)

    Amate, Ahmed; Milosavljevic, Milos; Kourtessis, Pandelis; Robinson, Matthew; Senior, John M.

    2015-01-01

    This paper introduces software-defined, millimeter Wave (mm-Wave) networks with Radio over Fiber (RoF) for the delivery of gigabit connectivity required to develop fifth generation (5G) mobile. This network will enable an effective open access system allowing providers to manage and lease the infrastructure to service providers through unbundling new business models. Exploiting the inherited benefits of RoF, complete base station functionalities are centralized at the edges of the metro and aggregation network, leaving remote radio heads (RRHs) with only tunable filtering and amplification. A Software Defined Network (SDN) Central Controller (SCC) is responsible for managing the resource across several mm-Wave Radio Access Networks (RANs) providing a global view of the several network segments. This ensures flexible resource allocation for reduced overall latency and increased throughput. The SDN based mm-Wave RAN also allows for inter edge node communication. Therefore, certain packets can be routed between different RANs supported by the same edge node, reducing latency. System level simulations of the complete network have shown significant improvement of the overall throughput and SINR for wireless users by providing effective resource allocation and coordination among interfering cells. A new Coordinated Multipoint (CoMP) algorithm exploiting the benefits of the SCC global network view for reduced delay in control message exchange is presented, accounting for a minimum packet delay and limited Channel State Information (CSI) in a Long Term Evolution-Advanced (LTE-A), Cloud RAN (CRAN) configuration. The algorithm does not require detailed CSI feedback from UEs but it rather considers UE location (determined by the eNB) as the required parameter. UE throughput in the target sector is represented using a Cumulative Distributive Function (CDF). The drawn characteristics suggest that there is a significant 60% improvement in UE cell edge throughput following the application, in the coordinating cells, of the new CoMP algorithm. Results also show a further improvement of 36% in cell edge UE throughput when eNBs are centralized in a CRAN backhaul architecture. The SINR distribution of UEs in the cooperating cells has also been evaluated using a box plot. As expected, UEs with CoMP perform better demonstrating an increase of over 2 dB at the median between the transmission scenarios.

  2. Investigation of broadband digital predistortion for broadband radio over fiber transmission systems

    NASA Astrophysics Data System (ADS)

    Zhang, Xiupu; Liu, Taijun; Shen, Dongya

    2016-12-01

    In future broadband cloud radio access networks (C-RAN), front-haul transmission systems play a significant role in performance and cost of C-RAN. Broadband and high linearity radio over fiber (RoF) transmission systems are considered a promising solution for the front-haul. Digital linearization is one possible solution for RoF front-haul. In this paper, we investigate RF domain digital predistortion (DPD) linearization for broadband RoF front-haul. The implemented DPD is first investigated in 2.4 GHz WiFi over fiber transmission systems at 36 Mb/s, and more than 8-dB and 5.6-dB improvements of error vector magnitude (EVM) are achieved in back to back (BTB) and after 10 km single mode fiber (SMF) transmission. Further, both WiFi and ultra wide band (UWB) wireless signals are transmitted together, in which the DPD has linearization bandwidth of 2.4 GHz. It is shown that the implemented DPD leads to EVM improvements of 4.5-dB (BTB) and 3.1-dB (10 km SMF) for the WiFi signal, and 4.6-dB (BTB) and 4-dB (10 km SMF) for the broadband UWB signal.

  3. bnstruct: an R package for Bayesian Network structure learning in the presence of missing data.

    PubMed

    Franzin, Alberto; Sambo, Francesco; Di Camillo, Barbara

    2017-04-15

    A Bayesian Network is a probabilistic graphical model that encodes probabilistic dependencies between a set of random variables. We introduce bnstruct, an open source R package to (i) learn the structure and the parameters of a Bayesian Network from data in the presence of missing values and (ii) perform reasoning and inference on the learned Bayesian Networks. To the best of our knowledge, there is no other open source software that provides methods for all of these tasks, particularly the manipulation of missing data, which is a common situation in practice. The software is implemented in R and C and is available on CRAN under a GPL licence. francesco.sambo@unipd.it. Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  4. Analysis of mobile fronthaul bandwidth and wireless transmission performance in split-PHY processing architecture.

    PubMed

    Miyamoto, Kenji; Kuwano, Shigeru; Terada, Jun; Otaka, Akihiro

    2016-01-25

    We analyze the mobile fronthaul (MFH) bandwidth and the wireless transmission performance in the split-PHY processing (SPP) architecture, which redefines the functional split of centralized/cloud RAN (C-RAN) while preserving high wireless coordinated multi-point (CoMP) transmission/reception performance. The SPP architecture splits the base stations (BS) functions between wireless channel coding/decoding and wireless modulation/demodulation, and employs its own CoMP joint transmission and reception schemes. Simulation results show that the SPP architecture reduces the MFH bandwidth by up to 97% from conventional C-RAN while matching the wireless bit error rate (BER) performance of conventional C-RAN in uplink joint reception with only 2-dB signal to noise ratio (SNR) penalty.

  5. RSEIS and RFOC: Seismic Analysis in R

    NASA Astrophysics Data System (ADS)

    Lees, J. M.

    2015-12-01

    Open software is essential for reproducible scientific exchange. R-packages provide a platform for development of seismological investigation software that can be properly documented and traced for data processing. A suite of R packages designed for a wide range of seismic analysis is currently available in the free software platform called R. R is a software platform based on the S-language developed at Bell Labs decades ago. Routines in R can be run as standalone function calls, or developed in object-oriented mode. R comes with a base set of routines, and thousands of user developed packages. The packages developed at UNC include subroutines and interactive codes for processing seismic data, analyzing geographic information (GIS) and inverting data involved in a variety of geophysical applications. On CRAN (Comprehensive R Archive Network, http://www.r-project.org/) currently available packages related to seismic analysis are RSEIS, Rquake, GEOmap, RFOC, zoeppritz, RTOMO, and geophys, Rwave, PEIP, hht, rFDSN. These include signal processing, data management, mapping, earthquake location, deconvolution, focal mechanisms, wavelet transforms, Hilbert-Huang Transforms, tomographic inversion, and Mogi deformation among other useful functionality. All software in R packages is required to have detailed documentation, making the exchange and modification of existing software easy. In this presentation, I will focus on packages RSEIS and RFOC, showing examples from a variety of seismic analyses. The R approach has similarities to the popular (and expensive) MATLAB platform, although R is open source and free to down load.

  6. Multi-factorial analysis of class prediction error: estimating optimal number of biomarkers for various classification rules.

    PubMed

    Khondoker, Mizanur R; Bachmann, Till T; Mewissen, Muriel; Dickinson, Paul; Dobrzelecki, Bartosz; Campbell, Colin J; Mount, Andrew R; Walton, Anthony J; Crain, Jason; Schulze, Holger; Giraud, Gerard; Ross, Alan J; Ciani, Ilenia; Ember, Stuart W J; Tlili, Chaker; Terry, Jonathan G; Grant, Eilidh; McDonnell, Nicola; Ghazal, Peter

    2010-12-01

    Machine learning and statistical model based classifiers have increasingly been used with more complex and high dimensional biological data obtained from high-throughput technologies. Understanding the impact of various factors associated with large and complex microarray datasets on the predictive performance of classifiers is computationally intensive, under investigated, yet vital in determining the optimal number of biomarkers for various classification purposes aimed towards improved detection, diagnosis, and therapeutic monitoring of diseases. We investigate the impact of microarray based data characteristics on the predictive performance for various classification rules using simulation studies. Our investigation using Random Forest, Support Vector Machines, Linear Discriminant Analysis and k-Nearest Neighbour shows that the predictive performance of classifiers is strongly influenced by training set size, biological and technical variability, replication, fold change and correlation between biomarkers. Optimal number of biomarkers for a classification problem should therefore be estimated taking account of the impact of all these factors. A database of average generalization errors is built for various combinations of these factors. The database of generalization errors can be used for estimating the optimal number of biomarkers for given levels of predictive accuracy as a function of these factors. Examples show that curves from actual biological data resemble that of simulated data with corresponding levels of data characteristics. An R package optBiomarker implementing the method is freely available for academic use from the Comprehensive R Archive Network (http://www.cran.r-project.org/web/packages/optBiomarker/).

  7. A studentized permutation test for three-arm trials in the 'gold standard' design.

    PubMed

    Mütze, Tobias; Konietschke, Frank; Munk, Axel; Friede, Tim

    2017-03-15

    The 'gold standard' design for three-arm trials refers to trials with an active control and a placebo control in addition to the experimental treatment group. This trial design is recommended when being ethically justifiable and it allows the simultaneous comparison of experimental treatment, active control, and placebo. Parametric testing methods have been studied plentifully over the past years. However, these methods often tend to be liberal or conservative when distributional assumptions are not met particularly with small sample sizes. In this article, we introduce a studentized permutation test for testing non-inferiority and superiority of the experimental treatment compared with the active control in three-arm trials in the 'gold standard' design. The performance of the studentized permutation test for finite sample sizes is assessed in a Monte Carlo simulation study under various parameter constellations. Emphasis is put on whether the studentized permutation test meets the target significance level. For comparison purposes, commonly used Wald-type tests, which do not make any distributional assumptions, are included in the simulation study. The simulation study shows that the presented studentized permutation test for assessing non-inferiority in three-arm trials in the 'gold standard' design outperforms its competitors, for instance the test based on a quasi-Poisson model, for count data. The methods discussed in this paper are implemented in the R package ThreeArmedTrials which is available on the comprehensive R archive network (CRAN). Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  8. BoolNet--an R package for generation, reconstruction and analysis of Boolean networks.

    PubMed

    Müssel, Christoph; Hopfensitz, Martin; Kestler, Hans A

    2010-05-15

    As the study of information processing in living cells moves from individual pathways to complex regulatory networks, mathematical models and simulation become indispensable tools for analyzing the complex behavior of such networks and can provide deep insights into the functioning of cells. The dynamics of gene expression, for example, can be modeled with Boolean networks (BNs). These are mathematical models of low complexity, but have the advantage of being able to capture essential properties of gene-regulatory networks. However, current implementations of BNs only focus on different sub-aspects of this model and do not allow for a seamless integration into existing preprocessing pipelines. BoolNet efficiently integrates methods for synchronous, asynchronous and probabilistic BNs. This includes reconstructing networks from time series, generating random networks, robustness analysis via perturbation, Markov chain simulations, and identification and visualization of attractors. The package BoolNet is freely available from the R project at http://cran.r-project.org/ or http://www.informatik.uni-ulm.de/ni/mitarbeiter/HKestler/boolnet/ under Artistic License 2.0. hans.kestler@uni-ulm.de Supplementary data are available at Bioinformatics online.

  9. Executable research compendia in geoscience research infrastructures

    NASA Astrophysics Data System (ADS)

    Nüst, Daniel

    2017-04-01

    From generation through analysis and collaboration to communication, scientific research requires the right tools. Scientists create their own software using third party libraries and platforms. Cloud computing, Open Science, public data infrastructures, and Open Source enable scientists with unprecedented opportunites, nowadays often in a field "Computational X" (e.g. computational seismology) or X-informatics (e.g. geoinformatics) [0]. This increases complexity and generates more innovation, e.g. Environmental Research Infrastructures (environmental RIs [1]). Researchers in Computational X write their software relying on both source code (e.g. from https://github.com) and binary libraries (e.g. from package managers such as APT, https://wiki.debian.org/Apt, or CRAN, https://cran.r-project.org/). They download data from domain specific (cf. https://re3data.org) or generic (e.g. https://zenodo.org) data repositories, and deploy computations remotely (e.g. European Open Science Cloud). The results themselves are archived, given persistent identifiers, connected to other works (e.g. using https://orcid.org/), and listed in metadata catalogues. A single researcher, intentionally or not, interacts with all sub-systems of RIs: data acquisition, data access, data processing, data curation, and community support [3]. To preserve computational research [3] proposes the Executable Research Compendium (ERC), a container format closing the gap of dependency preservation by encapsulating the runtime environment. ERCs and RIs can be integrated for different uses: (i) Coherence: ERC services validate completeness, integrity and results (ii) Metadata: ERCs connect the different parts of a piece of research and faciliate discovery (iii) Exchange and Preservation: ERC as usable building blocks are the shared and archived entity (iv) Self-consistency: ERCs remove dependence on ephemeral sources (v) Execution: ERC services create and execute a packaged analysis but integrate with existing platforms for display and control These integrations are vital for capturing workflows in RIs and connect key stakeholders (scientists, publishers, librarians). They are demonstrated using developments by the DFG-funded project Opening Reproducible Research (http://o2r.info). Semi-automatic creation of ERCs based on research workflows is a core goal of the project. References [0] Tony Hey, Stewart Tansley, Kristin Tolle (eds), 2009. The Fourth Paradigm: Data-Intensive Scientific Discovery. Microsoft Research. [1] P. Martin et al., Open Information Linking for Environmental Research Infrastructures, 2015 IEEE 11th International Conference on e-Science, Munich, 2015, pp. 513-520. doi: 10.1109/eScience.2015.66 [2] Y. Chen et al., Analysis of Common Requirements for Environmental Science Research Infrastructures, The International Symposium on Grids and Clouds (ISGC) 2013, Taipei, 2013, http://pos.sissa.it/archive/conferences/179/032/ISGC [3] Opening Reproducible Research, Geophysical Research Abstracts Vol. 18, EGU2016-7396, 2016, http://meetingorganizer.copernicus.org/EGU2016/EGU2016-7396.pdf

  10. Preprocessing of gene expression data by optimally robust estimators

    PubMed Central

    2010-01-01

    Background The preprocessing of gene expression data obtained from several platforms routinely includes the aggregation of multiple raw signal intensities to one expression value. Examples are the computation of a single expression measure based on the perfect match (PM) and mismatch (MM) probes for the Affymetrix technology, the summarization of bead level values to bead summary values for the Illumina technology or the aggregation of replicated measurements in the case of other technologies including real-time quantitative polymerase chain reaction (RT-qPCR) platforms. The summarization of technical replicates is also performed in other "-omics" disciplines like proteomics or metabolomics. Preprocessing methods like MAS 5.0, Illumina's default summarization method, RMA, or VSN show that the use of robust estimators is widely accepted in gene expression analysis. However, the selection of robust methods seems to be mainly driven by their high breakdown point and not by efficiency. Results We describe how optimally robust radius-minimax (rmx) estimators, i.e. estimators that minimize an asymptotic maximum risk on shrinking neighborhoods about an ideal model, can be used for the aggregation of multiple raw signal intensities to one expression value for Affymetrix and Illumina data. With regard to the Affymetrix data, we have implemented an algorithm which is a variant of MAS 5.0. Using datasets from the literature and Monte-Carlo simulations we provide some reasoning for assuming approximate log-normal distributions of the raw signal intensities by means of the Kolmogorov distance, at least for the discussed datasets, and compare the results of our preprocessing algorithms with the results of Affymetrix's MAS 5.0 and Illumina's default method. The numerical results indicate that when using rmx estimators an accuracy improvement of about 10-20% is obtained compared to Affymetrix's MAS 5.0 and about 1-5% compared to Illumina's default method. The improvement is also visible in the analysis of technical replicates where the reproducibility of the values (in terms of Pearson and Spearman correlation) is increased for all Affymetrix and almost all Illumina examples considered. Our algorithms are implemented in the R package named RobLoxBioC which is publicly available via CRAN, The Comprehensive R Archive Network (http://cran.r-project.org/web/packages/RobLoxBioC/). Conclusions Optimally robust rmx estimators have a high breakdown point and are computationally feasible. They can lead to a considerable gain in efficiency for well-established bioinformatics procedures and thus, can increase the reproducibility and power of subsequent statistical analysis. PMID:21118506

  11. Cross-validation and Peeling Strategies for Survival Bump Hunting using Recursive Peeling Methods

    PubMed Central

    Dazard, Jean-Eudes; Choe, Michael; LeBlanc, Michael; Rao, J. Sunil

    2015-01-01

    We introduce a framework to build a survival/risk bump hunting model with a censored time-to-event response. Our Survival Bump Hunting (SBH) method is based on a recursive peeling procedure that uses a specific survival peeling criterion derived from non/semi-parametric statistics such as the hazards-ratio, the log-rank test or the Nelson--Aalen estimator. To optimize the tuning parameter of the model and validate it, we introduce an objective function based on survival or prediction-error statistics, such as the log-rank test and the concordance error rate. We also describe two alternative cross-validation techniques adapted to the joint task of decision-rule making by recursive peeling and survival estimation. Numerical analyses show the importance of replicated cross-validation and the differences between criteria and techniques in both low and high-dimensional settings. Although several non-parametric survival models exist, none addresses the problem of directly identifying local extrema. We show how SBH efficiently estimates extreme survival/risk subgroups unlike other models. This provides an insight into the behavior of commonly used models and suggests alternatives to be adopted in practice. Finally, our SBH framework was applied to a clinical dataset. In it, we identified subsets of patients characterized by clinical and demographic covariates with a distinct extreme survival outcome, for which tailored medical interventions could be made. An R package PRIMsrc (Patient Rule Induction Method in Survival, Regression and Classification settings) is available on CRAN (Comprehensive R Archive Network) and GitHub. PMID:27034730

  12. Archives of Transformation: A Case Study of the International Women's Network against Militarism's Archival System

    ERIC Educational Resources Information Center

    Cachola, Ellen-Rae Cabebe

    2014-01-01

    This dissertation describes the International Women's Network Against Militarism's (IWNAM) political epistemology of security from an archival perspective, and how they create community archives to evidence this epistemology. This research examines records created by Women for Genuine Security (WGS) and Women's Voices Women Speak (WVWS), U.S. and…

  13. Experimental demonstration of multi-dimensional resources integration for service provisioning in cloud radio over fiber network

    NASA Astrophysics Data System (ADS)

    Yang, Hui; Zhang, Jie; Ji, Yuefeng; He, Yongqi; Lee, Young

    2016-07-01

    Cloud radio access network (C-RAN) becomes a promising scenario to accommodate high-performance services with ubiquitous user coverage and real-time cloud computing in 5G area. However, the radio network, optical network and processing unit cloud have been decoupled from each other, so that their resources are controlled independently. Traditional architecture cannot implement the resource optimization and scheduling for the high-level service guarantee due to the communication obstacle among them with the growing number of mobile internet users. In this paper, we report a study on multi-dimensional resources integration (MDRI) for service provisioning in cloud radio over fiber network (C-RoFN). A resources integrated provisioning (RIP) scheme using an auxiliary graph is introduced based on the proposed architecture. The MDRI can enhance the responsiveness to dynamic end-to-end user demands and globally optimize radio frequency, optical network and processing resources effectively to maximize radio coverage. The feasibility of the proposed architecture is experimentally verified on OpenFlow-based enhanced SDN testbed. The performance of RIP scheme under heavy traffic load scenario is also quantitatively evaluated to demonstrate the efficiency of the proposal based on MDRI architecture in terms of resource utilization, path blocking probability, network cost and path provisioning latency, compared with other provisioning schemes.

  14. Experimental demonstration of multi-dimensional resources integration for service provisioning in cloud radio over fiber network.

    PubMed

    Yang, Hui; Zhang, Jie; Ji, Yuefeng; He, Yongqi; Lee, Young

    2016-07-28

    Cloud radio access network (C-RAN) becomes a promising scenario to accommodate high-performance services with ubiquitous user coverage and real-time cloud computing in 5G area. However, the radio network, optical network and processing unit cloud have been decoupled from each other, so that their resources are controlled independently. Traditional architecture cannot implement the resource optimization and scheduling for the high-level service guarantee due to the communication obstacle among them with the growing number of mobile internet users. In this paper, we report a study on multi-dimensional resources integration (MDRI) for service provisioning in cloud radio over fiber network (C-RoFN). A resources integrated provisioning (RIP) scheme using an auxiliary graph is introduced based on the proposed architecture. The MDRI can enhance the responsiveness to dynamic end-to-end user demands and globally optimize radio frequency, optical network and processing resources effectively to maximize radio coverage. The feasibility of the proposed architecture is experimentally verified on OpenFlow-based enhanced SDN testbed. The performance of RIP scheme under heavy traffic load scenario is also quantitatively evaluated to demonstrate the efficiency of the proposal based on MDRI architecture in terms of resource utilization, path blocking probability, network cost and path provisioning latency, compared with other provisioning schemes.

  15. Performance evaluation of multi-stratum resources optimization with network functions virtualization for cloud-based radio over optical fiber networks.

    PubMed

    Yang, Hui; He, Yongqi; Zhang, Jie; Ji, Yuefeng; Bai, Wei; Lee, Young

    2016-04-18

    Cloud radio access network (C-RAN) has become a promising scenario to accommodate high-performance services with ubiquitous user coverage and real-time cloud computing using cloud BBUs. In our previous work, we implemented cross stratum optimization of optical network and application stratums resources that allows to accommodate the services in optical networks. In view of this, this study extends to consider the multiple dimensional resources optimization of radio, optical and BBU processing in 5G age. We propose a novel multi-stratum resources optimization (MSRO) architecture with network functions virtualization for cloud-based radio over optical fiber networks (C-RoFN) using software defined control. A global evaluation scheme (GES) for MSRO in C-RoFN is introduced based on the proposed architecture. The MSRO can enhance the responsiveness to dynamic end-to-end user demands and globally optimize radio frequency, optical and BBU resources effectively to maximize radio coverage. The efficiency and feasibility of the proposed architecture are experimentally demonstrated on OpenFlow-based enhanced SDN testbed. The performance of GES under heavy traffic load scenario is also quantitatively evaluated based on MSRO architecture in terms of resource occupation rate and path provisioning latency, compared with other provisioning scheme.

  16. Experimental demonstration of multi-dimensional resources integration for service provisioning in cloud radio over fiber network

    PubMed Central

    Yang, Hui; Zhang, Jie; Ji, Yuefeng; He, Yongqi; Lee, Young

    2016-01-01

    Cloud radio access network (C-RAN) becomes a promising scenario to accommodate high-performance services with ubiquitous user coverage and real-time cloud computing in 5G area. However, the radio network, optical network and processing unit cloud have been decoupled from each other, so that their resources are controlled independently. Traditional architecture cannot implement the resource optimization and scheduling for the high-level service guarantee due to the communication obstacle among them with the growing number of mobile internet users. In this paper, we report a study on multi-dimensional resources integration (MDRI) for service provisioning in cloud radio over fiber network (C-RoFN). A resources integrated provisioning (RIP) scheme using an auxiliary graph is introduced based on the proposed architecture. The MDRI can enhance the responsiveness to dynamic end-to-end user demands and globally optimize radio frequency, optical network and processing resources effectively to maximize radio coverage. The feasibility of the proposed architecture is experimentally verified on OpenFlow-based enhanced SDN testbed. The performance of RIP scheme under heavy traffic load scenario is also quantitatively evaluated to demonstrate the efficiency of the proposal based on MDRI architecture in terms of resource utilization, path blocking probability, network cost and path provisioning latency, compared with other provisioning schemes. PMID:27465296

  17. SYNCSA--R tool for analysis of metacommunities based on functional traits and phylogeny of the community components.

    PubMed

    Debastiani, Vanderlei J; Pillar, Valério D

    2012-08-01

    SYNCSA is an R package for the analysis of metacommunities based on functional traits and phylogeny of the community components. It offers tools to calculate several matrix correlations that express trait-convergence assembly patterns, trait-divergence assembly patterns and phylogenetic signal in functional traits at the species pool level and at the metacommunity level. SYNCSA is a package for the R environment, under a GPL-2 open-source license and freely available on CRAN official web server for R (http://cran.r-project.org). vanderleidebastiani@yahoo.com.br.

  18. pez: phylogenetics for the environmental sciences.

    PubMed

    Pearse, William D; Cadotte, Marc W; Cavender-Bares, Jeannine; Ives, Anthony R; Tucker, Caroline M; Walker, Steve C; Helmus, Matthew R

    2015-09-01

    pez is an R package that permits measurement, modelling and simulation of phylogenetic structure in ecological data. pez contains the first implementation of many methods in R, and aggregates existing data structures and methods into a single, coherent package. pez is released under the GPL v3 open-source license, available on the Internet from CRAN (http://cran.r-project.org). The package is under active development, and the authors welcome contributions (see http://github.com/willpearse/pez). will.pearse@gmail.com. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  19. Picante: R tools for integrating phylogenies and ecology.

    PubMed

    Kembel, Steven W; Cowan, Peter D; Helmus, Matthew R; Cornwell, William K; Morlon, Helene; Ackerly, David D; Blomberg, Simon P; Webb, Campbell O

    2010-06-01

    Picante is a software package that provides a comprehensive set of tools for analyzing the phylogenetic and trait diversity of ecological communities. The package calculates phylogenetic diversity metrics, performs trait comparative analyses, manipulates phenotypic and phylogenetic data, and performs tests for phylogenetic signal in trait distributions, community structure and species interactions. Picante is a package for the R statistical language and environment written in R and C, released under a GPL v2 open-source license, and freely available on the web (http://picante.r-forge.r-project.org) and from CRAN (http://cran.r-project.org).

  20. Cooperative Work and Sustainable Scientific Software Practices in R

    NASA Astrophysics Data System (ADS)

    Weber, N.

    2013-12-01

    Most scientific software projects are dependent on the work of many diverse people, institutions and organizations. Incentivizing these actors to cooperatively develop software that is both reliable, and sustainable is complicated by the fact that the reward structures of these various actors greatly differ: research scientists want results from a software or model run in order to publish papers, produce new data, or test a hypothesis; software engineers and research centers want compilable, well documented code that is refactorable, reusable and reproducible in future research scenarios. While much research has been done on incentives and motivations for participating in open source software projects or cyberinfrastrcture development, little work has been done on what motivates or incentivizes developers to maintain scientific software projects beyond their original application. This poster will present early results of research into the incentives and motivation for cooperative scientific software development. In particular, this work focuses on motivations for the maintenance and repair of libraries on the software platform R. Our work here uses a sample of R packages that were created by research centers, or are specific to earth, environmental and climate science applications. We first mined 'check' logs from the Comprehensive R Archive Network (CRAN) to determine the amount of time a package has existed, the number of versions it has gone through over this time, the number of releases, and finally the contact information for each official package 'maintainer'. We then sent a survey to each official maintainer, asking them questions about what role they played in developing the original package, and what their motivations were for sustaining the project over time. We will present early results from this mining and our survey of R maintainers.

  1. The Comet Halley archive: Summary volume

    NASA Technical Reports Server (NTRS)

    Sekanina, Zdenek (Editor); Fry, Lori (Editor)

    1991-01-01

    The contents are as follows: The Organizational History of the International Halley Watch; Operations of the International Halley Watch from a Lead Center Perspective; The Steering Group; Astrometry Network; Infrared Studies Network; Large-Scale Phenomena Network; Meteor Studies Network; Near-Nucleus Studies Network; Photometry and Polarimetry Network; Radio Science Network; Spectroscopy and Spectrophotometry Network; Amateur Observation Network; Use of the CD-ROM Archive; The 1986 Passage of Comet Halley; and Recent Observations of Comet Halley.

  2. 77 FR 32141 - Privacy Act of 1974, as Amended; System of Records Notices

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-31

    ... records titled ``Internal Collaboration Network''. SUMMARY: The National Archives and Records... 43, the Internal Collaboration Network, which contains files with information on National Archives.... SUPPLEMENTARY INFORMATION: The Internal Collaboration Network is a web- based platform that allows users to...

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    North, Michael J.

    Schema-on-read is an agile approach to data storage and retrieval that defers investments in data organization until production queries need to be run by working with data directly in native form. Schema-on-read functions have been implemented in a wide range of analytical systems, most notably Hadoop. SchemaOnRead is a CRAN package that uses R’s flexible data representations to provide transparent and convenient support for the schema-on-read paradigm in R. The schema-on- read tools within the package include a single function call that recursively reads folders with text, comma separated value, raster image, R data, HDF5, NetCDF, spreadsheet, Weka, Epi Info,more » Pajek network, R network, HTML, SPSS, Systat, and Stata files. The provided tools can be used as-is or easily adapted to implement customized schema-on-read tool chains in R. This paper’s contribution is that it introduces and describes SchemaOnRead, the first R package specifically focused on providing explicit schema-on-read support in R.« less

  4. dendextend: an R package for visualizing, adjusting and comparing trees of hierarchical clustering

    PubMed Central

    2015-01-01

    Summary: dendextend is an R package for creating and comparing visually appealing tree diagrams. dendextend provides utility functions for manipulating dendrogram objects (their color, shape and content) as well as several advanced methods for comparing trees to one another (both statistically and visually). As such, dendextend offers a flexible framework for enhancing R's rich ecosystem of packages for performing hierarchical clustering of items. Availability and implementation: The dendextend R package (including detailed introductory vignettes) is available under the GPL-2 Open Source license and is freely available to download from CRAN at: (http://cran.r-project.org/package=dendextend) Contact: Tal.Galili@math.tau.ac.il PMID:26209431

  5. The gputools package enables GPU computing in R.

    PubMed

    Buckner, Joshua; Wilson, Justin; Seligman, Mark; Athey, Brian; Watson, Stanley; Meng, Fan

    2010-01-01

    By default, the R statistical environment does not make use of parallelism. Researchers may resort to expensive solutions such as cluster hardware for large analysis tasks. Graphics processing units (GPUs) provide an inexpensive and computationally powerful alternative. Using R and the CUDA toolkit from Nvidia, we have implemented several functions commonly used in microarray gene expression analysis for GPU-equipped computers. R users can take advantage of the better performance provided by an Nvidia GPU. The package is available from CRAN, the R project's repository of packages, at http://cran.r-project.org/web/packages/gputools More information about our gputools R package is available at http://brainarray.mbni.med.umich.edu/brainarray/Rgpgpu

  6. Ionospheric characteristics for archiving at the World Data Centers. Technical report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gamache, R.R.; Reinisch, B.W.

    1990-12-01

    A database structure for archiving ionospheric characteristics at uneven data rates was developed at the July 1989 Ionospheric Informatics Working Group (IIWG) Lowell Workshop in Digital Ionogram Data Formats for World Data Center Archiving. This structure is proposed as a new URSI standard and is being employed by the World Data Center A for solar terrestrial physics for archiving characteristics. Here the database has been slightly refined for the application and programs written to generate these database files using as input Digisonde 256 ARTIST data, post processed by the ULCAR ADEP (ARTIST Data Editing Program) system. The characteristics program asmore » well as supplemental programs developed for this task are described here. The new software will make it possible to archive the ionospheric characteristics from the Geophysics Laboratory high latitude Digisonde network, the AWS DISS and the international Digisonde networks, and other ionospheric sounding networks.« less

  7. BASiNET-BiologicAl Sequences NETwork: a case study on coding and non-coding RNAs identification.

    PubMed

    Ito, Eric Augusto; Katahira, Isaque; Vicente, Fábio Fernandes da Rocha; Pereira, Luiz Filipe Protasio; Lopes, Fabrício Martins

    2018-06-05

    With the emergence of Next Generation Sequencing (NGS) technologies, a large volume of sequence data in particular de novo sequencing was rapidly produced at relatively low costs. In this context, computational tools are increasingly important to assist in the identification of relevant information to understand the functioning of organisms. This work introduces BASiNET, an alignment-free tool for classifying biological sequences based on the feature extraction from complex network measurements. The method initially transform the sequences and represents them as complex networks. Then it extracts topological measures and constructs a feature vector that is used to classify the sequences. The method was evaluated in the classification of coding and non-coding RNAs of 13 species and compared to the CNCI, PLEK and CPC2 methods. BASiNET outperformed all compared methods in all adopted organisms and datasets. BASiNET have classified sequences in all organisms with high accuracy and low standard deviation, showing that the method is robust and non-biased by the organism. The proposed methodology is implemented in open source in R language and freely available for download at https://cran.r-project.org/package=BASiNET.

  8. Integration Of An MR Image Network Into A Clinical PACS

    NASA Astrophysics Data System (ADS)

    Ratib, Osman M.; Mankovich, Nicholas J.; Taira, Ricky K.; Cho, Paul S.; Huang, H. K.

    1988-06-01

    A direct link between a clinical pediatric PACS module and a FONAR MRI image network was implemented. The original MR network combines together the MR scanner, a remote viewing station and a central archiving station. The pediatric PACS directly connects to the archiving unit through an Ethernet TCP-IP network adhering to FONAR's protocol. The PACS communication software developed supports the transfer of patient studies and the patient information directly from the MR archive database to the pediatric PACS. In the first phase of our project we developed a package to transfer data between a VAX-111750 and the IBM PC I AT-based MR archive database through the Ethernet network. This system served as a model for PACS-to-modality network communication. Once testing was complete on this research network, the software and network hardware was moved to the clinical pediatric VAX for full PACS integration. In parallel to the direct transmission of digital images to the Pediatric PACS, a broadband communication system in video format was developed for real-time broadcasting of images originating from the MR console to 8 remote viewing stations distributed in the radiology department. These analog viewing stations allow the radiologists to directly monitor patient positioning and to select the scan levels during a patient examination from remote locations in the radiology department. This paper reports (1) the technical details of this implementation, (2) the merits of this network development scheme, and (3) the performance statistics of the network-to-PACS interface.

  9. Community archiving of imaging studies

    NASA Astrophysics Data System (ADS)

    Fritz, Steven L.; Roys, Steven R.; Munjal, Sunita

    1996-05-01

    The quantity of image data created in a large radiology practice has long been a challenge for available archiving technology. Traditional methods ofarchiving the large quantity of films generated in radiology have relied on warehousing in remote sites, with courier delivery of film files for historical comparisons. A digital community archive, accessible via a wide area network, represents a feasible solution to the problem of archiving digital images from a busy practice. In addition, it affords a physician caring for a patient access to imaging studies performed at a variety ofhealthcare institutions without the need to repeat studies. Security problems include both network security issues in the WAN environment and access control for patient, physician and imaging center. The key obstacle to developing a community archive is currently political. Reluctance to participate in a community archive can be reduced by appropriate design of the access mechanisms.

  10. Mapping the Socio-Technical Complexity of Australian Science: From Archival Authorities to Networks of Contextual Information

    ERIC Educational Resources Information Center

    McCarthy, Gavan; Evans, Joanne

    2007-01-01

    This article examines the evolution of a national register of the archives of science and technology in Australia and the related development of an archival informatics focused initially on people and their relationships to archival materials. The register was created in 1985 as an in-house tool for the Australian Science Archives Project of the…

  11. LPmerge: an R package for merging genetic maps by linear programming.

    PubMed

    Endelman, Jeffrey B; Plomion, Christophe

    2014-06-01

    Consensus genetic maps constructed from multiple populations are an important resource for both basic and applied research, including genome-wide association analysis, genome sequence assembly and studies of evolution. The LPmerge software uses linear programming to efficiently minimize the mean absolute error between the consensus map and the linkage maps from each population. This minimization is performed subject to linear inequality constraints that ensure the ordering of the markers in the linkage maps is preserved. When marker order is inconsistent between linkage maps, a minimum set of ordinal constraints is deleted to resolve the conflicts. LPmerge is on CRAN at http://cran.r-project.org/web/packages/LPmerge. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  12. The Archive of the Amateur Observation Network of the International Halley Watch. Volume 1; Comet Giacobini-Zinner

    NASA Technical Reports Server (NTRS)

    Edberg, Stephen J. (Editor)

    1996-01-01

    The International Halley Watch (IHW) was organized for the purpose of gathering and archiving the most complete record of the apparition of a comet, Comet Halley (1982i = 1986 III = 1P/Halley), ever compiled. The redirection of the International Cometary Explorer (ICE), toward Comet Giacobini-Zinner (1984e = 1985 XIII = 21P/Giacobini-Zinner) prompted the initiation of a formal watch on that comet. All the data collected on P/Giacobini-Zinner and P/Halley have been published on CD-ROM in the Comet Halley Archive. This document contains a printed version of the archive data, collected by amateur astronomers, on these two comets. Volume 1 contains the Comet Giacobini-Zinner data archive and Volume 2 contains the Comet Halley archive. Both volumes include information on how to read the data in both archives, as well as a history of both comet watches (including the organizing of the network of astronomers and lessons learned from that experience).

  13. SchemaOnRead: A Package for Schema-on-Read in R

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    North, Michael J.

    Schema-on-read is an agile approach to data storage and retrieval that defers investments in data organization until production queries need to be run by working with data directly in native form. Schema-on-read functions have been implemented in a wide range of analytical systems, most notably Hadoop. SchemaOnRead is a CRAN package that uses R’s flexible data representations to provide transparent and convenient support for the schema-on-read paradigm in R. The schema-on- read tools within the package include a single function call that recursively reads folders with text, comma separated value, raster image, R data, HDF5, NetCDF, spreadsheet, Weka, Epi Info,more » Pajek network, R network, HTML, SPSS, Systat, and Stata files. The provided tools can be used as-is or easily adapted to implement customized schema-on-read tool chains in R. This paper’s contribution is that it introduces and describes SchemaOnRead, the first R package specifically focused on providing explicit schema-on-read support in R.« less

  14. Closet to Cloud: The online archiving of tape-based continuous NCSN seismic data from 1993-2005

    NASA Astrophysics Data System (ADS)

    Neuhauser, D. S.; Aranha, M. A.; Kohler, W. M.; Oppenheimer, D.

    2016-12-01

    As earthquake monitoring systems in the 1980s moved from analog to digital recording systems, most seismic networks only archived digital waveforms from detected events due to lack of affordable online digital storage for continuous high-rate (100 sps) data. The Northern California Earthquake Data Center (NCEDC), established in 1991 by UC Berkeley and the USGS Menlo Park, archived 20 sps continuous data and triggerd high-rate from the sparse Berkeley seismic network, but could not afford the online storage for continuous high-rate data from the 300+ stations of the USGS Northern California Seismic Network (NCSN). The discovery of non-volcanic tremor and the use of continuous waveform correlation techniques for detecting repeating earthquakes combined with the increase in disk capacity capacity and significant reduction in disk costs led the Northern California Earthquake Data Center (NCEDC) to begin archiving continuous high-rate waveforms in 2004-2005. The USGS Menlo Park NCSN network had backup tapes of continuous high-rate waveform data since 1993 on the shelf, and the USGS and NCEDC embarked on a project to restore and archive all continuous NCSN data from 1993 through 2005. We will discuss the procedures and problems encountered when reading, transcribing, converting data formats, SEED channel naming, and archiving the 1993-2005 continuous NCSN waveforms. We will also illustrate new science enabled by these data. These and other northern California seismic and geophysical data are available via web services at http://service.ncedc.org

  15. The Archive of the Amateur Observation Network of the International Halley Watch. Volume 2; Comet Halley

    NASA Technical Reports Server (NTRS)

    Edberg, Stephen J. (Editor)

    1996-01-01

    The International Halley Watch (IHW) was organized for the purpose of gathering and archiving the most complete record of the apparition of a comet, Halley's Comet (1982i = 1986 III = 1P/Halley), ever compiled. The redirection of the International Sun-Earth Explorer 3 (ISEE-3) spacecraft, subsequently renamed the International Cometary Explorer (ICE), toward Comet Giacobini- Zinner (1984e = 1985 XIII = 21P/Giacobini-Zinner) prompted the initiation of a formal watch on that comet. All the data collected on P/Giacobini-Zinner and P/Halley have been published on CD-ROM in the Comet Halley Archive. This document contains a printed version of the archive data, collected by amateur astronomers, on these two comets. Volume 1 contains the Comet Giacobini-Zinner data archive and Volume 2 contains the Comet Halley archive. Both volumes include information on how to read the data in both archives, as well as a history of both comet watches (including the organizing of the network of astronomers and lessons learned from that experience).

  16. The Archive of the Amateur Observation Network of the International Halley Watch. Volume 1; Comet Giacobini-Zinner

    NASA Technical Reports Server (NTRS)

    Edberg, Stephen J. (Editor)

    1966-01-01

    The International Halley Watch (IHW) was organized for the purpose of gathering and archiving the most complete record of the apparition of a comet, Halley's Comet (1982i = 1986 III = 1P/Halley), ever compiled. The redirection of the International Sun-Earth Explorer 3 (ISEE-3) spacecraft, subsequently renamed the International Cometary Explorer (ICE), toward Comet Giacobini-Zinner (1984e = 1985 XIII = 21P/Giacobini-Zinner) prompted the initiation of a formal watch on that comet. All the data collected on P/Giacobini-Zinner and P/Halley have been published on CD-ROM in the Comet Halley Archive. This document contains a printed version of the archive data, collected by amateur astronomers, on these two comets. Volume 1 contains the Comet Giacobini-Zinner data archive and Volume 2 contains the Comet Halley archive. Both volumes include information on how to read the data in both archives, as well as a history of both comet watches (including the organizing of the network of astronomers and lessons learned from that experience).

  17. Water level ingest, archive and processing system - an integral part of NOAA's tsunami database

    NASA Astrophysics Data System (ADS)

    McLean, S. J.; Mungov, G.; Dunbar, P. K.; Price, D. J.; Mccullough, H.

    2013-12-01

    The National Oceanic and Atmospheric Administration (NOAA), National Geophysical Data Center (NGDC) and collocated World Data Service for Geophysics (WDS) provides long-term archive, data management, and access to national and global tsunami data. Archive responsibilities include the NOAA Global Historical Tsunami event and runup database, damage photos, as well as other related hazards data. Beginning in 2008, NGDC was given the responsibility of archiving, processing and distributing all tsunami and hazards-related water level data collected from NOAA observational networks in a coordinated and consistent manner. These data include the Deep-ocean Assessment and Reporting of Tsunami (DART) data provided by the National Data Buoy Center (NDBC), coastal-tide-gauge data from the National Ocean Service (NOS) network and tide-gauge data from the two National Weather Service (NWS) Tsunami Warning Centers (TWCs) regional networks. Taken together, this integrated archive supports tsunami forecast, warning, research, mitigation and education efforts of NOAA and the Nation. Due to the variety of the water level data, the automatic ingest system was redesigned, along with upgrading the inventory, archive and delivery capabilities based on modern digital data archiving practices. The data processing system was also upgraded and redesigned focusing on data quality assessment in an operational manner. This poster focuses on data availability highlighting the automation of all steps of data ingest, archive, processing and distribution. Examples are given from recent events such as the October 2012 hurricane Sandy, the Feb 06, 2013 Solomon Islands tsunami, and the June 13, 2013 meteotsunami along the U.S. East Coast.

  18. The challenge of a data storage hierarchy

    NASA Technical Reports Server (NTRS)

    Ruderman, Michael

    1992-01-01

    A discussion of Mesa Archival Systems' data archiving system is presented. This data archiving system is strictly a software system that is implemented on a mainframe and manages the data into permanent file storage. Emphasis is placed on the fact that any kind of client system on the network can be connected through the Unix interface of the data archiving system.

  19. Archive interoperability in the Virtual Observatory

    NASA Astrophysics Data System (ADS)

    Genova, Françoise

    2003-02-01

    Main goals of Virtual Observatory projects are to build interoperability between astronomical on-line services, observatory archives, databases and results published in journals, and to develop tools permitting the best scientific usage from the very large data sets stored in observatory archives and produced by large surveys. The different Virtual Observatory projects collaborate to define common exchange standards, which are the key for a truly International Virtual Observatory: for instance their first common milestone has been a standard allowing exchange of tabular data, called VOTable. The Interoperability Work Area of the European Astrophysical Virtual Observatory project aims at networking European archives, by building a prototype using the CDS VizieR and Aladin tools, and at defining basic rules to help archive providers in interoperability implementation. The prototype is accessible for scientific usage, to get user feedback (and science results!) at an early stage of the project. ISO archive participates very actively to this endeavour, and more generally to information networking. The on-going inclusion of the ISO log in SIMBAD will allow higher level links for users.

  20. The challenges of archiving networked-based multimedia performances (Performance cryogenics)

    NASA Astrophysics Data System (ADS)

    Cohen, Elizabeth; Cooperstock, Jeremy; Kyriakakis, Chris

    2002-11-01

    Music archives and libraries have cultural preservation at the core of their charters. New forms of art often race ahead of the preservation infrastructure. The ability to stream multiple synchronized ultra-low latency streams of audio and video across a continent for a distributed interactive performance such as music and dance with high-definition video and multichannel audio raises a series of challenges for the architects of digital libraries and those responsible for cultural preservation. The archiving of such performances presents numerous challenges that go beyond simply recording each stream. Case studies of storage and subsequent retrieval issues for Internet2 collaborative performances are discussed. The development of shared reality and immersive environments generate issues about, What constitutes an archived performance that occurs across a network (in multiple spaces over time)? What are the families of necessary metadata to reconstruct this virtual world in another venue or era? For example, if the network exhibited changes in latency the performers most likely adapted. In a future recreation, the latency will most likely be completely different. We discuss the parameters of immersive environment acquisition and rendering, network architectures, software architecture, musical/choreographic scores, and environmental acoustics that must be considered to address this problem.

  1. European distributed seismological data archives infrastructure: EIDA

    NASA Astrophysics Data System (ADS)

    Clinton, John; Hanka, Winfried; Mazza, Salvatore; Pederson, Helle; Sleeman, Reinoud; Stammler, Klaus; Strollo, Angelo

    2014-05-01

    The European Integrated waveform Data Archive (EIDA) is a distributed Data Center system within ORFEUS that (a) securely archives seismic waveform data and related metadata gathered by European research infrastructures, and (b) provides transparent access to the archives for the geosciences research communities. EIDA was founded in 2013 by ORFEUS Data Center, GFZ, RESIF, ETH, INGV and BGR to ensure sustainability of a distributed archive system and the implementation of standards (e.g. FDSN StationXML, FDSN webservices) and coordinate new developments. Under the mandate of the ORFEUS Board of Directors and Executive Committee the founding group is responsible for steering and maintaining the technical developments and organization of the European distributed seismic waveform data archive and the integration within broader multidisciplanry frameworks like EPOS. EIDA currently offers uniform data access to unrestricted data from 8 European archives (www.orfeus-eu.org/eida), linked by the Arclink protocol, hosting data from 75 permanent networks (1800+ stations) and 33 temporary networks (1200+) stations). Moreover, each archive may also provide unique, restricted datasets. A webinterface, developed at GFZ, offers interactive access to different catalogues (EMSC, GFZ, USGS) and EIDA waveform data. Clients and toolboxes like arclink_fetch and ObsPy can connect directly to any EIDA node to collect data. Current developments are directed to the implementation of quality parameters and strong motion parameters.

  2. Service-Based Extensions to an OAIS Archive for Science Data Management

    NASA Astrophysics Data System (ADS)

    Flathers, E.; Seamon, E.; Gessler, P. E.

    2014-12-01

    With new data management mandates from major funding sources such as the National Institutes for Health and the National Science Foundation, architecture of science data archive systems is becoming a critical concern for research institutions. The Consultative Committee for Space Data Systems (CCSDS), in 2002, released their first version of a Reference Model for an Open Archival Information System (OAIS). The CCSDS document (now an ISO standard) was updated in 2012 with additional focus on verifying the authenticity of data and developing concepts of access rights and a security model. The OAIS model is a good fit for research data archives, having been designed to support data collections of heterogeneous types, disciplines, storage formats, etc. for the space sciences. As fast, reliable, persistent Internet connectivity spreads, new network-available resources have been developed that can support the science data archive. A natural extension of an OAIS archive is the interconnection with network- or cloud-based services and resources. We use the Service Oriented Architecture (SOA) design paradigm to describe a set of extensions to an OAIS-type archive: purpose and justification for each extension, where and how each extension connects to the model, and an example of a specific service that meets the purpose.

  3. Security Considerations for Archives: Rare Book, Manuscript, and Other Special Collections.

    ERIC Educational Resources Information Center

    Cupp, Christian M.

    The first of six sections in this guide to security for special collections in archives and libraries discusses the importance of security and the difficulty of preventing theft of archival materials. The second section, which focuses on planning, recommends an inservice training program for staff, a planned communications network between library…

  4. CRANS - CONFIGURABLE REAL-TIME ANALYSIS SYSTEM

    NASA Technical Reports Server (NTRS)

    Mccluney, K.

    1994-01-01

    In a real-time environment, the results of changes or failures in a complex, interconnected system need evaluation quickly. Tabulations showing the effects of changes and/or failures of a given item in the system are generally only useful for a single input, and only with regard to that item. Subsequent changes become harder to evaluate as combinations of failures produce a cascade effect. When confronted by multiple indicated failures in the system, it becomes necessary to determine a single cause. In this case, failure tables are not very helpful. CRANS, the Configurable Real-time ANalysis System, can interpret a logic tree, constructed by the user, describing a complex system and determine the effects of changes and failures in it. Items in the tree are related to each other by Boolean operators. The user is then able to change the state of these items (ON/OFF FAILED/UNFAILED). The program then evaluates the logic tree based on these changes and determines any resultant changes to other items in the tree. CRANS can also search for a common cause for multiple item failures, and allow the user to explore the logic tree from within the program. A "help" mode and a reference check provide the user with a means of exploring an item's underlying logic from within the program. A commonality check determines single point failures for an item or group of items. Output is in the form of a user-defined matrix or matrices of colored boxes, each box representing an item or set of items from the logic tree. Input is via mouse selection of the matrix boxes, using the mouse buttons to toggle the state of the item. CRANS is written in C-language and requires the MIT X Window System, Version 11 Revision 4 or Revision 5. It requires 78K of RAM for execution and a three button mouse. It has been successfully implemented on Sun4 workstations running SunOS, HP9000 workstations running HP-UX, and DECstations running ULTRIX. No executable is provided on the distribution medium; however, a sample makefile is included. Sample input files are also included. The standard distribution medium is a .25 inch streaming magnetic tape cartridge (Sun QIC-24) in UNIX tar format. Alternate distribution media and formats are available upon request. This program was developed in 1992.

  5. Detailed description of the Mayo/IBM PACS

    NASA Astrophysics Data System (ADS)

    Gehring, Dale G.; Persons, Kenneth R.; Rothman, Melvyn L.; Salutz, James R.; Morin, Richard L.

    1991-07-01

    The Mayo Clinic and IBM/Rochester have jointly developed a picture archiving system (PACS) for use with Mayo's MRI and Neuro-CT imaging modalities. The system was developed to replace the imaging system's vendor-supplied magnetic tape archiving capability. The system consists of seven MR imagers and nine CT scanners, each interfaced to the PACS via IBM Personal System/2(tm) (PS/2) computers, which act as gateways from the imaging modality to the PACS network. The PAC system operates on the token-ring component of Mayo's city-wide local area network. Also on the PACS network are four optical storage subsystems used for image archival, three optical subsystems used for image retrieval, an IBM Application System/400(tm) (AS/400) computer used for database management and multiple PS/2-based image display systems and their image servers.

  6. JNDMS Task Authorization 2 Report

    DTIC Science & Technology

    2013-10-01

    uses Barnyard to store alarms from all DREnet Snort sensors in a MySQL database. Barnyard is an open source tool designed to work with Snort to take...Technology ITI Information Technology Infrastructure J2EE Java 2 Enterprise Edition JAR Java Archive. This is an archive file format defined by Java ...standards. JDBC Java Database Connectivity JDW JNDMS Data Warehouse JNDMS Joint Network and Defence Management System JNDMS Joint Network Defence and

  7. The SSABLE system - Automated archive, catalog, browse and distribution of satellite data in near-real time

    NASA Technical Reports Server (NTRS)

    Simpson, James J.; Harkins, Daniel N.

    1993-01-01

    Historically, locating and browsing satellite data has been a cumbersome and expensive process. This has impeded the efficient and effective use of satellite data in the geosciences. SSABLE is a new interactive tool for the archive, browse, order, and distribution of satellite date based upon X Window, high bandwidth networks, and digital image rendering techniques. SSABLE provides for automatically constructing relational database queries to archived image datasets based on time, data, geographical location, and other selection criteria. SSABLE also provides a visual representation of the selected archived data for viewing on the user's X terminal. SSABLE is a near real-time system; for example, data are added to SSABLE's database within 10 min after capture. SSABLE is network and machine independent; it will run identically on any machine which satisfies the following three requirements: 1) has a bitmapped display (monochrome or greater); 2) is running the X Window system; and 3) is on a network directly reachable by the SSABLE system. SSABLE has been evaluated at over 100 international sites. Network response time in the United States and Canada varies between 4 and 7 s for browse image updates; reported transmission times to Europe and Australia typically are 20-25 s.

  8. LDCM Ground System. Network Lesson Learned

    NASA Technical Reports Server (NTRS)

    Gal-Edd, Jonathan

    2010-01-01

    This slide presentation reviews the Landsat Data Continuity Mission (LDCM) and the lessons learned in implementing the network that was assembled to allow for the acquisition, archiving and distribution of the data from the Landsat mission. The objective of the LDCM is to continue the acquisition, archiving, and distribution of moderate-resolution multispectral imagery affording global, synoptic, and repetitive coverage of the earth's land surface at a scale where natural and human-induced changes can be detected, differentiated, characterized, and monitored over time. It includes a review of the ground network, including a block diagram of the ground network elements (GNE) and a review of the RF design and testing. Also included is a listing of the lessons learned.

  9. Archive of observations of periodic comet Crommelin made during its 1983-84 apparition

    NASA Technical Reports Server (NTRS)

    Sekanina, Z. (Editor); Aronsson, M.

    1985-01-01

    This is an archive of 680 reduced observations of Periodic Comet Crommelin made during its 1984 apparition. The archive integrates reports by members of the eight networks of the International Halley Watch (IHW) and presents the results of a trial run designed to test the preparedness of the IHW organization for the current apparition of Periodic Comet Halley.

  10. Image acquisition unit for the Mayo/IBM PACS project

    NASA Astrophysics Data System (ADS)

    Reardon, Frank J.; Salutz, James R.

    1991-07-01

    The Mayo Clinic and IBM Rochester, Minnesota, have jointly developed a picture archiving, distribution and viewing system for use with Mayo's CT and MRI imaging modalities. Images are retrieved from the modalities and sent over the Mayo city-wide token ring network to optical storage subsystems for archiving, and to server subsystems for viewing on image review stations. Images may also be retrieved from archive and transmitted back to the modalities. The subsystems that interface to the modalities and communicate to the other components of the system are termed Image Acquisition Units (LAUs). The IAUs are IBM Personal System/2 (PS/2) computers with specially developed software. They operate independently in a network of cooperative subsystems and communicate with the modalities, archive subsystems, image review server subsystems, and a central subsystem that maintains information about the content and location of images. This paper provides a detailed description of the function and design of the Image Acquisition Units.

  11. The LCOGT Science Archive and Data Pipeline

    NASA Astrophysics Data System (ADS)

    Lister, Tim; Walker, Z.; Ciardi, D.; Gelino, C. R.; Good, J.; Laity, A.; Swain, M.

    2013-01-01

    Las Cumbres Observatory Global Telescope (LCOGT) is building and deploying a world-wide network of optical telescopes dedicated to time-domain astronomy. In the past year, we have deployed and commissioned four new 1m telescopes at McDonald Observatory, Texas and at CTIO, Chile, with more to come at SAAO, South Africa and Siding Spring Observatory, Australia. To handle these new data sources coming from the growing LCOGT network, and to serve them to end users, we have constructed a new data pipeline and Science Archive. We describe the new LCOGT pipeline, currently under development and testing, which makes use of the ORAC-DR automated recipe-based data reduction pipeline and illustrate some of the new data products. We also present the new Science Archive, which is being developed in partnership with the Infrared Processing and Analysis Center (IPAC) and show some of the new features the Science Archive provides.

  12. RAPIDR: an analysis package for non-invasive prenatal testing of aneuploidy

    PubMed Central

    Lo, Kitty K.; Boustred, Christopher; Chitty, Lyn S.; Plagnol, Vincent

    2014-01-01

    Non-invasive prenatal testing (NIPT) of fetal aneuploidy using cell-free fetal DNA is becoming part of routine clinical practice. RAPIDR (Reliable Accurate Prenatal non-Invasive Diagnosis R package) is an easy-to-use open-source R package that implements several published NIPT analysis methods. The input to RAPIDR is a set of sequence alignment files in the BAM format, and the outputs are calls for aneuploidy, including trisomies 13, 18, 21 and monosomy X as well as fetal sex. RAPIDR has been extensively tested with a large sample set as part of the RAPID project in the UK. The package contains quality control steps to make it robust for use in the clinical setting. Availability and implementation: RAPIDR is implemented in R and can be freely downloaded via CRAN from here: http://cran.r-project.org/web/packages/RAPIDR/index.html. Contact: kitty.lo@ucl.ac.uk Supplementary information: Supplementary data are available at Bioinformatics online. PMID:24990604

  13. SNPassoc: an R package to perform whole genome association studies.

    PubMed

    González, Juan R; Armengol, Lluís; Solé, Xavier; Guinó, Elisabet; Mercader, Josep M; Estivill, Xavier; Moreno, Víctor

    2007-03-01

    The popularization of large-scale genotyping projects has led to the widespread adoption of genetic association studies as the tool of choice in the search for single nucleotide polymorphisms (SNPs) underlying susceptibility to complex diseases. Although the analysis of individual SNPs is a relatively trivial task, when the number is large and multiple genetic models need to be explored it becomes necessary a tool to automate the analyses. In order to address this issue, we developed SNPassoc, an R package to carry out most common analyses in whole genome association studies. These analyses include descriptive statistics and exploratory analysis of missing values, calculation of Hardy-Weinberg equilibrium, analysis of association based on generalized linear models (either for quantitative or binary traits), and analysis of multiple SNPs (haplotype and epistasis analysis). Package SNPassoc is available at CRAN from http://cran.r-project.org. A tutorial is available on Bioinformatics online and in http://davinci.crg.es/estivill_lab/snpassoc.

  14. Using Network Analysis to Characterize Biogeographic Data in a Community Archive

    NASA Astrophysics Data System (ADS)

    Wellman, T. P.; Bristol, S.

    2017-12-01

    Informative measures are needed to evaluate and compare data from multiple providers in a community-driven data archive. This study explores insights from network theory and other descriptive and inferential statistics to examine data content and application across an assemblage of publically available biogeographic data sets. The data are archived in ScienceBase, a collaborative catalog of scientific data supported by the U.S Geological Survey to enhance scientific inquiry and acuity. In gaining understanding through this investigation and other scientific venues our goal is to improve scientific insight and data use across a spectrum of scientific applications. Network analysis is a tool to reveal patterns of non-trivial topological features in the data that do not exhibit complete regularity or randomness. In this work, network analyses are used to explore shared events and dependencies between measures of data content and application derived from metadata and catalog information and measures relevant to biogeographic study. Descriptive statistical tools are used to explore relations between network analysis properties, while inferential statistics are used to evaluate the degree of confidence in these assessments. Network analyses have been used successfully in related fields to examine social awareness of scientific issues, taxonomic structures of biological organisms, and ecosystem resilience to environmental change. Use of network analysis also shows promising potential to identify relationships in biogeographic data that inform programmatic goals and scientific interests.

  15. Recommendations for a service framework to access astronomical archives

    NASA Technical Reports Server (NTRS)

    Travisano, J. J.; Pollizzi, J.

    1992-01-01

    There are a large number of astronomical archives and catalogs on-line for network access, with many different user interfaces and features. Some systems are moving towards distributed access, supplying users with client software for their home sites which connects to servers at the archive site. Many of the issues involved in defining a standard framework of services that archive/catalog suppliers can use to achieve a basic level of interoperability are described. Such a framework would simplify the development of client and server programs to access the wide variety of astronomical archive systems. The primary services that are supplied by current systems include: catalog browsing, dataset retrieval, name resolution, and data analysis. The following issues (and probably more) need to be considered in establishing a standard set of client/server interfaces and protocols: Archive Access - dataset retrieval, delivery, file formats, data browsing, analysis, etc.; Catalog Access - database management systems, query languages, data formats, synchronous/asynchronous mode of operation, etc.; Interoperability - transaction/message protocols, distributed processing mechanisms (DCE, ONC/SunRPC, etc), networking protocols, etc.; Security - user registration, authorization/authentication mechanisms, etc.; Service Directory - service registration, lookup, port/task mapping, parameters, etc.; Software - public vs proprietary, client/server software, standard interfaces to client/server functions, software distribution, operating system portability, data portability, etc. Several archive/catalog groups, notably the Astrophysics Data System (ADS), are already working in many of these areas. In the process of developing StarView, which is the user interface to the Space Telescope Data Archive and Distribution Service (ST-DADS), these issues and the work of others were analyzed. A framework of standard interfaces for accessing services on any archive system which would benefit archive user and supplier alike is proposed.

  16. Technical note: The US Dobson station network data record prior to 2015, re-evaluation of NDACC and WOUDC archived records with WinDobson processing software

    NASA Astrophysics Data System (ADS)

    Evans, Robert D.; Petropavlovskikh, Irina; McClure-Begley, Audra; McConville, Glen; Quincy, Dorothy; Miyagawa, Koji

    2017-10-01

    The United States government has operated Dobson ozone spectrophotometers at various sites, starting during the International Geophysical Year (1 July 1957 to 31 December 1958). A network of stations for long-term monitoring of the total column content (thickness of the ozone layer) of the atmosphere was established in the early 1960s and eventually grew to 16 stations, 14 of which are still operational and submit data to the United States of America's National Oceanic and Atmospheric Administration (NOAA). Seven of these sites are also part of the Network for the Detection of Atmospheric Composition Change (NDACC), an organization that maintains its own data archive. Due to recent changes in data processing software the entire dataset was re-evaluated for possible changes. To evaluate and minimize potential changes caused by the new processing software, the reprocessed data record was compared to the original data record archived in the World Ozone and UV Data Center (WOUDC) in Toronto, Canada. The history of the observations at the individual stations, the instruments used for the NOAA network monitoring at the station, the method for reducing zenith-sky observations to total ozone, and calibration procedures were re-evaluated using data quality control tools built into the new software. At the completion of the evaluation, the new datasets are to be published as an update to the WOUDC and NDACC archives, and the entire dataset is to be made available to the scientific community. The procedure for reprocessing Dobson data and the results of the reanalysis on the archived record are presented in this paper. A summary of historical changes to 14 station records is also provided.

  17. Designing Solar Data Archives: Practical Considerations

    NASA Astrophysics Data System (ADS)

    Messerotti, M.

    The variety of new solar observatories in space and on the ground poses the stringent problem of an efficient storage and archiving of huge datasets. We briefly address some typical architectures and consider the key point of data access and distribution through networking.

  18. Ocean Networks Canada's "Big Data" Initiative

    NASA Astrophysics Data System (ADS)

    Dewey, R. K.; Hoeberechts, M.; Moran, K.; Pirenne, B.; Owens, D.

    2013-12-01

    Ocean Networks Canada operates two large undersea observatories that collect, archive, and deliver data in real time over the Internet. These data contribute to our understanding of the complex changes taking place on our ocean planet. Ocean Networks Canada's VENUS was the world's first cabled seafloor observatory to enable researchers anywhere to connect in real time to undersea experiments and observations. Its NEPTUNE observatory is the largest cabled ocean observatory, spanning a wide range of ocean environments. Most recently, we installed a new small observatory in the Arctic. Together, these observatories deliver "Big Data" across many disciplines in a cohesive manner using the Oceans 2.0 data management and archiving system that provides national and international users with open access to real-time and archived data while also supporting a collaborative work environment. Ocean Networks Canada operates these observatories to support science, innovation, and learning in four priority areas: study of the impact of climate change on the ocean; the exploration and understanding the unique life forms in the extreme environments of the deep ocean and below the seafloor; the exchange of heat, fluids, and gases that move throughout the ocean and atmosphere; and the dynamics of earthquakes, tsunamis, and undersea landslides. To date, the Ocean Networks Canada archive contains over 130 TB (collected over 7 years) and the current rate of data acquisition is ~50 TB per year. This data set is complex and diverse. Making these "Big Data" accessible and attractive to users is our priority. In this presentation, we share our experience as a "Big Data" institution where we deliver simple and multi-dimensional calibrated data cubes to a diverse pool of users. Ocean Networks Canada also conducts extensive user testing. Test results guide future tool design and development of "Big Data" products. We strive to bridge the gap between the raw, archived data and the needs and experience of a diverse user community, each requiring tailored data visualization and integrated products. By doing this we aim to design tools that maximize exploitation of the data.

  19. DOSim: an R package for similarity between diseases based on Disease Ontology.

    PubMed

    Li, Jiang; Gong, Binsheng; Chen, Xi; Liu, Tao; Wu, Chao; Zhang, Fan; Li, Chunquan; Li, Xiang; Rao, Shaoqi; Li, Xia

    2011-06-29

    The construction of the Disease Ontology (DO) has helped promote the investigation of diseases and disease risk factors. DO enables researchers to analyse disease similarity by adopting semantic similarity measures, and has expanded our understanding of the relationships between different diseases and to classify them. Simultaneously, similarities between genes can also be analysed by their associations with similar diseases. As a result, disease heterogeneity is better understood and insights into the molecular pathogenesis of similar diseases have been gained. However, bioinformatics tools that provide easy and straight forward ways to use DO to study disease and gene similarity simultaneously are required. We have developed an R-based software package (DOSim) to compute the similarity between diseases and to measure the similarity between human genes in terms of diseases. DOSim incorporates a DO-based enrichment analysis function that can be used to explore the disease feature of an independent gene set. A multilayered enrichment analysis (GO and KEGG annotation) annotation function that helps users explore the biological meaning implied in a newly detected gene module is also part of the DOSim package. We used the disease similarity application to demonstrate the relationship between 128 different DO cancer terms. The hierarchical clustering of these 128 different cancers showed modular characteristics. In another case study, we used the gene similarity application on 361 obesity-related genes. The results revealed the complex pathogenesis of obesity. In addition, the gene module detection and gene module multilayered annotation functions in DOSim when applied on these 361 obesity-related genes helped extend our understanding of the complex pathogenesis of obesity risk phenotypes and the heterogeneity of obesity-related diseases. DOSim can be used to detect disease-driven gene modules, and to annotate the modules for functions and pathways. The DOSim package can also be used to visualise DO structure. DOSim can reflect the modular characteristic of disease related genes and promote our understanding of the complex pathogenesis of diseases. DOSim is available on the Comprehensive R Archive Network (CRAN) or http://bioinfo.hrbmu.edu.cn/dosim.

  20. Improved Data Access From the Northern California Earthquake Data Center

    NASA Astrophysics Data System (ADS)

    Neuhauser, D.; Oppenheimer, D.; Zuzlewski, S.; Klein, F.; Jensen, E.; Gee, L.; Murray, M.; Romanowicz, B.

    2002-12-01

    The NCEDC is a joint project of the UC Berkeley Seismological Laboratory and the USGS Menlo Park to provide a long-term archive and distribution center for geophysical data for northern California. Most data are available via the Web at http://quake.geo.berkeley.edu and research accounts are available for access to specialized datasets. Current efforts continue to expand the available datasets, enhance distribution methods, and to provide rapid access to all datasets. The NCEDC archives continuous and event-based seismic and geophysical time-series data from the BDSN, the USGS NCSN, the UNR Seismic Network, the Parkfield HRSN, and the Calpine/Unocal Geysers network. In collaboration with the USGS, the NCEDC has archived a total of 887 channels from 139 sites of the "USGS low-frequency" geophysical network (UL), including data from strainmeters, creep meters, magnetometers, water well levels, and tiltmeters. There are 336 active continuous data channels that are updated at the NCEDC on a daily basis. Geodetic data from the BARD network of over 40 continuously recording GPS sites are archived at the NCEDC in both raw and RINEX format. The NCEDC is the primary archive for survey-mode GPS and other geodetic data collected in northern California by the USGS, universities, and other agencies. All of the BARD data and GPS data archived from USGS Menlo Park surveys are now available through the GPS Seamless Archive Centers (GSAC), and by FTP directly from the NCEDC. Virtually all time-series data at the NCEDC are now available in SEED with complete instrument responses. Assembling, verifying, and maintaining the response information for these networks is a huge task, and is accomplished through the collaborative efforts of the NCEDC and the contributing agencies. Until recently, the NCSN waveform data were available only through research accounts and special request methods due to incomplete instrument responses. In the last year, the USGS compiled the necessary descriptions for for both historic and current NCSN instrumentation. The NCEDC and USGS jointly developed a procedure to create and maintain the hardware attributes and instrument responses at the NCEDC for the 3500 NCSN channels. As a result, the NCSN waveform data can now be distributed in SEED format. The NCEDC provides access to waveform data through Web forms, email requests, and programming interfaces. The SeismiQuery Web interface provides information about data holdings. NetDC allows users to retrieve inventory information, instrument responses, and waveforms in SEED format. STP provides both a Web and programming interface to retrieve data in SEED or other user-friendly formats. Through the newly formed California Integrated Seismic Network, we are working with the SCEDC to provide unified access to California earthquake data.

  1. Enabling technologies for millimeter-wave radio-over-fiber systems in next generation heterogeneous mobile access networks

    NASA Astrophysics Data System (ADS)

    Zhang, Junwen; Yu, Jianjun; Wang, Jing; Xu, Mu; Cheng, Lin; Lu, Feng; Shen, Shuyi; Yan, Yan; Cho, Hyunwoo; Guidotti, Daniel; Chang, Gee-kung

    2017-01-01

    Fifth-generation (5G) wireless access network promises to support higher access data rate with more than 1,000 times capacity with respect to current long-term evolution (LTE) systems. New radio-access-technologies (RATs) based on higher carrier frequencies to millimeter-wave (MMW) radio-over-fiber, and carrier-aggregation (CA) using multi-band resources are intensively studied to support the high data rate access and effectively use of frequency resources in heterogeneous mobile network (Het-Net). In this paper, we investigate several enabling technologies for MMW RoF systems in 5G Het-Net. Efficient mobile fronthaul (MFH) solutions for 5G centralized radio access network (C-RAN) and beyond are proposed, analyzed and experimentally demonstrated based on the analog scheme. Digital predistortion based on memory polynomial for analog MFH linearization are presented with improved EVM performances and receiver sensitivity. We also propose and experimentally demonstrate a novel inter-/intra- RAT CA scheme for 5G Het- Net. The real-time standard 4G-LTE signal is carrier-aggregated with three broadband 60GHz MMW signals based on proposed optical-domain band-mapping method. RATs based on new waveforms have also been studied here to achieve higher spectral-efficiency (SE) in asynchronous environments. Full-duplex asynchronous quasi-gapless carrier aggregation scheme for MMW ROF inter-/intra-RAT based on the FBMC is also presented with 4G-LTE signals. Compared with OFDM-based signals with large guard-bands, FBMC achieves higher spectral-efficiency with better EVM performance at less received power and smaller guard-bands.

  2. Technology and the Transformation of Archival Description

    ERIC Educational Resources Information Center

    Pitti, Daniel V.

    2005-01-01

    The emergence of computer and network technologies has presented the archival profession with daunting challenges as well as inspiring opportunities. Archivists have been actively imagining and realizing the application of advanced technologies to their professional functions and activities. Using advanced technologies, archivists have been…

  3. WWLLN Data User Manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lay, Erin Hoffmann; Wiens, Kyle Cameron; Delapp, Dorothea Marcia

    2016-03-11

    The World Wide Lightning Location Network (WWLLN) provides continuous global lightning monitoring and detection. At LANL we collect and archive these data on a daily basis. This document describes the WWLLN data, how they are collected and archived, and how to use the data at LANL.

  4. A new phycitine moth (Vorapourouma basseti, Lepidoptera: Pyralidae) from Panama feeding on Pourouma Aubl. (Urticaceae)

    USDA-ARS?s Scientific Manuscript database

    A study of the insects associated with the tree Pourouma bicolor Martius (Cecropiaceae) in Panama, resulted in the discovery of a new phycitine moth genus and species, Vorapourouma basseti (Lepidoptera: Pyralidae). The immatures were collected by beating vegetation using the Fort Sherman Canopy Cran...

  5. 32 CFR 2001.50 - Telecommunications automated information systems and network security.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... and network security. 2001.50 Section 2001.50 National Defense Other Regulations Relating to National Defense INFORMATION SECURITY OVERSIGHT OFFICE, NATIONAL ARCHIVES AND RECORDS ADMINISTRATION CLASSIFIED... network security. Each agency head shall ensure that classified information electronically accessed...

  6. 32 CFR 2001.50 - Telecommunications automated information systems and network security.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... and network security. 2001.50 Section 2001.50 National Defense Other Regulations Relating to National Defense INFORMATION SECURITY OVERSIGHT OFFICE, NATIONAL ARCHIVES AND RECORDS ADMINISTRATION CLASSIFIED... network security. Each agency head shall ensure that classified information electronically accessed...

  7. 32 CFR 2001.50 - Telecommunications automated information systems and network security.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... and network security. 2001.50 Section 2001.50 National Defense Other Regulations Relating to National Defense INFORMATION SECURITY OVERSIGHT OFFICE, NATIONAL ARCHIVES AND RECORDS ADMINISTRATION CLASSIFIED... network security. Each agency head shall ensure that classified information electronically accessed...

  8. Keep It Sacred | National Native Network

    Science.gov Websites

    Detection Health Care Coverage Get Involved Resources NNN Webinar Archive Newsletter Archive Podcasts Cancer Guide Tribal Public Health Data Toolkits Smoke-Free Policy Toolkit Success Stories Resource Library Colorectal Cancer Diabetes Fact Sheets & Contact General Health Problems & Cancers General State

  9. 40 CFR 58.16 - Data submittal and archiving requirements.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 5 2011-07-01 2011-07-01 false Data submittal and archiving requirements. 58.16 Section 58.16 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) AMBIENT AIR QUALITY SURVEILLANCE Monitoring Network § 58.16 Data submittal and...

  10. Back to the Future: Long-Term Seismic Archives Revisited

    NASA Astrophysics Data System (ADS)

    Waldhauser, F.; Schaff, D. P.

    2007-12-01

    Archives of digital seismic data recorded by seismometer networks around the world have grown tremendously over the last several decades helped by the deployment of seismic stations and their continued operation within the framework of monitoring seismic activity. These archives typically consist of waveforms of seismic events and associated parametric data such as phase arrival time picks and the location of hypocenters. Catalogs of earthquake locations are fundamental data in seismology, and even in the Earth sciences in general. Yet, these locations have notoriously low spatial resolution because of errors in both the picks and the models commonly used to locate events one at a time. This limits their potential to address fundamental questions concerning the physics of earthquakes, the structure and composition of the Earth's interior, and the seismic hazards associated with active faults. We report on the comprehensive use of modern waveform cross-correlation based methodologies for high- resolution earthquake location - as applied to regional and global long-term seismic databases. By simultaneous re-analysis of two decades of the digital seismic archive of Northern California, reducing pick errors via cross-correlation and model errors via double-differencing, we achieve up to three orders of magnitude resolution improvement over existing hypocenter locations. The relocated events image networks of discrete faults at seismogenic depths across various tectonic settings that until now have been hidden in location uncertainties. Similar location improvements are obtained for earthquakes recorded at global networks by re- processing 40 years of parametric data from the ISC and corresponding waveforms archived at IRIS. Since our methods are scaleable and run on inexpensive Beowulf clusters, periodic re-analysis of entire archives may thus become a routine procedure to continuously improve resolution in existing catalogs. We demonstrate the role of seismic archives in obtaining the precise location of new events in real-time. Such information has considerable social and economic impact in the evaluation and mitigation of seismic hazards, for example, and highlights the need for consistent long-term seismic monitoring and archiving of records.

  11. The IRIS DMC: Perspectives on Real-Time Data Management and Open Access From a Large Seismological Archive: Challenges, Tools, and Quality Assurance

    NASA Astrophysics Data System (ADS)

    Benson, R. B.

    2007-05-01

    The IRIS Data Management Center, located in Seattle, WA, is the largest openly accessible geophysical archive in the world, and has a unique perspective on data management and operational practices that gets the most out of your network. Networks scale broad domains in time and space, from finite needs to monitor bridges and dams to national and international networks like the GSN and the FDSN that establish a baseline for global monitoring and research, the requirements that go into creating a well-tuned DMC archive treat these the same, building a collaborative network of networks that generations of users rely on and adds value to the data. Funded by the National Science Foundation through the Division of Earth Sciences, IRIS is operated through member universities and in cooperation with the USGS, and the DMS facility is a bridge between a globally distributed collaboration of seismic networks and an equally distributed network of users that demand a high standard for data quality, completeness, and ease of access. I will describe the role that a perpetual archive has in the life cycle of data, and how hosting real-time data performs a dual role of being a hub for continuous data from approximately 59 real-time networks, and distributing these (along with other data from the 40-year library of available time-series data) to researchers, while simultaneously providing shared data back to networks in real- time that benefits monitoring activities. I will describe aspects of our quality-assurance framework that are both passively and actively performed on 1100 seismic stations, generating over 6,000 channels of regularly sampled data arriving daily, that data providers can use as aids in operating their network, and users can likewise use when requesting suitable data for research purposes. The goal of the DMC is to eliminate bottlenecks in data discovery and shortening the steps leading to analysis. This includes many challenges, including keeping metadata current, tools for evaluating and viewing them, along with measuring and creating databases of other performance metrics and how monitoring them closer to real- time helps reduce operation costs, creates a richer repository, and eliminates problems over generations of duty cycles of data usage. I will describe a new resource, called the Nominal Response Library, which hopes to provide accurate and representative examples of sensor and data logger configurations that are hosted at the DMC and constitute a high-graded subset for crafting your own metadata. Finally, I want to encourage all network operators who do not currently submit SEED format data to an archive to consider these benefits, and briefly discuss how robust transfer mechanisms that include Earthworm, LISS, Antelope, NRTS and SeisComp, to name a few, can assist you in contributing your network data and help create this enabling virtual network of networks. In this era of high performance Internet capacity, the process that enables others to share your data and allows you to utilize external sources of data is nearly seamless with your current mission of network operation.

  12. Development and Evaluation of a City-Wide Wireless Weather Sensor Network

    ERIC Educational Resources Information Center

    Chang, Ben; Wang, Hsue-Yie; Peng, Tian-Yin; Hsu, Ying-Shao

    2010-01-01

    This project analyzed the effectiveness of a city-wide wireless weather sensor network, the Taipei Weather Science Learning Network (TWIN), in facilitating elementary and junior high students' study of weather science. The network, composed of sixty school-based weather sensor nodes and a centralized weather data archive server, provides students…

  13. Blueprint for multimedia telemedicine networks in the Rocky Mountain Veterans Integrated Service Network (VISN-19).

    PubMed

    Terreros, D A; Martinez, R

    1997-01-01

    A multimedia telemedicine network is proposed for a VISN-19 test bed and it will include picture archiving and communication systems (PACS). Initial tests have been performed, and the technical feasibility of the basic plan has been demonstrated.

  14. Running R Statistical Computing Environment Software on the Peregrine

    Science.gov Websites

    for the development of new statistical methodologies and enjoys a large user base. Please consult the distribution details. Natural language support but running in an English locale R is a collaborative project programming paradigms to better leverage modern HPC systems. The CRAN task view for High Performance Computing

  15. VALORATE: fast and accurate log-rank test in balanced and unbalanced comparisons of survival curves and cancer genomics.

    PubMed

    Treviño, Victor; Tamez-Pena, Jose

    2017-06-15

    The association of genomic alterations to outcomes in cancer is affected by a problem of unbalanced groups generated by the low frequency of alterations. For this, an R package (VALORATE) that estimates the null distribution and the P -value of the log-rank based on a recent reformulation is presented. For a given number of alterations that define the size of survival groups, the log-rank density is estimated by a weighted sum of conditional distributions depending on a co-occurrence term of mutations and events. The estimations are accurately accelerated by sampling across co-occurrences allowing the analysis of large genomic datasets in few minutes. In conclusion, the proposed VALORATE R package is a valuable tool for survival analysis. The R package is available in CRAN at https://cran.r-project.org and in http://bioinformatica.mty.itesm.mx/valorateR . vtrevino@itesm.mx. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  16. PredictABEL: an R package for the assessment of risk prediction models.

    PubMed

    Kundu, Suman; Aulchenko, Yurii S; van Duijn, Cornelia M; Janssens, A Cecile J W

    2011-04-01

    The rapid identification of genetic markers for multifactorial diseases from genome-wide association studies is fuelling interest in investigating the predictive ability and health care utility of genetic risk models. Various measures are available for the assessment of risk prediction models, each addressing a different aspect of performance and utility. We developed PredictABEL, a package in R that covers descriptive tables, measures and figures that are used in the analysis of risk prediction studies such as measures of model fit, predictive ability and clinical utility, and risk distributions, calibration plot and the receiver operating characteristic plot. Tables and figures are saved as separate files in a user-specified format, which include publication-quality EPS and TIFF formats. All figures are available in a ready-made layout, but they can be customized to the preferences of the user. The package has been developed for the analysis of genetic risk prediction studies, but can also be used for studies that only include non-genetic risk factors. PredictABEL is freely available at the websites of GenABEL ( http://www.genabel.org ) and CRAN ( http://cran.r-project.org/).

  17. compendiumdb: an R package for retrieval and storage of functional genomics data.

    PubMed

    Nandal, Umesh K; van Kampen, Antoine H C; Moerland, Perry D

    2016-09-15

    Currently, the Gene Expression Omnibus (GEO) contains public data of over 1 million samples from more than 40 000 microarray-based functional genomics experiments. This provides a rich source of information for novel biological discoveries. However, unlocking this potential often requires retrieving and storing a large number of expression profiles from a wide range of different studies and platforms. The compendiumdb R package provides an environment for downloading functional genomics data from GEO, parsing the information into a local or remote database and interacting with the database using dedicated R functions, thus enabling seamless integration with other tools available in R/Bioconductor. The compendiumdb package is written in R, MySQL and Perl. Source code and binaries are available from CRAN (http://cran.r-project.org/web/packages/compendiumdb/) for all major platforms (Linux, MS Windows and OS X) under the GPLv3 license. p.d.moerland@amc.uva.nl Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  18. FIT: statistical modeling tool for transcriptome dynamics under fluctuating field conditions

    PubMed Central

    Iwayama, Koji; Aisaka, Yuri; Kutsuna, Natsumaro

    2017-01-01

    Abstract Motivation: Considerable attention has been given to the quantification of environmental effects on organisms. In natural conditions, environmental factors are continuously changing in a complex manner. To reveal the effects of such environmental variations on organisms, transcriptome data in field environments have been collected and analyzed. Nagano et al. proposed a model that describes the relationship between transcriptomic variation and environmental conditions and demonstrated the capability to predict transcriptome variation in rice plants. However, the computational cost of parameter optimization has prevented its wide application. Results: We propose a new statistical model and efficient parameter optimization based on the previous study. We developed and released FIT, an R package that offers functions for parameter optimization and transcriptome prediction. The proposed method achieves comparable or better prediction performance within a shorter computational time than the previous method. The package will facilitate the study of the environmental effects on transcriptomic variation in field conditions. Availability and Implementation: Freely available from CRAN (https://cran.r-project.org/web/packages/FIT/). Contact: anagano@agr.ryukoku.ac.jp Supplementary information: Supplementary data are available at Bioinformatics online PMID:28158396

  19. Creating Trading Networks of Digital Archives.

    ERIC Educational Resources Information Center

    Cooper, Brian; Garcia-Molina, Hector

    Digital materials are vulnerable to a number of different kinds of failures, including decay of the digital media, loss due to hackers and viruses, accidental deletions, natural disasters, and bankruptcy of the institution holding the collection. Digital archives can best survive failures if they have made several copies of their collections at…

  20. Products and Services Available from the Southern California Earthquake Data Center (SCEDC) and the Southern California Seismic Network (SCSN)

    NASA Astrophysics Data System (ADS)

    Yu, E.; Bhaskaran, A.; Chen, S.; Chowdhury, F. R.; Meisenhelter, S.; Hutton, K.; Given, D.; Hauksson, E.; Clayton, R. W.

    2010-12-01

    Currently the SCEDC archives continuous and triggered data from nearly 5000 data channels from 425 SCSN recorded stations, processing and archiving an average of 12,000 earthquakes each year. The SCEDC provides public access to these earthquake parametric and waveform data through its website www.data.scec.org and through client applications such as STP and DHI. This poster will describe the most significant developments at the SCEDC in the past year. Updated hardware: ● The SCEDC has more than doubled its waveform file storage capacity by migrating to 2 TB disks. New data holdings: ● Waveform data: Beginning Jan 1, 2010 the SCEDC began continuously archiving all high-sample-rate strong-motion channels. All seismic channels recorded by SCSN are now continuously archived and available at SCEDC. ● Portable data from El Mayor Cucapah 7.2 sequence: Seismic waveforms from portable stations installed by researchers (contributed by Elizabeth Cochran, Jamie Steidl, and Octavio Lazaro-Mancilla) have been added to the archive and are accessible through STP either as continuous data or associated with events in the SCEDC earthquake catalog. This additional data will help SCSN analysts and researchers improve event locations from the sequence. ● Real time GPS solutions from El Mayor Cucapah 7.2 event: Three component 1Hz seismograms of California Real Time Network (CRTN) GPS stations, from the April 4, 2010, magnitude 7.2 El Mayor-Cucapah earthquake are available in SAC format at the SCEDC. These time series were created by Brendan Crowell, Yehuda Bock, the project PI, and Mindy Squibb at SOPAC using data from the CRTN. The El Mayor-Cucapah earthquake demonstrated definitively the power of real-time high-rate GPS data: they measure dynamic displacements directly, they do not clip and they are also able to detect the permanent (coseismic) surface deformation. ● Triggered data from the Quake Catcher Network (QCN) and Community Seismic Network (CSN): The SCEDC in cooperation with QCN and CSN is exploring ways to archive and distribute data from high density low cost networks. As a starting point the SCEDC will store a dataset from QCN and CSN and distribute it through a separate STP client. New archival methods: ● The SCEDC is exploring the feasibility of archiving and distributing waveform data using cloud computing such as Google Apps. A month of continuous data from the SCEDC archive will be stored in Google Apps and a client developed to access it in a manner similar to STP. XML formats: ● The SCEDC is now distributing earthquake parameter data through web services in QuakeML format. ● The SCEDC in collaboration with the Northern California Earthquake Data Center (NCEDC) and USGS Golden has reviewed and revised the StationXML format to produce version 2.0. The new version includes a rules on extending the schema, use of named complex types, and greater consistency in naming conventions. Based on this work we plan to develop readers and writers of the StationXML format.

  1. Present status and future directions of the Mayo/IBM PACS project

    NASA Astrophysics Data System (ADS)

    Morin, Richard L.; Forbes, Glenn S.; Gehring, Dale G.; Salutz, James R.; Pavlicek, William

    1991-07-01

    This joint project began in 1988 and was motivated by the need to develop an alternative to the archival process in place at that time (magnetic tape) for magnetic resonance imaging and neurological computed tomography. In addition, this project was felt to be an important step in gaining the necessary clinical experience for the future implementation of various aspects of electronic imaging. The initial phase of the project was conceived and developed to prove the concept, test the fundamental components, and produce performance measurements for future work. The key functions of this phase centered on attachment of imaging equipment (GE Signa) and archival processes using a non-dedicated (institutionally supplied) local area network (LAN). Attachment of imaging equipment to the LAN was performed using commercially available devices (Ethernet, PS/2, Token Ring). Image data were converted to ACR/NEMA format with retention of the vendor specific header information. Performance measurements were encouraging and led to the design of following projects. The second phase has recently been concluded. The major features of this phase have been to greatly expand the network, put the network into clinical use, establish an efficient and useful viewing station, include diagnostic reports in the archive data, provide wide area network (WAN) capability via ISDN, and establish two-way real-time video between remote sites. This phase has heightened both departmental and institutional thought regarding various issues raised by electronic imaging. Much discussion regarding both present as well as future archival processes has occurred. The use of institutional LAN resources has proven to be adequate for the archival function examined thus far. Experiments to date have shown that use of dedicated resources will be necessary for retrieval activities at even a basic level. This report presents an overview of the background present status and future directions of the project.

  2. Radiologic image communication and archive service: a secure, scalable, shared approach

    NASA Astrophysics Data System (ADS)

    Fellingham, Linda L.; Kohli, Jagdish C.

    1995-11-01

    The Radiologic Image Communication and Archive (RICA) service is designed to provide a shared archive for medical images to the widest possible audience of customers. Images are acquired from a number of different modalities, each available from many different vendors. Images are acquired digitally from those modalities which support direct digital output and by digitizing films for projection x-ray exams. The RICA Central Archive receives standard DICOM 3.0 messages and data streams from the medical imaging devices at customer institutions over the public telecommunication network. RICA represents a completely scalable resource. The user pays only for what he is using today with the full assurance that as the volume of image data that he wishes to send to the archive increases, the capacity will be there to accept it. To provide this seamless scalability imposes several requirements on the RICA architecture: (1) RICA must support the full array of transport services. (2) The Archive Interface must scale cost-effectively to support local networks that range from the very small (one x-ray digitizer in a medical clinic) to the very large and complex (a large hospital with several CTs, MRs, Nuclear medicine devices, ultrasound machines, CRs, and x-ray digitizers). (3) The Archive Server must scale cost-effectively to support rapidly increasing demands for service providing storage for and access to millions of patients and hundreds of millions of images. The architecture must support the incorporation of improved technology as it becomes available to maintain performance and remain cost-effective as demand rises.

  3. Research Capacity Building in Education: The Role of Digital Archives

    ERIC Educational Resources Information Center

    Carmichael, Patrick

    2011-01-01

    Accounts of how research capacity in education can be developed often make reference to electronic networks and online resources. This paper presents a theoretically driven analysis of the role of one such resource, an online archive of educational research studies that includes not only digitised collections of original documents but also videos…

  4. Yucca Mountain licensing support network archive assistant.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dunlavy, Daniel M.; Bauer, Travis L.; Verzi, Stephen J.

    2008-03-01

    This report describes the Licensing Support Network (LSN) Assistant--a set of tools for categorizing e-mail messages and documents, and investigating and correcting existing archives of categorized e-mail messages and documents. The two main tools in the LSN Assistant are the LSN Archive Assistant (LSNAA) tool for recategorizing manually labeled e-mail messages and documents and the LSN Realtime Assistant (LSNRA) tool for categorizing new e-mail messages and documents. This report focuses on the LSNAA tool. There are two main components of the LSNAA tool. The first is the Sandia Categorization Framework, which is responsible for providing categorizations for documents in anmore » archive and storing them in an appropriate Categorization Database. The second is the actual user interface, which primarily interacts with the Categorization Database, providing a way for finding and correcting categorizations errors in the database. A procedure for applying the LSNAA tool and an example use case of the LSNAA tool applied to a set of e-mail messages are provided. Performance results of the categorization model designed for this example use case are presented.« less

  5. Complementary concept for an image archive and communication system in a cardiological department based on CD-medical, an online archive, and networking facilities

    NASA Astrophysics Data System (ADS)

    Oswald, Helmut; Mueller-Jones, Kay; Builtjes, Jan; Fleck, Eckart

    1998-07-01

    The developments in information technologies -- computer hardware, networking and storage media -- has led to expectations that these advances make it possible to replace 35 mm film completely by digital techniques in the catheter laboratory. Besides the role of an archival medium, cine film is used as the major image review and exchange medium in cardiology. None of the today technologies can fulfill completely the requirements to replace cine film. One of the major drawbacks of cine film is the single access in time and location. For the four catheter laboratories in our institutions we have designed a complementary concept combining the CD-R, also called CD-medical, as a single patient storage and exchange medium, and a digital archive for on-line access and image review of selected frames or short sequences on adequate medical workstations. The image data from various modalities as well as all digital documents regarding to a patient are part of an electronic patient record. The access, the processing and the display of documents is supported by an integrated medical application.

  6. Delineating functional principles of the bow tie structure of a kinase-phosphatase network in the budding yeast.

    PubMed

    Abd-Rabbo, Diala; Michnick, Stephen W

    2017-03-16

    Kinases and phosphatases (KP) form complex self-regulating networks essential for cellular signal processing. In spite of having a wealth of data about interactions among KPs and their substrates, we have very limited models of the structures of the directed networks they form and consequently our ability to formulate hypotheses about how their structure determines the flow of information in these networks is restricted. We assembled and studied the largest bona fide kinase-phosphatase network (KP-Net) known to date for the yeast Saccharomyces cerevisiae. Application of the vertex sort (VS) algorithm on the KP-Net allowed us to elucidate its hierarchical structure in which nodes are sorted into top, core and bottom layers, forming a bow tie structure with a strongly connected core layer. Surprisingly, phosphatases tend to sort into the top layer, implying they are less regulated by phosphorylation than kinases. Superposition of the widest range of KP biological properties over the KP-Net hierarchy shows that core layer KPs: (i), receive the largest number of inputs; (ii), form bottlenecks implicated in multiple pathways and in decision-making; (iii), and are among the most regulated KPs both temporally and spatially. Moreover, top layer KPs are more abundant and less noisy than those in the bottom layer. Finally, we showed that the VS algorithm depends on node degrees without biasing the biological results of the sorted network. The VS algorithm is available as an R package ( https://cran.r-project.org/web/packages/VertexSort/index.html ). The KP-Net model we propose possesses a bow tie hierarchical structure in which the top layer appears to ensure highest fidelity and the core layer appears to mediate signal integration and cell state-dependent signal interpretation. Our model of the yeast KP-Net provides both functional insight into its organization as we understand today and a framework for future investigation of information processing in yeast and eukaryotes in general.

  7. Developing Generic Image Search Strategies for Large Astronomical Data Sets and Archives using Convolutional Neural Networks and Transfer Learning

    NASA Astrophysics Data System (ADS)

    Peek, Joshua E. G.; Hargis, Jonathan R.; Jones, Craig K.

    2018-01-01

    Astronomical instruments produce petabytes of images every year, vastly more than can be inspected by a member of the astronomical community in search of a specific population of structures. Fortunately, the sky is mostly black and source extraction algorithms have been developed to provide searchable catalogs of unconfused sources like stars and galaxies. These tools often fail for studies of more diffuse structures like the interstellar medium and unresolved stellar structures in nearby galaxies, leaving astronomers interested in observations of photodissociation regions, stellar clusters, diffuse interstellar clouds without the crucial ability to search. In this work we present a new path forward for finding structures in large data sets similar to an input structure using convolutional neural networks, transfer learning, and machine learning clustering techniques. We show applications to archival data in the Mikulski Archive for Space Telescopes (MAST).

  8. 7 CFR 1755.901 - Incorporation by Reference.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ..., Digital Systems and Networks, Transmission media characteristics—Optical fibre cables, Characteristics of... Systems and Media, Digital Systems and Networks, Transmission media characteristics—Optical fibre cables... National Archives and Records Administration (NARA). For information on the availability of these materials...

  9. Development of a North American paleoclimate pollen-based reconstruction database application

    NASA Astrophysics Data System (ADS)

    Ladd, Matthew; Mosher, Steven; Viau, Andre

    2013-04-01

    Recent efforts in synthesizing paleoclimate records across the globe has warranted an effort to standardize the different paleoclimate archives currently available in order to facilitate data-model comparisons and hence improve our estimates of future climate change. It is often the case that the methodology and programs make it challenging for other researchers to reproduce the results for a reconstruction, therefore there is a need for to standardize paleoclimate reconstruction databases in an application specific to proxy data. Here we present a methodology using the open source R language using North American pollen databases (e.g. NAPD, NEOTOMA) where this application can easily be used to perform new reconstructions and quickly analyze and output/plot the data. The application was developed to easily test methodological and spatial/temporal issues that might affect the reconstruction results. The application allows users to spend more time analyzing and interpreting results instead of on data management and processing. Some of the unique features of this R program are the two modules each with a menu making the user feel at ease with the program, the ability to use different pollen sums, select one of 70 climate variables available, substitute an appropriate modern climate dataset, a user-friendly regional target domain, temporal resolution criteria, linear interpolation and many other features for a thorough exploratory data analysis. The application program will be available for North American pollen-based reconstructions and eventually be made available as a package through the CRAN repository by late 2013.

  10. Space data management at the NSSDC (National Space Sciences Data Center): Applications for data compression

    NASA Technical Reports Server (NTRS)

    Green, James L.

    1989-01-01

    The National Space Science Data Center (NSSDC), established in 1966, is the largest archive for processed data from NASA's space and Earth science missions. The NSSDC manages over 120,000 data tapes with over 4,000 data sets. The size of the digital archive is approximately 6,000 gigabytes with all of this data in its original uncompressed form. By 1995 the NSSDC digital archive is expected to more than quadruple in size reaching over 28,000 gigabytes. The NSSDC digital archive is expected to more than quadruple in size reaching over 28,000 gigabytes. The NSSDC is beginning several thrusts allowing it to better serve the scientific community and keep up with managing the ever increasing volumes of data. These thrusts involve managing larger and larger amounts of information and data online, employing mass storage techniques, and the use of low rate communications networks to move requested data to remote sites in the United States, Europe and Canada. The success of these thrusts, combined with the tremendous volume of data expected to be archived at the NSSDC, clearly indicates that innovative storage and data management solutions must be sought and implemented. Although not presently used, data compression techniques may be a very important tool for managing a large fraction or all of the NSSDC archive in the future. Some future applications would consist of compressing online data in order to have more data readily available, compress requested data that must be moved over low rate ground networks, and compress all the digital data in the NSSDC archive for a cost effective backup that would be used only in the event of a disaster.

  11. WhopGenome: high-speed access to whole-genome variation and sequence data in R.

    PubMed

    Wittelsbürger, Ulrich; Pfeifer, Bastian; Lercher, Martin J

    2015-02-01

    The statistical programming language R has become a de facto standard for the analysis of many types of biological data, and is well suited for the rapid development of new algorithms. However, variant call data from population-scale resequencing projects are typically too large to be read and processed efficiently with R's built-in I/O capabilities. WhopGenome can efficiently read whole-genome variation data stored in the widely used variant call format (VCF) file format into several R data types. VCF files can be accessed either on local hard drives or on remote servers. WhopGenome can associate variants with annotations such as those available from the UCSC genome browser, and can accelerate the reading process by filtering loci according to user-defined criteria. WhopGenome can also read other Tabix-indexed files and create indices to allow fast selective access to FASTA-formatted sequence files. The WhopGenome R package is available on CRAN at http://cran.r-project.org/web/packages/WhopGenome/. A Bioconductor package has been submitted. lercher@cs.uni-duesseldorf.de. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  12. cit: hypothesis testing software for mediation analysis in genomic applications.

    PubMed

    Millstein, Joshua; Chen, Gary K; Breton, Carrie V

    2016-08-01

    The challenges of successfully applying causal inference methods include: (i) satisfying underlying assumptions, (ii) limitations in data/models accommodated by the software and (iii) low power of common multiple testing approaches. The causal inference test (CIT) is based on hypothesis testing rather than estimation, allowing the testable assumptions to be evaluated in the determination of statistical significance. A user-friendly software package provides P-values and optionally permutation-based FDR estimates (q-values) for potential mediators. It can handle single and multiple binary and continuous instrumental variables, binary or continuous outcome variables and adjustment covariates. Also, the permutation-based FDR option provides a non-parametric implementation. Simulation studies demonstrate the validity of the cit package and show a substantial advantage of permutation-based FDR over other common multiple testing strategies. The cit open-source R package is freely available from the CRAN website (https://cran.r-project.org/web/packages/cit/index.html) with embedded C ++ code that utilizes the GNU Scientific Library, also freely available (http://www.gnu.org/software/gsl/). joshua.millstein@usc.edu Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  13. Improving the Quality of Backup Process for Publishing Houses and Printing Houses

    NASA Astrophysics Data System (ADS)

    Proskuriakov, N. E.; Yakovlev, B. S.; Pries, V. V.

    2018-04-01

    The analysis of main types for data threats, used by print media, and their influence on the vitality and security of information is made. The influence of the programs settings for preparing archive files, the types of file managers on the backup process is analysed. We proposed a simple and economical version of the practical implementation of the backup process consisting of 4 components: the command line interpreter, the 7z archiver, the Robocopy utility, and network storage. We recommend that the best option would be to create backup copies, consisting of three local copies of data and two network copies.

  14. Information Assurance Tasks Supporting the Processing of Electronic Records Archives

    DTIC Science & Technology

    2007-03-01

    3 Table 2. OpenVPN evaluation results...........................................................................................10 iv 1...operation of necessary security features and compare the network performance under OpenVPN (openvpn.net) operation with the network performance under no...VPN operation (non-VPN) in a gigabit network environment. The reason for selecting OpenVPN product was based on the previous findings of Khanvilkar

  15. The LCOGT Observation Portal, Data Pipeline and Science Archive

    NASA Astrophysics Data System (ADS)

    Lister, Tim; LCOGT Science Archive Team

    2014-01-01

    Las Cumbres Observatory Global Telescope (LCOGT) is building and deploying a world-wide network of optical telescopes dedicated to time-domain astronomy. During 2012-2013, we successfully deployed and commissioned nine new 1m telescopes at McDonald Observatory (Texas), CTIO (Chile), SAAO (South Africa) and Siding Spring Observatory (Australia). New, improved cameras and additional telescopes will be deployed during 2014. To enable the diverse LCOGT user community of scientific and educational users to request observations on the LCOGT Network and to see their progress and get access to their data, we have developed an Observation Portal system. This Observation Portal integrates proposal submission and observation requests with seamless access to the data products from the data pipelines in near-realtime and long-term products from the Science Archive. We describe the LCOGT Observation Portal and the data pipeline, currently in operation, which makes use of the ORAC-DR automated recipe-based data reduction pipeline and illustrate some of the new data products. We also present the LCOGT Science Archive, which is being developed in partnership with the Infrared Processing and Analysis Center (IPAC) and show some of the new features the Science Archive provides.

  16. Buckets: Aggregative, Intelligent Agents for Publishing

    NASA Technical Reports Server (NTRS)

    Nelson, Michael L.; Maly, Kurt; Shen, Stewart N. T.; Zubair, Mohammad

    1998-01-01

    Buckets are an aggregative, intelligent construct for publishing in digital libraries. The goal of research projects is to produce information. This information is often instantiated in several forms, differentiated by semantic types (report, software, video, datasets, etc.). A given semantic type can be further differentiated by syntactic representations as well (PostScript version, PDF version, Word version, etc.). Although the information was created together and subtle relationships can exist between them, different semantic instantiations are generally segregated along currently obsolete media boundaries. Reports are placed in report archives, software might go into a software archive, but most of the data and supporting materials are likely to be kept in informal personal archives or discarded altogether. Buckets provide an archive-independent container construct in which all related semantic and syntactic data types and objects can be logically grouped together, archived, and manipulated as a single object. Furthermore, buckets are active archival objects and can communicate with each other, people, or arbitrary network services.

  17. Continuous, Large-Scale Processing of Seismic Archives for High-Resolution Monitoring of Seismic Activity and Seismogenic Properties

    NASA Astrophysics Data System (ADS)

    Waldhauser, F.; Schaff, D. P.

    2012-12-01

    Archives of digital seismic data recorded by seismometer networks around the world have grown tremendously over the last several decades helped by the deployment of seismic stations and their continued operation within the framework of monitoring earthquake activity and verification of the Nuclear Test-Ban Treaty. We show results from our continuing effort in developing efficient waveform cross-correlation and double-difference analysis methods for the large-scale processing of regional and global seismic archives to improve existing earthquake parameter estimates, detect seismic events with magnitudes below current detection thresholds, and improve real-time monitoring procedures. We demonstrate the performance of these algorithms as applied to the 28-year long seismic archive of the Northern California Seismic Network. The tools enable the computation of periodic updates of a high-resolution earthquake catalog of currently over 500,000 earthquakes using simultaneous double-difference inversions, achieving up to three orders of magnitude resolution improvement over existing hypocenter locations. This catalog, together with associated metadata, form the underlying relational database for a real-time double-difference scheme, DDRT, which rapidly computes high-precision correlation times and hypocenter locations of new events with respect to the background archive (http://ddrt.ldeo.columbia.edu). The DDRT system facilitates near-real-time seismicity analysis, including the ability to search at an unprecedented resolution for spatio-temporal changes in seismogenic properties. In areas with continuously recording stations, we show that a detector built around a scaled cross-correlation function can lower the detection threshold by one magnitude unit compared to the STA/LTA based detector employed at the network. This leads to increased event density, which in turn pushes the resolution capability of our location algorithms. On a global scale, we are currently building the computational framework for double-difference processing the combined parametric and waveform archives of the ISC, NEIC, and IRIS with over three million recorded earthquakes worldwide. Since our methods are scalable and run on inexpensive Beowulf clusters, periodic re-analysis of such archives may thus become a routine procedure to continuously improve resolution in existing global earthquake catalogs. Results from subduction zones and aftershock sequences of recent great earthquakes demonstrate the considerable social and economic impact that high-resolution images of active faults, when available in real-time, will have in the prompt evaluation and mitigation of seismic hazards. These results also highlight the need for consistent long-term seismic monitoring and archiving of records.

  18. A quoi ça CERN? De l'invention du Web à la Grille de calcul, l'informatique pour tous?

    ScienceCinema

    None

    2018-06-28

    Venez débattre autour de l'ordinateur ayant servi à réaliser la première page Web au monde et découvrir sur le grand écran la Grille de calcul mondiale. The video is black for the first 9 minutes.

  19. Fort Leavenworth Ethics Symposium: The Professional Ethic and the State

    DTIC Science & Technology

    2015-04-23

    47 Chapter 6 Ethical Paradox, Cultural Incongruence, and the Need for a Code of Ethics in the US Military by William J . Davis, Jr. PhD...of Military Ethics by Thomas J . Gibbons ....................................................................................................91...PJ McCormack MBE, BD, MTh, PhD (QUB), PhD (Cran), CF ......143 Chapter 14 Multiple Ethical Loyalties in Guantanamo CAPT J . Scott McPherson, USN and

  20. Clinical experience with a high-performance ATM-connected DICOM archive for cardiology

    NASA Astrophysics Data System (ADS)

    Solomon, Harry P.

    1997-05-01

    A system to archive large image sets, such as cardiac cine runs, with near realtime response must address several functional and performance issues, including efficient use of a high performance network connection with standard protocols, an architecture which effectively integrates both short- and long-term mass storage devices, and a flexible data management policy which allows optimization of image distribution and retrieval strategies based on modality and site-specific operational use. Clinical experience with such as archive has allowed evaluation of these systems issues and refinement of a traffic model for cardiac angiography.

  1. [Management and development of the dangerous preparation archive].

    PubMed

    Binetti, Roberto; Longo, Marcello; Scimonelli, Luigia; Costamagna, Francesca

    2006-01-01

    In the year 2000 an archive of dangerous preparations was created at the National Health Institute (Istituto Superiore di Sanità), following a principle included in the Directive 88/379/EEC on dangerous preparations, subsequently modified by the Directive 1999/45/EC, concerning the creation of a data bank on dangerous preparations in each European country. The information stored in the archive is useful for purposes of health consumer's and workers protection and prevention, and particularly in case of acute poisonings. The archive is fully informatised, therefore the companies can send the information using the web and the authorized Poison Centres can find the information on the archive using the web. In each Member State different procedures are in place to comply with the 1999/45/EC Directive; therefore an international coordination could be useful in order to create an European network of national data-banks on dangerous preparations.

  2. [A new concept for integration of image databanks into a comprehensive patient documentation].

    PubMed

    Schöll, E; Holm, J; Eggli, S

    2001-05-01

    Image processing and archiving are of increasing importance in the practice of modern medicine. Particularly due to the introduction of computer-based investigation methods, physicians are dealing with a wide variety of analogue and digital picture archives. On the other hand, clinical information is stored in various text-based information systems without integration of image components. The link between such traditional medical databases and picture archives is a prerequisite for efficient data management as well as for continuous quality control and medical education. At the Department of Orthopedic Surgery, University of Berne, a software program was developed to create a complete multimedia electronic patient record. The client-server system contains all patients' data, questionnaire-based quality control, and a digital picture archive. Different interfaces guarantee the integration into the hospital's data network. This article describes our experiences in the development and introduction of a comprehensive image archiving system at a large orthopedic center.

  3. Cardio-PACs: a new opportunity

    NASA Astrophysics Data System (ADS)

    Heupler, Frederick A., Jr.; Thomas, James D.; Blume, Hartwig R.; Cecil, Robert A.; Heisler, Mary

    2000-05-01

    It is now possible to replace film-based image management in the cardiac catheterization laboratory with a Cardiology Picture Archiving and Communication System (Cardio-PACS) based on digital imaging technology. The first step in the conversion process is installation of a digital image acquisition system that is capable of generating high-quality DICOM-compatible images. The next three steps, which are the subject of this presentation, involve image display, distribution, and storage. Clinical requirements and associated cost considerations for these three steps are listed below: Image display: (1) Image quality equal to film, with DICOM format, lossless compression, image processing, desktop PC-based with color monitor, and physician-friendly imaging software; (2) Performance specifications include: acquire 30 frames/sec; replay 15 frames/sec; access to file server 5 seconds, and to archive 5 minutes; (3) Compatibility of image file, transmission, and processing formats; (4) Image manipulation: brightness, contrast, gray scale, zoom, biplane display, and quantification; (5) User-friendly control of image review. Image distribution: (1) Standard IP-based network between cardiac catheterization laboratories, file server, long-term archive, review stations, and remote sites; (2) Non-proprietary formats; (3) Bidirectional distribution. Image storage: (1) CD-ROM vs disk vs tape; (2) Verification of data integrity; (3) User-designated storage capacity for catheterization laboratory, file server, long-term archive. Costs: (1) Image acquisition equipment, file server, long-term archive; (2) Network infrastructure; (3) Review stations and software; (4) Maintenance and administration; (5) Future upgrades and expansion; (6) Personnel.

  4. Software For Monitoring A Computer Network

    NASA Technical Reports Server (NTRS)

    Lee, Young H.

    1992-01-01

    SNMAT is rule-based expert-system computer program designed to assist personnel in monitoring status of computer network and identifying defective computers, workstations, and other components of network. Also assists in training network operators. Network for SNMAT located at Space Flight Operations Center (SFOC) at NASA's Jet Propulsion Laboratory. Intended to serve as data-reduction system providing windows, menus, and graphs, enabling users to focus on relevant information. SNMAT expected to be adaptable to other computer networks; for example in management of repair, maintenance, and security, or in administration of planning systems, billing systems, or archives.

  5. High Performance Computing and Networking for Science--Background Paper.

    ERIC Educational Resources Information Center

    Congress of the U.S., Washington, DC. Office of Technology Assessment.

    The Office of Technology Assessment is conducting an assessment of the effects of new information technologies--including high performance computing, data networking, and mass data archiving--on research and development. This paper offers a view of the issues and their implications for current discussions about Federal supercomputer initiatives…

  6. PRECISION OF ATMOSPHERIC DRY DEPOSITION DATA FROM THE CLEAN AIR STATUS AND TRENDS NETWORK (CASTNET)

    EPA Science Inventory

    A collocated, dry deposition sampling program was begun in January 1987 by the US Environmental Protection Agency to provide ongoing estimates of the overall precision of dry deposition and supporting data entering the Clean Air Status and Trends Network (CASTNet) archives Duplic...

  7. CD-based image archival and management on a hybrid radiology intranet.

    PubMed

    Cox, R D; Henri, C J; Bret, P M

    1997-08-01

    This article describes the design and implementation of a low-cost image archival and management solution on a radiology network consisting of UNIX, IBM personal computer-compatible (IBM, Purchase, NY) and Macintosh (Apple Computer, Cupertino, CA) workstations. The picture archiving and communications system (PACS) is modular, scaleable and conforms to the Digital Imaging and Communications in Medicine (DICOM) 3.0 standard for image transfer, storage and retrieval. Image data is made available on soft-copy reporting workstations by a work-flow management scheme and on desktop computers through a World Wide Web (WWW) interface. Data archival is based on recordable compact disc (CD) technology and is automated. The project has allowed the radiology department to eliminate the use of film in magnetic resonance (MR) imaging, computed tomography (CT) and ultrasonography.

  8. Turning Archival Tapes into an Online “Cardless” Catalog

    PubMed Central

    Zuckerman, Alan E.; Ewens, Wilma A.; Cannard, Bonnie G.; Broering, Naomi C.

    1982-01-01

    Georgetown University has created an online card catalog based on machine readable cataloging records (MARC) loaded from archival tapes or online via the OCLC network. The system is programmed in MUMPS and uses the medical subject headings (MeSH) authority file created by the National Library of Medicine. The online catalog may be searched directly by library users and has eliminated the need for manual filing of catalog cards.

  9. An Efficient and Reliable Statistical Method for Estimating Functional Connectivity in Large Scale Brain Networks Using Partial Correlation

    PubMed Central

    Wang, Yikai; Kang, Jian; Kemmer, Phebe B.; Guo, Ying

    2016-01-01

    Currently, network-oriented analysis of fMRI data has become an important tool for understanding brain organization and brain networks. Among the range of network modeling methods, partial correlation has shown great promises in accurately detecting true brain network connections. However, the application of partial correlation in investigating brain connectivity, especially in large-scale brain networks, has been limited so far due to the technical challenges in its estimation. In this paper, we propose an efficient and reliable statistical method for estimating partial correlation in large-scale brain network modeling. Our method derives partial correlation based on the precision matrix estimated via Constrained L1-minimization Approach (CLIME), which is a recently developed statistical method that is more efficient and demonstrates better performance than the existing methods. To help select an appropriate tuning parameter for sparsity control in the network estimation, we propose a new Dens-based selection method that provides a more informative and flexible tool to allow the users to select the tuning parameter based on the desired sparsity level. Another appealing feature of the Dens-based method is that it is much faster than the existing methods, which provides an important advantage in neuroimaging applications. Simulation studies show that the Dens-based method demonstrates comparable or better performance with respect to the existing methods in network estimation. We applied the proposed partial correlation method to investigate resting state functional connectivity using rs-fMRI data from the Philadelphia Neurodevelopmental Cohort (PNC) study. Our results show that partial correlation analysis removed considerable between-module marginal connections identified by full correlation analysis, suggesting these connections were likely caused by global effects or common connection to other nodes. Based on partial correlation, we find that the most significant direct connections are between homologous brain locations in the left and right hemisphere. When comparing partial correlation derived under different sparse tuning parameters, an important finding is that the sparse regularization has more shrinkage effects on negative functional connections than on positive connections, which supports previous findings that many of the negative brain connections are due to non-neurophysiological effects. An R package “DensParcorr” can be downloaded from CRAN for implementing the proposed statistical methods. PMID:27242395

  10. An Efficient and Reliable Statistical Method for Estimating Functional Connectivity in Large Scale Brain Networks Using Partial Correlation.

    PubMed

    Wang, Yikai; Kang, Jian; Kemmer, Phebe B; Guo, Ying

    2016-01-01

    Currently, network-oriented analysis of fMRI data has become an important tool for understanding brain organization and brain networks. Among the range of network modeling methods, partial correlation has shown great promises in accurately detecting true brain network connections. However, the application of partial correlation in investigating brain connectivity, especially in large-scale brain networks, has been limited so far due to the technical challenges in its estimation. In this paper, we propose an efficient and reliable statistical method for estimating partial correlation in large-scale brain network modeling. Our method derives partial correlation based on the precision matrix estimated via Constrained L1-minimization Approach (CLIME), which is a recently developed statistical method that is more efficient and demonstrates better performance than the existing methods. To help select an appropriate tuning parameter for sparsity control in the network estimation, we propose a new Dens-based selection method that provides a more informative and flexible tool to allow the users to select the tuning parameter based on the desired sparsity level. Another appealing feature of the Dens-based method is that it is much faster than the existing methods, which provides an important advantage in neuroimaging applications. Simulation studies show that the Dens-based method demonstrates comparable or better performance with respect to the existing methods in network estimation. We applied the proposed partial correlation method to investigate resting state functional connectivity using rs-fMRI data from the Philadelphia Neurodevelopmental Cohort (PNC) study. Our results show that partial correlation analysis removed considerable between-module marginal connections identified by full correlation analysis, suggesting these connections were likely caused by global effects or common connection to other nodes. Based on partial correlation, we find that the most significant direct connections are between homologous brain locations in the left and right hemisphere. When comparing partial correlation derived under different sparse tuning parameters, an important finding is that the sparse regularization has more shrinkage effects on negative functional connections than on positive connections, which supports previous findings that many of the negative brain connections are due to non-neurophysiological effects. An R package "DensParcorr" can be downloaded from CRAN for implementing the proposed statistical methods.

  11. MDplot: Visualise Molecular Dynamics.

    PubMed

    Margreitter, Christian; Oostenbrink, Chris

    2017-05-10

    The MDplot package provides plotting functions to allow for automated visualisation of molecular dynamics simulation output. It is especially useful in cases where the plot generation is rather tedious due to complex file formats or when a large number of plots are generated. The graphs that are supported range from those which are standard, such as RMsD/RMsF (root-mean-square deviation and root-mean-square fluctuation, respectively) to less standard, such as thermodynamic integration analysis and hydrogen bond monitoring over time. All told, they address many commonly used analyses. In this article, we set out the MDplot package's functions, give examples of the function calls, and show the associated plots. Plotting and data parsing is separated in all cases, i.e. the respective functions can be used independently. Thus, data manipulation and the integration of additional file formats is fairly easy. Currently, the loading functions support GROMOS, GROMACS, and AMBER file formats. Moreover, we also provide a Bash interface that allows simple embedding of MDplot into Bash scripts as the final analysis step. The package can be obtained in the latest major version from CRAN (https://cran.r-project.org/package=MDplot) or in the most recent version from the project's GitHub page at https://github.com/MDplot/MDplot, where feedback is also most welcome. MDplot is published under the GPL-3 license.

  12. msap: a tool for the statistical analysis of methylation-sensitive amplified polymorphism data.

    PubMed

    Pérez-Figueroa, A

    2013-05-01

    In this study msap, an R package which analyses methylation-sensitive amplified polymorphism (MSAP or MS-AFLP) data is presented. The program provides a deep analysis of epigenetic variation starting from a binary data matrix indicating the banding pattern between the isoesquizomeric endonucleases HpaII and MspI, with differential sensitivity to cytosine methylation. After comparing the restriction fragments, the program determines if each fragment is susceptible to methylation (representative of epigenetic variation) or if there is no evidence of methylation (representative of genetic variation). The package provides, in a user-friendly command line interface, a pipeline of different analyses of the variation (genetic and epigenetic) among user-defined groups of samples, as well as the classification of the methylation occurrences in those groups. Statistical testing provides support to the analyses. A comprehensive report of the analyses and several useful plots could help researchers to assess the epigenetic and genetic variation in their MSAP experiments. msap is downloadable from CRAN (http://cran.r-project.org/) and its own webpage (http://msap.r-forge.R-project.org/). The package is intended to be easy to use even for those people unfamiliar with the R command line environment. Advanced users may take advantage of the available source code to adapt msap to more complex analyses. © 2013 Blackwell Publishing Ltd.

  13. Bayesian mixture analysis for metagenomic community profiling.

    PubMed

    Morfopoulou, Sofia; Plagnol, Vincent

    2015-09-15

    Deep sequencing of clinical samples is now an established tool for the detection of infectious pathogens, with direct medical applications. The large amount of data generated produces an opportunity to detect species even at very low levels, provided that computational tools can effectively profile the relevant metagenomic communities. Data interpretation is complicated by the fact that short sequencing reads can match multiple organisms and by the lack of completeness of existing databases, in particular for viral pathogens. Here we present metaMix, a Bayesian mixture model framework for resolving complex metagenomic mixtures. We show that the use of parallel Monte Carlo Markov chains for the exploration of the species space enables the identification of the set of species most likely to contribute to the mixture. We demonstrate the greater accuracy of metaMix compared with relevant methods, particularly for profiling complex communities consisting of several related species. We designed metaMix specifically for the analysis of deep transcriptome sequencing datasets, with a focus on viral pathogen detection; however, the principles are generally applicable to all types of metagenomic mixtures. metaMix is implemented as a user friendly R package, freely available on CRAN: http://cran.r-project.org/web/packages/metaMix sofia.morfopoulou.10@ucl.ac.uk Supplementary data are available at Bionformatics online. © The Author 2015. Published by Oxford University Press.

  14. iScreen: Image-Based High-Content RNAi Screening Analysis Tools.

    PubMed

    Zhong, Rui; Dong, Xiaonan; Levine, Beth; Xie, Yang; Xiao, Guanghua

    2015-09-01

    High-throughput RNA interference (RNAi) screening has opened up a path to investigating functional genomics in a genome-wide pattern. However, such studies are often restricted to assays that have a single readout format. Recently, advanced image technologies have been coupled with high-throughput RNAi screening to develop high-content screening, in which one or more cell image(s), instead of a single readout, were generated from each well. This image-based high-content screening technology has led to genome-wide functional annotation in a wider spectrum of biological research studies, as well as in drug and target discovery, so that complex cellular phenotypes can be measured in a multiparametric format. Despite these advances, data analysis and visualization tools are still largely lacking for these types of experiments. Therefore, we developed iScreen (image-Based High-content RNAi Screening Analysis Tool), an R package for the statistical modeling and visualization of image-based high-content RNAi screening. Two case studies were used to demonstrate the capability and efficiency of the iScreen package. iScreen is available for download on CRAN (http://cran.cnr.berkeley.edu/web/packages/iScreen/index.html). The user manual is also available as a supplementary document. © 2014 Society for Laboratory Automation and Screening.

  15. Performance of asynchronous transfer mode (ATM) local area and wide area networks for medical imaging transmission in clinical environment.

    PubMed

    Huang, H K; Wong, A W; Zhu, X

    1997-01-01

    Asynchronous transfer mode (ATM) technology emerges as a leading candidate for medical image transmission in both local area network (LAN) and wide area network (WAN) applications. This paper describes the performance of an ATM LAN and WAN network at the University of California, San Francisco. The measurements were obtained using an intensive care unit (ICU) server connecting to four image workstations (WS) at four different locations of a hospital-integrated picture archiving and communication system (HI-PACS) in a daily regular clinical environment. Four types of performance were evaluated: magnetic disk-to-disk, disk-to-redundant array of inexpensive disks (RAID), RAID-to-memory, and memory-to-memory. Results demonstrate that the transmission rate between two workstations can reach 5-6 Mbytes/s from RAID-to-memory, and 8-10 Mbytes/s from memory-to-memory. When the server has to send images to all four workstations simultaneously, the transmission rate to each WS is about 4 Mbytes/s. Both situations are adequate for radiologic image communications for picture archiving and communication systems (PACS) and teleradiology applications.

  16. Detecting subnetwork-level dynamic correlations.

    PubMed

    Yan, Yan; Qiu, Shangzhao; Jin, Zhuxuan; Gong, Sihong; Bai, Yun; Lu, Jianwei; Yu, Tianwei

    2017-01-15

    The biological regulatory system is highly dynamic. The correlations between many functionally related genes change over different biological conditions. Finding dynamic relations on the existing biological network may reveal important regulatory mechanisms. Currently no method is available to detect subnetwork-level dynamic correlations systematically on the genome-scale network. Two major issues hampered the development. The first is gene expression profiling data usually do not contain time course measurements to facilitate the analysis of dynamic relations, which can be partially addressed by using certain genes as indicators of biological conditions. Secondly, it is unclear how to effectively delineate subnetworks, and define dynamic relations between them. Here we propose a new method named LANDD (Liquid Association for Network Dynamics Detection) to find subnetworks that show substantial dynamic correlations, as defined by subnetwork A is concentrated with Liquid Association scouting genes for subnetwork B. The method produces easily interpretable results because of its focus on subnetworks that tend to comprise functionally related genes. Also, the collective behaviour of genes in a subnetwork is a much more reliable indicator of underlying biological conditions compared to using single genes as indicators. We conducted extensive simulations to validate the method's ability to detect subnetwork-level dynamic correlations. Using a real gene expression dataset and the human protein-protein interaction network, we demonstrate the method links subnetworks of distinct biological processes, with both confirmed relations and plausible new functional implications. We also found signal transduction pathways tend to show extensive dynamic relations with other functional groups. The R package is available at https://cran.r-project.org/web/packages/LANDD CONTACTS: yunba@pcom.edu, jwlu33@hotmail.com or tianwei.yu@emory.eduSupplementary information: Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  17. Clinical experiences utilizing wireless remote control and an ASP model backup archive for a disaster recovery event

    NASA Astrophysics Data System (ADS)

    Liu, Brent J.; Documet, Luis; Documet, Jorge; Huang, H. K.; Muldoon, Jean

    2004-04-01

    An Application Service Provider (ASP) archive model for disaster recovery for Saint John"s Health Center (SJHC) clinical PACS data has been implemented using a Fault-Tolerant Archive Server at the Image Processing and Informatics Laboratory, Marina del Rey, CA (IPIL) since mid-2002. The purpose of this paper is to provide clinical experiences with the implementation of an ASP model backup archive in conjunction with handheld wireless technologies for a particular disaster recovery scenario, an earthquake, in which the local PACS archive and the hospital are destroyed and the patients are moved from one hospital to another. The three sites involved are: (1) SJHC, the simulated disaster site; (2) IPIL, the ASP backup archive site; and (3) University of California, Los Angeles Medical Center (UCLA), the relocated patient site. An ASP backup archive has been established at IPIL to receive clinical PACS images daily using a T1 line from SJHC for backup and disaster recovery storage. Procedures were established to test the network connectivity and data integrity on a regular basis. In a given disaster scenario where the local PACS archive has been destroyed and the patients need to be moved to a second hospital, a wireless handheld device such as a Personal Digital Assistant (PDA) can be utilized to route images to the second hospital site with a PACS and reviewed by radiologists. To simulate this disaster scenario, a wireless network was implemented within the clinical environment in all three sites: SJHC, IPIL, and UCLA. Upon executing the disaster scenario, the SJHC PACS archive server simulates a downtime disaster event. Using the PDA, the radiologist at UCLA can query the ASP backup archive server at IPIL for PACS images and route them directly to UCLA. Implementation experiences integrating this solution within the three clinical environments as well as the wireless performance are discussed. A clinical downtime disaster scenario was implemented and successfully tested. Radiologists were able to successfully query PACS images utilizing a wireless handheld device from the ASP backup archive at IPIL and route the PACS images directly to a second clinical site at UCLA where they and the patients are located at that time. In a disaster scenario, using a wireless device, radiologists at the disaster health care center can route PACS data from an ASP backup archive server to be reviewed in a live clinical PACS environment at a secondary site. This solution allows Radiologists to use a wireless handheld device to control the image workflow and to review PACS images during a major disaster event where patients must be moved to a secondary site.

  18. The design and implementation of the HY-1B Product Archive System

    NASA Astrophysics Data System (ADS)

    Liu, Shibin; Liu, Wei; Peng, Hailong

    2010-11-01

    Product Archive System (PAS), as a background system, is the core part of the Product Archive and Distribution System (PADS) which is the center for data management of the Ground Application System of HY-1B satellite hosted by the National Satellite Ocean Application Service of China. PAS integrates a series of updating methods and technologies, such as a suitable data transmittal mode, flexible configuration files and log information in order to make the system with several desirable characteristics, such as ease of maintenance, stability, minimal complexity. This paper describes seven major components of the PAS (Network Communicator module, File Collector module, File Copy module, Task Collector module, Metadata Extractor module, Product data Archive module, Metadata catalogue import module) and some of the unique features of the system, as well as the technical problems encountered and resolved.

  19. Image dissemination and archiving.

    PubMed

    Robertson, Ian

    2007-08-01

    Images generated as part of the sonographic examination are an integral part of the medical record and must be retained according to local regulations. The standard medical image format, known as DICOM (Digital Imaging and COmmunications in Medicine) makes it possible for images from many different imaging modalities, including ultrasound, to be distributed via a standard internet network to distant viewing workstations and a central archive in an almost seamless fashion. The DICOM standard is a truly universal standard for the dissemination of medical images. When purchasing an ultrasound unit, the consumer should research the unit's capacity to generate images in a DICOM format, especially if one wishes interconnectivity with viewing workstations and an image archive that stores other medical images. PACS, an acronym for Picture Archive and Communication System refers to the infrastructure that links modalities, workstations, the image archive, and the medical record information system into an integrated system, allowing for efficient electronic distribution and storage of medical images and access to medical record data.

  20. New Archiving Distributed InfrastructuRe (NADIR): Status and Evolution

    NASA Astrophysics Data System (ADS)

    De Marco, M.; Knapic, C.; Smareglia, R.

    2015-09-01

    The New Archiving Distributed InfrastructuRe (NADIR) has been developed at INAF-OATs IA2 (Italian National Institute for Astrophysics - Astronomical Observatory of Trieste, Italian center of Astronomical Archives), as an evolution of the previous archiving and distribution system, used on several telescopes (LBT, TNG, Asiago, etc.) to improve performance, efficiency and reliability. At the present, NADIR system is running on LBT telescope and Vespa (Italian telescopes network for outreach) Ramella et al. (2014), and will be used on TNG, Asiago and IRA (Istituto Radio Astronomia) archives of Medicina, Noto and SRT radio telescopes Zanichelli et al. (2014) as the data models for radio data will be ready. This paper will discuss the progress status, the architectural choices and the solutions adopted, during the development and the commissioning phase of the project. A special attention will be given to the LBT case, due to some critical aspect of data flow and policies and standards compliance, adopted by the LBT organization.

  1. The amino acid's backup bone - storage solutions for proteomics facilities.

    PubMed

    Meckel, Hagen; Stephan, Christian; Bunse, Christian; Krafzik, Michael; Reher, Christopher; Kohl, Michael; Meyer, Helmut Erich; Eisenacher, Martin

    2014-01-01

    Proteomics methods, especially high-throughput mass spectrometry analysis have been continually developed and improved over the years. The analysis of complex biological samples produces large volumes of raw data. Data storage and recovery management pose substantial challenges to biomedical or proteomic facilities regarding backup and archiving concepts as well as hardware requirements. In this article we describe differences between the terms backup and archive with regard to manual and automatic approaches. We also introduce different storage concepts and technologies from transportable media to professional solutions such as redundant array of independent disks (RAID) systems, network attached storages (NAS) and storage area network (SAN). Moreover, we present a software solution, which we developed for the purpose of long-term preservation of large mass spectrometry raw data files on an object storage device (OSD) archiving system. Finally, advantages, disadvantages, and experiences from routine operations of the presented concepts and technologies are evaluated and discussed. This article is part of a Special Issue entitled: Computational Proteomics in the Post-Identification Era. Guest Editors: Martin Eisenacher and Christian Stephan. Copyright © 2013. Published by Elsevier B.V.

  2. The Islamic State Battle Plan: Press Release Natural Language Processing

    DTIC Science & Technology

    2016-06-01

    Processing, text mining , corpus, generalized linear model, cascade, R Shiny, leaflet, data visualization 15. NUMBER OF PAGES 83 16. PRICE CODE...Terrorism and Responses to Terrorism TDM Term Document Matrix TF Term Frequency TF-IDF Term Frequency-Inverse Document Frequency tm text mining (R...package=leaflet. Feinerer I, Hornik K (2015) Text Mining Package “tm,” Version 0.6-2. (Jul 3) https://cran.r-project.org/web/packages/tm/tm.pdf

  3. Initial Experience With A Prototype Storage System At The University Of North Carolina

    NASA Astrophysics Data System (ADS)

    Creasy, J. L.; Loendorf, D. D.; Hemminger, B. M.

    1986-06-01

    A prototype archiving system manufactured by the 3M Corporation has been in place at the University of North Carolina for approximately 12 months. The system was installed as a result of a collaboration between 3M and UNC, with 3M seeking testing of their system, and UNC realizing the need for an archiving system as an essential part of their PACS test-bed facilities. System hardware includes appropriate network and disk interface devices as well as media for both short and long term storage of images and their associated information. The system software includes those procedures necessary to communicate with the network interface elements(NIEs) as well as those procedures necessary to interpret the ACR-NEMA header blocks and to store the images. A subset of the total ACR-NEMA header is parsed and stored in a relational database system. The entire header is stored on disk with the completed study. Interactive programs have been developed that allow radiologists to easily retrieve information about the archived images and to send the full images to a viewing console. Initial experience with the system has consisted primarily of hardware and software debugging. Although the system is ACR-NEMA compatable, further objective and subjective assessments of system performance is awaiting the connection of compatable consoles and acquisition devices to the network.

  4. Transforming War Fighting through the Use of Service Based Architecture (SBA) Technology

    DTIC Science & Technology

    2006-05-04

    near-real-time video & telemetry to users on network using standard web-based protocols – Provides web-based access to archived video files MTI...Target Tracks Service Capabilities – Disseminates near-real-time MTI and Target Tracks to users on network based on consumer specified geographic...filter IBS SIGINT Service Capabilities – Disseminates near-real-time IBS SIGINT data to users on network based on consumer specified geographic filter

  5. Optical Disk Technology and Information.

    ERIC Educational Resources Information Center

    Goldstein, Charles M.

    1982-01-01

    Provides basic information on videodisks and potential applications, including inexpensive online storage, random access graphics to complement online information systems, hybrid network architectures, office automation systems, and archival storage. (JN)

  6. ARM Operations and Engineering Procedure Mobile Facility Site Startup

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Voyles, Jimmy W

    2015-05-01

    This procedure exists to define the key milestones, necessary steps, and process rules required to commission and operate an Atmospheric Radiation Measurement (ARM) Mobile Facility (AMF), with a specific focus toward on-time product delivery to the ARM Data Archive. The overall objective is to have the physical infrastructure, networking and communications, and instrument calibration, grooming, and alignment (CG&A) completed with data products available from the ARM Data Archive by the Operational Start Date milestone.

  7. Expanding the PACS archive to support clinical review, research, and education missions

    NASA Astrophysics Data System (ADS)

    Honeyman-Buck, Janice C.; Frost, Meryll M.; Drane, Walter E.

    1999-07-01

    Designing an image archive and retrieval system that supports multiple users with many different requirements and patterns of use without compromising the performance and functionality required by diagnostic radiology is an intellectual and technical challenge. A diagnostic archive, optimized for performance when retrieving diagnostic images for radiologists needed to be expanded to support a growing clinical review network, the University of Florida Brain Institute's demands for neuro-imaging, Biomedical Engineering's imaging sciences, and an electronic teaching file. Each of the groups presented a different set of problems for the designers of the system. In addition, the radiologists did not want to see nay loss of performance as new users were added.

  8. A Sharing Mind Map-Oriented Approach to Enhance Collaborative Mobile Learning with Digital Archiving Systems

    ERIC Educational Resources Information Center

    Chang, Jui-Hung; Chiu, Po-Sheng; Huang, Yueh-Min

    2018-01-01

    With the advances in mobile network technology, the use of portable devices and mobile networks for learning is not limited by time and space. Such use, in combination with appropriate learning strategies, can achieve a better effect. Despite the effectiveness of mobile learning, students' learning direction, progress, and achievement may differ.…

  9. Data systems and computer science programs: Overview

    NASA Technical Reports Server (NTRS)

    Smith, Paul H.; Hunter, Paul

    1991-01-01

    An external review of the Integrated Technology Plan for the Civil Space Program is presented. The topics are presented in viewgraph form and include the following: onboard memory and storage technology; advanced flight computers; special purpose flight processors; onboard networking and testbeds; information archive, access, and retrieval; visualization; neural networks; software engineering; and flight control and operations.

  10. Programs That Work, from the Promising Practices Network on Children, Families and Communities. RAND Tool

    ERIC Educational Resources Information Center

    Kilburn, M. Rebecca, Ed.

    2014-01-01

    The Promising Practices Network (PPN) on Children, Families and Communities (www.promisingpractices.net) began as a partnership between four state-level organizations that help public and private organizations improve the well-being of children and families. The PPN website, archived in June 2014, featured summaries of programs and practices that…

  11. From the archives of scientific diplomacy: science and the shared interests of Samuel Hartlib's London and Frederick Clodius's Gottorf.

    PubMed

    Keller, Vera; Penman, Leigh T I

    2015-03-01

    Many historians have traced the accumulation of scientific archives via communication networks. Engines for communication in early modernity have included trade, the extrapolitical Republic of Letters, religious enthusiasm, and the centralization of large emerging information states. The communication between Samuel Hartlib, John Dury, Duke Friedrich III of Gottorf-Holstein, and his key agent in England, Frederick Clodius, points to a less obvious but no less important impetus--the international negotiations of smaller states. Smaller states shaped communication networks in an international (albeit politically and religiously slanted) direction. Their networks of negotiation contributed to the internationalization of emerging science through a political and religious concept of shared interest. While interest has been central to social studies of science, interest itself has not often been historicized within the history of science. This case study demonstrates the co-production of science and society by tracing how period concepts of interest made science international.

  12. The Biomolecular Interaction Network Database and related tools 2005 update

    PubMed Central

    Alfarano, C.; Andrade, C. E.; Anthony, K.; Bahroos, N.; Bajec, M.; Bantoft, K.; Betel, D.; Bobechko, B.; Boutilier, K.; Burgess, E.; Buzadzija, K.; Cavero, R.; D'Abreo, C.; Donaldson, I.; Dorairajoo, D.; Dumontier, M. J.; Dumontier, M. R.; Earles, V.; Farrall, R.; Feldman, H.; Garderman, E.; Gong, Y.; Gonzaga, R.; Grytsan, V.; Gryz, E.; Gu, V.; Haldorsen, E.; Halupa, A.; Haw, R.; Hrvojic, A.; Hurrell, L.; Isserlin, R.; Jack, F.; Juma, F.; Khan, A.; Kon, T.; Konopinsky, S.; Le, V.; Lee, E.; Ling, S.; Magidin, M.; Moniakis, J.; Montojo, J.; Moore, S.; Muskat, B.; Ng, I.; Paraiso, J. P.; Parker, B.; Pintilie, G.; Pirone, R.; Salama, J. J.; Sgro, S.; Shan, T.; Shu, Y.; Siew, J.; Skinner, D.; Snyder, K.; Stasiuk, R.; Strumpf, D.; Tuekam, B.; Tao, S.; Wang, Z.; White, M.; Willis, R.; Wolting, C.; Wong, S.; Wrong, A.; Xin, C.; Yao, R.; Yates, B.; Zhang, S.; Zheng, K.; Pawson, T.; Ouellette, B. F. F.; Hogue, C. W. V.

    2005-01-01

    The Biomolecular Interaction Network Database (BIND) (http://bind.ca) archives biomolecular interaction, reaction, complex and pathway information. Our aim is to curate the details about molecular interactions that arise from published experimental research and to provide this information, as well as tools to enable data analysis, freely to researchers worldwide. BIND data are curated into a comprehensive machine-readable archive of computable information and provides users with methods to discover interactions and molecular mechanisms. BIND has worked to develop new methods for visualization that amplify the underlying annotation of genes and proteins to facilitate the study of molecular interaction networks. BIND has maintained an open database policy since its inception in 1999. Data growth has proceeded at a tremendous rate, approaching over 100 000 records. New services provided include a new BIND Query and Submission interface, a Standard Object Access Protocol service and the Small Molecule Interaction Database (http://smid.blueprint.org) that allows users to determine probable small molecule binding sites of new sequences and examine conserved binding residues. PMID:15608229

  13. Milk yield and composition from Angus and Angus-cross beef cows raised in southern Brazil.

    PubMed

    Rodrigues, P F; Menezes, L M; Azambuja, R C C; Suñé, R W; Barbosa Silveira, I D; Cardoso, F F

    2014-06-01

    This study assessed milk yield and composition of Angus and Angus-cross beef cows raised in southern Brazil. A total of 128 records were collected in 2 consecutive calving seasons from cows between 3 and 5 yr of age of 4 breed compositions: Angus (ANAN), Caracu × Angus (CRAN), Hereford × Angus (HHAN), and Nelore × Angus (NEAN). These cows were mated to Brangus (BN) or Braford (BO) bulls and managed under extensive grazing conditions in southern Brazil. Milk production of these cows was assessed by 2 procedures: indirectly by the calf weigh-suckle-weigh procedure (WD) and directly by machine milking (MM). Lactation curves were estimated using nonlinear regression and the following related traits were derived: peak yield (PY), peak week (PW), total yield at 210 d (TY210), and lactation persistence (PERS). Milk composition and calf weaning weight adjusted to 210 d (WW210) were also determined. The MM technique was considered more accurate because of lower standard errors of estimated means, greater statistical power, and greater correlation between TY210 and WW210 (0.50) compared to WD (0.36). Considering the more precise evaluation by MM, the CRAN and NEAN cows had greater TY210 (1070 and 1116 kg, respectively) and PY (8.1 and 7.8 kg, respectively) compared to ANAN and HHAN cows, which had 858 and 842 kg for TY210 and 6.6 and 6.3 kg for PY, respectively. The NEAN cows had the latest PW at 10.8 wk. Late-calving cows had 21% lower TY210 compared to cows that calved earlier. Milk composition was influenced by cow genotype, with CRAN and NEAN cows producing milk with greater fat (3.8 and 3.9%, respectively) and protein (3.2 and 3.1%, respectively) content compared to ANAN and HHAN cows. Regardless of the genotype, fat, protein, and total solids increased in concentration from beginning to end of lactation, while lactose content decreased. Crossbreeding of Angus with adapted breeds of taurine or indicine origin can be effective in increasing milk yield and nutrient content and, consequently, producing heavier calves at weaning under extensive grazing in southern Brazil and other similar subtropical climate regions.

  14. GPS data exploration for seismologists and geodesists

    NASA Astrophysics Data System (ADS)

    Webb, F.; Bock, Y.; Kedar, S.; Dong, D.; Jamason, P.; Chang, R.; Prawirodirdjo, L.; MacLeod, I.; Wadsworth, G.

    2007-12-01

    Over the past decade, GPS and seismic networks spanning the western US plate boundaries have produced vast amounts of data that need to be made accessible to both the geodesy and seismology communities. Unlike seismic data, raw geodetic data requires significant processing before geophysical interpretations can be made. This requires the generation of data-products (time series, velocities and strain maps) and dissemination strategies to bridge these differences and assure efficient use of data across traditionally separate communities. "GPS DATA PRODUCTS FOR SOLID EARTH SCIENCE" (GDPSES) is a multi-year NASA funded project, designed to produce and deliver high quality GPS time series, velocities, and strain fields, derived from multiple GPS networks along the western US plate boundary, and to make these products easily accessible to geophysicists. Our GPS product dissemination is through modern web-based IT methodology. Product browsing is facilitated through a web tool known as GPS Explorer and continuous streams of GPS time series are provided using web services to the seismic archive, where it can be accessed by seismologists using traditional seismic data viewing and manipulation tools. GPS-Explorer enables users to efficiently browse several layers of data products from raw data through time series, velocities and strain by providing the user with a web interface, which seamlessly interacts with a continuously updated database of these data products through the use of web-services. The current archive contains GDPSES data products beginning in 1995, and includes observations from GPS stations in EarthScope's Plate Boundary Observatory (PBO), as well as from real-time real-time CGPS stations. The generic, standards-based approach used in this project enables GDPSES to seamlessly expand indefinitely to include other space-time-dependent data products from additional GPS networks. The prototype GPS-Explorer provides users with a personalized working environment in which the user may zoom in and access subsets of the data via web services. It provides users with a variety of interactive web tools interconnected in a portlet environment to explore and save datasets of interest to return to at a later date. At the same time the GPS time series are also made available through the seismic data archive, where the GPS networks are treated as regular seismic networks, whose data is made available in data formats used by seismic utilities such as SEED readers and SAC. A key challenge, stemming from the fundamental differences between seismic and geodetic time series, is the representation of reprocessed of GPS data in the seismic archive. As GPS processing algorithms evolve and their accuracy increases, a periodic complete recreation of the the GPS time series archive is necessary.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shoopman, J. D.

    This report documents Livermore Computing (LC) activities in support of ASC L2 milestone 5589: Modernization and Expansion of LLNL Archive Disk Cache, due March 31, 2016. The full text of the milestone is included in Attachment 1. The description of the milestone is: Description: Configuration of archival disk cache systems will be modernized to reduce fragmentation, and new, higher capacity disk subsystems will be deployed. This will enhance archival disk cache capability for ASC archive users, enabling files written to the archives to remain resident on disk for many (6–12) months, regardless of file size. The milestone was completed inmore » three phases. On August 26, 2015 subsystems with 6PB of disk cache were deployed for production use in LLNL’s unclassified HPSS environment. Following that, on September 23, 2015 subsystems with 9 PB of disk cache were deployed for production use in LLNL’s classified HPSS environment. On January 31, 2016, the milestone was fully satisfied when the legacy Data Direct Networks (DDN) archive disk cache subsystems were fully retired from production use in both LLNL’s unclassified and classified HPSS environments, and only the newly deployed systems were in use.« less

  16. Fast simulation of reconstructed phylogenies under global time-dependent birth-death processes.

    PubMed

    Höhna, Sebastian

    2013-06-01

    Diversification rates and patterns may be inferred from reconstructed phylogenies. Both the time-dependent and the diversity-dependent birth-death process can produce the same observed patterns of diversity over time. To develop and test new models describing the macro-evolutionary process of diversification, generic and fast algorithms to simulate under these models are necessary. Simulations are not only important for testing and developing models but play an influential role in the assessment of model fit. In the present article, I consider as the model a global time-dependent birth-death process where each species has the same rates but rates may vary over time. For this model, I derive the likelihood of the speciation times from a reconstructed phylogenetic tree and show that each speciation event is independent and identically distributed. This fact can be used to simulate efficiently reconstructed phylogenetic trees when conditioning on the number of species, the time of the process or both. I show the usability of the simulation by approximating the posterior predictive distribution of a birth-death process with decreasing diversification rates applied on a published bird phylogeny (family Cettiidae). The methods described in this manuscript are implemented in the R package TESS, available from the repository CRAN (http://cran.r-project.org/web/packages/TESS/). Supplementary data are available at Bioinformatics online.

  17. MDplot: Visualise Molecular Dynamics

    PubMed Central

    Margreitter, Christian; Oostenbrink, Chris

    2017-01-01

    The MDplot package provides plotting functions to allow for automated visualisation of molecular dynamics simulation output. It is especially useful in cases where the plot generation is rather tedious due to complex file formats or when a large number of plots are generated. The graphs that are supported range from those which are standard, such as RMsD/RMsF (root-mean-square deviation and root-mean-square fluctuation, respectively) to less standard, such as thermodynamic integration analysis and hydrogen bond monitoring over time. All told, they address many commonly used analyses. In this article, we set out the MDplot package′s functions, give examples of the function calls, and show the associated plots. Plotting and data parsing is separated in all cases, i.e. the respective functions can be used independently. Thus, data manipulation and the integration of additional file formats is fairly easy. Currently, the loading functions support GROMOS, GROMACS, and AMBER file formats. Moreover, we also provide a Bash interface that allows simple embedding of MDplot into Bash scripts as the final analysis step. Availability The package can be obtained in the latest major version from CRAN (https://cran.r-project.org/package=MDplot) or in the most recent version from the project′s GitHub page at https://github.com/MDplot/MDplot, where feedback is also most welcome. MDplot is published under the GPL-3 license. PMID:28845302

  18. Cardiology-oriented PACS

    NASA Astrophysics Data System (ADS)

    Silva, Augusto F. d.; Costa, Carlos; Abrantes, Pedro; Gama, Vasco; Den Boer, Ad

    1998-07-01

    This paper describes an integrated system designed to provide efficient means for DICOM compliant cardiac imaging archival, transmission and visualization based on a communications backbone matching recent enabling telematic technologies like Asynchronous Transfer Mode (ATM) and switched Local Area Networks (LANs). Within a distributed client-server framework, the system was conceived on a modality based bottom-up approach, aiming ultrafast access to short term archives and seamless retrieval of cardiac video sequences throughout review stations located at the outpatient referral rooms, intensive and intermediate care units and operating theaters.

  19. Optical Fiber Transmission In A Picture Archiving And Communication System For Medical Applications

    NASA Astrophysics Data System (ADS)

    Aaron, Gilles; Bonnard, Rene

    1984-03-01

    In an hospital, the need for an electronic communication network is increasing along with the digitization of pictures. This local area network is intended to link some picture sources such as digital radiography, computed tomography, nuclear magnetic resonance, ultrasounds etc...with an archiving system. Interactive displays can be used in examination rooms, physicians offices and clinics. In such a system, three major requirements must be considered : bit-rate, cable length, and number of devices. - The bit-rate is very important because a maximum response time of a few seconds must be guaranteed for several mega-bit pictures. - The distance between nodes may be a few kilometers in some large hospitals. - The number of devices connected to the network is never greater than a few tens because picture sources and computers represent important hardware, and simple displays can be concentrated. All these conditions are fulfilled by optical fiber transmissions. Depending on the topology and the access protocol, two solutions are to be considered - Active ring - Active or passive star Finally Thomson-CSF developments of optical transmission devices for large networks of TV distribution bring us a technological support and a mass produc-tion which will cut down hardware costs.

  20. A Skilful Marine Sclerochronological Network Based Reconstruction of North Atlantic Subpolar Gyre Dynamics

    NASA Astrophysics Data System (ADS)

    Reynolds, D.; Hall, I. R.; Slater, S. M.; Scourse, J. D.; Wanamaker, A. D.; Halloran, P. R.; Garry, F. K.

    2017-12-01

    Spatial network analyses of precisely dated, and annually resolved, tree-ring proxy records have facilitated robust reconstructions of past atmospheric climate variability and the associated mechanisms and forcings that drive it. In contrast, a lack of similarly dated marine archives has constrained the use of such techniques in the marine realm, despite the potential for developing a more robust understanding of the role basin scale ocean dynamics play in the global climate system. Here we show that a spatial network of marine molluscan sclerochronological oxygen isotope (δ18Oshell) series spanning the North Atlantic region provides a skilful reconstruction of basin scale North Atlantic sea surface temperatures (SSTs). Our analyses demonstrate that the composite marine series (referred to as δ18Oproxy_PC1) is significantly sensitive to inter-annual variability in North Atlantic SSTs (R=-0.61 P<0.01) and surface air temperatures (SATs; R=-0.67, P<0.01) over the 20th century. Subpolar gyre (SPG) SSTs dominates variability in the δ18Oproxy_PC1 series at sub-centennial frequencies (R=-0.51, P<0.01). Comparison of the δ18Oproxy_PC1 series against variability in the strength of the European Slope Current and maximum North Atlantic meridional overturning circulation derived from numeric climate models (CMIP5), indicates that variability in the SPG region, associated with the strength of the surface currents of the North Atlantic, are playing a significant role in shaping the multi-decadal scale SST variability over the industrial era. These analyses demonstrate that spatial networks developed from sclerochronological archives can provide powerful baseline archives of past ocean variability that can facilitate the development of a quantitative understanding for the role the oceans play in the global climate systems and constraining uncertainties in numeric climate models.

  1. Quantifying travel time variability in transportation networks.

    DOT National Transportation Integrated Search

    2010-03-01

    Nonrecurring congestion creates significant delay on freeways in urban areas, lending importance : to the study of facility reliability. In locations where traffic detectors record and archive data, : approximate probability distributions for travel ...

  2. Products and Services Available from the Southern California Earthquake Data Center (SCEDC) and the Southern California Seismic Network (SCSN)

    NASA Astrophysics Data System (ADS)

    Chen, S. E.; Yu, E.; Bhaskaran, A.; Chowdhury, F. R.; Meisenhelter, S.; Hutton, K.; Given, D.; Hauksson, E.; Clayton, R. W.

    2011-12-01

    Currently, the SCEDC archives continuous and triggered data from nearly 8400 data channels from 425 SCSN recorded stations, processing and archiving an average of 6.4 TB of continuous waveforms and 12,000 earthquakes each year. The SCEDC provides public access to these earthquake parametric and waveform data through its website www.data.scec.org and through client applications such as STP and DHI. This poster will describe the most significant developments at the SCEDC during 2011. New website design: ? The SCEDC has revamped its website. The changes make it easier for users to search the archive, discover updates and new content. These changes also improve our ability to manage and update the site. New data holdings: ? Post processing on El Mayor Cucapah 7.2 sequence continues. To date there have been 11847 events reviewed. Updates are available in the earthquake catalog immediately. ? A double difference catalog (Hauksson et. al 2011) spanning 1981 to 6/30/11 will be available for download at www.data.scec.org and available via STP. ? A focal mechanism catalog determined by Yang et al. 2011 is available for distribution at www.data.scec.org. ? Waveforms from Southern California NetQuake stations are now being stored in the SCEDC archive and available via STP as event associated waveforms. Amplitudes from these stations are also being stored in the archive and used by ShakeMap. ? As part of a NASA/AIST project in collaboration with JPL and SIO, the SCEDC will receive real time 1 sps streams of GPS displacement solutions from the California Real Time Network (http://sopac.ucsd.edu/projects/realtime; Genrich and Bock, 2006, J. Geophys. Res.). These channels will be archived at the SCEDC as miniSEED waveforms, which then can be distributed to the user community via applications such as STP. Improvements in the user tool STP: ? STP sac output now includes picks from the SCSN. New archival methods: ? The SCEDC is exploring the feasibility of archiving and distributing waveform data using cloud computing such as Google Apps. A month of continuous data from the SCEDC archive will be stored in Google Apps and a client developed to access it in a manner similar to STP. The data is stored in miniseed format with gzip compression. Time gaps between time series were padded with null values, which substantially increases search efficiency by make the records uniform in length.

  3. The Rondonia Lightning Detection Network: Network Description, Science Objectives, Data Processing Archival/Methodology, and Results

    NASA Technical Reports Server (NTRS)

    Blakeslee, R. J.; Bailey, J. C.; Pinto, O.; Athayde, A.; Renno, N.; Weidman, C. D.

    2003-01-01

    A four station Advanced Lightning Direction Finder (ALDF) network was established in the state of Rondonia in western Brazil in 1999 through a collaboration of U.S. and Brazilian participants from NASA, INPE, INMET, and various universities. The network utilizes ALDF IMPACT (Improved Accuracy from Combined Technology) sensors to provide cloud-to-ground lightning observations (i.e., stroke/flash locations, signal amplitude, and polarity) using both time-of- arrival and magnetic direction finding techniques. The observations are collected, processed and archived at a central site in Brasilia and at the NASA/Marshall Space Flight Center in Huntsville, Alabama. Initial, non-quality assured quick-look results are made available in near real-time over the Internet. The network, which is still operational, was deployed to provide ground truth data for the Lightning Imaging Sensor (LIS) on the Tropical Rainfall Measuring Mission (TRMM) satellite that was launched in November 1997. The measurements are also being used to investigate the relationship between the electrical, microphysical and kinematic properties of tropical convection. In addition, the long-time series observations produced by this network will help establish a regional lightning climatological database, supplementing other databases in Brazil that already exist or may soon be implemented. Analytic inversion algorithms developed at the NASA/Marshall Space Flight Center have been applied to the Rondonian ALDF lightning observations to obtain site error corrections and improved location retrievals. The data will also be corrected for the network detection efficiency. The processing methodology and the results from the analysis of four years of network operations will be presented.

  4. Sexually Transmitted Diseases (STDs) Prevention

    MedlinePlus

    ... Isolate Surveillance Project (GISP) STD Health Equity Community Approaches to Reducing STDs Archive STD Surveillance Network (SSuN) ... It is best to get all three doses (shots) before becoming sexually active . However, HPV vaccines are ...

  5. Standardization of the Definitions of Vertical Resolution and Uncertainty in the NDACC-archived Ozone and Temperature Lidar Measurements

    NASA Technical Reports Server (NTRS)

    Leblanc, T.; Godin-Beekmann, S.; Payen, Godin-Beekmann; Gabarrot, Franck; vanGijsel, Anne; Bandoro, J.; Sica, R.; Trickl, T.

    2012-01-01

    The international Network for the Detection of Atmospheric Composition Change (NDACC) is a global network of high-quality, remote-sensing research stations for observing and understanding the physical and chemical state of the Earth atmosphere. As part of NDACC, over 20 ground-based lidar instruments are dedicated to the long-term monitoring of atmospheric composition and to the validation of space-borne measurements of the atmosphere from environmental satellites such as Aura and ENVISAT. One caveat of large networks such as NDACC is the difficulty to archive measurement and analysis information consistently from one research group (or instrument) to another [1][2][3]. Yet the need for consistent definitions has strengthened as datasets of various origin (e.g., satellite and ground-based) are increasingly used for intercomparisons, validation, and ingested together in global assimilation systems.In the framework of the 2010 Call for Proposals by the International Space Science Institute (ISSI) located in Bern, Switzerland, a Team of lidar experts was created to address existing issues in three critical aspects of the NDACC lidar ozone and temperature data retrievals: signal filtering and the vertical filtering of the retrieved profiles, the quantification and propagation of the uncertainties, and the consistent definition and reporting of filtering and uncertainties in the NDACC- archived products. Additional experts from the satellite and global data standards communities complement the team to help address issues specific to the latter aspect.

  6. NOAA tsunami water level archive - scientific perspectives and discoveries

    NASA Astrophysics Data System (ADS)

    Mungov, G.; Eble, M. C.; McLean, S. J.

    2013-12-01

    The National Oceanic and Atmospheric Administration (NOAA) National Geophysical Data Center (NGDC) and co-located World Data Service for Geophysics (WDS) provides long-term archive, data management, and access to national and global tsunami data. Currently, NGDC archives and processes high-resolution data recorded by the Deep-ocean Assessment and Reporting of Tsunami (DART) network, the coastal-tide-gauge network from the National Ocean Service (NOS) as well as tide-gauge data recorded by all gauges in the two National Weather Service (NWS) Tsunami Warning Centers' (TWCs) regional networks. The challenge in processing these data is that the observations from the deep-ocean, Pacific Islands, Alaska region, and United States West and East Coasts display commonalities, but, at the same time, differ significantly, especially when extreme events are considered. The focus of this work is on how time integration of raw observations (10-seconds to 1-minute) could mask extreme water levels. Analysis of the statistical and spectral characteristics obtained from records with different time step of integration will be presented. Results show the need to precisely calibrate the despiking procedure against raw data due to the significant differences in the variability of deep-ocean and coastal tide-gauge observations. It is shown that special attention should be drawn to the very strong water level declines associated with the passage of the North Atlantic cyclones. Strong changes for the deep ocean and for the West Coast have implications for data quality but these same features are typical for the East Coast regime.

  7. EGU2013 SM1.4/GI1.6 session: "Improving seismic networks performances: from site selection to data integration"

    NASA Astrophysics Data System (ADS)

    Pesaresi, D.; Busby, R.

    2013-08-01

    The number and quality of seismic stations and networks in Europe continually improves, nevertheless there is always scope to optimize their performance. In this session we welcomed contributions from all aspects of seismic network installation, operation and management. This includes site selection; equipment testing and installation; planning and implementing communication paths; policies for redundancy in data acquisition, processing and archiving; and integration of different datasets including GPS and OBS.

  8. Integration, acceptance testing, and clinical operation of the Medical Information, Communication and Archive System, phase II.

    PubMed

    Smith, E M; Wandtke, J; Robinson, A

    1999-05-01

    The Medical Information, Communication and Archive System (MICAS) is a multivendor incremental approach to picture archiving and communications system (PACS). It is a multimodality integrated image management system that is seamlessly integrated with the radiology information system (RIS). Phase II enhancements of MICAS include a permanent archive, automated workflow, study caches, Microsoft (Redmond, WA) Windows NT diagnostic workstations with all components adhering to Digital Information Communications in Medicine (DICOM) standards. MICAS is designed as an enterprise-wide PACS to provide images and reports throughout the Strong Health healthcare network. Phase II includes the addition of a Cemax-Icon (Fremont, CA) archive, PACS broker (Mitra, Waterloo, Canada), an interface (IDX PACSlink, Burlington, VT) to the RIS (IDXrad) plus the conversion of the UNIX-based redundant array of inexpensive disks (RAID) 5 temporary archives in phase I to NT-based RAID 0 DICOM modality-specific study caches (ImageLabs, Bedford, MA). The phase I acquisition engines and workflow management software was uninstalled and the Cemax archive manager (AM) assumed these functions. The existing ImageLabs UNIX-based viewing software was enhanced and converted to an NT-based DICOM viewer. Installation of phase II hardware and software and integration with existing components began in July 1998. Phase II of MICAS demonstrates that a multivendor open-system incremental approach to PACS is feasible, cost-effective, and has significant advantages over a single-vendor implementation.

  9. Cloud archiving and data mining of High-Resolution Rapid Refresh forecast model output

    NASA Astrophysics Data System (ADS)

    Blaylock, Brian K.; Horel, John D.; Liston, Samuel T.

    2017-12-01

    Weather-related research often requires synthesizing vast amounts of data that need archival solutions that are both economical and viable during and past the lifetime of the project. Public cloud computing services (e.g., from Amazon, Microsoft, or Google) or private clouds managed by research institutions are providing object data storage systems potentially appropriate for long-term archives of such large geophysical data sets. We illustrate the use of a private cloud object store developed by the Center for High Performance Computing (CHPC) at the University of Utah. Since early 2015, we have been archiving thousands of two-dimensional gridded fields (each one containing over 1.9 million values over the contiguous United States) from the High-Resolution Rapid Refresh (HRRR) data assimilation and forecast modeling system. The archive is being used for retrospective analyses of meteorological conditions during high-impact weather events, assessing the accuracy of the HRRR forecasts, and providing initial and boundary conditions for research simulations. The archive is accessible interactively and through automated download procedures for researchers at other institutions that can be tailored by the user to extract individual two-dimensional grids from within the highly compressed files. Characteristics of the CHPC object storage system are summarized relative to network file system storage or tape storage solutions. The CHPC storage system is proving to be a scalable, reliable, extensible, affordable, and usable archive solution for our research.

  10. The IRIS Data Management Center: An international "network of networks", providing open, automated access to geographically distributed sensors of geophysical and environmental data.

    NASA Astrophysics Data System (ADS)

    Benson, R. B.; Ahern, T. K.; Trabant, C.

    2006-12-01

    The IRIS Data Management System has long supported international collaboration for seismology by both deploying a global network of seismometers and creating and maintaining an open and accessible archive in Seattle, WA, known as the Data Management Center (DMC). With sensors distributed on a global scale spanning more than 30 years of digital data, the DMC provides a rich repository of observations across broad time and space domains. Primary seismological data types include strong motion and broadband seismometers, conventional and superconducting gravimeters, tilt and creep meters, GPS measurements, along with other similar sensors that record accurate and calibrated ground motion. What may not be as well understood is the volume of environmental data that accompanies typical seismological data these days. This poster will review the types of time-series data that are currently being collected, how they are collected, and made freely available for download at the IRIS DMC. Environmental sensor data that is often co-located with geophysical data sensors include temperature, barometric pressure, wind direction and speed, humidity, insolation, rain gauge, and sometimes hydrological data like water current, level, temperature and depth. As the primary archival institution of the International Federation of Digital Seismograph Networks (FDSN), the IRIS DMC collects approximately 13,600 channels of real-time data from 69 different networks, from close to 1600 individual stations, currently averaging 10Tb per year in total. A major contribution to the IRIS archive currently is the EarthScope project data, a ten-year science undertaking that is collecting data from a high-resolution, multi-variate sensor network. Data types include magnetotelluric, high-sample rate seismics from a borehole drilled into the San Andreas fault (SAFOD) and various types of strain data from the Plate Boundary Observatory (PBO). In addition to the DMC, data centers located in other countries are networked seamlessly, and are providing access for researchers to these data from national networks around the world utilizing the IRIS developed Data Handling Interface (DHI) system. This poster will highlight some of the DHI enabled clients that allow geophysical information to be directly transferred to the clients. This ability allows one to construct a virtual network of data centers providing the illusion of a single virtual observatory. Furthermore, some of the features that will be shown include direct connections to MATLAB and the ability to access globally distributed sensor data in real time. We encourage discussion and participation from network operators who would like to leverage existing technology, as well as enabling collaboration.

  11. KMgene: a unified R package for gene-based association analysis for complex traits.

    PubMed

    Yan, Qi; Fang, Zhou; Chen, Wei; Stegle, Oliver

    2018-02-09

    In this report, we introduce an R package KMgene for performing gene-based association tests for familial, multivariate or longitudinal traits using kernel machine (KM) regression under a generalized linear mixed model (GLMM) framework. Extensive simulations were performed to evaluate the validity of the approaches implemented in KMgene. http://cran.r-project.org/web/packages/KMgene. qi.yan@chp.edu or wei.chen@chp.edu. Supplementary data are available at Bioinformatics online. © The Author(s) 2018. Published by Oxford University Press.

  12. Assortative model for social networks

    NASA Astrophysics Data System (ADS)

    Catanzaro, Michele; Caldarelli, Guido; Pietronero, Luciano

    2004-09-01

    In this Brief Report we present a version of a network growth model, generalized in order to describe the behavior of social networks. The case of study considered is the preprint archive at cul.arxiv.org. Each node corresponds to a scientist, and a link is present whenever two authors wrote a paper together. This graph is a nice example of degree-assortative network, that is, to say a network where sites with similar degree are connected to each other. The model presented is one of the few able to reproduce such behavior, giving some insight on the microscopic dynamics at the basis of the graph structure.

  13. Integrating a local database into the StarView distributed user interface

    NASA Technical Reports Server (NTRS)

    Silberberg, D. P.

    1992-01-01

    A distributed user interface to the Space Telescope Data Archive and Distribution Service (DADS) known as StarView is being developed. The DADS architecture consists of the data archive as well as a relational database catalog describing the archive. StarView is a client/server system in which the user interface is the front-end client to the DADS catalog and archive servers. Users query the DADS catalog from the StarView interface. Query commands are transmitted via a network and evaluated by the database. The results are returned via the network and are displayed on StarView forms. Based on the results, users decide which data sets to retrieve from the DADS archive. Archive requests are packaged by StarView and sent to DADS, which returns the requested data sets to the users. The advantages of distributed client/server user interfaces over traditional one-machine systems are well known. Since users run software on machines separate from the database, the overall client response time is much faster. Also, since the server is free to process only database requests, the database response time is much faster. Disadvantages inherent in this architecture are slow overall database access time due to the network delays, lack of a 'get previous row' command, and that refinements of a previously issued query must be submitted to the database server, even though the domain of values have already been returned by the previous query. This architecture also does not allow users to cross correlate DADS catalog data with other catalogs. Clearly, a distributed user interface would be more powerful if it overcame these disadvantages. A local database is being integrated into StarView to overcome these disadvantages. When a query is made through a StarView form, which is often composed of fields from multiple tables, it is translated to an SQL query and issued to the DADS catalog. At the same time, a local database table is created to contain the resulting rows of the query. The returned rows are displayed on the form as well as inserted into the local database table. Identical results are produced by reissuing the query to either the DADS catalog or to the local table. Relational databases do not provide a 'get previous row' function because of the inherent complexity of retrieving previous rows of multiple-table joins. However, since this function is easily implemented on a single table, StarView uses the local table to retrieve the previous row. Also, StarView issues subsequent query refinements to the local table instead of the DADS catalog, eliminating the network transmission overhead. Finally, other catalogs can be imported into the local database for cross correlation with local tables. Overall, it is believe that this is a more powerful architecture for distributed, database user interfaces.

  14. Indicators of Suicide Found on Social Networks: Phase 2

    DTIC Science & Technology

    2015-10-01

    Engagement in Sport and Suicide Risk. Archives of Suicide Research . 11(4), pp375-390. Chioqueta, A. P. & Stiles, T. C. (2007). The relationship between...Approved for Public Distribution: Distribution Unlimited Defense Personnel and Security Research Center Defense Manpower Data Center Technical...Report 15-04 October 2015 Indicators of Suicide Found on Social Networks: Phase 2 Andrée E. Rose Defense Personnel and Security Research

  15. Building the Joint Battlespace Infosphere. Volume 1: Summary

    DTIC Science & Technology

    1999-12-17

    Integrity guarantees. The information staff will conduct audits and other routine procedures to ensure that the JBI platform and its clients are...delete the object; thereafter, the object will not be available to other JBI clients, but it will have been saved in an archive for auditing or...this project are maintaining multiyear, multilocation programs in nomadic networking and assuring quality of service in emerging networks. Dynamic

  16. Providing Web Interfaces to the NSF EarthScope USArray Transportable Array

    NASA Astrophysics Data System (ADS)

    Vernon, Frank; Newman, Robert; Lindquist, Kent

    2010-05-01

    Since April 2004 the EarthScope USArray seismic network has grown to over 850 broadband stations that stream multi-channel data in near real-time to the Array Network Facility in San Diego. Providing secure, yet open, access to real-time and archived data for a broad range of audiences is best served by a series of platform agnostic low-latency web-based applications. We present a framework of tools that mediate between the world wide web and Boulder Real Time Technologies Antelope Environmental Monitoring System data acquisition and archival software. These tools provide comprehensive information to audiences ranging from network operators and geoscience researchers, to funding agencies and the general public. This ranges from network-wide to station-specific metadata, state-of-health metrics, event detection rates, archival data and dynamic report generation over a station's two year life span. Leveraging open source web-site development frameworks for both the server side (Perl, Python and PHP) and client-side (Flickr, Google Maps/Earth and jQuery) facilitates the development of a robust extensible architecture that can be tailored on a per-user basis, with rapid prototyping and development that adheres to web-standards. Typical seismic data warehouses allow online users to query and download data collected from regional networks, without the scientist directly visually assessing data coverage and/or quality. Using a suite of web-based protocols, we have recently developed an online seismic waveform interface that directly queries and displays data from a relational database through a web-browser. Using the Python interface to Datascope and the Python-based Twisted network package on the server side, and the jQuery Javascript framework on the client side to send and receive asynchronous waveform queries, we display broadband seismic data using the HTML Canvas element that is globally accessible by anyone using a modern web-browser. We are currently creating additional interface tools to create a rich-client interface for accessing and displaying seismic data that can be deployed to any system running the Antelope Real Time System. The software is freely available from the Antelope contributed code Git repository (http://www.antelopeusersgroup.org).

  17. Interactive access to LP DAAC satellite data archives through a combination of open-source and custom middleware web services

    USGS Publications Warehouse

    Davis, Brian N.; Werpy, Jason; Friesz, Aaron M.; Impecoven, Kevin; Quenzer, Robert; Maiersperger, Tom; Meyer, David J.

    2015-01-01

    Current methods of searching for and retrieving data from satellite land remote sensing archives do not allow for interactive information extraction. Instead, Earth science data users are required to download files over low-bandwidth networks to local workstations and process data before science questions can be addressed. New methods of extracting information from data archives need to become more interactive to meet user demands for deriving increasingly complex information from rapidly expanding archives. Moving the tools required for processing data to computer systems of data providers, and away from systems of the data consumer, can improve turnaround times for data processing workflows. The implementation of middleware services was used to provide interactive access to archive data. The goal of this middleware services development is to enable Earth science data users to access remote sensing archives for immediate answers to science questions instead of links to large volumes of data to download and process. Exposing data and metadata to web-based services enables machine-driven queries and data interaction. Also, product quality information can be integrated to enable additional filtering and sub-setting. Only the reduced content required to complete an analysis is then transferred to the user.

  18. Implementation of a filmless mini picture archiving and communication system in ultrasonography: experience after one year of use.

    PubMed

    Henri, C J; Cox, R D; Bret, P M

    1997-08-01

    This article details our experience in developing and operating an ultrasound mini-picture archiving and communication system (PACS). Using software developed in-house, low-end Macintosh computers (Apple Computer Co. Cupertino, CA) equipped with framegrabbers coordinate the entry of patient demographic information, image acquisition, and viewing on each ultrasound scanner. After each exam, the data are transmitted to a central archive server where they can be accessed from anywhere on the network. The archive server also provides web-based access to the data and manages pre-fetch and other requests for data that may no longer be on-line. Archival is fully automatic and is performed on recordable compact disk (CD) without compression. The system has been filmless now for over 18 months. In the meantime, one film processor has been eliminated and the position of one film clerk has been reallocated. Previously, nine ultrasound machines produced approximately 150 sheets of laser film per day (at 14 images per sheet). The same quantity of data are now archived without compression onto a single CD. Start-up costs were recovered within six months, and the project has been extended to include computed tomography (CT) and magnetic resonance imaging (MRI).

  19. Networking of Bibliographical Information: Lessons learned for the Virtual Observatory

    NASA Astrophysics Data System (ADS)

    Genova, Françoise; Egret, Daniel

    Networking of bibliographic information is particularly remarkable in astronomy. On-line journals, the ADS bibliographic database, SIMBAD and NED are everyday tools for research, and provide easy navigation from one resource to another. Tables are published on line, in close collaboration with data centers. Recent new developments include the links between observatory archives and the ADS, as well as the large scale prototyping of object links between Astronomy and Astrophysics and SIMBAD, following those implemented a few years ago with New Astronomy and the International Bulletin of Variable stars . This networking has been made possible by close collaboration between the ADS, data centers such as the CDS and NED, and the journals, and this partnership being now extended to observatory archives. Simple, de facto exchange standards, like the bibcode to refer to a published paper, have been the key for building links and exchanging data. This partnership, in which practitioners from different disciplines agree to link their resources and to work together to define useful and usable standards, has produced a revolution in scientists' practice. It is an excellent model for the Virtual Observatory projects.

  20. The National Institutes of Health Clinical Center Digital Imaging Network, Picture Archival and Communication System, and Radiology Information System.

    PubMed

    Goldszal, A F; Brown, G K; McDonald, H J; Vucich, J J; Staab, E V

    2001-06-01

    In this work, we describe the digital imaging network (DIN), picture archival and communication system (PACS), and radiology information system (RIS) currently being implemented at the Clinical Center, National Institutes of Health (NIH). These systems are presently in clinical operation. The DIN is a redundant meshed network designed to address gigabit density and expected high bandwidth requirements for image transfer and server aggregation. The PACS projected workload is 5.0 TB of new imaging data per year. Its architecture consists of a central, high-throughput Digital Imaging and Communications in Medicine (DICOM) data repository and distributed redundant array of inexpensive disks (RAID) servers employing fiber-channel technology for immediate delivery of imaging data. On demand distribution of images and reports to clinicians and researchers is accomplished via a clustered web server. The RIS follows a client-server model and provides tools to order exams, schedule resources, retrieve and review results, and generate management reports. The RIS-hospital information system (HIS) interfaces include admissions, discharges, and transfers (ATDs)/demographics, orders, appointment notifications, doctors update, and results.

  1. The status of the international Halley watch

    NASA Technical Reports Server (NTRS)

    Newburn, Ray L., Jr.; Rahe, Juergen

    1987-01-01

    More than 1000 professional astronomers worldwide actually observed Halley's comet from the ground. Preliminary logs from the observers indicate that 20-40 Gbytes of data were acquired in eight professional disciplines and as much as 5 Gbytes in the amateur network. The latter will be used to fill in gaps in the Archive and to provide a visual light curve. In addition roughly 400 Mbytes of data were taken on Comet Giacobini-Zinner. Data will be accepted for archiving until early 1989. The permanent archive will consist of a set of CD-ROMs and a set of books, publication of both to be completed by mid-1990. Data from the space missions will be included but only on the CDs. From every indication, the ground based effort and the space missions complimented each other beautifully, both directly in the solution of spacecraft navigation problems and indirectly in the solution of scientific problems. The major remaining concern is that scientists submit their data to the Archive before the 1989 deadline.

  2. Landsat International Cooperators and Global Archive Consolidation

    USGS Publications Warehouse

    ,

    2016-04-07

    Landsat missions have always been an important component of U.S. foreign policy, as well as science and technology policy. The program’s longstanding network of International Cooperators (ICs), which operate numerous International Ground Stations (IGS) around the world, embodies the United States’ policy of peaceful use of outer space and the worldwide dissemination of civil space technology for public benefit. Thus, the ICs provide an essential dimension to the Landsat mission.In 2010, the Landsat Global Archive Consolidation (LGAC) effort began, with a goal to consolidate the Landsat data archives of all international ground stations, make the data more accessible to the global Landsat community, and significantly increase the frequency of observations over a given area of interest to improve scientific uses such as change detection and analysis.

  3. The Rondonia Lightning Detection Network: Network Description, Science Objectives, Data Processing/Archival Methodology, and First Results

    NASA Technical Reports Server (NTRS)

    Blakelee, Richard

    1999-01-01

    A four station Advanced Lightning Direction Finder (ALDF) network was recently established in the state of Rondonia in western Brazil through a collaboration of U.S. and Brazilian participants from NASA, INPE, INMET, and various universities. The network utilizes ALDF IMPACT (Improved Accuracy from Combined Technology) sensors to provide cloud-to-ground lightning observations (i.e., stroke/flash locations, signal amplitude, and polarity) using both time-of-arrival and magnetic direction finding techniques. The observations are collected, processed and archived at a central site in Brasilia and at the NASA/Marshall Space Flight Center (MSFC) in Huntsville, Alabama. Initial, non-quality assured quick-look results are made available in near real-time over the internet. The network will remain deployed for several years to provide ground truth data for the Lightning Imaging Sensor (LIS) on the Tropical Rainfall Measurement Mission (TRMM) satellite which was launched in November 1997. The measurements will also be used to investigate the relationship between the electrical, microphysical and kinematic properties of tropical convection. In addition, the long-term observations from this network will contribute in establishing a regional lightning climatological data base, supplementing other data bases in Brazil that already exist or may soon be implemented. Analytic inversion algorithms developed at NASA/MSFC are now being applied to the Rondonian ALDF lightning observations to obtain site error corrections and improved location retrievals. The processing methodology and the initial results from an analysis of the first 6 months of network operations will be presented.

  4. The Rondonia Lightning Detection Network: Network Description, Science Objectives, Data Processing/Archival Methodology, and First Results

    NASA Technical Reports Server (NTRS)

    Blakeslee, Rich; Bailey, Jeff; Koshak, Bill

    1999-01-01

    A four station Advanced Lightning Direction Finder (ALDF) network was recently established in the state of Rondonia in western Brazil through a collaboration of U.S. and Brazilian participants from NASA, INPE, INMET, and various universities. The network utilizes ALDF IMPACT (Improved Accuracy from Combined Technology) sensors to provide cloud-to-ground lightning observations (i.e., stroke/flash locations, signal amplitude, and polarity) using both time-of-arrival and magnetic direction finding techniques. The observations are collected, processed and archived at a central site in Brasilia and at the NASA/ Marshall Space Flight Center (MSFC) in Huntsville, Alabama. Initial, non-quality assured quick-look results are made available in near real-time over the internet. The network will remain deployed for several years to provide ground truth data for the Lightning Imaging Sensor (LIS) on the Tropical Rainfall Measuring Mission (TRMM) satellite which was launched in November 1997. The measurements will also be used to investigate the relationship between the electrical, microphysical and kinematic properties of tropical convection. In addition, the long-term observations from this network will contribute in establishing a regional lightning climatological data base, supplementing other data bases in Brazil that already exist or may soon be implemented. Analytic inversion algorithms developed at NASA/Marshall Space Flight Center (MSFC) are now being applied to the Rondonian ALDF lightning observations to obtain site error corrections and improved location retrievals. The processing methodology and the initial results from an analysis of the first 6 months of network operations will be presented.

  5. Digital data preservation for scholarly publications in astronomy

    NASA Astrophysics Data System (ADS)

    Choudhury, Sayeed; di Lauro, Tim; Szalay, Alex; Vishniac, Ethan; Hanisch, Robert; Steffen, Julie; Milkey, Robert; Ehling, Teresa; Plante, Ray

    2007-11-01

    Astronomy is similar to other scientific disciplines in that scholarly publication relies on the presentation and interpretation of data. But although astronomy now has archives for its primary research telescopes and associated surveys, the highly processed data that is presented in the peer-reviewed journals and is the basis for final analysis and interpretation is generally not archived and has no permanent repository. We have initiated a project whose goal is to implement an end-to-end prototype system which, through a partnership of a professional society, that society's scholarly publications/publishers, research libraries, and an information technology substrate provided by the Virtual Observatory, will capture high-level digital data as part of the publication process and establish a distributed network of curated, permanent data repositories. The data in this network will be accessible through the research journals, astronomy data centers, and Virtual Observatory data discovery portals.

  6. speaq 2.0: A complete workflow for high-throughput 1D NMR spectra processing and quantification.

    PubMed

    Beirnaert, Charlie; Meysman, Pieter; Vu, Trung Nghia; Hermans, Nina; Apers, Sandra; Pieters, Luc; Covaci, Adrian; Laukens, Kris

    2018-03-01

    Nuclear Magnetic Resonance (NMR) spectroscopy is, together with liquid chromatography-mass spectrometry (LC-MS), the most established platform to perform metabolomics. In contrast to LC-MS however, NMR data is predominantly being processed with commercial software. Meanwhile its data processing remains tedious and dependent on user interventions. As a follow-up to speaq, a previously released workflow for NMR spectral alignment and quantitation, we present speaq 2.0. This completely revised framework to automatically analyze 1D NMR spectra uses wavelets to efficiently summarize the raw spectra with minimal information loss or user interaction. The tool offers a fast and easy workflow that starts with the common approach of peak-picking, followed by grouping, thus avoiding the binning step. This yields a matrix consisting of features, samples and peak values that can be conveniently processed either by using included multivariate statistical functions or by using many other recently developed methods for NMR data analysis. speaq 2.0 facilitates robust and high-throughput metabolomics based on 1D NMR but is also compatible with other NMR frameworks or complementary LC-MS workflows. The methods are benchmarked using a simulated dataset and two publicly available datasets. speaq 2.0 is distributed through the existing speaq R package to provide a complete solution for NMR data processing. The package and the code for the presented case studies are freely available on CRAN (https://cran.r-project.org/package=speaq) and GitHub (https://github.com/beirnaert/speaq).

  7. speaq 2.0: A complete workflow for high-throughput 1D NMR spectra processing and quantification

    PubMed Central

    Pieters, Luc; Covaci, Adrian

    2018-01-01

    Nuclear Magnetic Resonance (NMR) spectroscopy is, together with liquid chromatography-mass spectrometry (LC-MS), the most established platform to perform metabolomics. In contrast to LC-MS however, NMR data is predominantly being processed with commercial software. Meanwhile its data processing remains tedious and dependent on user interventions. As a follow-up to speaq, a previously released workflow for NMR spectral alignment and quantitation, we present speaq 2.0. This completely revised framework to automatically analyze 1D NMR spectra uses wavelets to efficiently summarize the raw spectra with minimal information loss or user interaction. The tool offers a fast and easy workflow that starts with the common approach of peak-picking, followed by grouping, thus avoiding the binning step. This yields a matrix consisting of features, samples and peak values that can be conveniently processed either by using included multivariate statistical functions or by using many other recently developed methods for NMR data analysis. speaq 2.0 facilitates robust and high-throughput metabolomics based on 1D NMR but is also compatible with other NMR frameworks or complementary LC-MS workflows. The methods are benchmarked using a simulated dataset and two publicly available datasets. speaq 2.0 is distributed through the existing speaq R package to provide a complete solution for NMR data processing. The package and the code for the presented case studies are freely available on CRAN (https://cran.r-project.org/package=speaq) and GitHub (https://github.com/beirnaert/speaq). PMID:29494588

  8. pulver: an R package for parallel ultra-rapid p-value computation for linear regression interaction terms.

    PubMed

    Molnos, Sophie; Baumbach, Clemens; Wahl, Simone; Müller-Nurasyid, Martina; Strauch, Konstantin; Wang-Sattler, Rui; Waldenberger, Melanie; Meitinger, Thomas; Adamski, Jerzy; Kastenmüller, Gabi; Suhre, Karsten; Peters, Annette; Grallert, Harald; Theis, Fabian J; Gieger, Christian

    2017-09-29

    Genome-wide association studies allow us to understand the genetics of complex diseases. Human metabolism provides information about the disease-causing mechanisms, so it is usual to investigate the associations between genetic variants and metabolite levels. However, only considering genetic variants and their effects on one trait ignores the possible interplay between different "omics" layers. Existing tools only consider single-nucleotide polymorphism (SNP)-SNP interactions, and no practical tool is available for large-scale investigations of the interactions between pairs of arbitrary quantitative variables. We developed an R package called pulver to compute p-values for the interaction term in a very large number of linear regression models. Comparisons based on simulated data showed that pulver is much faster than the existing tools. This is achieved by using the correlation coefficient to test the null-hypothesis, which avoids the costly computation of inversions. Additional tricks are a rearrangement of the order, when iterating through the different "omics" layers, and implementing this algorithm in the fast programming language C++. Furthermore, we applied our algorithm to data from the German KORA study to investigate a real-world problem involving the interplay among DNA methylation, genetic variants, and metabolite levels. The pulver package is a convenient and rapid tool for screening huge numbers of linear regression models for significant interaction terms in arbitrary pairs of quantitative variables. pulver is written in R and C++, and can be downloaded freely from CRAN at https://cran.r-project.org/web/packages/pulver/ .

  9. A new method to study the change of miRNA-mRNA interactions due to environmental exposures.

    PubMed

    Petralia, Francesca; Aushev, Vasily N; Gopalakrishnan, Kalpana; Kappil, Maya; W Khin, Nyan; Chen, Jia; Teitelbaum, Susan L; Wang, Pei

    2017-07-15

    Integrative approaches characterizing the interactions among different types of biological molecules have been demonstrated to be useful for revealing informative biological mechanisms. One such example is the interaction between microRNA (miRNA) and messenger RNA (mRNA), whose deregulation may be sensitive to environmental insult leading to altered phenotypes. The goal of this work is to develop an effective data integration method to characterize deregulation between miRNA and mRNA due to environmental toxicant exposures. We will use data from an animal experiment designed to investigate the effect of low-dose environmental chemical exposure on normal mammary gland development in rats to motivate and evaluate the proposed method. We propose a new network approach-integrative Joint Random Forest (iJRF), which characterizes the regulatory system between miRNAs and mRNAs using a network model. iJRF is designed to work under the high-dimension low-sample-size regime, and can borrow information across different treatment conditions to achieve more accurate network inference. It also effectively takes into account prior information of miRNA-mRNA regulatory relationships from existing databases. When iJRF is applied to the data from the environmental chemical exposure study, we detected a few important miRNAs that regulated a large number of mRNAs in the control group but not in the exposed groups, suggesting the disruption of miRNA activity due to chemical exposure. Effects of chemical exposure on two affected miRNAs were further validated using breast cancer human cell lines. R package iJRF is available at CRAN. pei.wang@mssm.edu or susan.teitelbaum@mssm.edu. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  10. A medical application integrating remote 3D visualization tools to access picture archiving and communication system on mobile devices.

    PubMed

    He, Longjun; Ming, Xing; Liu, Qian

    2014-04-01

    With computing capability and display size growing, the mobile device has been used as a tool to help clinicians view patient information and medical images anywhere and anytime. However, for direct interactive 3D visualization, which plays an important role in radiological diagnosis, the mobile device cannot provide a satisfactory quality of experience for radiologists. This paper developed a medical system that can get medical images from the picture archiving and communication system on the mobile device over the wireless network. In the proposed application, the mobile device got patient information and medical images through a proxy server connecting to the PACS server. Meanwhile, the proxy server integrated a range of 3D visualization techniques, including maximum intensity projection, multi-planar reconstruction and direct volume rendering, to providing shape, brightness, depth and location information generated from the original sectional images for radiologists. Furthermore, an algorithm that changes remote render parameters automatically to adapt to the network status was employed to improve the quality of experience. Finally, performance issues regarding the remote 3D visualization of the medical images over the wireless network of the proposed application were also discussed. The results demonstrated that this proposed medical application could provide a smooth interactive experience in the WLAN and 3G networks.

  11. Maritime Aerosol Network optical depth measurements and comparison with satellite retrievals from various different sensors

    NASA Astrophysics Data System (ADS)

    Smirnov, Alexander; Petrenko, Maksym; Ichoku, Charles; Holben, Brent N.

    2017-10-01

    The paper reports on the current status of the Maritime Aerosol Network (MAN) which is a component of the Aerosol Robotic Network (AERONET). A public domain web-based data archive dedicated to MAN activity can be found at https://aeronet.gsfc.nasa.gov/new_web/maritime_aerosol_network.html . Since 2006 over 450 cruises were completed and the data archive consists of more than 6000 measurement days. In this work, we present MAN observations collocated with MODIS Terra, MODIS Aqua, MISR, POLDER, SeaWIFS, OMI, and CALIOP spaceborne aerosol products using a modified version of the Multi-Sensor Aerosol Products Sampling System (MAPSS) framework. Because of different spatio-temporal characteristics of the analyzed products, the number of MAN data points collocated with spaceborne retrievals varied between 1500 matchups for MODIS to 39 for CALIOP (as of August 2016). Despite these unavoidable sampling biases, latitudinal dependencies of AOD differences for all satellite sensors, except for SeaWIFS and POLDER, showed positive biases against ground truth (i.e. MAN) in the southern latitudes (<50° S), and substantial scatter in the Northern Atlantic "dust belt" (5°-15° N). Our analysis did not intend to determine whether satellite retrievals are within claimed uncertainty boundaries, but rather show where bias exists and corrections are needed.

  12. Mass-storage management for distributed image/video archives

    NASA Astrophysics Data System (ADS)

    Franchi, Santina; Guarda, Roberto; Prampolini, Franco

    1993-04-01

    The realization of image/video database requires a specific design for both database structures and mass storage management. This issue has addressed the project of the digital image/video database system that has been designed at IBM SEMEA Scientific & Technical Solution Center. Proper database structures have been defined to catalog image/video coding technique with the related parameters, and the description of image/video contents. User workstations and servers are distributed along a local area network. Image/video files are not managed directly by the DBMS server. Because of their wide size, they are stored outside the database on network devices. The database contains the pointers to the image/video files and the description of the storage devices. The system can use different kinds of storage media, organized in a hierarchical structure. Three levels of functions are available to manage the storage resources. The functions of the lower level provide media management. They allow it to catalog devices and to modify device status and device network location. The medium level manages image/video files on a physical basis. It manages file migration between high capacity media and low access time media. The functions of the upper level work on image/video file on a logical basis, as they archive, move and copy image/video data selected by user defined queries. These functions are used to support the implementation of a storage management strategy. The database information about characteristics of both storage devices and coding techniques are used by the third level functions to fit delivery/visualization requirements and to reduce archiving costs.

  13. Exploring the Potential of a Global Emerging Contaminant Early Warning Network through the Use of Retrospective Suspect Screening with High-Resolution Mass Spectrometry.

    PubMed

    Alygizakis, Nikiforos A; Samanipour, Saer; Hollender, Juliane; Ibáñez, María; Kaserzon, Sarit; Kokkali, Varvara; van Leerdam, Jan A; Mueller, Jochen F; Pijnappels, Martijn; Reid, Malcolm J; Schymanski, Emma L; Slobodnik, Jaroslav; Thomaidis, Nikolaos S; Thomas, Kevin V

    2018-05-01

    A key challenge in the environmental and exposure sciences is to establish experimental evidence of the role of chemical exposure in human and environmental systems. High resolution and accurate tandem mass spectrometry (HRMS) is increasingly being used for the analysis of environmental samples. One lauded benefit of HRMS is the possibility to retrospectively process data for (previously omitted) compounds that has led to the archiving of HRMS data. Archived HRMS data affords the possibility of exploiting historical data to rapidly and effectively establish the temporal and spatial occurrence of newly identified contaminants through retrospective suspect screening. We propose to establish a global emerging contaminant early warning network to rapidly assess the spatial and temporal distribution of contaminants of emerging concern in environmental samples through performing retrospective analysis on HRMS data. The effectiveness of such a network is demonstrated through a pilot study, where eight reference laboratories with available archived HRMS data retrospectively screened data acquired from aqueous environmental samples collected in 14 countries on 3 different continents. The widespread spatial occurrence of several surfactants (e.g., polyethylene glycols ( PEGs ) and C12AEO-PEGs ), transformation products of selected drugs (e.g., gabapentin-lactam, metoprolol-acid, carbamazepine-10-hydroxy, omeprazole-4-hydroxy-sulfide, and 2-benzothiazole-sulfonic-acid), and industrial chemicals (3-nitrobenzenesulfonate and bisphenol-S) was revealed. Obtaining identifications of increased reliability through retrospective suspect screening is challenging, and recommendations for dealing with issues such as broad chromatographic peaks, data acquisition, and sensitivity are provided.

  14. The Italian National Seismic Network

    NASA Astrophysics Data System (ADS)

    Michelini, Alberto

    2016-04-01

    The Italian National Seismic Network is composed by about 400 stations, mainly broadband, installed in the Country and in the surrounding regions. About 110 stations feature also collocated strong motion instruments. The Centro Nazionale Terremoti, (National Earthquake Center), CNT, has installed and operates most of these stations, although a considerable number of stations contributing to the INGV surveillance has been installed and is maintained by other INGV sections (Napoli, Catania, Bologna, Milano) or even other Italian or European Institutions. The important technological upgrades carried out in the last years has allowed for significant improvements of the seismic monitoring of Italy and of the Euro-Mediterranean Countries. The adopted data transmission systems include satellite, wireless connections and wired lines. The Seedlink protocol has been adopted for data transmission. INGV is a primary node of EIDA (European Integrated Data Archive) for archiving and distributing, continuous, quality checked data. The data acquisition system was designed to accomplish, in near-real-time, automatic earthquake detection and hypocenter and magnitude determination (moment tensors, shake maps, etc.). Database archiving of all parametric results are closely linked to the existing procedures of the INGV seismic monitoring environment. Overall, the Italian earthquake surveillance service provides, in quasi real-time, hypocenter parameters which are then revised routinely by the analysts of the Bollettino Sismico Nazionale. The results are published on the web page http://cnt.rm.ingv.it/ and are publicly available to both the scientific community and the the general public. This presentation will describe the various activities and resulting products of the Centro Nazionale Terremoti. spanning from data acquisition to archiving, distribution and specialised products.

  15. Development and implementation of ultrasound picture archiving and communication system

    NASA Astrophysics Data System (ADS)

    Weinberg, Wolfram S.; Tessler, Franklin N.; Grant, Edward G.; Kangarloo, Hooshang; Huang, H. K.

    1990-08-01

    The Department of Radiological Sciences at the UCLA School of Medicine is developing an archiving and communication system (PACS) for digitized ultrasound images. In its final stage the system will involve the acquisition and archiving of ultrasound studies from four different locations including the Center for Health Sciences, the Department for Mental Health and the Outpatient Radiology and Endoscopy Departments with a total of 200-250 patient studies per week. The concept comprises two stages of image manipulation for each ultrasound work area. The first station is located close to the examination site and accomodates the acquisition of digital images from up to five ultrasound devices and provides for instantaneous display and primary viewing and image selection. Completed patient studies are transferred to a main workstation for secondary review, further analysis and comparison studies. The review station has an on-line storage capacity of 10,000 images with a resolution of 512x512 8 bit data to allow for immediate retrieval of active patient studies of up to two weeks. The main work stations are connected through the general network and use one central archive for long term storage and a film printer for hardcopy output. First phase development efforts concentrate on the implementation and testing of a system at one location consisting of a number of ultrasound units with video digitizer and network interfaces and a microcomputer workstation as host for the display station with two color monitors, each allowing simultaneous display of four 512x512 images. The discussion emphasizes functionality, performance and acceptance of the system in the clinical environment.

  16. Water use demand in the Crans-Montana-Sierre region (Switzerland)

    NASA Astrophysics Data System (ADS)

    Bonriposi, M.; Reynard, E.

    2012-04-01

    Crans-Montana-Sierre is an Alpine touristic region located in the driest area of Switzerland (Rhone River Valley, Canton of Valais), with both winter (ski) and summer (e.g. golf) tourist activities. Climate change as well as societal and economic development will in future significantly modify the supply and consumption of water and, consequently, may fuel conflicts of interest. Within the framework of the MontanAqua project (www.montanaqua.ch), we are researching more sustainable water management options based on the co-ordination and adaptation of water demand to water availability under changing biophysical and socioeconomic conditions. This work intends to quantify current water uses in the area and consider future scenarios (around 2050). We have focused upon the temporal and spatial characteristics of resource demand, in order to estimate the spatial footprint of water use (drinking water, hydropower production, irrigation and artificial snowmaking), in terms of system, infrastructure, and organisation of supply. We have then quantified these as precisely as possible (at the monthly temporal scale and at the municipality spatial scale). When the quantity of water was not measurable for practical reasons or for lack of data, as for the case for irrigation or snowmaking, an alternative approach was applied. Instead of quantifying how much water was used, the stress was put on the water needs for irrigating agricultural land or on the optimal meteorological conditions necessary to produce artificial snow. A huge summer peak and a smaller winter peak characterize the current regional water consumption estimation. The summer peak is mainly caused by irrigation and secondly by drinking water demand. The winter peak is essentially due to drinking water and snowmaking. Other consumption peaks exist at the municipality scale but they cannot be observed at the regional scale. The results show a major variation in water demand between the 11 concerned municipalities and between the various uses. All this confirms the necessity of modelling the future demand of water, which would allow prediction of possible future use conflicts. In a second phase of the project, the collected data will be introduced into WEAP (the Water Evaluation And Planning system) model, in order to estimate the future water demand of the Crans-Montana-Sierre region. This hydrologic model is distinct from most similar models because of its ability to integrate climate and socio-economic scenarios (Hansen, 1994). Reference Hansen, E. 1994. WEAP - A system for tackling water resource problems. In Water Management Europe 1993/94: An Annual Review of the European Water and Wastewater Industry. Stockholm Environment Institute: Stockholm.

  17. Digital preservation of a highway photolog film archive in Connecticut.

    DOT National Transportation Integrated Search

    2014-01-28

    The Connecticut Department of Transportation has been photologging their transportation network : for over forty years. Photologging at a minimum refers to the use of an instrumented vehicle, which is : designed to capture successive photographs of t...

  18. The World Radiation Monitoring Center of the Baseline Surface Radiation Network: Status 2017

    NASA Astrophysics Data System (ADS)

    Driemel, Amelie; König-Langlo, Gert; Sieger, Rainer; Long, Charles N.

    2017-04-01

    The World Radiation Monitoring Center (WRMC) is the central archive of the Baseline Surface Radiation Network (BSRN). The BSRN was initiated by the World Climate Research Programme (WCRP) Working Group on Radiative Fluxes and began operations in 1992. One of its aims is to provide short and long-wave surface radiation fluxes of the best possible quality to support the research projects of the WCRP and other scientific projects. The high quality, uniform and consistent measurements of the BSRN network can be used to monitor the short- and long-wave radiative components and their changes with the best methods currently available, to validate and evaluate satellite-based estimates of the surface radiative fluxes, and to verify the results of global climate models. In 1992 the BSRN/WRMC started at ETH Zurich, Switzerland with 9 stations. Since 2007 the archive is hosted by the Alfred-Wegener-Institut (AWI) in Bremerhaven, Germany (http://www.bsrn.awi.de/) and comprises a network of currently 59 stations in contrasting climatic zones, covering a latitude range from 80°N to 90°S. Of the 59 stations, 23 offer the complete radiation budget (down- and upwelling short- and long-wave data). In addition to the ftp-service access instituted at ETH Zurich, the archive at AWI offers data access via PANGAEA - Data Publisher for Earth & Environmental Science (https://www.pangaea.de). PANGAEA guarantees the long-term availability of its content through a commitment of the operating institutions. Within PANGAEA, the metadata of the stations are freely available. To access the data itself an account is required. If the scientist accepts to follow the data release guidelines of the archive (http://bsrn.awi.de/data/conditions-of-data-release/) he or she can get an account from amelie.driemel@awi.de. Currently, more than 9,400 station months (>780 years) are available for interested scientists (see also https://dataportals.pangaea.de/bsrn/?q=LR0100 for an overview on available data). After long years of excellent service as the director of the WRMC, Gert-König Langlo retires in 2017. He is handing over the duties to the current WRMC data curator Amelie Driemel who will continue this important task in the years to come.

  19. The global Landsat archive: Status, consolidation, and direction

    USGS Publications Warehouse

    Wulder, Michael A.; White, Joanne C.; Loveland, Thomas; Woodcock, Curtis; Belward, Alan; Cohen, Warren B.; Fosnight, Eugene A.; Shaw, Jerad; Masek, Jeffery G.; Roy, David P.

    2016-01-01

    New and previously unimaginable Landsat applications have been fostered by a policy change in 2008 that made analysis-ready Landsat data free and open access. Since 1972, Landsat has been collecting images of the Earth, with the early years of the program constrained by onboard satellite and ground systems, as well as limitations across the range of required computing, networking, and storage capabilities. Rather than robust on-satellite storage for transmission via high bandwidth downlink to a centralized storage and distribution facility as with Landsat-8, a network of receiving stations, one operated by the U.S. government, the other operated by a community of International Cooperators (ICs), were utilized. ICs paid a fee for the right to receive and distribute Landsat data and over time, more Landsat data was held outside the archive of the United State Geological Survey (USGS) than was held inside, much of it unique. Recognizing the critical value of these data, the USGS began a Landsat Global Archive Consolidation (LGAC) initiative in 2010 to bring these data into a single, universally accessible, centralized global archive, housed at the Earth Resources Observation and Science (EROS) Center in Sioux Falls, South Dakota. The primary LGAC goals are to inventory the data held by ICs, acquire the data, and ingest and apply standard ground station processing to generate an L1T analysis-ready product. As of January 1, 2015 there were 5,532,454 images in the USGS archive. LGAC has contributed approximately 3.2 million of those images, more than doubling the original USGS archive holdings. Moreover, an additional 2.3 million images have been identified to date through the LGAC initiative and are in the process of being added to the archive. The impact of LGAC is significant and, in terms of images in the collection, analogous to that of having had twoadditional Landsat-5 missions. As a result of LGAC, there are regions of the globe that now have markedly improved Landsat data coverage, resulting in an enhanced capacity for mapping, monitoring change, and capturing historic conditions. Although future missions can be planned and implemented, the past cannot be revisited, underscoring the value and enhanced significance of historical Landsat data and the LGAC initiative. The aim of this paper is to report the current status of the global USGS Landsat archive, document the existing and anticipated contributions of LGAC to the archive, and characterize the current acquisitions of Landsat-7 and Landsat-8. Landsat-8 is adding data to the archive at an unprecedented rate as nearly all terrestrial images are now collected. We also offer key lessons learned so far from the LGAC initiative, plus insights regarding other critical elements of the Landsat program looking forward, such as acquisition, continuity, temporal revisit, and the importance of continuing to operationalize the Landsat program.

  20. Communications among data and science centers

    NASA Technical Reports Server (NTRS)

    Green, James L.

    1990-01-01

    The ability to electronically access and query the contents of remote computer archives is of singular importance in space and earth sciences; the present evaluation of such on-line information networks' development status foresees swift expansion of their data capabilities and complexity, in view of the volumes of data that will continue to be generated by NASA missions. The U.S.'s National Space Science Data Center (NSSDC) manages NASA's largest science computer network, the Space Physics Analysis Network; a comprehensive account is given of the structure of NSSDC international access through BITNET, and of connections to the NSSDC available in the Americas via the International X.25 network.

  1. Measurements of CO2 Mole Fractionand δ13C in Archived Air Samples from Cape Meares, Oregon (USA) 1977 - 1998

    NASA Astrophysics Data System (ADS)

    Clark, O.; Rice, A. L.

    2017-12-01

    Carbon dioxide (CO2) is the most abundant, anthropogenically forced greenhouse gas (GHG) in the global atmosphere. Emissions of CO2 account for approximately 75% of the world's total GHG emissions. Atmospheric concentrations of CO2 are higher now than they've been at any other time in the past 800,000 years. Currently, the global mean concentration exceeds 400 ppm. Today, global networks regularly monitor CO2 concentrations and isotopic composition (δ13C and δ18O). However, past data is sparse. Over 200 ambient air samples from Cape Meares, Oregon (45.5°N, 124.0°W), a coastal site in Western United States, were obtained by researchers at Oregon Institute of Science and Technology (OGI, now Oregon Health & Science University), between the years of 1977 and 1998 as part of a global monitoring program of six different sites in the polar, middle, and tropical latitudes of the Northern and Southern Hemispheres. Air liquefaction was used to compress approximately 1000L of air (STP) to 30bar, into 33L electropolished (SUMMA) stainless steel canisters. Select archived air samples from the original network are maintained at Portland State University (PSU) Department of Physics. These archived samples are a valuable look at changing atmospheric concentrations of CO2 and δ13C, which can contribute to a better understanding of changes in sources during this time. CO2 concentrations and δ13C of CO2 were measured at PSU, with a Picarro Cavity Ringdown Spectrometer, model G1101-i analytical system. This study presents the analytical methods used, calibration techniques, precision, and reproducibility. Measurements of select samples from the archive show rising CO2 concentrations and falling δ13C over the 1977 to 1998 period, compatible with previous observations and rising anthropogenic sources of CO2. The resulting data set was statistically analyzed in MATLAB. Results of preliminary seasonal and secular trends from the archive samples are presented.

  2. Ensuring long-term reliability of the data storage on optical disc

    NASA Astrophysics Data System (ADS)

    Chen, Ken; Pan, Longfa; Xu, Bin; Liu, Wei

    2008-12-01

    "Quality requirements and handling regulation of archival optical disc for electronic records filing" is released by The State Archives Administration of the People's Republic of China (SAAC) on its network in March 2007. This document established a complete operative managing process for optical disc data storage in archives departments. The quality requirements of the optical disc used in archives departments are stipulated. Quality check of the recorded disc before filing is considered to be necessary and the threshold of the parameter of the qualified filing disc is set down. The handling regulations for the staffs in the archives departments are described. Recommended environment conditions of the disc preservation, recording, accessing and testing are presented. The block error rate of the disc is selected as main monitoring parameter of the lifetime of the filing disc and three classes pre-alarm lines are created for marking of different quality check intervals. The strategy of monitoring the variation of the error rate curve of the filing discs and moving the data to a new disc or a new media when the error rate of the disc reaches the third class pre-alarm line will effectively guarantee the data migration before permanent loss. Only when every step of the procedure is strictly implemented, it is believed that long-term reliability of the data storage on optical disc for archives departments can be effectively ensured.

  3. Preserving the Pyramid of STI Using Buckets

    NASA Technical Reports Server (NTRS)

    Nelson, Michael L.; Maly, Kurt

    2004-01-01

    The product of research projects is information. Through the life cycle of a project, information comes from many sources and takes many forms. Traditionally, this body of information is summarized in a formal publication, typically a journal article. While formal publications enjoy the benefits of peer review and technical editing, they are also often compromises in media format and length. As such, we consider a formal publication to represent an abstract to a larger body of work: a pyramid of scientific and technical information (STI). While this abstract may be sufficient for some applications, an in-depth use or analysis is likely to require the supporting layers from the pyramid. We have developed buckets to preserve this pyramid of STI. Buckets provide an archive- and protocol-independent container construct in which all related information objects can be logically grouped together, archived, and manipulated as a single object. Furthermore, buckets are active archival objects and can communicate with each other, people, or arbitrary network services. Buckets are an implementation of the Smart Object, Dumb Archive (SODA) DL model. In SODA, data objects are more important than the archives that hold them. Much of the functionality traditionally associated with archives is pushed down into the objects, such as enforcing terms and conditions, negotiating display, and content maintenance. In this paper, we discuss the motivation, design, and implication of bucket use in DLs with respect to grey literature.

  4. Project MICAS: a multivendor open-system incremental approach to implementing an integrated enterprise-wide PACS: works in progress

    NASA Astrophysics Data System (ADS)

    Smith, Edward M.; Wright, Jeffrey; Fontaine, Marc T.; Robinson, Arvin E.

    1998-07-01

    The Medical Information, Communication and Archive System (MICAS) is a multi-vendor incremental approach to PACS. MICAS is a multi-modality integrated image management system that incorporates the radiology information system (RIS) and radiology image database (RID) with future 'hooks' to other hospital databases. Even though this approach to PACS is more risky than a single-vendor turn-key approach, it offers significant advantages. The vendors involved in the initial phase of MICAS are IDX Corp., ImageLabs, Inc. and Digital Equipment Corp (DEC). The network architecture operates at 100 MBits per sec except between the modalities and the stackable intelligent switch which is used to segment MICAS by modality. Each modality segment contains the acquisition engine for the modality, a temporary archive and one or more diagnostic workstations. All archived studies are available at all workstations, but there is no permanent archive at this time. At present, the RIS vendor is responsible for study acquisition and workflow as well as maintenance of the temporary archive. Management of study acquisition, workflow and the permanent archive will become the responsibility of the archive vendor when the archive is installed in the second quarter of 1998. The modalities currently interfaced to MICAS are MRI, CT and a Howtek film digitizer with Nuclear Medicine and computed radiography (CR) to be added when the permanent archive is installed. There are six dual-monitor diagnostic workstations which use ImageLabs Shared Vision viewer software located in MRI, CT, Nuclear Medicine, musculoskeletal reading areas and two in Radiology's main reading area. One of the major lessons learned to date is that the permanent archive should have been part of the initial MICAS installation and the archive vendor should have been responsible for image acquisition rather than the RIS vendor. Currently an archive vendor is being selected who will be responsible for the management of the archive plus the HIS/RIS interface, image acquisition, modality work list manager and interfacing to the current DICOM viewer software. The next phase of MICAS will include interfacing ultrasound, locating servers outside of the Radiology LAN to support the distribution of images and reports to the clinical floors and physician offices both within and outside of the University of Rochester Medical Center (URMC) campus and the teaching archive.

  5. Medical information, communication, and archiving system (MICAS): Phase II integration and acceptance testing

    NASA Astrophysics Data System (ADS)

    Smith, Edward M.; Wandtke, John; Robinson, Arvin E.

    1999-07-01

    The Medical Information, Communication and Archive System (MICAS) is a multi-modality integrated image management system that is seamlessly integrated with the Radiology Information System (RIS). This project was initiated in the summer of 1995 with the first phase being installed during the first half of 1997 and the second phase installed during the summer of 1998. Phase II enhancements include a permanent archive, automated workflow including modality worklist, study caches, NT diagnostic workstations with all components adhering to Digital Imaging and Communications in Medicine (DICOM) standards. This multi-vendor phased approach to PACS implementation is designed as an enterprise-wide PACS to provide images and reports throughout our healthcare network. MICAS demonstrates that aa multi-vendor open system phased approach to PACS is feasible, cost-effective, and has significant advantages over a single vendor implementation.

  6. Migration of medical image data archived using mini-PACS to full-PACS.

    PubMed

    Jung, Haijo; Kim, Hee-Joung; Kang, Won-Suk; Lee, Sang-Ho; Kim, Sae-Rome; Ji, Chang Lyong; Kim, Jung-Han; Yoo, Sun Kook; Kim, Ki-Hwang

    2004-06-01

    This study evaluated the migration to full-PACS of medical image data archived using mini-PACS at two hospitals of the Yonsei University Medical Center, Seoul, Korea. A major concern in the migration of medical data is to match the image data from the mini-PACS with the hospital OCS (Ordered Communication System). Prior to carrying out the actual migration process, the principles, methods, and anticipated results for the migration with respect to both cost and effectiveness were evaluated. Migration gateway workstations were established and a migration software tool was developed. The actual migration process was performed based on the results of several migration simulations. Our conclusions were that a migration plan should be carefully prepared and tailored to the individual hospital environment because the server system, archive media, network, OCS, and policy for data management may be unique.

  7. EpiContactTrace: an R-package for contact tracing during livestock disease outbreaks and for risk-based surveillance.

    PubMed

    Nöremark, Maria; Widgren, Stefan

    2014-03-17

    During outbreak of livestock diseases, contact tracing can be an important part of disease control. Animal movements can also be of relevance for risk-based surveillance and sampling, i.e. both when assessing consequences of introduction or likelihood of introduction. In many countries, animal movement data are collected with one of the major objectives to enable contact tracing. However, often an analytical step is needed to retrieve appropriate information for contact tracing or surveillance. In this study, an open source tool was developed to structure livestock movement data to facilitate contact-tracing in real time during disease outbreaks and for input in risk-based surveillance and sampling. The tool, EpiContactTrace, was written in the R-language and uses the network parameters in-degree, out-degree, ingoing contact chain and outgoing contact chain (also called infection chain), which are relevant for forward and backward tracing respectively. The time-frames for backward and forward tracing can be specified independently and search can be done on one farm at a time or for all farms within the dataset. Different outputs are available; datasets with network measures, contacts visualised in a map and automatically generated reports for each farm either in HTML or PDF-format intended for the end-users, i.e. the veterinary authorities, regional disease control officers and field-veterinarians. EpiContactTrace is available as an R-package at the R-project website (http://cran.r-project.org/web/packages/EpiContactTrace/). We believe this tool can help in disease control since it rapidly can structure essential contact information from large datasets. The reproducible reports make this tool robust and independent of manual compilation of data. The open source makes it accessible and easily adaptable for different needs.

  8. The Telecommunications and Data Acquisition Report

    NASA Technical Reports Server (NTRS)

    Posner, E. C. (Editor)

    1989-01-01

    Archival reports on developments in programs managed by the Jet Propulsion Laboratory's Office of Telecommunications and Data Acquisition are provided. Space communications, radio navigation, radio science, and ground based radio and radio astronomy are discussed. Deep Space Network projects are also discussed.

  9. COMBINE archive and OMEX format: one file to share all information to reproduce a modeling project.

    PubMed

    Bergmann, Frank T; Adams, Richard; Moodie, Stuart; Cooper, Jonathan; Glont, Mihai; Golebiewski, Martin; Hucka, Michael; Laibe, Camille; Miller, Andrew K; Nickerson, David P; Olivier, Brett G; Rodriguez, Nicolas; Sauro, Herbert M; Scharm, Martin; Soiland-Reyes, Stian; Waltemath, Dagmar; Yvon, Florent; Le Novère, Nicolas

    2014-12-14

    With the ever increasing use of computational models in the biosciences, the need to share models and reproduce the results of published studies efficiently and easily is becoming more important. To this end, various standards have been proposed that can be used to describe models, simulations, data or other essential information in a consistent fashion. These constitute various separate components required to reproduce a given published scientific result. We describe the Open Modeling EXchange format (OMEX). Together with the use of other standard formats from the Computational Modeling in Biology Network (COMBINE), OMEX is the basis of the COMBINE Archive, a single file that supports the exchange of all the information necessary for a modeling and simulation experiment in biology. An OMEX file is a ZIP container that includes a manifest file, listing the content of the archive, an optional metadata file adding information about the archive and its content, and the files describing the model. The content of a COMBINE Archive consists of files encoded in COMBINE standards whenever possible, but may include additional files defined by an Internet Media Type. Several tools that support the COMBINE Archive are available, either as independent libraries or embedded in modeling software. The COMBINE Archive facilitates the reproduction of modeling and simulation experiments in biology by embedding all the relevant information in one file. Having all the information stored and exchanged at once also helps in building activity logs and audit trails. We anticipate that the COMBINE Archive will become a significant help for modellers, as the domain moves to larger, more complex experiments such as multi-scale models of organs, digital organisms, and bioengineering.

  10. MAGNAMWAR: an R package for genome-wide association studies of bacterial orthologs.

    PubMed

    Sexton, Corinne E; Smith, Hayden Z; Newell, Peter D; Douglas, Angela E; Chaston, John M

    2018-06-01

    Here we report on an R package for genome-wide association studies of orthologous genes in bacteria. Before using the software, orthologs from bacterial genomes or metagenomes are defined using local or online implementations of OrthoMCL. These presence-absence patterns are statistically associated with variation in user-collected phenotypes using the Mono-Associated GNotobiotic Animals Metagenome-Wide Association R package (MAGNAMWAR). Genotype-phenotype associations can be performed with several different statistical tests based on the type and distribution of the data. MAGNAMWAR is available on CRAN. john_chaston@byu.edu.

  11. Neuroimaging Data Sharing on the Neuroinformatics Database Platform

    PubMed Central

    Book, Gregory A; Stevens, Michael; Assaf, Michal; Glahn, David; Pearlson, Godfrey D

    2015-01-01

    We describe the Neuroinformatics Database (NiDB), an open-source database platform for archiving, analysis, and sharing of neuroimaging data. Data from the multi-site projects Autism Brain Imaging Data Exchange (ABIDE), Bipolar-Schizophrenia Network on Intermediate Phenotypes parts one and two (B-SNIP1, B-SNIP2), and Monetary Incentive Delay task (MID) are available for download from the public instance of NiDB, with more projects sharing data as it becomes available. As demonstrated by making several large datasets available, NiDB is an extensible platform appropriately suited to archive and distribute shared neuroimaging data. PMID:25888923

  12. ["My first encounter with German urology (1937)". Stefan Wesolowski (1908-2009) - a source in the archives of the German Society for Urology from the oldest corresponding member and promoter of Polish-German relationships].

    PubMed

    Moll, F H; Krischel, M; Zajaczkowski, T; Rathert, P

    2010-10-01

    A source in the archives of the German Society of Urology gives us a vivid insight into the situation in Berlin during the 1930s from the perspective of a young Polish doctor, and presents the situation at one of the leading urology institutions of the time in Germany. Furthermore, we learn about the social situation in hospitals as well as the discourse and networking taking place in the scientific community at that time.

  13. Exploring Digisonde Ionogram Data with SAO-X and DIDBase

    NASA Astrophysics Data System (ADS)

    Khmyrov, Grigori M.; Galkin, Ivan A.; Kozlov, Alexander V.; Reinisch, Bodo W.; McElroy, Jonathan; Dozois, Claude

    2008-02-01

    A comprehensive suite of software tools for ionogram data analysis and archiving has been developed at UMLCAR to support the exploration of raw and processed data from the worldwide network of digisondes in a low-latency, user-friendly environment. Paired with the remotely accessible Digital Ionogram Data Base (DIDBase), the SAO Explorer software serves as an example of how an academic institution conscientiously manages its resident data archive while local experts continue to work on design of new and improved data products, all in the name of free public access to the full roster of acquired ionospheric sounding data.

  14. NASAwide electronic publishing system: Electronic printing and duplicating, stage-2 evaluation report (GSFC)

    NASA Technical Reports Server (NTRS)

    Tuey, Richard C.; Lane, Robert; Hart, Susan V.

    1995-01-01

    The NASA Scientific and Technical Information Office was assigned the responsibility to continue with the expansion of the NASAwide networked electronic duplicating effort by including the Goddard Space Flight Center (GSFC) as an additional node to the existing configuration of networked electronic duplicating systems within NASA. The subject of this report is the evaluation of a networked electronic duplicating system which meets the duplicating requirements and expands electronic publishing capabilities without increasing current operating costs. This report continues the evaluation reported in 'NASA Electronic Publishing System - Electronic Printing and Duplicating Evaluation Report' (NASA TM-106242) and 'NASA Electronic Publishing System - Stage 1 Evaluation Report' (NASA TM-106510). This report differs from the previous reports through the inclusion of an external networked desktop editing, archival, and publishing functionality which did not exist with the previous networked electronic duplicating system. Additionally, a two-phase approach to the evaluation was undertaken; the first was a paper study justifying a 90-day, on-site evaluation, and the second phase was to validate, during the 90-day evaluation, the cost benefits and productivity increases that could be achieved in an operational mode. A benchmark of the functionality of the networked electronic publishing system and external networked desktop editing, archival, and publishing system was performed under a simulated daily production environment. This report can be used to guide others in determining the most cost effective duplicating/publishing alternative through the use of cost/benefit analysis and return on investment techniques. A treatise on the use of these techniques can be found by referring to 'NASA Electronic Publishing System -Cost/Benefit Methodology' (NASA TM-106662).

  15. Tools to manage the enterprise-wide picture archiving and communications system environment.

    PubMed

    Lannum, L M; Gumpf, S; Piraino, D

    2001-06-01

    The presentation will focus on the implementation and utilization of a central picture archiving and communications system (PACS) network-monitoring tool that allows for enterprise-wide operations management and support of the image distribution network. The MagicWatch (Siemens, Iselin, NJ) PACS/radiology information system (RIS) monitoring station from Siemens has allowed our organization to create a service support structure that has given us proactive control of our environment and has allowed us to meet the service level performance expectations of the users. The Radiology Help Desk has used the MagicWatch PACS monitoring station as an applications support tool that has allowed the group to monitor network activity and individual systems performance at each node. Fast and timely recognition of the effects of single events within the PACS/RIS environment has allowed the group to proactively recognize possible performance issues and resolve problems. The PACS/operations group performs network management control, image storage management, and software distribution management from a single, central point in the enterprise. The MagicWatch station allows for the complete automation of software distribution, installation, and configuration process across all the nodes in the system. The tool has allowed for the standardization of the workstations and provides a central configuration control for the establishment and maintenance of the system standards. This report will describe the PACS management and operation prior to the implementation of the MagicWatch PACS monitoring station and will highlight the operational benefits of a centralized network and system-monitoring tool.

  16. Design and clinical evaluation of a high-capacity digital image archival library and high-speed network for the replacement of cinefilm in the cardiac angiography environment

    NASA Astrophysics Data System (ADS)

    Cusma, Jack T.; Spero, Laurence A.; Groshong, Bennett R.; Cho, Teddy; Bashore, Thomas M.

    1993-09-01

    An economical and practical digital solution for the replacement of 35 mm cine film as the archive media in the cardiac x-ray imaging environment has remained lacking to date due to the demanding requirements of high capacity, high acquisition rate, high transfer rate, and a need for application in a distributed environment. A clinical digital image library and network based on the D2 digital video format has been installed in the Duke University Cardiac Catheterization Laboratory. The system architecture includes a central image library with digital video recorders and robotic tape retrieval, three acquisition stations, and remote review stations connected via a serial image network. The library has a capacity for over 20,000 Gigabytes of uncompressed image data, equivalent to records for approximately 20,000 patients. Image acquisition in the clinical laboratories is via a real-time digital interface between the digital angiography system and a local digital recorder. Images are transferred to the library over the serial network at a rate of 14.3 Mbytes/sec and permanently stored for later review. The image library and network are currently undergoing a clinical comparison with cine film for visual and quantitative assessment of coronary artery disease. At the conclusion of the evaluation, the configuration will be expanded to include four additional catheterization laboratories and remote review stations throughout the hospital.

  17. System Security Authorization Agreement (SSAA) for the WIRE Archive and Research Facility

    NASA Technical Reports Server (NTRS)

    2002-01-01

    The Wide-Field Infrared Explorer (WIRE) Archive and Research Facility (WARF) is operated and maintained by the Department of Physics, USAF Academy. The lab is located in Fairchild Hall, 2354 Fairchild Dr., Suite 2A103, USAF Academy, CO 80840. The WARF will be used for research and education in support of the NASA Wide Field Infrared Explorer (WIRE) satellite, and for related high-precision photometry missions and activities. The WARF will also contain the WIRE preliminary and final archives prior to their delivery to the National Space Science Data Center (NSSDC). The WARF consists of a suite of equipment purchased under several NASA grants in support of WIRE research. The core system consists of a Red Hat Linux workstation with twin 933 MHz PIII processors, 1 GB of RAM, 133 GB of hard disk space, and DAT and DLT tape drives. The WARF is also supported by several additional networked Linux workstations. Only one of these (an older 450 Mhz PIII computer running Red Hat Linux) is currently running, but the addition of several more is expected over the next year. In addition, a printer will soon be added. The WARF will serve as the primary research facility for the analysis and archiving of data from the WIRE satellite, together with limited quantities of other high-precision astronomical photometry data from both ground- and space-based facilities. However, the archive to be created here will not be the final archive; rather, the archive will be duplicated at the NSSDC and public access to the data will generally take place through that site.

  18. Commercial imagery archive product development

    NASA Astrophysics Data System (ADS)

    Sakkas, Alysa

    1999-12-01

    The Lockheed Martin (LM) team had garnered over a decade of operational experience in digital imagery management and analysis for the US Government at numerous worldwide sites. Recently, it set out to create a new commercial product to serve the needs of large-scale imagery archiving and analysis markets worldwide. LM decided to provide a turnkey commercial solution to receive, store, retrieve, process, analyze and disseminate in 'push' or 'pull' modes components and adapted and developed its own algorithms to provide added functionality not commercially available elsewhere. The resultant product, Intelligent Library System, satisfies requirements for (a) a potentially unbounded, data archive automated workflow management for increased user productivity; (c) automatic tracking and management of files stored on shelves; (d) ability to ingest, process and disseminate data involves with bandwidths ranging up to multi-gigabit per second; (e) access through a thin client- to-server network environment; (f) multiple interactive users needing retrieval of filters in seconds from both archived images or in real time, and (g) scalability that maintains information throughput performance as the size of the digital library grows.

  19. Multi-provider architecture for cloud outsourcing of medical imaging repositories.

    PubMed

    Godinho, Tiago Marques; Bastião Silva, Luís A; Costa, Carlos; Oliveira, José Luís

    2014-01-01

    Over the last few years, the extended usage of medical imaging procedures has raised the medical community attention towards the optimization of their workflows. More recently, the federation of multiple institutions into a seamless distribution network has brought hope of increased quality healthcare services along with more efficient resource management. As a result, medical institutions are constantly looking for the best infrastructure to deploy their imaging archives. In this scenario, public cloud infrastructures arise as major candidates, as they offer elastic storage space, optimal data availability without great requirements of maintenance costs or IT personnel, in a pay-as-you-go model. However, standard methodologies still do not take full advantage of outsourced archives, namely because their integration with other in-house solutions is troublesome. This document proposes a multi-provider architecture for integration of outsourced archives with in-house PACS resources, taking advantage of foreign providers to store medical imaging studies, without disregarding security. It enables the retrieval of images from multiple archives simultaneously, improving performance, data availability and avoiding the vendor-locking problem. Moreover it enables load balancing and cache techniques.

  20. The Impact of Developing Technology on Media Communications.

    ERIC Educational Resources Information Center

    MacDonald, Lindsay W.

    1997-01-01

    Examines changes in media communications resulting from new information technologies: communications technologies (networks, World Wide Web, digital set-top box); graphic arts (digital photography, CD and digital archives, desktop design and publishing, printing technology); television and video (digital editing, interactive television, news and…

  1. Novel carboxamides as potential mosquito reprellents.

    USDA-ARS?s Scientific Manuscript database

    A model was developed using 167 carboxamide compounds, from the US Department of Agriculture archival database, that were tested as arthropod repellents over the past 60 years. An artificial neural network utilizing CODESSA PRO descriptors was used to construct a Quantitative Structure-Activity Re...

  2. 32 CFR 2001.1 - Purpose and scope.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... Telecommunications, automated information systems, and network security 4.1, 4.2 2001.51 Technical security 4.1 2001... National Defense Other Regulations Relating to National Defense INFORMATION SECURITY OVERSIGHT OFFICE, NATIONAL ARCHIVES AND RECORDS ADMINISTRATION CLASSIFIED NATIONAL SECURITY INFORMATION Scope of Part § 2001...

  3. 32 CFR 2001.1 - Purpose and scope.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... Telecommunications, automated information systems, and network security 4.1, 4.2 2001.51 Technical security 4.1 2001... National Defense Other Regulations Relating to National Defense INFORMATION SECURITY OVERSIGHT OFFICE, NATIONAL ARCHIVES AND RECORDS ADMINISTRATION CLASSIFIED NATIONAL SECURITY INFORMATION Scope of Part § 2001...

  4. 32 CFR 2001.1 - Purpose and scope.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Telecommunications, automated information systems, and network security 4.1, 4.2 2001.51 Technical security 4.1 2001... National Defense Other Regulations Relating to National Defense INFORMATION SECURITY OVERSIGHT OFFICE, NATIONAL ARCHIVES AND RECORDS ADMINISTRATION CLASSIFIED NATIONAL SECURITY INFORMATION Scope of Part § 2001...

  5. 32 CFR 2001.1 - Purpose and scope.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... Telecommunications, automated information systems, and network security 4.1, 4.2 2001.51 Technical security 4.1 2001... National Defense Other Regulations Relating to National Defense INFORMATION SECURITY OVERSIGHT OFFICE, NATIONAL ARCHIVES AND RECORDS ADMINISTRATION CLASSIFIED NATIONAL SECURITY INFORMATION Scope of Part § 2001...

  6. 32 CFR 2001.1 - Purpose and scope.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Telecommunications, automated information systems, and network security 4.1, 4.2 2001.51 Technical security 4.1 2001... National Defense Other Regulations Relating to National Defense INFORMATION SECURITY OVERSIGHT OFFICE, NATIONAL ARCHIVES AND RECORDS ADMINISTRATION CLASSIFIED NATIONAL SECURITY INFORMATION Scope of Part § 2001...

  7. Towards more Global Coordination of Atmospheric Electricity Measurements (GloCAEM)

    NASA Astrophysics Data System (ADS)

    Nicoll, Keri; Harrison, Giles

    2017-04-01

    Earth's atmospheric electrical environment has been studied since the 1750s but its more recent applications to science questions around clouds and climate highlight the incompleteness of our understanding, in part due to lack of suitable global measurements. The Global Electric Circuit (GEC) sustains the near-surface fair weather (FW) electric field, which is present globally in regions which are not strongly electrically disturbed by weather or pollution. It can be measured routinely at the surface using well established instrumentation such as electric field mills. Despite the central role of lightning as a weather hazard and the potentially widespread importance of charge for atmospheric processes, research is hampered by the fragmented nature of surface atmospheric electricity measurements. This makes anything other than local studies in fortuitous fair weather conditions difficult. In contrast to detection of global lightning using satellite measurements and ground-based radio networks, the FW electric field and GEC cannot be measured by remote sensing and no similar measurement networks exist for its study. This presents an opportunity as many researchers worldwide now make high temporal resolution measurements of the FW electric field routinely, which is neither coordinated nor exploited. The GLOCAEM (Global Coordination of Atmospheric Electricity Measurements) project is currently bringing some of these experts together to make the first steps towards an effective global network for FW atmospheric electricity monitoring. A specific objective of the project is to establish the first modern archive of international FW atmospheric electric field data in close to real time to allow global studies of atmospheric electricity to be straightforwardly and robustly performed. Data will be archived through the UK Centre for Environmental Data Analysis (CEDA) and will be available for download by users from early 2018. Both 1 second and 1 minute electric field data will be archived, along with meteorological measurements (if available) for ease of interpretation of electrical measurements. Although the primary aim of the project is to provide a close to real time electric field database, archiving of existing historical electric field datasets is also planned to extend the range of studies possible. This presentation will provide a summary of progress with the GLOCAEM project.

  8. Neural networks: Application to medical imaging

    NASA Technical Reports Server (NTRS)

    Clarke, Laurence P.

    1994-01-01

    The research mission is the development of computer assisted diagnostic (CAD) methods for improved diagnosis of medical images including digital x-ray sensors and tomographic imaging modalities. The CAD algorithms include advanced methods for adaptive nonlinear filters for image noise suppression, hybrid wavelet methods for feature segmentation and enhancement, and high convergence neural networks for feature detection and VLSI implementation of neural networks for real time analysis. Other missions include (1) implementation of CAD methods on hospital based picture archiving computer systems (PACS) and information networks for central and remote diagnosis and (2) collaboration with defense and medical industry, NASA, and federal laboratories in the area of dual use technology conversion from defense or aerospace to medicine.

  9. A Routing Mechanism for Cloud Outsourcing of Medical Imaging Repositories.

    PubMed

    Godinho, Tiago Marques; Viana-Ferreira, Carlos; Bastião Silva, Luís A; Costa, Carlos

    2016-01-01

    Web-based technologies have been increasingly used in picture archive and communication systems (PACS), in services related to storage, distribution, and visualization of medical images. Nowadays, many healthcare institutions are outsourcing their repositories to the cloud. However, managing communications between multiple geo-distributed locations is still challenging due to the complexity of dealing with huge volumes of data and bandwidth requirements. Moreover, standard methodologies still do not take full advantage of outsourced archives, namely because their integration with other in-house solutions is troublesome. In order to improve the performance of distributed medical imaging networks, a smart routing mechanism was developed. This includes an innovative cache system based on splitting and dynamic management of digital imaging and communications in medicine objects. The proposed solution was successfully deployed in a regional PACS archive. The results obtained proved that it is better than conventional approaches, as it reduces remote access latency and also the required cache storage space.

  10. Snowball: resampling combined with distance-based regression to discover transcriptional consequences of a driver mutation

    PubMed Central

    Xu, Yaomin; Guo, Xingyi; Sun, Jiayang; Zhao, Zhongming

    2015-01-01

    Motivation: Large-scale cancer genomic studies, such as The Cancer Genome Atlas (TCGA), have profiled multidimensional genomic data, including mutation and expression profiles on a variety of cancer cell types, to uncover the molecular mechanism of cancerogenesis. More than a hundred driver mutations have been characterized that confer the advantage of cell growth. However, how driver mutations regulate the transcriptome to affect cellular functions remains largely unexplored. Differential analysis of gene expression relative to a driver mutation on patient samples could provide us with new insights in understanding driver mutation dysregulation in tumor genome and developing personalized treatment strategies. Results: Here, we introduce the Snowball approach as a highly sensitive statistical analysis method to identify transcriptional signatures that are affected by a recurrent driver mutation. Snowball utilizes a resampling-based approach and combines a distance-based regression framework to assign a robust ranking index of genes based on their aggregated association with the presence of the mutation, and further selects the top significant genes for downstream data analyses or experiments. In our application of the Snowball approach to both synthesized and TCGA data, we demonstrated that it outperforms the standard methods and provides more accurate inferences to the functional effects and transcriptional dysregulation of driver mutations. Availability and implementation: R package and source code are available from CRAN at http://cran.r-project.org/web/packages/DESnowball, and also available at http://bioinfo.mc.vanderbilt.edu/DESnowball/. Contact: zhongming.zhao@vanderbilt.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:25192743

  11. Coupling Meteorology, Metal Concentrations, and Pb Isotopes for Source Attribution in Archived Precipitation Samples

    EPA Science Inventory

    A technique that couples lead (Pb) isotopes and multi-element concentrations with meteorological analysis was used to assess source contributions to precipitation samples at the Bondville, Illinois USA National Trends Network (NTN) site. Precipitation samples collected over a 16 ...

  12. Education for Australia's Information Future

    ERIC Educational Resources Information Center

    Burford, Sally; Partridge, Helen; Brown, Sarah; Hider, Philip; Ellis, Leonie

    2015-01-01

    Digital disruption and an increasingly networked society drive rapid change in many professions and a corresponding need for change in tertiary education. Across the world, information education has, to date, prepared graduates for employment in discrete professions, such as librarianship, records management, archives and teacher librarianship.…

  13. Data distribution satellite

    NASA Technical Reports Server (NTRS)

    Stevens, Grady H.

    1992-01-01

    The Data Distribution Satellite (DDS), operating in conjunction with the planned space network, the National Research and Education Network and its commercial derivatives, would play a key role in networking the emerging supercomputing facilities, national archives, academic, industrial, and government institutions. Centrally located over the United States in geostationary orbit, DDS would carry sophisticated on-board switching and make use of advanced antennas to provide an array of special services. Institutions needing continuous high data rate service would be networked together by use of a microwave switching matrix and electronically steered hopping beams. Simultaneously, DDS would use other beams and on board processing to interconnect other institutions with lesser, low rate, intermittent needs. Dedicated links to White Sands and other facilities would enable direct access to space payloads and sensor data. Intersatellite links to a second generation ATDRS, called Advanced Space Data Acquisition and Communications System (ASDACS), would eliminate one satellite hop and enhance controllability of experimental payloads by reducing path delay. Similarly, direct access would be available to the supercomputing facilities and national data archives. Economies with DDS would be derived from its ability to switch high rate facilities amongst users needed. At the same time, having a CONUS view, DDS would interconnect with any institution regardless of how remote. Whether one needed high rate service or low rate service would be immaterial. With the capability to assign resources on demand, DDS will need only carry a portion of the resources needed if dedicated facilities were used. Efficiently switching resources to users as needed, DDS would become a very feasible spacecraft, even though it would tie together the space network, the terrestrial network, remote sites, 1000's of small users, and those few who need very large data links intermittently.

  14. Archiving and Distributing Seismic Data at the Southern California Earthquake Data Center (SCEDC)

    NASA Astrophysics Data System (ADS)

    Appel, V. L.

    2002-12-01

    The Southern California Earthquake Data Center (SCEDC) archives and provides public access to earthquake parametric and waveform data gathered by the Southern California Seismic Network and since January 1, 2001, the TriNet seismic network, southern California's earthquake monitoring network. The parametric data in the archive includes earthquake locations, magnitudes, moment-tensor solutions and phase picks. The SCEDC waveform archive prior to TriNet consists primarily of short-period, 100-samples-per-second waveforms from the SCSN. The addition of the TriNet array added continuous recordings of 155 broadband stations (20 samples per second or less), and triggered seismograms from 200 accelerometers and 200 short-period instruments. Since the Data Center and TriNet use the same Oracle database system, new earthquake data are available to the seismological community in near real-time. Primary access to the database and waveforms is through the Seismogram Transfer Program (STP) interface. The interface enables users to search the database for earthquake information, phase picks, and continuous and triggered waveform data. Output is available in SAC, miniSEED, and other formats. Both the raw counts format (V0) and the gain-corrected format (V1) of COSMOS (Consortium of Organizations for Strong-Motion Observation Systems) are now supported by STP. EQQuest is an interface to prepackaged waveform data sets for select earthquakes in Southern California stored at the SCEDC. Waveform data for large-magnitude events have been prepared and new data sets will be available for download in near real-time following major events. The parametric data from 1981 to present has been loaded into the Oracle 9.2.0.1 database system and the waveforms for that time period have been converted to mSEED format and are accessible through the STP interface. The DISC optical-disk system (the "jukebox") that currently serves as the mass-storage for the SCEDC is in the process of being replaced with a series of inexpensive high-capacity (1.6 Tbyte) magnetic-disk RAIDs. These systems are built with PC-technology components, using 16 120-Gbyte IDE disks, hot-swappable disk trays, two RAID controllers, dual redundant power supplies and a Linux operating system. The system is configured over a private gigabit network that connects to the two Data Center servers and spans between the Seismological Lab and the USGS. To ensure data integrity, each RAID disk system constantly checks itself against its twin and verifies file integrity using 128-bit MD5 file checksums that are stored separate from the system. The final level of data protection is a Sony AIT-3 tape backup of the files. The primary advantage of the magnetic-disk approach is faster data access because magnetic disk drives have almost no latency. This means that the SCEDC can provide better "on-demand" interactive delivery of the seismograms in the archive.

  15. National Space Science Data Center and World Data Center A for Rockets and Satellites - Ionospheric data holdings and services

    NASA Technical Reports Server (NTRS)

    Bilitza, D.; King, J. H.

    1988-01-01

    The activities and services of the National Space Science data Center (NSSDC) and the World Data Center A for Rockets and Satellites (WDC-A-R and S) are described with special emphasis on ionospheric physics. The present catalog/archive system is explained and future developments are indicated. In addition to the basic data acquisition, archiving, and dissemination functions, ongoing activities include the Central Online Data Directory (CODD), the Coordinated Data Analysis Workshopps (CDAW), the Space Physics Analysis Network (SPAN), advanced data management systems (CD/DIS, NCDS, PLDS), and publication of the NSSDC News, the SPACEWARN Bulletin, and several NSSD reports.

  16. Internet Services for Professional Astronomy

    NASA Astrophysics Data System (ADS)

    Andernach, H.

    A (subjective) overview of Internet resources relevant to professional astronomers is given. Special emphasis is put on databases of astronomical objects and servers providing general information, e.g. on astronomical catalogues, finding charts from sky surveys, bibliographies, directories, browsers through multi-wavelength observational archives, etc. Archives of specific observational data will be discussed in more detail in other chapters of this book, dealing with the corresponding part of the electromagnetic spectrum. About 200 different links are mentioned, and every attempt was made to make this report as up-to-date as possible. As the field is rapidly growing with improved network technology, it will be just a snapshot of the situation in mid-1998.

  17. Satellite image analysis using neural networks

    NASA Technical Reports Server (NTRS)

    Sheldon, Roger A.

    1990-01-01

    The tremendous backlog of unanalyzed satellite data necessitates the development of improved methods for data cataloging and analysis. Ford Aerospace has developed an image analysis system, SIANN (Satellite Image Analysis using Neural Networks) that integrates the technologies necessary to satisfy NASA's science data analysis requirements for the next generation of satellites. SIANN will enable scientists to train a neural network to recognize image data containing scenes of interest and then rapidly search data archives for all such images. The approach combines conventional image processing technology with recent advances in neural networks to provide improved classification capabilities. SIANN allows users to proceed through a four step process of image classification: filtering and enhancement, creation of neural network training data via application of feature extraction algorithms, configuring and training a neural network model, and classification of images by application of the trained neural network. A prototype experimentation testbed was completed and applied to climatological data.

  18. RevEcoR: an R package for the reverse ecology analysis of microbiomes.

    PubMed

    Cao, Yang; Wang, Yuanyuan; Zheng, Xiaofei; Li, Fei; Bo, Xiaochen

    2016-07-29

    All species live in complex ecosystems. The structure and complexity of a microbial community reflects not only diversity and function, but also the environment in which it occurs. However, traditional ecological methods can only be applied on a small scale and for relatively well-understood biological systems. Recently, a graph-theory-based algorithm called the reverse ecology approach has been developed that can analyze the metabolic networks of all the species in a microbial community, and predict the metabolic interface between species and their environment. Here, we present RevEcoR, an R package and a Shiny Web application that implements the reverse ecology algorithm for determining microbe-microbe interactions in microbial communities. This software allows users to obtain large-scale ecological insights into species' ecology directly from high-throughput metagenomic data. The software has great potential for facilitating the study of microbiomes. RevEcoR is open source software for the study of microbial community ecology. The RevEcoR R package is freely available under the GNU General Public License v. 2.0 at http://cran.r-project.org/web/packages/RevEcoR/ with the vignette and typical usage examples, and the interactive Shiny web application is available at http://yiluheihei.shinyapps.io/shiny-RevEcoR , or can be installed locally with the source code accessed from https://github.com/yiluheihei/shiny-RevEcoR .

  19. Hydrological analysis in R: Topmodel and beyond

    NASA Astrophysics Data System (ADS)

    Buytaert, W.; Reusser, D.

    2011-12-01

    R is quickly gaining popularity in the hydrological sciences community. The wide range of statistical and mathematical functionality makes it an excellent tool for data analysis, modelling and uncertainty analysis. Topmodel was one of the first hydrological models being implemented as an R package and distributed through R's own distribution network CRAN. This facilitated pre- and postprocessing of data such as parameter sampling, calculation of prediction bounds, and advanced visualisation. However, apart from these basic functionalities, the package did not use many of the more advanced features of the R environment, especially from R's object oriented functionality. With R's increasing expansion in arenas such as high performance computing, big data analysis, and cloud services, we revisit the topmodel package, and use it as an example of how to build and deploy the next generation of hydrological models. R provides a convenient environment and attractive features to build and couple hydrological - and in extension other environmental - models, to develop flexible and effective data assimilation strategies, and to take the model beyond the individual computer by linking into cloud services for both data provision and computing. However, in order to maximise the benefit of these approaches, it will be necessary to adopt standards and ontologies for model interaction and information exchange. Some of those are currently being developed, such as the OGC web processing standards, while other will need to be developed.

  20. Prediction of Peptide and Protein Propensity for Amyloid Formation

    PubMed Central

    Família, Carlos; Dennison, Sarah R.; Quintas, Alexandre; Phoenix, David A.

    2015-01-01

    Understanding which peptides and proteins have the potential to undergo amyloid formation and what driving forces are responsible for amyloid-like fiber formation and stabilization remains limited. This is mainly because proteins that can undergo structural changes, which lead to amyloid formation, are quite diverse and share no obvious sequence or structural homology, despite the structural similarity found in the fibrils. To address these issues, a novel approach based on recursive feature selection and feed-forward neural networks was undertaken to identify key features highly correlated with the self-assembly problem. This approach allowed the identification of seven physicochemical and biochemical properties of the amino acids highly associated with the self-assembly of peptides and proteins into amyloid-like fibrils (normalized frequency of β-sheet, normalized frequency of β-sheet from LG, weights for β-sheet at the window position of 1, isoelectric point, atom-based hydrophobic moment, helix termination parameter at position j+1 and ΔG° values for peptides extrapolated in 0 M urea). Moreover, these features enabled the development of a new predictor (available at http://cran.r-project.org/web/packages/appnn/index.html) capable of accurately and reliably predicting the amyloidogenic propensity from the polypeptide sequence alone with a prediction accuracy of 84.9 % against an external validation dataset of sequences with experimental in vitro, evidence of amyloid formation. PMID:26241652

  1. Analysis, Design and Implementation of a Networking Proof-of-Concept Prototype to Support Maritime Visit, Board, Search and Seizure Teams

    DTIC Science & Technology

    2014-03-01

    M. Callaghan ( AKR -1001). Retrieved from http://www.navsource.org/archives/09/54/541001.htm Nguyen, H., & Baker, M. (2012). Characteristics of a ...AND IMPLEMENTATION OF A NETWORKING PROOF-OF-CONCEPT PROTOTYPE TO SUPPORT MARITIME VISIT, BOARD, SEARCH AND SEIZURE TEAMS by Van E. Stewart...2. REPORT DATE March 2014 3. REPORT TYPE AND DATES COVERED Master’s Thesis 4. TITLE AND SUBTITLE ANALYSIS, DESIGN AND IMPLEMENTATION OF A

  2. The Telecommunications and Data Acquisition Report

    NASA Technical Reports Server (NTRS)

    Posner, E. C. (Editor)

    1990-01-01

    Archival reports on developments in programs managed by the Jet Propulsion Laboratory's (JPL) Office of Telecommunications and Data Acquisition (TDA) are given. Space communications, radio navigation, radio science, and ground-based radio and radar astronomy, activities of the Deep Space Network (DSN) and its associated Ground Communications Facility (GCF) in planning, supporting research and technology, implementation, and operations are reported. Also included is TDA-funded activity at JPL on data and information systems and reimbursable Deep Space Network (DSN) work performed for other space agencies through NASA.

  3. Secure and Privacy-Preserving Distributed Information Brokering

    ERIC Educational Resources Information Center

    Li, Fengjun

    2010-01-01

    As enormous structured, semi-structured and unstructured data are collected and archived by organizations in many realms ranging from business to health networks to government agencies, the needs for efficient yet secure inter-organization information sharing naturally arise. Unlike early information sharing approaches that only involve a small…

  4. TOTAL PRECIPITATION DATA - U.S HISTORICAL CLIMATOLOGY NETWORK (HCN)

    EPA Science Inventory

    The Carbon Dioxide Information Analysis Center, which includes the World Data Center-A for Atmospheric Trace Gases, is the primary global-change data and information analysis center of the U.S. Department of Energy (DOE). More than just an archive of data sets and publications, ...

  5. Unobtrusive Social Network Data From Email

    DTIC Science & Technology

    2008-12-01

    PERSON Standard Form 298 (Rev. 8-98) Prescribed by ANSI Std Z39-18 outlook archived files and stores that data into an SQL - database. Communication...Applications ( VBA ) program was installed on the personal computers (PC) of all participants, in the session window of their Microsoft Outlook. Details of

  6. Development of acylpiperidine and carboxamide repellents using structure-activity models

    USDA-ARS?s Scientific Manuscript database

    The United States Department of Agriculture (USDA) has developed repellents and insecticides for the U.S. military since 1942. Data for over 30,000 compounds are contained within the USDA archive. Repellency data from similarly structured compounds were used to develop artificial neural network (ANN...

  7. Using the Web To Explore the Great Depression.

    ERIC Educational Resources Information Center

    Chamberlin, Paul

    2001-01-01

    Presents an annotated list of Web sites that focus on the Great Depression. Includes the American Experience, American Memory, the National Archives and Records Administration, and the New Deal Network Web sites. Offers additional sites covering topics such as the Jersey homesteads and labor history. (CMK)

  8. A Tight Squeeze

    ERIC Educational Resources Information Center

    Ramaswami, Rama

    2008-01-01

    The Storage Networking Industry Association (SNIA) does not mince words when describing the looming data storage problem. In its 2007 report, "Solving the Coming Archive Crisis--the 100-Year Dilemma," the trade group asserts that the volume of disparate digital information sources being kept online for long-term preservation is overwhelming and…

  9. Deep-Learning Convolutional Neural Networks Accurately Classify Genetic Mutations in Gliomas.

    PubMed

    Chang, P; Grinband, J; Weinberg, B D; Bardis, M; Khy, M; Cadena, G; Su, M-Y; Cha, S; Filippi, C G; Bota, D; Baldi, P; Poisson, L M; Jain, R; Chow, D

    2018-05-10

    The World Health Organization has recently placed new emphasis on the integration of genetic information for gliomas. While tissue sampling remains the criterion standard, noninvasive imaging techniques may provide complimentary insight into clinically relevant genetic mutations. Our aim was to train a convolutional neural network to independently predict underlying molecular genetic mutation status in gliomas with high accuracy and identify the most predictive imaging features for each mutation. MR imaging data and molecular information were retrospectively obtained from The Cancer Imaging Archives for 259 patients with either low- or high-grade gliomas. A convolutional neural network was trained to classify isocitrate dehydrogenase 1 ( IDH1 ) mutation status, 1p/19q codeletion, and O6-methylguanine-DNA methyltransferase ( MGMT ) promotor methylation status. Principal component analysis of the final convolutional neural network layer was used to extract the key imaging features critical for successful classification. Classification had high accuracy: IDH1 mutation status, 94%; 1p/19q codeletion, 92%; and MGMT promotor methylation status, 83%. Each genetic category was also associated with distinctive imaging features such as definition of tumor margins, T1 and FLAIR suppression, extent of edema, extent of necrosis, and textural features. Our results indicate that for The Cancer Imaging Archives dataset, machine-learning approaches allow classification of individual genetic mutations of both low- and high-grade gliomas. We show that relevant MR imaging features acquired from an added dimensionality-reduction technique demonstrate that neural networks are capable of learning key imaging components without prior feature selection or human-directed training. © 2018 by American Journal of Neuroradiology.

  10. Defining the Transfer Functions of the PCAD Model in North Atlantic Right Whales (Eubalaena glacialis) - Retrospective Analyses of Existing Data

    DTIC Science & Technology

    2012-09-30

    potentially providing information on nutritional state and chronic stress ( Wasser et al., 2010). We tested both T3 and T4 assays for parallelism.The...EconPapers.RePEc.org/RePEc:inn:wpaper:2011-20 Hunt K.E., Rolland R.M., Kraus S.D., Wasser S.K. 2006. Analysis of fecal glucocorticoids in the North Atlantic right...version 1.2.5. http://CRAN.R-project.org/package=doMC Rolland R.M., Hunt K.E., Kraus S.D., Wasser S.K. 2005. Assessing reproductive status of right

  11. Automated extraction of metadata from remotely sensed satellite imagery

    NASA Technical Reports Server (NTRS)

    Cromp, Robert F.

    1991-01-01

    The paper discusses research in the Intelligent Data Management project at the NASA/Goddard Space Flight Center, with emphasis on recent improvements in low-level feature detection algorithms for performing real-time characterization of images. Images, including MSS and TM data, are characterized using neural networks and the interpretation of the neural network output by an expert system for subsequent archiving in an object-oriented data base. The data show the applicability of this approach to different arrangements of low-level remote sensing channels. The technique works well when the neural network is trained on data similar to the data used for testing.

  12. A flexible, open, decentralized system for digital pathology networks.

    PubMed

    Schuler, Robert; Smith, David E; Kumaraguruparan, Gowri; Chervenak, Ann; Lewis, Anne D; Hyde, Dallas M; Kesselman, Carl

    2012-01-01

    High-resolution digital imaging is enabling digital archiving and sharing of digitized microscopy slides and new methods for digital pathology. Collaborative research centers, outsourced medical services, and multi-site organizations stand to benefit from sharing pathology data in a digital pathology network. Yet significant technological challenges remain due to the large size and volume of digitized whole slide images. While information systems do exist for managing local pathology laboratories, they tend to be oriented toward narrow clinical use cases or offer closed ecosystems around proprietary formats. Few solutions exist for networking digital pathology operations. Here we present a system architecture and implementation of a digital pathology network and share results from a production system that federates major research centers.

  13. A Flexible, Open, Decentralized System for Digital Pathology Networks

    PubMed Central

    SMITH, David E.; KUMARAGURUPARAN, Gowri; CHERVENAK, Ann; LEWIS, Anne D.; HYDE, Dallas M.; KESSELMAN, Carl

    2014-01-01

    High-resolution digital imaging is enabling digital archiving and sharing of digitized microscopy slides and new methods for digital pathology. Collaborative research centers, outsourced medical services, and multi-site organizations stand to benefit from sharing pathology data in a digital pathology network. Yet significant technological challenges remain due to the large size and volume of digitized whole slide images. While information systems do exist for managing local pathology laboratories, they tend to be oriented toward narrow clinical use cases or offer closed ecosystems around proprietary formats. Few solutions exist for networking digital pathology operations. Here we present a system architecture and implementation of a digital pathology network and share results from a production system that federates major research centers. PMID:22941985

  14. NNI Supplement to the President's 2013 Budget | Nano

    Science.gov Websites

    Skip main navigation Nano.gov Nanotechnology 101 What It Is and How It Works What is Nanotechnology What's So Special about the Nanoscale? NNI Accomplishments NNI Accomplishments Archive Nanotechnology Timeline Frequently Asked Questions Glossary Nanotechnology and You Benefits and Applications Networks and

  15. Information Retrieval Using ADABAS-NATURAL (with Applications for Television and Radio).

    ERIC Educational Resources Information Center

    Silbergeld, I.; Kutok, P.

    1984-01-01

    Describes use of the software ADABAS (general purpose database management system) and NATURAL (interactive programing language) in development and implementation of an information retrieval system for the National Television and Radio Network of Israel. General design considerations, files contained in each archive, search strategies, and keywords…

  16. Secondary Lessons from Indiana's Underground Railroad Institute (July 22-27, 2001).

    ERIC Educational Resources Information Center

    Indiana Univ.-Purdue Univ., Indianapolis. Geography Educators' Network of Indiana.

    The Geography Educator's Network of Indiana's 2001 Exploring and Teaching Institute series led 23 educators from around the state on a six day traveling adventure. Participants explored art, literature/folklore, historical sites and archives, physical environments, architecture, economics, politics, and cultures associated with the Underground…

  17. 40 CFR 58.16 - Data submittal and archiving requirements.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    .... Other Federal agencies may request access to filters for purposes of supporting air quality management... PROGRAMS (CONTINUED) AMBIENT AIR QUALITY SURVEILLANCE Monitoring Network § 58.16 Data submittal and..., via AQS all ambient air quality data and associated quality assurance data for SO2; CO; O3; NO2; NO...

  18. Elementary Lessons from Indiana's Underground Railroad Institute (July 22-27, 2001).

    ERIC Educational Resources Information Center

    Indiana Univ.-Purdue Univ., Indianapolis. Geography Educators' Network of Indiana.

    The Geography Educators' Network of Indiana's 2001 Exploring and Teaching Institute led 23 educators from around the state on a six day traveling adventure. Participants explored art, literature/folklore, historical sites and archives, physical environments, architecture, economics, politics, and cultures associated with the Underground Railroad…

  19. An Archive of Digital Images.

    ERIC Educational Resources Information Center

    Fantini, M.; And Others

    1990-01-01

    Describes the architecture of the prototype of an image management system that has been used to develop an application concerning images of frescoes in the Sistina Chapel in the Vatican. Hardware and software design are described, the use of local area networks (LANs) is discussed, and data organization is explained. (15 references) (LRW)

  20. Apparently, We Disappeared

    ERIC Educational Resources Information Center

    Richerme, Lauren Kapalka

    2011-01-01

    An examination of the 2005-2010 online archives of major American network news stations and newspapers reveals a troubling picture for music education. News stories frequently mention the disappearance of music education. When the media mention the existence of music education, they often promote it as a means of raising standardized test scores…

  1. Giving Life to Data: University-Community Partnerships in Addressing HIV and AIDS through Building Digital Archives

    ERIC Educational Resources Information Center

    de Lange, Naydene; Mnisi, Thoko; Mitchell, Claudia; Park, Eun G.

    2010-01-01

    The partnerships, especially university-community partnerships, that sustain globally networked learning environments often face challenges in mobilizing research to empower local communities to effect change. This article examines these challenges by describing a university-community partnership involving researchers and graduate students in…

  2. Visitor's Computer Guidelines | CTIO

    Science.gov Websites

    Visitor's Computer Guidelines Network Connection Request Instruments Instruments by Telescope IR Instruments Logs Tololo Kaxis Webcam NOAO Newsletters NOAO Data Archive Astronomical Links Visitor's Computer ‹› You are here CTIO Home » Astronomers » Visitor's Computer Guidelines Visitor's Computer

  3. DICOM-compliant PACS with CD-based image archival

    NASA Astrophysics Data System (ADS)

    Cox, Robert D.; Henri, Christopher J.; Rubin, Richard K.; Bret, Patrice M.

    1998-07-01

    This paper describes the design and implementation of a low- cost PACS conforming to the DICOM 3.0 standard. The goal was to provide an efficient image archival and management solution on a heterogeneous hospital network as a basis for filmless radiology. The system follows a distributed, client/server model and was implemented at a fraction of the cost of a commercial PACS. It provides reliable archiving on recordable CD and allows access to digital images throughout the hospital and on the Internet. Dedicated servers have been designed for short-term storage, CD-based archival, data retrieval and remote data access or teleradiology. The short-term storage devices provide DICOM storage and query/retrieve services to scanners and workstations and approximately twelve weeks of 'on-line' image data. The CD-based archival and data retrieval processes are fully automated with the exception of CD loading and unloading. The system employs lossless compression on both short- and long-term storage devices. All servers communicate via the DICOM protocol in conjunction with both local and 'master' SQL-patient databases. Records are transferred from the local to the master database independently, ensuring that storage devices will still function if the master database server cannot be reached. The system features rules-based work-flow management and WWW servers to provide multi-platform remote data access. The WWW server system is distributed on the storage, retrieval and teleradiology servers allowing viewing of locally stored image data directly in a WWW browser without the need for data transfer to a central WWW server. An independent system monitors disk usage, processes, network and CPU load on each server and reports errors to the image management team via email. The PACS was implemented using a combination of off-the-shelf hardware, freely available software and applications developed in-house. The system has enabled filmless operation in CT, MR and ultrasound within the radiology department and throughout the hospital. The use of WWW technology has enabled the development of an intuitive we- based teleradiology and image management solution that provides complete access to image data.

  4. NASA Earth Observing System Data and Information System (EOSDIS): A U.S. Network of Data Centers Serving Earth Science Data: A Network Member of ICSU WDS

    NASA Technical Reports Server (NTRS)

    Behnke, Jeanne; Ramapriyan, H. K. " Rama"

    2016-01-01

    NASA's Earth Observing System Data and Information System (EOSDIS) has been in operation since August 1994, and serving a diverse user community around the world with Earth science data from satellites, aircraft, field campaigns and research investigations. The ESDIS Project, responsible for EOSDIS is a Network Member of the International Council for Sciences (ICSU) World Data System (WDS). Nine of the 12 Distributed Active Archive Centers (DAACs), which are part of EOSDIS, are Regular Members of the ICSUWDS. This poster presents the EOSDIS mission objectives, key characteristics of the DAACs that make them world class Earth science data centers, successes, challenges and best practices of EOSDIS focusing on the years 2014-2016, and illustrates some highlights of accomplishments of EOSDIS. The highlights include: high customer satisfaction, growing archive and distribution volumes, exponential growth in number of products distributed to users around the world, unified metadata model and common metadata repository, flexibility provided to uses by supporting data transformations to suit their applications, near-real-time capabilities to support various operational and research applications, and full resolution image browse capabilities to help users select data of interest. The poster also illustrates how the ESDIS Project is actively involved in several US and international data system organizations.

  5. Picture archiving and communication in radiology.

    PubMed

    Napoli, Marzia; Nanni, Marinella; Cimarra, Stefania; Crisafulli, Letizia; Campioni, Paolo; Marano, Pasquale

    2003-01-01

    After over 80 years of exclusive archiving of radiologic films, at present, in Radiology, digital archiving is increasingly gaining ground. Digital archiving allows a considerable reduction in costs and space saving, but most importantly, immediate or remote consultation of all examinations and reports in the hospital clinical wards, is feasible. The RIS system, in this case, is the starting point of the process of electronic archiving which however is the task of PACS. The latter can be used as radiologic archive in accordance with the law provided that it is in conformance with some specifications as the use of optical long-term storage media or with electronic track of change. PACS archives, in a hierarchical system, all digital images produced by each diagnostic imaging modality. Images and patient data can be retrieved and used for consultation or remote consultation by the reporting radiologist who requires images and reports of previous radiologic examinations or by the referring physician of the ward. Modern PACS owing to the WEB server allow remote access to extremely simplified images and data however ensuring the due regulations and access protections. Since the PACS enables a simpler data communication within the hospital, security and patient privacy should be protected. A secure and reliable PACS should be able to minimize the risk of accidental data destruction, and should prevent non authorized access to the archive with adequate security measures in relation to the acquired knowledge and based on the technological advances. Archiving of data produced by modern digital imaging is a problem now present also in small Radiology services. The technology is able to readily solve problems which were extremely complex up to some years ago as the connection between equipment and archiving system owing also to the universalization of the DICOM 3.0 standard. The evolution of communication networks and the use of standard protocols as TCP/IP can minimize problems of data and image remote transmission within the healthcare enterprise as well as over the territory. However, new problems are appearing as that of digital data security profiles and of the different systems which should ensure it. Among these, algorithms of electronic signature should be mentioned. In Italy they are validated by law and therefore can be used in digital archives in accordance with the law.

  6. Commercial imagery archive, management, exploitation, and distribution project development

    NASA Astrophysics Data System (ADS)

    Hollinger, Bruce; Sakkas, Alysa

    1999-10-01

    The Lockheed Martin (LM) team had garnered over a decade of operational experience on the U.S. Government's IDEX II (Imagery Dissemination and Exploitation) system. Recently, it set out to create a new commercial product to serve the needs of large-scale imagery archiving and analysis markets worldwide. LM decided to provide a turnkey commercial solution to receive, store, retrieve, process, analyze and disseminate in 'push' or 'pull' modes imagery, data and data products using a variety of sources and formats. LM selected 'best of breed' hardware and software components and adapted and developed its own algorithms to provide added functionality not commercially available elsewhere. The resultant product, Intelligent Library System (ILS)TM, satisfies requirements for (1) a potentially unbounded, data archive (5000 TB range) (2) automated workflow management for increased user productivity; (3) automatic tracking and management of files stored on shelves; (4) ability to ingest, process and disseminate data volumes with bandwidths ranging up to multi- gigabit per second; (5) access through a thin client-to-server network environment; (6) multiple interactive users needing retrieval of files in seconds from both archived images or in real time, and (7) scalability that maintains information throughput performance as the size of the digital library grows.

  7. Commercial imagery archive, management, exploitation, and distribution product development

    NASA Astrophysics Data System (ADS)

    Hollinger, Bruce; Sakkas, Alysa

    1999-12-01

    The Lockheed Martin (LM) team had garnered over a decade of operational experience on the U.S. Government's IDEX II (Imagery Dissemination and Exploitation) system. Recently, it set out to create a new commercial product to serve the needs of large-scale imagery archiving and analysis markets worldwide. LM decided to provide a turnkey commercial solution to receive, store, retrieve, process, analyze and disseminate in 'push' or 'pull' modes imagery, data and data products using a variety of sources and formats. LM selected 'best of breed' hardware and software components and adapted and developed its own algorithms to provide added functionality not commercially available elsewhere. The resultant product, Intelligent Library System (ILS)TM, satisfies requirements for (a) a potentially unbounded, data archive (5000 TB range) (b) automated workflow management for increased user productivity; (c) automatic tracking and management of files stored on shelves; (d) ability to ingest, process and disseminate data volumes with bandwidths ranging up to multi- gigabit per second; (e) access through a thin client-to-server network environment; (f) multiple interactive users needing retrieval of files in seconds from both archived images or in real time, and (g) scalability that maintains information throughput performance as the size of the digital library grows.

  8. EMMA—mouse mutant resources for the international scientific community

    PubMed Central

    Wilkinson, Phil; Sengerova, Jitka; Matteoni, Raffaele; Chen, Chao-Kung; Soulat, Gaetan; Ureta-Vidal, Abel; Fessele, Sabine; Hagn, Michael; Massimi, Marzia; Pickford, Karen; Butler, Richard H.; Marschall, Susan; Mallon, Ann-Marie; Pickard, Amanda; Raspa, Marcello; Scavizzi, Ferdinando; Fray, Martin; Larrigaldie, Vanessa; Leyritz, Johan; Birney, Ewan; Tocchini-Valentini, Glauco P.; Brown, Steve; Herault, Yann; Montoliu, Lluis; de Angelis, Martin Hrabé; Smedley, Damian

    2010-01-01

    The laboratory mouse is the premier animal model for studying human disease and thousands of mutants have been identified or produced, most recently through gene-specific mutagenesis approaches. High throughput strategies by the International Knockout Mouse Consortium (IKMC) are producing mutants for all protein coding genes. Generating a knock-out line involves huge monetary and time costs so capture of both the data describing each mutant alongside archiving of the line for distribution to future researchers is critical. The European Mouse Mutant Archive (EMMA) is a leading international network infrastructure for archiving and worldwide provision of mouse mutant strains. It operates in collaboration with the other members of the Federation of International Mouse Resources (FIMRe), EMMA being the European component. Additionally EMMA is one of four repositories involved in the IKMC, and therefore the current figure of 1700 archived lines will rise markedly. The EMMA database gathers and curates extensive data on each line and presents it through a user-friendly website. A BioMart interface allows advanced searching including integrated querying with other resources e.g. Ensembl. Other resources are able to display EMMA data by accessing our Distributed Annotation System server. EMMA database access is publicly available at http://www.emmanet.org. PMID:19783817

  9. Workshop: Western hemisphere network of bird banding programs

    USGS Publications Warehouse

    Celis-Murillo, A.

    2007-01-01

    Purpose: To promote collaboration among banding programs in the Americas. Introduction: Bird banding and marking provide indispensable tools for ornithological research, management, and conservation of migratory birds on migratory routes, breeding and non-breeding grounds. Many countries and organizations in Latin America and the Caribbean are in the process of developing or have expressed interest in developing national banding schemes and databases to support their research and management programs. Coordination of developing and existing banding programs is essential for effective data management, reporting, archiving and security, and most importantly, for gaining a fuller understanding of migratory bird conservation issues and how the banding data can help. Currently, there is a well established bird-banding program in the U.S.A. and Canada, and programs in other countries are being developed as well. Ornithologists in many Latin American countries and the Caribbean are interested in using banding and marking in their research programs. Many in the ornithological community are interested in establishing banding schemes and some countries have recently initiated independent banding programs. With the number of long term collaborative and international initiatives increasing, the time is ripe to discuss and explore opportunities for international collaboration, coordination, and administration of bird banding programs in the Western Hemisphere. We propose the second ?Western Hemisphere Network of Bird Banding Programs? workshop, in association with the SCSCB, to be an essential step in the progress to strengthen international partnerships and support migratory bird conservation in the Americas and beyond. This will be the second multi-national meeting to promote collaboration among banding programs in the Americas (the first meeting was held in October 8-9, 2006 in La Mancha, Veracruz, Mexico). The Second ?Western Hemisphere Network of Bird Banding Programs? workshop will continue addressing issues surrounding the coordination of an Americas? approach to bird banding and will review in detail the advances made on the first workshop such as, coordination of bands and markers, coordination in recovery reporting, permit issues, data management and data sharing and archiving, data security, training, etc. Workshop Goals: Build on accomplishments of the network?s first workshop (Oct 8-9, 2006). Identify and explore new opportunities for data sharing, data archiving, data access, training, etc. Initiate strategies to support international collaboration and coordination amongst bird banding programs in the Western Hemisphere. Workshop structure: One day workshop of guided discussions. Participants: Representatives of government agencies, program managers and NGOs.

  10. Finding meaning in social media: content-based social network analysis of QuitNet to identify new opportunities for health promotion.

    PubMed

    Myneni, Sahiti; Cobb, Nathan K; Cohen, Trevor

    2013-01-01

    Unhealthy behaviors increase individual health risks and are a socioeconomic burden. Harnessing social influence is perceived as fundamental for interventions to influence health-related behaviors. However, the mechanisms through which social influence occurs are poorly understood. Online social networks provide the opportunity to understand these mechanisms as they digitally archive communication between members. In this paper, we present a methodology for content-based social network analysis, combining qualitative coding, automated text analysis, and formal network analysis such that network structure is determined by the content of messages exchanged between members. We apply this approach to characterize the communication between members of QuitNet, an online social network for smoking cessation. Results indicate that the method identifies meaningful theme-based social sub-networks. Modeling social network data using this method can provide us with theme-specific insights such as the identities of opinion leaders and sub-community clusters. Implications for design of targeted social interventions are discussed.

  11. Toward a standardized soil carbon database platform in the US Critical Zone Observatory Network

    NASA Astrophysics Data System (ADS)

    Filley, T. R.; Marini, L.; Todd-Brown, K. E.; Malhotra, A.; Harden, J. W.; Kumar, P.

    2017-12-01

    Within the soil carbon community of the US Critical Zone Observatory (CZO) Network, efforts are underway to promote network-level data syntheses and modeling projects and to identify barriers to data intercomparability. This represents a challenging goal given the diversity of soil carbon sampling methodologies, spatial and vertical resolution, carbon pool isolation protocols, subsequent measurement techniques, and matrix terminology. During the last annual meeting of the CZO SOC Working Group, Dec 11, 2016, it was decided that integration with, and potentially adoption of, a widely used, active, and mature data aggregation, archival, and visualization platform was the easiest route to achieve this ultimate goal. Additionally, to assess the state of deep and shallow soil C data among the CZO sites it was recommended that a comprehensive survey must be undertaken to identify data gaps and catalog the various soil sampling and analysis methodologies. The International Soil Carbon Network (ISCN) has a long history of leadership in the development of soil C data aggregation, archiving, and visualization tools and currently houses data for over 70,000 soil cores contributed from international soil carbon community. Over the past year, members of the CZO network and the ISCN have met to discuss logistics of adopting the ISCN template within the CZO. Collaborative efforts among all of the CZO site data managers, led by the Intensively Managed Landscapes CZO, will evaluate feasibility of adoption of the ISCN template, or some modification thereof, and distribution to the appropriate soil scientists for data upload and aggregation. Partnering with ISCN also ensures that soil characteristics from the US CZO are placed in a developing global soil context and paves the way for future integration of data from other international CZO networks. This poster will provide an update of this overall effort along with a summary of data products, partnering networks, and recommendations for data language template and the future CZO APIs.

  12. VETA x ray data acquisition and control system

    NASA Technical Reports Server (NTRS)

    Brissenden, Roger J. V.; Jones, Mark T.; Ljungberg, Malin; Nguyen, Dan T.; Roll, John B., Jr.

    1992-01-01

    We describe the X-ray Data Acquisition and Control System (XDACS) used together with the X-ray Detection System (XDS) to characterize the X-ray image during testing of the AXAF P1/H1 mirror pair at the MSFC X-ray Calibration Facility. A variety of X-ray data were acquired, analyzed and archived during the testing including: mirror alignment, encircled energy, effective area, point spread function, system housekeeping and proportional counter window uniformity data. The system architecture is presented with emphasis placed on key features that include a layered UNIX tool approach, dedicated subsystem controllers, real-time X-window displays, flexibility in combining tools, network connectivity and system extensibility. The VETA test data archive is also described.

  13. Clinical applications of an ATM/Ethernet network in departments of neuroradiology and radiotherapy.

    PubMed

    Cimino, C; Pizzi, R; Fusca, M; Bruzzone, M G; Casolino, D; Sicurello, F

    1997-01-01

    An integrated system for the multimedia management of images and clinical information has been developed at the Isituto Nazionale Neurologico C. Besta in Milan. The Institute physicians have the daily need of consulting images coming from various modalities. The high volume of archived material and the need of retrieving and displaying new and past images and clinical information has motivated the development of a Picture Archiving and Communication System (PACS) for the automatic management of images and clinical data, related not only to the Radiology Department, but also to the Radiotherapy Department for 3D virtual simulation, to remote teleconsulting, and in the following to all the wards, ambulatories and labs.

  14. Analysis of the eight-year trend in ozone depletion from empirical models of solar backscattered ultraviolet instrument degradation

    NASA Technical Reports Server (NTRS)

    Herman, J. R.; Hudson, R. D.; Serafino, G.

    1990-01-01

    Arguments are presented showing that the basic empirical model of the solar backscatter UV (SBUV) instrument degradation used by Cebula et al. (1988) in their analysis of the SBUV data is likely to lead to an incorrect estimate of the ozone trend. A correction factor is given as a function of time and altitude that brings the SBUV data into approximate agreement with the SAGE, SME, and Dobson network ozone trends. It is suggested that the currently archived SBUV ozone data should be used with caution for periods of analysis exceeding 1 yr, since it is likely that the yearly decreases contained in the archived data are too large.

  15. Development of multi-mission satellite data systems at the German Remote Sensing Data Centre

    NASA Astrophysics Data System (ADS)

    Lotz-Iwen, H. J.; Markwitz, W.; Schreier, G.

    1998-11-01

    This paper focuses on conceptual aspects of the access to multi-mission remote sensing data by online catalogue and information systems. The system ISIS of the German Remote Sensing Data Centre is described as an example of a user interface to earth observation data. ISIS has been designed to support international scientific research as well as operational applications by offering online access to the database via public networks. It provides catalogue retrieval, visualisation and transfer of image data, and is integrated in international activities dedicated to catalogue and archive interoperability. Finally, an outlook is given on international projects dealing with access to remote sensing data in distributed archives.

  16. R package to estimate intracluster correlation coefficient with confidence interval for binary data.

    PubMed

    Chakraborty, Hrishikesh; Hossain, Akhtar

    2018-03-01

    The Intracluster Correlation Coefficient (ICC) is a major parameter of interest in cluster randomized trials that measures the degree to which responses within the same cluster are correlated. There are several types of ICC estimators and its confidence intervals (CI) suggested in the literature for binary data. Studies have compared relative weaknesses and advantages of ICC estimators as well as its CI for binary data and suggested situations where one is advantageous in practical research. The commonly used statistical computing systems currently facilitate estimation of only a very few variants of ICC and its CI. To address the limitations of current statistical packages, we developed an R package, ICCbin, to facilitate estimating ICC and its CI for binary responses using different methods. The ICCbin package is designed to provide estimates of ICC in 16 different ways including analysis of variance methods, moments based estimation, direct probabilistic methods, correlation based estimation, and resampling method. CI of ICC is estimated using 5 different methods. It also generates cluster binary data using exchangeable correlation structure. ICCbin package provides two functions for users. The function rcbin() generates cluster binary data and the function iccbin() estimates ICC and it's CI. The users can choose appropriate ICC and its CI estimate from the wide selection of estimates from the outputs. The R package ICCbin presents very flexible and easy to use ways to generate cluster binary data and to estimate ICC and it's CI for binary response using different methods. The package ICCbin is freely available for use with R from the CRAN repository (https://cran.r-project.org/package=ICCbin). We believe that this package can be a very useful tool for researchers to design cluster randomized trials with binary outcome. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. Evaluating transition state structures of vanadium-phosphatase protein complexes using shape analysis.

    PubMed

    Sánchez-Lombardo, Irma; Alvarez, Santiago; McLauchlan, Craig C; Crans, Debbie C

    2015-06-01

    Shape analysis of coordination complexes is well-suited to evaluate the subtle distortions in the trigonal bipyramidal (TBPY-5) geometry of vanadium coordinated in the active site of phosphatases and characterized by X-ray crystallography. Recent studies using the tau (τ) analysis support the assertion that vanadium is best described as a trigonal bipyramid, because this geometry is the ideal transition state geometry of the phosphate ester substrate hydrolysis (C.C. McLauchlan, B.J. Peters, G.R. Willsky, D.C. Crans, Coord. Chem. Rev. http://dx.doi.org/10.1016/j.ccr.2014.12.012 ; D.C. Crans, M.L. Tarlton, C.C. McLauchlan, Eur. J. Inorg. Chem. 2014, 4450-4468). Here we use continuous shape measures (CShM) analysis to investigate the structural space of the five-coordinate vanadium-phosphatase complexes associated with mechanistic transformations between the tetrahedral geometry and the five-coordinate high energy TBPY-5 geometry was discussed focusing on the protein tyrosine phosphatase 1B (PTP1B) enzyme. No evidence for square pyramidal geometries was observed in any vanadium-protein complexes. The shape analysis positioned the metal ion and the ligands in the active site reflecting the mechanism of the cleavage of the organic phosphate in a phosphatase. We identified the umbrella distortions to be directly on the reaction path between tetrahedral phosphate and the TBPY-5-types of high-energy species. The umbrella distortions of the trigonal bipyramid are therefore identified as being the most relevant types of transition state structures for the phosphoryl group transfer reactions for phosphatases and this may be related to the possibility that vanadium is an inhibitor for enzymes that support both exploded and five-coordinate transition states. Copyright © 2015 Elsevier Inc. All rights reserved.

  18. The R package 'Luminescence': a history of unexpected complexity and concepts to deal with it

    NASA Astrophysics Data System (ADS)

    Kreutzer, Sebastian; Burow, Christoph; Dietze, Michael; Fuchs, Margret C.; Friedrich, Johannes; Fischer, Manfred; Schmidt, Christoph

    2017-04-01

    Overcoming limitations in the so far used standard software, developing an efficient solution of low weight for a very specific task or creating graphs of high quality: the reasons that may had initially lead a scientist to work with R are manifold. And as long as developed solutions, e.g., R scripts, are needed for personal use only, code can remain unstructured and a documentation is not compulsory. However, this changes with the first friendly request for help after the code has been reused by others. In contrast to single scripts, written without intention to ever get published, for R packages the CRAN policy demands a more structured and elaborated approach including a minimum of documentation. Nevertheless, growing projects with thousands of lines of code that need to be maintained can become overwhelming, in particular as researchers are not by definition experts on managing software projects. The R package 'Luminescence' (Kreutzer et al., 2017), a collection of tools dealing with the analysis of luminescence data in a geoscientific, geochronological context, started as one single R script, but quickly evolved into a comprehensive solution connected with various other R packages. We present (1) a very brief development history of the package 'Luminescence', before we (2) sketch technical challenges encountered over time and solutions that have been found to deal with it by using various open source tools. Our presentation is considered as a collection of concepts and approaches to set up R projects in geosciences. References. Kreutzer, S., Dietze, M., Burow, C., Fuchs, M. C., Schmidt, C., Fischer, M., Friedrich, J., 2017. Luminescence: Comprehensive Luminescence Dating Data Analysis. R package version 0.6.4. https://CRAN.R-project.org/package=Luminescence

  19. Feature selection using a one dimensional naïve Bayes' classifier increases the accuracy of support vector machine classification of CDR3 repertoires.

    PubMed

    Cinelli, Mattia; Sun, Yuxin; Best, Katharine; Heather, James M; Reich-Zeliger, Shlomit; Shifrut, Eric; Friedman, Nir; Shawe-Taylor, John; Chain, Benny

    2017-04-01

    Somatic DNA recombination, the hallmark of vertebrate adaptive immunity, has the potential to generate a vast diversity of antigen receptor sequences. How this diversity captures antigen specificity remains incompletely understood. In this study we use high throughput sequencing to compare the global changes in T cell receptor β chain complementarity determining region 3 (CDR3β) sequences following immunization with ovalbumin administered with complete Freund's adjuvant (CFA) or CFA alone. The CDR3β sequences were deconstructed into short stretches of overlapping contiguous amino acids. The motifs were ranked according to a one-dimensional Bayesian classifier score comparing their frequency in the repertoires of the two immunization classes. The top ranking motifs were selected and used to create feature vectors which were used to train a support vector machine. The support vector machine achieved high classification scores in a leave-one-out validation test reaching >90% in some cases. The study describes a novel two-stage classification strategy combining a one-dimensional Bayesian classifier with a support vector machine. Using this approach we demonstrate that the frequency of a small number of linear motifs three amino acids in length can accurately identify a CD4 T cell response to ovalbumin against a background response to the complex mixture of antigens which characterize Complete Freund's Adjuvant. The sequence data is available at www.ncbi.nlm.nih.gov/sra/?term¼SRP075893 . The Decombinator package is available at github.com/innate2adaptive/Decombinator . The R package e1071 is available at the CRAN repository https://cran.r-project.org/web/packages/e1071/index.html . b.chain@ucl.ac.uk. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press.

  20. The 1990 annual statistics and highlights report

    NASA Technical Reports Server (NTRS)

    Green, James L.

    1991-01-01

    The National Space Science Data Center (NSSDC) has archived over 6 terabytes of space and Earth science data accumulated over nearly 25 years. It now expects these holdings to nearly double every two years. The science user community needs rapid access to this archival data and information about data. The NSSDC has been set on course to provide just that. Five years ago the NSSDC came on line, becoming easily reachable for thousands of scientists around the world through electronic networks it managed and other international electronic networks to which it connected. Since that time, the data center has developed and implemented over 15 interactive systems, operational nearly 24 hours per day, and is reachable through DECnet, TCP/IP, X25, and BITnet communication protocols. The NSSDC is a clearinghouse for the science user to find data needed through the Master Directory system whether it is at the NSSDC or deposited in over 50 other archives and data management facilities around the world. Over 13,000 users accessed the NSSDC electronic systems, during the past year. Thousands of requests for data have been satisfied, resulting in the NSSDC's sending out a volume of data last year that nearly exceeded a quarter of its holdings. This document reports on some of the highlights and distribution statistics for most of the basic NSSDC operational services for fiscal year 1990. It is intended to be the first of a series of annual reports on how well NSSDC is doing in supporting the space and Earth science user communities.

  1. EarthScope's Transportable Array: Status of the Alaska Deployment and Guide to Resources for Lower48 Deployment

    NASA Astrophysics Data System (ADS)

    Busby, R. W.; Woodward, R.; Aderhold, K.; Frassetto, A.

    2017-12-01

    The Alaska Transportable Array deployment is completely installed, totaling 280 stations, with 194 new stations and 86 existing stations, 28 of those upgraded with new sensor emplacement. We briefly summarize the deployment of this seismic network, describe the added meteorological instruments and soil temperature gauges, and review our expectations for operation and demobilization. Curation of data from the contiguous Lower-48 States deployment of Transportable Array (>1800 stations, 2004-2015) has continued with the few gaps in real-time data replaced by locally archived files as well as minor adjustments in metadata. We highlight station digests that provide more detail on the components and settings of individual stations, documentation of standard procedures used throughout the deployment and other resources available online. In cooperation with IRIS DMC, a copy of the complete TA archive for the Lower-48 period has been transferred to a local disk to experiment with data access and software workflows that utilize most or all of the seismic timeseries, in contrast to event segments. Assembling such large datasets reliably - from field stations to a well managed data archive to a user's workspace - is complex. Sharing a curated and defined data volume with researchers is a potentially straightforward way to make data intensive analyses less difficult. We note that data collection within the Lower-48 continues with 160 stations of the N4 network operating at increased sample rates (100 sps) as part of the CEUSN, as operational support transitions from NSF to USGS.

  2. Observations and modeling of seismic background noise

    USGS Publications Warehouse

    Peterson, Jon R.

    1993-01-01

    The preparation of this report had two purposes. One was to present a catalog of seismic background noise spectra obtained from a worldwide network of seismograph stations. The other purpose was to refine and document models of seismic background noise that have been in use for several years. The second objective was, in fact, the principal reason that this study was initiated and influenced the procedures used in collecting and processing the data.With a single exception, all of the data used in this study were extracted from the digital data archive at the U.S. Geological Survey's Albuquerque Seismological Laboratory (ASL). This archive dates from 1972 when ASL first began deploying digital seismograph systems and collecting and distributing digital data under the sponsorship of the Defense Advanced Research Projects Agency (DARPA). There have been many changes and additions to the global seismograph networks during the past twenty years, but perhaps none as significant as the current deployment of very broadband seismographs by the U.S. Geological Survey (USGS) and the University of California San Diego (UCSD) under the scientific direction of the IRIS consortium. The new data acquisition systems have extended the bandwidth and resolution of seismic recording, and they utilize high-density recording media that permit the continuous recording of broadband data. The data improvements and continuous recording greatly benefit and simplify surveys of seismic background noise.Although there are many other sources of digital data, the ASL archive data were used almost exclusively because of accessibility and because the data systems and their calibration are well documented for the most part. Fortunately, the ASL archive contains high-quality data from other stations in addition to those deployed by the USGS. Included are data from UCSD IRIS/IDA stations, the Regional Seismic Test Network (RSTN) deployed by Sandia National Laboratories (SNL), and the TERRAscope network deployed by the California Institute of Technology in cooperation with other institutions.A map showing the approximate locations of the stations used in this study is provided in Figure 1. One might hope for a better distribution of stations in the southern hemisphere, especially Africa and South America, in order to look for regional variations in seismic noise (apart from the major differences between continental, coastal and island sites). Unfortunately, anyone looking for subtle regional variations in seismic noise is probably going to be disappointed by the spectral data presented in this report because much of the station data appear to be dominated by local disturbances caused by instrumental, environmental, cultural, or surf noise. Better instruments and better instrument siting, or a well-funded field program, will be needed before a global isoseismal noise map can be produced. However, by assembling a composite of background noise from a large network of stations, many of the local station variables are masked, and it is possible to create generalized spectral plots of Earth noise for hypothetical quiet and noisy station sites.

  3. Picture archiving and communication system--Part one: Filmless radiology and distance radiology.

    PubMed

    De Backer, A I; Mortelé, K J; De Keulenaer, B L

    2004-01-01

    Picture archiving and communication system (PACS) is a collection of technologies used to carry out digital medical imaging. PACS is used to digitally acquire medical images from the various modalities, such as computed tomography (CT), magnetic resonance imaging (MRI), ultrasound, and digital projection radiography. The image data and pertinent information are transmitted to other and possibly remote locations over networks, where they may be displayed on computer workstations for soft copy viewing in multiple locations, thus permitting simultaneous consultations and almost instant reporting from radiologists at a distance. Data are secured and archived on digital media such as optical disks or tape, and may be automatically retrieved as necessary. Close integration with the hospital information system (HIS)--radiology information system (RIS) is critical for system functionality. Medical image management systems are maturing, providing access outside of the radiology department to images throughout the hospital via the Ethernet, at different hospitals, or from a home workstation if teleradiology has been implemented.

  4. Recent advances and plans in processing and geocoding of SAR data at the DFD

    NASA Technical Reports Server (NTRS)

    Noack, W.

    1993-01-01

    Because of the needs of future projects like ENVISAT and the experiences made with the current operational ERS-1 facilities, a radical change in the synthetic aperture radar (SAR) processing scenarios can be predicted for the next years. At the German PAF several new developments were initialized which are driven mainly either by user needs or by system and operational constraints ('lessons learned'). At the end there will be a major simplification and uniformation of all used computer systems. Especially the following changes are likely to be implemented at the German PAF: transcription before archiving, processing of all standard products with high throughput directly at the receiving stations, processing of special 'high-valued' products at the PAF, usage of a single type of processor hardware, implementation of a large and fast on-line data archive, and improved and unified fast data network between the processing and archiving facilities. A short description of the current operational SAR facilities as well as the future implementations are given.

  5. High resolution global gridded data for use in population studies

    NASA Astrophysics Data System (ADS)

    Lloyd, Christopher T.; Sorichetta, Alessandro; Tatem, Andrew J.

    2017-01-01

    Recent years have seen substantial growth in openly available satellite and other geospatial data layers, which represent a range of metrics relevant to global human population mapping at fine spatial scales. The specifications of such data differ widely and therefore the harmonisation of data layers is a prerequisite to constructing detailed and contemporary spatial datasets which accurately describe population distributions. Such datasets are vital to measure impacts of population growth, monitor change, and plan interventions. To this end the WorldPop Project has produced an open access archive of 3 and 30 arc-second resolution gridded data. Four tiled raster datasets form the basis of the archive: (i) Viewfinder Panoramas topography clipped to Global ADMinistrative area (GADM) coastlines; (ii) a matching ISO 3166 country identification grid; (iii) country area; (iv) and slope layer. Further layers include transport networks, landcover, nightlights, precipitation, travel time to major cities, and waterways. Datasets and production methodology are here described. The archive can be downloaded both from the WorldPop Dataverse Repository and the WorldPop Project website.

  6. High resolution global gridded data for use in population studies.

    PubMed

    Lloyd, Christopher T; Sorichetta, Alessandro; Tatem, Andrew J

    2017-01-31

    Recent years have seen substantial growth in openly available satellite and other geospatial data layers, which represent a range of metrics relevant to global human population mapping at fine spatial scales. The specifications of such data differ widely and therefore the harmonisation of data layers is a prerequisite to constructing detailed and contemporary spatial datasets which accurately describe population distributions. Such datasets are vital to measure impacts of population growth, monitor change, and plan interventions. To this end the WorldPop Project has produced an open access archive of 3 and 30 arc-second resolution gridded data. Four tiled raster datasets form the basis of the archive: (i) Viewfinder Panoramas topography clipped to Global ADMinistrative area (GADM) coastlines; (ii) a matching ISO 3166 country identification grid; (iii) country area; (iv) and slope layer. Further layers include transport networks, landcover, nightlights, precipitation, travel time to major cities, and waterways. Datasets and production methodology are here described. The archive can be downloaded both from the WorldPop Dataverse Repository and the WorldPop Project website.

  7. TLALOCNet: A Continuous GPS-Met Array in Mexico for Seismotectonic and Atmospheric Research

    NASA Astrophysics Data System (ADS)

    Cabral-Cano, E.; Salazar-Tlaczani, L.; Galetzka, J.; DeMets, C.; Serra, Y. L.; Feaux, K.; Mattioli, G. S.; Miller, M. M.

    2015-12-01

    TLALOCNet is a network of continuous Global Positioning System (cGPS) and meteorology stations in Mexico for the interrogation of the earthquake cycle, tectonic processes, land subsidence, and atmospheric processes of Mexico. Once completed, TLALOCNet will span all of Mexico and will link existing GPS infrastructure in North America and the Caribbean aiming towards creating a continuous, federated network of networks in the Americas. Phase 1 (2014-2015), funded by NSF and UNAM, is building and upgrading 30+ cGPS-Met sites to the high standard of the EarthScope Plate Boundary Observatory (PBO). Phase 2 (2016) will add ~25 more cGPS-Met stations to be funded through CONACyT. TLALOCNet provides open and freely available raw GPS data, GPS-PWV, surface meteorology measurements, time series of daily positions, as well as a station velocity field to support a broad range of geoscience investigations. This is accomplished through the development of the TLALOCNet data center (http://tlalocnet.udg.mx) that serves as a collection and distribution point. This data center is based on UNAVCO's Dataworks-GSAC software and can work as part of UNAVCO's seamless archive for discovery, sharing, and access to data.The TLALOCNet data center also contains contributed data from several regional networks in Mexico. By using the same protocols and structure as the UNAVCO and other COCONet regional data centers, the geodetic community has the capability of accessing data from a large number of scientific and academically operated Mexican GPS sites. This archive provides a fully querable and scriptable GPS and Meteorological data retrieval point. Additionally Real-time 1Hz streams from selected TLALOCNet stations are available in BINEX, RTCM 2.3 and RTCM 3.1 formats via the Networked Transport of RTCM via Internet Protocol (NTRIP).

  8. All Source Sensor Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    - PNNL, Harold Trease

    2012-10-10

    ASSA is a software application that processes binary data into summarized index tables that can be used to organize features contained within the data. ASSA's index tables can also be used to search for user specified features. ASSA is designed to organize and search for patterns in unstructured binary data streams or archives, such as video, images, audio, and network traffic. ASSA is basically a very general search engine used to search for any pattern in any binary data stream. It has uses in video analytics, image analysis, audio analysis, searching hard-drives, monitoring network traffic, etc.

  9. Beginning with the Particular: Reimagining Professional Development as a Feminist Practice

    ERIC Educational Resources Information Center

    Schultz, Katherine

    2011-01-01

    This article analyzes the work of a long-term network of teachers, the Philadelphia Teachers Learning Cooperative, with a focus on their descriptive practices. Drawing on three years of ethnographic documentation of weekly meetings and a historical archive of meetings over 30 years, I characterize the teachers' knowledge about teaching and…

  10. Use of MCIDAS as an earth science information systems tool

    NASA Technical Reports Server (NTRS)

    Goodman, H. Michael; Karitani, Shogo; Parker, Karen G.; Stooksbury, Laura M.; Wilson, Gregory S.

    1988-01-01

    The application of the man computer interactive data access system (MCIDAS) to information processing is examined. The computer systems that interface with the MCIDAS are discussed. Consideration is given to the computer networking of MCIDAS, data base archival, and the collection and distribution of real-time special sensor microwave/imager data.

  11. Communication Within the Educational Research and Development Community--Suggested Steps Toward Further Structure and Order

    ERIC Educational Resources Information Center

    Nelson, Carnot E.

    1973-01-01

    Identifies two major problems--first, both the formal and informal communication networks are extremely diffuse; and second, the interval from the start of a piece of research until its integration into the archival body of scientific knowledge is long--and presents some suggestions for alleviating them. (Author/JM)

  12. Brooklyn Historical Society and the New York State Historical Documents Inventory, 1985-2007

    ERIC Educational Resources Information Center

    Pettit, Marilyn H.

    2008-01-01

    This article summarizes the New York State Historical Documents Inventory as experienced at Brooklyn Historical Society. The archives and manuscripts, dating from the seventeenth century and surveyed by the Historical Documents Inventory in the 1980s, were cataloged as Historical Documents Inventory/Research Libraries Information Network records…

  13. Research Library Issues: A Quarterly Report from ARL, CNI, and SPARC. RLI 279

    ERIC Educational Resources Information Center

    Baughman, M. Sue, Ed.

    2012-01-01

    "Research Library Issues" ("RLI") is a quarterly report from ARL (Association of Research Libraries), CNI (Coalition of Networked Information), and SPARC (Scholarly Publishing and Academic Resources Coalition). This issue includes the following articles: (1) Digitization of Special Collections and Archives: Legal and Contractual Issues (Peter B.…

  14. Web Camera Use in Developing Biology, Molecular Biology and Biochemistry Laboratories

    ERIC Educational Resources Information Center

    Ogren, Paul J.; Deibel, Michael; Kelly, Ian; Mulnix, Amy B.; Peck, Charlie

    2004-01-01

    The use of a network-ready color camera is described which is primarily marketed as a security device and is used for experiments in developmental biology, genetics and biochemistry laboratories and in special student research projects. Acquiring and analyzing project and archiving images is very important in microscopy, electrophoresis and…

  15. On-line access to remote sensing data with the satellite-data information system (ISIS)

    NASA Astrophysics Data System (ADS)

    Strunz, G.; Lotz-Iwen, H.-J.

    1994-08-01

    The German Remote Sensing Data Center (DFD) is developing the satellite-data information system ISIS as central interface for users to access Earth observation data. ISIS has been designed to support international scientific research as well as operational applications by offering online database access via public networks, and is integrated in the international activities dedicated to catalogue and archive interoperability. A prototype of ISIS is already in use within the German Processing and Archiving Facility for ERS-1 for the storage and retrieval of digital SAR quicklook products and for the Radarmap of Germany. An operational status of the system is envisaged for the launch of ERS-2. The paper in hand describes the underlying concepts of ISIS and the recent state of realization. It explains the overall structure of the system and the functionality of each of its components. Emphasis is put on the description of the advisory system, the catalogue retrieval, and the online access and transfer of image data. Finally, the integration into a future global environmental data network is outlined.

  16. Accessing northern California earthquake data via Internet

    NASA Astrophysics Data System (ADS)

    Romanowicz, Barbara; Neuhauser, Douglas; Bogaert, Barbara; Oppenheimer, David

    The Northern California Earthquake Data Center (NCEDC) provides easy access to central and northern California digital earthquake data. It is located at the University of California, Berkeley, and is operated jointly with the U.S. Geological Survey (USGS) in Menlo Park, Calif., and funded by the University of California and the National Earthquake Hazard Reduction Program. It has been accessible to users in the scientific community through Internet since mid-1992.The data center provides an on-line archive for parametric and waveform data from two regional networks: the Northern California Seismic Network (NCSN) operated by the USGS and the Berkeley Digital Seismic Network (BDSN) operated by the Seismographic Station at the University of California, Berkeley.

  17. Georgia's Stream-Water-Quality Monitoring Network, 2006

    USGS Publications Warehouse

    Nobles, Patricia L.; ,

    2006-01-01

    The USGS stream-water-quality monitoring network for Georgia is an aggregation of smaller networks and individual monitoring stations that have been established in cooperation with Federal, State, and local agencies. These networks collectively provide data from 130 sites, 62 of which are monitored continuously in real time using specialized equipment that transmits these data via satellite to a centralized location for processing and storage. These data are made available on the Web in near real time at http://waterdata.usgs.gov/ga/nwis/ Ninety-eight stations are sampled periodically for a more extensive suite of chemical and biological constituents that require laboratory analysis. Both the continuous and the periodic water-quality data are archived and maintained in the USGS National Water Information System and are available to cooperators, water-resource managers, and the public. The map at right shows the USGS stream-water-quality monitoring network for Georgia and major watersheds. The network represents an aggregation of smaller networks and individual monitoring stations that collectively provide data from 130 sites.

  18. Web Services and Data Enhancements at the Northern California Earthquake Data Center

    NASA Astrophysics Data System (ADS)

    Neuhauser, D. S.; Zuzlewski, S.; Lombard, P. N.; Allen, R. M.

    2013-12-01

    The Northern California Earthquake Data Center (NCEDC) provides data archive and distribution services for seismological and geophysical data sets that encompass northern California. The NCEDC is enhancing its ability to deliver rapid information through Web Services. NCEDC Web Services use well-established web server and client protocols and REST software architecture to allow users to easily make queries using web browsers or simple program interfaces and to receive the requested data in real-time rather than through batch or email-based requests. Data are returned to the user in the appropriate format such as XML, RESP, simple text, or MiniSEED depending on the service and selected output format. The NCEDC offers the following web services that are compliant with the International Federation of Digital Seismograph Networks (FDSN) web services specifications: (1) fdsn-dataselect: time series data delivered in MiniSEED format, (2) fdsn-station: station and channel metadata and time series availability delivered in StationXML format, (3) fdsn-event: earthquake event information delivered in QuakeML format. In addition, the NCEDC offers the the following IRIS-compatible web services: (1) sacpz: provide channel gains, poles, and zeros in SAC format, (2) resp: provide channel response information in RESP format, (3) dataless: provide station and channel metadata in Dataless SEED format. The NCEDC is also developing a web service to deliver timeseries from pre-assembled event waveform gathers. The NCEDC has waveform gathers for ~750,000 northern and central California events from 1984 to the present, many of which were created by the USGS NCSN prior to the establishment of the joint NCSS (Northern California Seismic System). We are currently adding waveforms to these older event gathers with time series from the UCB networks and other networks with waveforms archived at the NCEDC, and ensuring that the waveform for each channel in the event gathers have the highest quality waveform from the archive.

  19. Intelligent Systems Technologies and Utilization of Earth Observation Data

    NASA Technical Reports Server (NTRS)

    Ramapriyan, H. K.; McConaughy, G. R.; Morse, H. S.

    2004-01-01

    The addition of raw data and derived geophysical parameters from several Earth observing satellites over the last decade to the data held by NASA data centers has created a data rich environment for the Earth science research and applications communities. The data products are being distributed to a large and diverse community of users. Due to advances in computational hardware, networks and communications, information management and software technologies, significant progress has been made in the last decade in archiving and providing data to users. However, to realize the full potential of the growing data archives, further progress is necessary in the transformation of data into information, and information into knowledge that can be used in particular applications. Sponsored by NASA s Intelligent Systems Project within the Computing, Information and Communication Technology (CICT) Program, a conceptual architecture study has been conducted to examine ideas to improve data utilization through the addition of intelligence into the archives in the context of an overall knowledge building system (KBS). Potential Intelligent Archive concepts include: 1) Mining archived data holdings to improve metadata to facilitate data access and usability; 2) Building intelligence about transformations on data, information, knowledge, and accompanying services; 3) Recognizing the value of results, indexing and formatting them for easy access; 4) Interacting as a cooperative node in a web of distributed systems to perform knowledge building; and 5) Being aware of other nodes in the KBS, participating in open systems interfaces and protocols for virtualization, and achieving collaborative interoperability.

  20. Policies and Procedures for Accessing Archived NASA Lunar Data via the Web

    NASA Technical Reports Server (NTRS)

    James, Nathan L.; Williams, David R.

    2011-01-01

    The National Space Science Data Center (NSSDC) was established by NASA to provide for the preservation and dissemination of scientific data from NASA missions. This paper describes the policies specifically related to lunar science data. NSSDC presently archives 660 lunar data collections. Most of these data (423 units) are stored offline in analog format. The remainder of this collection consists of magnetic tapes and discs containing approximately 1.7 TB of digital lunar data. The active archive for NASA lunar data is the Planetary Data System (PDS). NSSDC has an agreement with the PDS Lunar Data Node to assist in the restoration and preparation of NSSDC-resident lunar data upon request for access and distribution via the PDS archival system. Though much of NSSDC's digital store also resides in PDS, NSSDC has many analog data collections and some digital lunar data sets that are not in PDS. NSSDC stands ready to make these archived lunar data accessible to both the research community and the general public upon request as resources allow. Newly requested offline lunar data are digitized and moved to near-line storage devices called digital linear tape jukeboxes. The data are then packaged and made network-accessible via FTP for the convenience of a growing segment of the user community. This publication will 1) discuss the NSSDC processes and policies that govern how NASA lunar data is preserved, restored, and made accessible via the web and 2) highlight examples of special lunar data requests.

  1. Geodetic Seamless Archive Centers Modernization - Information Technology for Exploiting the Data Explosion

    NASA Astrophysics Data System (ADS)

    Boler, F. M.; Blewitt, G.; Kreemer, C. W.; Bock, Y.; Noll, C. E.; McWhirter, J.; Jamason, P.; Squibb, M. B.

    2010-12-01

    Space geodetic science and other disciplines using geodetic products have benefited immensely from open sharing of data and metadata from global and regional archives Ten years ago Scripps Orbit and Permanent Array Center (SOPAC), the NASA Crustal Dynamics Data Information System (CDDIS), UNAVCO and other archives collaborated to create the GPS Seamless Archive Centers (GSAC) in an effort to further enable research with the expanding collections of GPS data then becoming available. The GSAC partners share metadata to facilitate data discovery and mining across participating archives and distribution of data to users. This effort was pioneering, but was built on technology that has now been rendered obsolete. As the number of geodetic observing technologies has expanded, the variety of data and data products has grown dramatically, exposing limitations in data product sharing. Through a NASA ROSES project, the three archives (CDDIS, SOPAC and UNAVCO) have been funded to expand the original GSAC capability for multiple geodetic observation types and to simultaneously modernize the underlying technology by implementing web services. The University of Nevada, Reno (UNR) will test the web services implementation by incorporating them into their daily GNSS data processing scheme. The effort will include new methods for quality control of current and legacy data that will be a product of the analysis/testing phase performed by UNR. The quality analysis by UNR will include a report of the stability of the stations coordinates over time that will enable data users to select sites suitable for their application, for example identifying stations with large seasonal effects. This effort will contribute to enhanced ability for very large networks to obtain complete data sets for processing.

  2. A fixed-memory moving, expanding window for obtaining scatter corrections in X-ray CT and other stochastic averages

    NASA Astrophysics Data System (ADS)

    Levine, Zachary H.; Pintar, Adam L.

    2015-11-01

    A simple algorithm for averaging a stochastic sequence of 1D arrays in a moving, expanding window is provided. The samples are grouped in bins which increase exponentially in size so that a constant fraction of the samples is retained at any point in the sequence. The algorithm is shown to have particular relevance for a class of Monte Carlo sampling problems which includes one characteristic of iterative reconstruction in computed tomography. The code is available in the CPC program library in both Fortran 95 and C and is also available in R through CRAN.

  3. High-performance mass storage system for workstations

    NASA Technical Reports Server (NTRS)

    Chiang, T.; Tang, Y.; Gupta, L.; Cooperman, S.

    1993-01-01

    Reduced Instruction Set Computer (RISC) workstations and Personnel Computers (PC) are very popular tools for office automation, command and control, scientific analysis, database management, and many other applications. However, when using Input/Output (I/O) intensive applications, the RISC workstations and PC's are often overburdened with the tasks of collecting, staging, storing, and distributing data. Also, by using standard high-performance peripherals and storage devices, the I/O function can still be a common bottleneck process. Therefore, the high-performance mass storage system, developed by Loral AeroSys' Independent Research and Development (IR&D) engineers, can offload a RISC workstation of I/O related functions and provide high-performance I/O functions and external interfaces. The high-performance mass storage system has the capabilities to ingest high-speed real-time data, perform signal or image processing, and stage, archive, and distribute the data. This mass storage system uses a hierarchical storage structure, thus reducing the total data storage cost, while maintaining high-I/O performance. The high-performance mass storage system is a network of low-cost parallel processors and storage devices. The nodes in the network have special I/O functions such as: SCSI controller, Ethernet controller, gateway controller, RS232 controller, IEEE488 controller, and digital/analog converter. The nodes are interconnected through high-speed direct memory access links to form a network. The topology of the network is easily reconfigurable to maximize system throughput for various applications. This high-performance mass storage system takes advantage of a 'busless' architecture for maximum expandability. The mass storage system consists of magnetic disks, a WORM optical disk jukebox, and an 8mm helical scan tape to form a hierarchical storage structure. Commonly used files are kept in the magnetic disk for fast retrieval. The optical disks are used as archive media, and the tapes are used as backup media. The storage system is managed by the IEEE mass storage reference model-based UniTree software package. UniTree software will keep track of all files in the system, will automatically migrate the lesser used files to archive media, and will stage the files when needed by the system. The user can access the files without knowledge of their physical location. The high-performance mass storage system developed by Loral AeroSys will significantly boost the system I/O performance and reduce the overall data storage cost. This storage system provides a highly flexible and cost-effective architecture for a variety of applications (e.g., realtime data acquisition with a signal and image processing requirement, long-term data archiving and distribution, and image analysis and enhancement).

  4. Monitoring of stability of ASG-EUPOS network coordinates

    NASA Astrophysics Data System (ADS)

    Figurski, M.; Szafranek, K.; Wrona, M.

    2009-04-01

    ASG-EUPOS (Active Geodetic Network - European Position Determination System) is the national system of precise satellite positioning in Poland, which increases a density of regional and global GNSS networks and is widely used by public administration, national institutions, entrepreneurs and citizens (especially surveyors). In near future ASG-EUPOS is to take role of main national network. Control of proper activity of stations and realization of ETRS'89 is a necessity. User of the system needs to be sure that observations quality and coordinates accuracy are high enough. Coordinates of IGS (International GNSS Service) and EPN (European Permanent Network) stations are precisely determined and any changes are monitored all the time. Observations are verified before they are archived in regional and global databases. The same applies to ASG-EUPOS. This paper concerns standardization of GNSS observations from different stations (uniform adjustment), examination of solutions correctness according to IGS and EPN standards and stability of solutions and sites activity

  5. Analog-to-digital clinical data collection on networked workstations with graphic user interface.

    PubMed

    Lunt, D

    1991-02-01

    An innovative respiratory examination system has been developed that combines physiological response measurement, real-time graphic displays, user-driven operating sequences, and networked file archiving and review into a scientific research and clinical diagnosis tool. This newly constructed computer network is being used to enhance the research center's ability to perform patient pulmonary function examinations. Respiratory data are simultaneously acquired and graphically presented during patient breathing maneuvers and rapidly transformed into graphic and numeric reports, suitable for statistical analysis or database access. The environment consists of the hardware (Macintosh computer, MacADIOS converters, analog amplifiers), the software (HyperCard v2.0, HyperTalk, XCMDs), and the network (AppleTalk, fileservers, printers) as building blocks for data acquisition, analysis, editing, and storage. System operation modules include: Calibration, Examination, Reports, On-line Help Library, Graphic/Data Editing, and Network Storage.

  6. TLALOCNet continuous GPS-Met Array in Mexico supporting the 2017 NAM GPS Hydrometeorological Network.

    NASA Astrophysics Data System (ADS)

    Cabral-Cano, E.; Salazar-Tlaczani, L.; Adams, D. K.; Vivoni, E. R.; Grutter, M.; Serra, Y. L.; DeMets, C.; Galetzka, J.; Feaux, K.; Mattioli, G. S.; Miller, M. M.

    2017-12-01

    TLALOCNet is a network of continuous GPS and meteorology stations in Mexico to study atmospheric and solid earth processes. This recently completed network spans most of Mexico with a strong coverage emphasis on southern and western Mexico. This network, funded by NSF, CONACyT and UNAM, recently built 40 cGPS-Met sites to EarthScope Plate Boundary Observatory standards and upgraded 25 additional GPS stations. TLALOCNet provides open and freely available raw GPS data, and high frequency surface meteorology measurements, and time series of daily positions. This is accomplished through the development of the TLALOCNet data center (http://tlalocnet.udg.mx) that serves as a collection and distribution point. This data center is based on UNAVCO's Dataworks-GSAC software and also works as part of UNAVCO's seamless archive for discovery, sharing, and access to GPS data. The TLALOCNet data center also contains contributed data from several regional GPS networks in Mexico for a total of 100+ stations. By using the same protocols and structure as the UNAVCO and other COCONet regional data centers, the scientific community has the capability of accessing data from the largest Mexican GPS network. This archive provides a fully queryable and scriptable GPS and Meteorological data retrieval point. In addition, real-time 1Hz streams from selected TLALOCNet stations are available in BINEX, RTCM 2.3 and RTCM 3.1 formats via the Networked Transport of RTCM via Internet Protocol (NTRIP) for real-time seismic and weather forecasting applications. TLALOCNet served as a GPS-Met backbone for the binational Mexico-US North American Monsoon GPS Hydrometeorological Network 2017 campaign experiment. This innovative experiment attempts to address water vapor source regions and land-surface water vapor flux contributions to precipitation (i.e., moisture recycling) during the 2017 North American Monsoon in Baja California, Sonora, Chihuahua, and Arizona. Models suggest that moisture recycling is a large contributor to summer rainfall. This experiment represents a first attempt to quantify the surface water vapor flux contribution to GPS-derived precipitable water vapor. Preliminary results from this campaign are presented.

  7. Digital information management: a progress report on the National Digital Mammography Archive

    NASA Astrophysics Data System (ADS)

    Beckerman, Barbara G.; Schnall, Mitchell D.

    2002-05-01

    Digital mammography creates very large images, which require new approaches to storage, retrieval, management, and security. The National Digital Mammography Archive (NDMA) project, funded by the National Library of Medicine (NLM), is developing a limited testbed that demonstrates the feasibility of a national breast imaging archive, with access to prior exams; patient information; computer aids for image processing, teaching, and testing tools; and security components to ensure confidentiality of patient information. There will be significant benefits to patients and clinicians in terms of accessible data with which to make a diagnosis and to researchers performing studies on breast cancer. Mammography was chosen for the project, because standards were already available for digital images, report formats, and structures. New standards have been created for communications protocols between devices, front- end portal and archive. NDMA is a distributed computing concept that provides for sharing and access across corporate entities. Privacy, auditing, and patient consent are all integrated into the system. Five sites, Universities of Pennsylvania, Chicago, North Carolina and Toronto, and BWXT Y12, are connected through high-speed networks to demonstrate functionality. We will review progress, including technical challenges, innovative research and development activities, standards and protocols being implemented, and potential benefits to healthcare systems.

  8. RESIF national datacentre : new features and forthcoming evolutions

    NASA Astrophysics Data System (ADS)

    Pequegnat, C.; Volcke, P.; le Tanou, J.; Wolyniec, D.; Lecointre, A.; Guéguen, P.

    2013-12-01

    RESIF is a nationwide french project aimed at building an high quality system to observe and understand the inner earth. The goal is to create a network throughout mainland France comprising 750 seismometers and geodetic measurement instruments, 250 of which will be mobile, to enable the observation network to be focussed on specific investigation subjects and geographic locations. The RESIF data distribution centre, which is a part of the global project, is operated by the Université Joseph Fourier (Grenoble, France) and is being implemented for two years. Data from french broadband permanent network, strong motion permanent network, and mobile seismological antenna are freely accessible as realtime streams and continuous validated data, along with instrumental metadata, delivered using widely known formats and requests tools. New features of the datacentre are : - new modern distribution tools : two FDSN WEBservices has been implemented and deliver data and metadata. - new data and datasets : the number of permanent stations rose by over 40 % percent in one year and the RESIF archive now includes past data (down to 1995) and data from new networks. Moreover, data from mobile experiments prior to 2011 is progressively released, and data from new mobile experiments in the Alps and in the Pyrenean mountains is progressively integrated. - new infrastructures : (1) the RESIF databank is about to be connected to the grid storage of the University High Performance Computing (HPC) centre. As a scientific use case of this datacenter facility, a focus is made on intensive exploitation of combined data from permanent and mobile networks (2) the RESIF databank will be progressively hosted on a new shared storage facility operated by the Université Joseph Fourier. This infrastructure offers high availability data storage (both in blocks and files modes) as well as backup and long term archival capabilities, and will be fully operational at the beginning of 2014.

  9. Telecommunications issues of intelligent database management for ground processing systems in the EOS era

    NASA Technical Reports Server (NTRS)

    Touch, Joseph D.

    1994-01-01

    Future NASA earth science missions, including the Earth Observing System (EOS), will be generating vast amounts of data that must be processed and stored at various locations around the world. Here we present a stepwise-refinement of the intelligent database management (IDM) of the distributed active archive center (DAAC - one of seven regionally-located EOSDIS archive sites) architecture, to showcase the telecommunications issues involved. We develop this architecture into a general overall design. We show that the current evolution of protocols is sufficient to support IDM at Gbps rates over large distances. We also show that network design can accommodate a flexible data ingestion storage pipeline and a user extraction and visualization engine, without interference between the two.

  10. dartr: An r package to facilitate analysis of SNP data generated from reduced representation genome sequencing.

    PubMed

    Gruber, Bernd; Unmack, Peter J; Berry, Oliver F; Georges, Arthur

    2018-05-01

    Although vast technological advances have been made and genetic software packages are growing in number, it is not a trivial task to analyse SNP data. We announce a new r package, dartr, enabling the analysis of single nucleotide polymorphism data for population genomic and phylogenomic applications. dartr provides user-friendly functions for data quality control and marker selection, and permits rigorous evaluations of conformation to Hardy-Weinberg equilibrium, gametic-phase disequilibrium and neutrality. The package reports standard descriptive statistics, permits exploration of patterns in the data through principal components analysis and conducts standard F-statistics, as well as basic phylogenetic analyses, population assignment, isolation by distance and exports data to a variety of commonly used downstream applications (e.g., newhybrids, faststructure and phylogeny applications) outside of the r environment. The package serves two main purposes: first, a user-friendly approach to lower the hurdle to analyse such data-therefore, the package comes with a detailed tutorial targeted to the r beginner to allow data analysis without requiring deep knowledge of r. Second, we use a single, well-established format-genlight from the adegenet package-as input for all our functions to avoid data reformatting. By strictly using the genlight format, we hope to facilitate this format as the de facto standard of future software developments and hence reduce the format jungle of genetic data sets. The dartr package is available via the r CRAN network and GitHub. © 2017 John Wiley & Sons Ltd.

  11. Warfighter Visualizations Compilations

    DTIC Science & Technology

    2013-05-01

    list of the user’s favorite websites or other textual content, sub-categorized into types, such as blogs, social networking sites, comics , videos...available: The example in the prototype shows a random archived comic from the website. Other options include thumbnail strips of imagery or dynamic...varied, and range from serving as statistical benchmarks, for increasing social consciousness and interaction, for improving educational interactions

  12. Situated Knowledge and Visual Education: Patrick Geddes and Reclus's Geography (1886-1932)

    ERIC Educational Resources Information Center

    Ferretti, Federico

    2017-01-01

    This article addresses Patrick Geddes's relationship with geography and visual education by focusing on his collaboration with the network of the anarchist geographers Élie, Élisée, and Paul Reclus. Drawing on empirical archival research, it contributes to the current debates on geographies of anarchist education and on geographic teaching. The…

  13. The Telecommunications and Data Acquisition Report

    NASA Technical Reports Server (NTRS)

    Posner, E. C. (Editor)

    1989-01-01

    Archival reports on developments in programs managed by JPL's Office of Telecommunications and Data Acquisition (TDA) are presented. Activities of the Deep Space Network (DSN) and its associated Ground Communications Facility (GCF) related to DSN advanced systems, systems implementation, and DSN operations are addressed. In addition, recent developments in the NASA SETI (Search for Extraterrestrial Intelligence) sky survey are summarized.

  14. Faculty Recommendations for Web Tools: Implications for Course Management Systems

    ERIC Educational Resources Information Center

    Oliver, Kevin; Moore, John

    2008-01-01

    A gap analysis of web tools in Engineering was undertaken as one part of the Digital Library Network for Engineering and Technology (DLNET) grant funded by NSF (DUE-0085849). DLNET represents a Web portal and an online review process to archive quality knowledge objects in Engineering and Technology disciplines. The gap analysis coincided with the…

  15. The Telecommunications and Data Acquisition Report

    NASA Technical Reports Server (NTRS)

    Yuen, Joseph H. (Editor)

    1994-01-01

    This quarterly publication provides archival reports on developments in programs in space communications, radio navigation, radio science, and ground-based radio and radar astronomy. It reports on activities of the Deep Space Network (DSN) in planning, supporting research and technology, implementation, and operations. Also included are standardization activities at the Jet Propulsion Laboratory for space data and information systems.

  16. Enterprise-class Digital Imaging and Communications in Medicine (DICOM) image infrastructure.

    PubMed

    York, G; Wortmann, J; Atanasiu, R

    2001-06-01

    Most current picture archiving and communication systems (PACS) are designed for a single department or a single modality. Few PACS installations have been deployed that support the needs of the hospital or the entire Integrated Delivery Network (IDN). The authors propose a new image management architecture that can support a large, distributed enterprise.

  17. Making and Missing Connections: Exploring Twitter Chats as a Learning Tool in a Preservice Teacher Education Course

    ERIC Educational Resources Information Center

    Hsieh, Betina

    2017-01-01

    Research on social media use in education indicates that network-based connections can enable powerful teacher learning opportunities. Using a connectivist theoretical framework (Siemens, 2005), this study focuses on secondary teacher candidates (TCs) who completed, archived, and reflected upon 1-hour Twitter chats (N = 39) to explore the promise…

  18. The Telecommunications and Data Acquisition Report

    NASA Technical Reports Server (NTRS)

    Posner, E. C. (Editor)

    1983-01-01

    Archival reports on developments in programs managed by JPL's office of Telecommunications and Data Acquisition (TDA) are presented. In space communications, radio navigation, radio science, and ground-based radio astronomy, it reports on activities of the Deep Space Network (DSN) and its associated Ground Communications Facility (GCF) in planning, in supporting research and technology, in implementation, and in operations.

  19. Power and Energy: Geopolitical Aspects of the Transnational Natural Gas Pipelines from the Caspian Sea Basin to Europe

    DTIC Science & Technology

    2010-06-01

    corridor as the “Silk Road.”25 On this trade-road network, merchants, missionaries , and conquistadors carried silk, gems, pottery, tea, paper, medicines...2010). Atwell, Kyle. “Yanukovich: Ukraine will be a ridge between East and West” Atlantic Review, Feb 19, 2010, http://atlanticreview.org/archives

  20. Observing Ocean Ecosystems with Sonar

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Matzner, Shari; Maxwell, Adam R.; Ham, Kenneth D.

    2016-12-01

    We present a real-time processing system for sonar to detect and track animals, and to extract water column biomass statistics in order to facilitate continuous monitoring of an underwater environment. The Nekton Interaction Monitoring System (NIMS) is built to connect to an instrumentation network, where it consumes a real-time stream of sonar data and archives tracking and biomass data.

  1. Data archiving and network system of Bisei Spaceguard center

    NASA Astrophysics Data System (ADS)

    Terazono, J.-Y.; Asami, A.; Asher, D.; Hashimoto, N.; Nakano, S.; Nishiyama, K.; Oshima, Y.; Umehara, H.; Urata, T.; Yoshikawa, M.; Isobe, S.

    2002-09-01

    Bisei Spaceguard Center, Japan's first facility for observations of space debris and Near-Earth Objects (NEOs), will produce large amounts of data. In this paper, we describe details of the data transfer and processing system we are now developing. Also we present a software system devoted to the discovery of asteroids mainly by high school students.

  2. Earthquake Monitoring: SeisComp3 at the Swiss National Seismic Network

    NASA Astrophysics Data System (ADS)

    Clinton, J. F.; Diehl, T.; Cauzzi, C.; Kaestli, P.

    2011-12-01

    The Swiss Seismological Service (SED) has an ongoing responsibility to improve the seismicity monitoring capability for Switzerland. This is a crucial issue for a country with low background seismicity but where a large M6+ earthquake is expected in the next decades. With over 30 stations with spacing of ~25km, the SED operates one of the densest broadband networks in the world, which is complimented by ~ 50 realtime strong motion stations. The strong motion network is expected to grow with an additional ~80 stations over the next few years. Furthermore, the backbone of the network is complemented by broadband data from surrounding countries and temporary sub-networks for local monitoring of microseismicity (e.g. at geothermal sites). The variety of seismic monitoring responsibilities as well as the anticipated densifications of our network demands highly flexible processing software. We are transitioning all software to the SeisComP3 (SC3) framework. SC3 is a fully featured automated real-time earthquake monitoring software developed by GeoForschungZentrum Potsdam in collaboration with commercial partner, gempa GmbH. It is in its core open source, and becoming a community standard software for earthquake detection and waveform processing for regional and global networks across the globe. SC3 was originally developed for regional and global rapid monitoring of potentially tsunamagenic earthquakes. In order to fulfill the requirements of a local network recording moderate seismicity, SED has tuned configurations and added several modules. In this contribution, we present our SC3 implementation strategy, focusing on the detection and identification of seismicity on different scales. We operate several parallel processing "pipelines" to detect and locate local, regional and global seismicity. Additional pipelines with lower detection thresholds can be defined to monitor seismicity within dense subnets of the network. To be consistent with existing processing procedures, the nonlinloc algorithm was implemented for manual and automatic locations using 1D and 3D velocity models; plugins for improved automatic phase picking and Ml computation were developed; and the graphical user interface for manual review was extended (including pick uncertainty definition; first motion focal mechanisms; interactive review of station magnitude waveforms; full inclusion of strong motion data). SC3 locations are fully compatible with those derived from the existing in-house processing tools and are stored in a database derived from the QuakeML data model. The database is shared with the SED alerting software, which merges origins from both SC3 and external sources in realtime and handles the alerting procedure. With the monitoring software being transitioned to SeisComp3, acquisition, archival and dissemination of SED waveform data now conforms to the seedlink and ArcLink protocols and continuous archives can be accessed via SED and all EIDA (European Integrated Data Archives) web-sites. Further, a SC3 module for waveform parameterisation has been developed, allowing rapid computation of peak values of ground motion and other engineering parameters within minutes of a new event. An output of this module is USGS ShakeMap XML. n minutes of a new event. An output of this module is USGS ShakeMap XML.

  3. EpiContactTrace: an R-package for contact tracing during livestock disease outbreaks and for risk-based surveillance

    PubMed Central

    2014-01-01

    Background During outbreak of livestock diseases, contact tracing can be an important part of disease control. Animal movements can also be of relevance for risk-based surveillance and sampling, i.e. both when assessing consequences of introduction or likelihood of introduction. In many countries, animal movement data are collected with one of the major objectives to enable contact tracing. However, often an analytical step is needed to retrieve appropriate information for contact tracing or surveillance. Results In this study, an open source tool was developed to structure livestock movement data to facilitate contact-tracing in real time during disease outbreaks and for input in risk-based surveillance and sampling. The tool, EpiContactTrace, was written in the R-language and uses the network parameters in-degree, out-degree, ingoing contact chain and outgoing contact chain (also called infection chain), which are relevant for forward and backward tracing respectively. The time-frames for backward and forward tracing can be specified independently and search can be done on one farm at a time or for all farms within the dataset. Different outputs are available; datasets with network measures, contacts visualised in a map and automatically generated reports for each farm either in HTML or PDF-format intended for the end-users, i.e. the veterinary authorities, regional disease control officers and field-veterinarians. EpiContactTrace is available as an R-package at the R-project website (http://cran.r-project.org/web/packages/EpiContactTrace/). Conclusions We believe this tool can help in disease control since it rapidly can structure essential contact information from large datasets. The reproducible reports make this tool robust and independent of manual compilation of data. The open source makes it accessible and easily adaptable for different needs. PMID:24636731

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abercrombie, Robert K; Udoeyop, Akaninyene W; Schlicher, Bob G

    This work examines a scientometric model that tracks the emergence of an identified technology from initial discovery (via original scientific and conference literature), through critical discoveries (via original scientific, conference literature and patents), transitioning through Technology Readiness Levels (TRLs) and ultimately on to commercial application. During the period of innovation and technology transfer, the impact of scholarly works, patents and on-line web news sources are identified. As trends develop, currency of citations, collaboration indicators, and on-line news patterns are identified. The combinations of four distinct and separate searchable on-line networked sources (i.e., scholarly publications and citation, patents, news archives, andmore » online mapping networks) are assembled to become one collective network (a dataset for analysis of relations). This established network becomes the basis from which to quickly analyze the temporal flow of activity (searchable events) for the example subject domain we investigated.« less

  5. BOREAS AFM-5 Level-1 Upper Air Network Data

    NASA Technical Reports Server (NTRS)

    Barr, Alan; Hrynkiw, Charmaine; Newcomer, Jeffrey A. (Editor); Hall, Forrest G. (Editor); Smith, David E. (Technical Monitor)

    2000-01-01

    The Boreal Ecosystem-Atmosphere Study (BOREAS) Airborne Fluxes and Meteorology (AFM)-5 team collected and processed data from the numerous radiosonde flights during the project. The goals of the AFM-05 team were to provide large-scale definition of the atmosphere by supplementing the existing Atmospheric Environment Service (AES) aerological network, both temporally and spatially. This data set includes basic upper-air parameters collected from the network of upper-air stations during the 1993, 1994, and 1996 field campaigns over the entire study region. The data are contained in tabular ASCII files. The level-1 upper-air network data are available from the Earth Observing System Data and Information System (EOSDIS) Oak Ridge National Laboratory (ORNL) Distributed Active Archive Center (DAAC). The data files also are available on a CD-ROM (see document number 20010000884).

  6. Data Management and Archiving - a Long Process

    NASA Astrophysics Data System (ADS)

    Gebauer, Petra; Bertelmann, Roland; Hasler, Tim; Kirchner, Ingo; Klump, Jens; Mettig, Nora; Peters-Kottig, Wolfgang; Rusch, Beate; Ulbricht, Damian

    2014-05-01

    Implementing policies for research data management to the end of data archiving at university institutions takes a long time. Even though, especially in geosciences, most of the scientists are familiar to analyze different sorts of data, to present statistical results and to write publications sometimes based on big data records, only some of them manage their data in a standardized manner. Much more often they have learned how to measure and to generate large volumes of data than to document these measurements and to preserve them for the future. Changing staff and limited funding make this work more difficult, but it is essential in a progressively developing digital and networked world. Results from the project EWIG (Translates to: Developing workflow components for long-term archiving of research data in geosciences), funded by Deutsche Forschungsgemeinschaft, will help on these theme. Together with the project partners Deutsches GeoForschungsZentrum Potsdam and Konrad-Zuse-Zentrum für Informationstechnik Berlin a workflow to transfer continuously recorded data from a meteorological city monitoring network into a long-term archive was developed. This workflow includes quality assurance of the data as well as description of metadata and using tools to prepare data packages for long term archiving. It will be an exemplary model for other institutions working with similar data. The development of this workflow is closely intertwined with the educational curriculum at the Institut für Meteorologie. Designing modules to run quality checks for meteorological time series of data measured every minute and preparing metadata are tasks in actual bachelor theses. Students will also test the usability of the generated working environment. Based on these experiences a practical guideline for integrating research data management in curricula will be one of the results of this project, for postgraduates as well as for younger students. Especially at the beginning of the scientific career it is necessary to become familiar with all issues concerning data management. The outcomes of EWIG are intended to be generic enough to be easily adopted by other institutions. University lectures in meteorology were started to teach future scientific generations right from the start how to deal with all sorts of different data in a transparent way. The progress of the project EWIG can be followed on the web via ewig.gfz-potsdam.de

  7. Recovery and archiving key Arctic Alaska vegetation map and plot data for the Arctic-Boreal Vulnerability Field Experiment (ABoVE)

    NASA Astrophysics Data System (ADS)

    Walker, D. A.; Breen, A. L.; Broderson, D.; Epstein, H. E.; Fisher, W.; Grunblatt, J.; Heinrichs, T.; Raynolds, M. K.; Walker, M. D.; Wirth, L.

    2013-12-01

    Abundant ground-based information will be needed to inform remote-sensing and modeling studies of NASA's Arctic-Boreal Vulnerability Experiment (ABoVE). A large body of plot and map data collected by the Alaska Geobotany Center (AGC) and collaborators from the Arctic regions of Alaska and the circumpolar Arctic over the past several decades is being archived and made accessible to scientists and the public via the Geographic Information Network of Alaska's (GINA's) 'Catalog' display and portal system. We are building two main types of data archives: Vegetation Plot Archive: For the plot information we use a Turboveg database to construct the Alaska portion of the international Arctic Vegetation Archive (AVA) http://www.geobotany.uaf.edu/ava/. High quality plot data and non-digital legacy datasets in danger of being lost have highest priority for entry into the archive. A key aspect of the database is the PanArctic Species List (PASL-1), developed specifically for the AVA to provide a standard of species nomenclature for the entire Arctic biome. A wide variety of reports, documents, and ancillary data are linked to each plot's geographic location. Geoecological Map Archive: This database includes maps and remote sensing products and links to other relevant data associated with the maps, mainly those produced by the Alaska Geobotany Center. Map data include GIS shape files of vegetation, land-cover, soils, landforms and other categorical variables and digital raster data of elevation, multispectral satellite-derived data, and data products and metadata associated with these. The map archive will contain all the information that is currently in the hierarchical Toolik-Arctic Geobotanical Atlas (T-AGA) in Alaska http://www.arcticatlas.org, plus several additions that are in the process of development and will be combined with GINA's already substantial holdings of spatial data from northern Alaska. The Geoecological Atlas Portal uses GINA's Catalog tool to develop a web interface to view and access the plot and map data. The mapping portal allows visualization of GIS data, sample-point locations and imagery and access to the map data. Catalog facilitates the discovery and dissemination of science-based information products in support of analysis and decision-making concerned with development and climate change and is currently used by GINA in several similar archive/distribution portals.

  8. The Telecommunications and Data Acquisition Report

    NASA Technical Reports Server (NTRS)

    Posner, E. C. (Editor)

    1992-01-01

    Archival reports on developments in programs managed by JPL's Office of Telecommunications and Data Acquisition (TDA) are provided. In space communications, radio navigation, radio science, and ground-based radio and radar astronomy, it reports on activities of the Deep Space Network (DSN) in planning, in supporting research and technology, in implementation, and in operations. Also included is standards activity at JPL for space data and information. In the search for extraterrestrial intelligence (SETI), the TDA Progress Report reports on implementation and operations for searching the microwave spectrum. Topics covered include tracking and ground-based navigation; communications, spacecraft-ground; station control and system technology; capabilities for new projects; network upgrade and sustaining; network operations and operations support; and TDA program management and analysis.

  9. BOREAS AFM-5 Level-2 Upper Air Network Standard Pressure Level Data

    NASA Technical Reports Server (NTRS)

    Barr, Alan; Hrynkiw, Charmaine; Hall, Forrest G. (Editor); Newcomer, Jeffrey A. (Editor); Smith, David E. (Technical Monitor)

    2000-01-01

    The BOREAS AFM-5 team collected and processed data from the numerous radiosonde flights during the project. The goals of the AFM-05 team were to provide large-scale definition of the atmosphere by supplementing the existing AES aerological network, both temporally and spatially. This data set includes basic upper-air parameters interpolated at 0.5 kiloPascal increments of atmospheric pressure from data collected from the network of upper-air stations during the 1993, 1994, and 1996 field campaigns over the entire study region. The data are contained in tabular ASCII files. The data files are available on a CD-ROM (see document number 20010000884) or from the Oak Ridge National Laboratory (ORNL) Distributed Active Archive Center (DAAC).

  10. [Research and implementation of the TLS network transport security technology based on DICOM standard].

    PubMed

    Lu, Xiaoqi; Wang, Lei; Zhao, Jianfeng

    2012-02-01

    With the development of medical information, Picture Archiving and Communications System (PACS), Hospital Information System/Radiology Information System(HIS/RIS) and other medical information management system become popular and developed, and interoperability between these systems becomes more frequent. So, these enclosed systems will be open and regionalized by means of network, and this is inevitable. If the trend becomes true, the security of information transmission may be the first problem to be solved. Based on the need for network security, we investigated the Digital Imaging and Communications in Medicine (DICOM) Standard and Transport Layer Security (TLS) Protocol, and implemented the TLS transmission of the DICOM medical information with OpenSSL toolkit and DCMTK toolkit.

  11. The Telecommunications and Data Acquisition Report

    NASA Technical Reports Server (NTRS)

    Yuen, Joseph H. (Editor)

    1993-01-01

    This quarterly publication provides archival reports on developments in programs managed by JPL's Office of Telecommunications and Data Acquisition (TDA). In space communications, radio navigation, radio science, and ground-based radio and radar astronomy, it reports on activities of the Deep Space Network (DSN) in planning, supporting research and technology, implementation, and operations. Also included are standards activity at JPL for space data and information systems and reimbursable DSN work performed for other space agencies through NASA. The papers included in this document cover satellite tracking and ground-based navigation, spacecraft-ground communications, and optical communication systems for the Deep Space Network.

  12. [Clinical pathology on the verge of virtual microscopy].

    PubMed

    Tolonen, Teemu; Näpänkangas, Juha; Isola, Jorma

    2015-01-01

    For more than 100 years, examinations of pathology specimens have relied on the use of the light microscope. The technological progress of the last few years is enabling the digitizing of histologic specimen slides and application of the virtual microscope in diagnostics. Virtual microscopy will facilitate consultation possibilities, and digital image analysis serves to enhance the level of diagnostics. Organizing and monitoring clinicopathological meetings will become easier. Digital archive of histologic specimens and the virtual microscopy network are expected to benefit training and research as well, particularly what applies to the Finnish biobank network which is currently being established.

  13. Moving towards persistent identification in the seismological community

    NASA Astrophysics Data System (ADS)

    Quinteros, Javier; Evans, Peter; Strollo, Angelo; Ulbricht, Damian; Elger, Kirsten; Bertelmann, Roland

    2016-04-01

    The GEOFON data centre and others in the seismological community have been archiving seismic waveforms for many years. The amount of seismic data available continuously increases due to the use of higher sampling rates and the growing number of stations. In recent years, there is a trend towards standardization of the protocols and formats to improve and homogenise access to these data [FDSN, 2013]. The seismological community has begun assigning a particular persistent identifier (PID), the Digital Object Identifier (DOI), to seismic networks as a first step for properly and consistently attributing the use of data from seismic networks in scientific articles [Evans et al., 2015]. This was codified in a recommendation by the international Federation of Digital Seismic Networks [FDSN, 2014]; DOIs for networks now appear in community web pages. However, our community, in common with other fields of science, still struggles with issues such as: supporting reproducibility of results; providing proper attribution (data citation) for data sets; and measuring the impact (by tracking their use) of, those data sets. Seismological data sets used for research are frequently created "on-the-fly" based on particular user requirements such as location or time period; users prepare requests to select subsets of the data held in seismic networks; the data actually provided may even be held at many different data centres [EIDA, 2016]. These subsets also require careful citation. For persistency, a request must receive exactly the same data when repeated at a later time. However, if data are curated between requests, the data set delivered may differ, severely complicating the ability to reproduce a result. Transmission problems or configuration problems may also inadvertently modify the response to a request. With this in mind, our next step is the assignment of additional EPIC-PIDs to daily data files (currently over 28 million in the GEOFON archive) for use within the data centre. These will be used for replication and versioning of the data. This will support reproducible, fine-grained citation of seismic waveform data in a consistent fashion. Moreover, we plan to create also PIDs for collections of PIDs, in order to support the citation of a set of many data files with a single identifier. The technical information describing the instruments used to acquire the data and their location will most probably be also identified with a PID (to a StationXML record) and pointed to from the metadata of the waveform PID. StationXML will also include the DOI of the network for citation purposes. With all these elements, progress towards reproducibility and better attribution are gained. References - EIDA (2016): European Integrated Data Archive (EIDA) . http://www.orfeus-eu.org/eida/eida.html - Evans, P., Strollo, A., Clark, A., Ahern, T., Newman, R., Clinton, J. F., Pedersen, H., Pequegnat, C. (2015 online): Why Seismic Networks Need Digital Object Identifiers. - Eos, Transactions American Geophysical Union, 96. http://doi.org/10.1029/2015EO036971 - International Federation of Digital Seismograph Networks (FDSN) (2013): FDSN Web Service Specifications, Version 1.1b, 2013/10/25. http://www.fdsn.org/webservices/FDSN-WS-Specifications-1.1.pdf - International Federation of Digital Seismograph Networks (FDSN) (2014), FDSN recommendations for seismic network DOIs and related FDSN services [WG3 recommendation], http://doi.org/10.7914/D11596.

  14. infoRAD: computers for clinical practice and education in radiology. Teleradiology, information transfer, and PACS: implications for diagnostic imaging in the 1990s.

    PubMed

    Schilling, R B

    1993-05-01

    Picture archiving and communication systems (PACS) provide image viewing at diagnostic, reporting, consultation, and remote workstations; archival on magnetic or optical media by means of short- or long-term storage devices; communications by means of local or wide area networks or public communication services; and integrated systems with modality interfaces and gateways to health care facilities and departmental information systems. Research indicates three basic needs for image and report management: (a) improved communication and turnaround time between radiologists and other imaging specialists and referring physicians, (b) fast reliable access to both current and previously obtained images and reports, and (c) space-efficient archival support. Although PACS considerations are much more complex than those associated with single modalities, the same basic purchase criteria apply. These criteria include technical leadership, image quality, throughput, life cost (eg, initial cost, maintenance, upgrades, and depreciation), and total service. Because a PACS takes much longer to implement than a single modality, the customer and manufacturer must develop a closer working relationship than has been necessary in the past.

  15. Astroinformatics as a New Research Field. UkrVO Astroinformation Resources: Tasks and Prospective

    NASA Astrophysics Data System (ADS)

    Vavilova, I. B.

    The data-oriented astronomy has allowed classifying the Astroinformatics as a new academic research field, which covers various multi-disciplinary applications of the e-Astronomy. Among them are the data modeling, data mining, metadata standards development, data access, digital astronomical databases, image archives and visualization, machine learning, statistics and other computational methods and software for work with astronomical survey and catalogues with their teta- topeta-scale astroinformation resource. In this review we describe briefly the astroinformatics applications and software/services performed for different astronomical tasks in frame of the VIrtual Roentgen and Gamma Observatory (VIRGO) and Ukrainian VirtualObservatory (UkrVO). Among them there are projects based on the archival space-born data of X-ray and gamma space observatories and on the Joint Digitized Archive (JDA) database of astroplate network collections. The UkrVO JDA DR1 deals with the star catalogues (FON, Polar zone, open clusters, GRB star fields) as well as the UkrVO JDA DR2 deals with the Solar System bodies (giant and small planets, satellites, astronomical heritage images).

  16. High resolution global gridded data for use in population studies

    PubMed Central

    Lloyd, Christopher T.; Sorichetta, Alessandro; Tatem, Andrew J.

    2017-01-01

    Recent years have seen substantial growth in openly available satellite and other geospatial data layers, which represent a range of metrics relevant to global human population mapping at fine spatial scales. The specifications of such data differ widely and therefore the harmonisation of data layers is a prerequisite to constructing detailed and contemporary spatial datasets which accurately describe population distributions. Such datasets are vital to measure impacts of population growth, monitor change, and plan interventions. To this end the WorldPop Project has produced an open access archive of 3 and 30 arc-second resolution gridded data. Four tiled raster datasets form the basis of the archive: (i) Viewfinder Panoramas topography clipped to Global ADMinistrative area (GADM) coastlines; (ii) a matching ISO 3166 country identification grid; (iii) country area; (iv) and slope layer. Further layers include transport networks, landcover, nightlights, precipitation, travel time to major cities, and waterways. Datasets and production methodology are here described. The archive can be downloaded both from the WorldPop Dataverse Repository and the WorldPop Project website. PMID:28140386

  17. Radio Synthesis Imaging - A High Performance Computing and Communications Project

    NASA Astrophysics Data System (ADS)

    Crutcher, Richard M.

    The National Science Foundation has funded a five-year High Performance Computing and Communications project at the National Center for Supercomputing Applications (NCSA) for the direct implementation of several of the computing recommendations of the Astronomy and Astrophysics Survey Committee (the "Bahcall report"). This paper is a summary of the project goals and a progress report. The project will implement a prototype of the next generation of astronomical telescope systems - remotely located telescopes connected by high-speed networks to very high performance, scalable architecture computers and on-line data archives, which are accessed by astronomers over Gbit/sec networks. Specifically, a data link has been installed between the BIMA millimeter-wave synthesis array at Hat Creek, California and NCSA at Urbana, Illinois for real-time transmission of data to NCSA. Data are automatically archived, and may be browsed and retrieved by astronomers using the NCSA Mosaic software. In addition, an on-line digital library of processed images will be established. BIMA data will be processed on a very high performance distributed computing system, with I/O, user interface, and most of the software system running on the NCSA Convex C3880 supercomputer or Silicon Graphics Onyx workstations connected by HiPPI to the high performance, massively parallel Thinking Machines Corporation CM-5. The very computationally intensive algorithms for calibration and imaging of radio synthesis array observations will be optimized for the CM-5 and new algorithms which utilize the massively parallel architecture will be developed. Code running simultaneously on the distributed computers will communicate using the Data Transport Mechanism developed by NCSA. The project will also use the BLANCA Gbit/s testbed network between Urbana and Madison, Wisconsin to connect an Onyx workstation in the University of Wisconsin Astronomy Department to the NCSA CM-5, for development of long-distance distributed computing. Finally, the project is developing 2D and 3D visualization software as part of the international AIPS++ project. This research and development project is being carried out by a team of experts in radio astronomy, algorithm development for massively parallel architectures, high-speed networking, database management, and Thinking Machines Corporation personnel. The development of this complete software, distributed computing, and data archive and library solution to the radio astronomy computing problem will advance our expertise in high performance computing and communications technology and the application of these techniques to astronomical data processing.

  18. Leveraging Educational, Research and Facility Expertise to Improve Global Seismic Monitoring: Preparing a Guide on Sustainable Networks

    NASA Astrophysics Data System (ADS)

    Nybade, A.; Aster, R.; Beck, S.; Ekstrom, G.; Fischer, K.; Lerner-Lam, A.; Meltzer, A.; Sandvol, E.; Willemann, R. J.

    2008-12-01

    Building a sustainable earthquake monitoring system requires well-informed cooperation between commercial companies that manufacture components or deliver complete systems and the government or other agencies that will be responsible for operating them. Many nations or regions with significant earthquake hazard lack the financial, technical, and human resources to establish and sustain permanent observatory networks required to return the data needed for hazard mitigation. Government agencies may not be well- informed about the short-term and long-term challenges of managing technologically advanced monitoring systems, much less the details of how they are built and operated. On the relatively compressed time scale of disaster recovery efforts, it can be difficult to find a reliable, disinterested source of information, without which government agencies may be dependent on partial information. If system delivery fails to include sufficient development of indigenous expertise, the performance of local and regional networks may decline quickly, and even data collected during an early high-performance period may be degraded or lost. Drawing on unsurpassed educational capabilities of its members working in close cooperation with its facility staff, IRIS is well prepared to contribute to sustainability through a wide variety of training and service activities that further promote standards for network installation, data exchange protocols, and free and open access to data. Members of the Consortium and staff of its Core Programs together could write a guide on decisions about network design, installation and operation. The intended primary audience would be government officials seeking to understand system requirements, the acquisition and installation process, and the expertise needed operate a system. The guide would cover network design, procurement, set-up, data use and archiving. Chapters could include advice on network data processing, archiving data (including information on the value of standards), installing and servicing stations, building a data processing and management center (including information on evaluating bids), using results from earthquake monitoring, and sustaining an earthquake monitoring system. Appendices might include profiles of well-configured and well- run networks and sample RFPs. Establishing permanent networks could provide a foundation for international research and educational collaborations and critical new data for imaging Earth structure while supporting scientific capacity building and strengthening hazard monitoring around the globe.

  19. Southern California Seismic Network: New Design and Implementation of Redundant and Reliable Real-time Data Acquisition Systems

    NASA Astrophysics Data System (ADS)

    Saleh, T.; Rico, H.; Solanki, K.; Hauksson, E.; Friberg, P.

    2005-12-01

    The Southern California Seismic Network (SCSN) handles more than 2500 high-data rate channels from more than 380 seismic stations distributed across southern California. These data are imported real-time from dataloggers, earthworm hubs, and partner networks. The SCSN also exports data to eight different partner networks. Both the imported and exported data are critical for emergency response and scientific research. Previous data acquisition systems were complex and difficult to operate, because they grew in an ad hoc fashion to meet the increasing needs for distributing real-time waveform data. To maximize reliability and redundancy, we apply best practices methods from computer science for implementing the software and hardware configurations for import, export, and acquisition of real-time seismic data. Our approach makes use of failover software designs, methods for dividing labor diligently amongst the network nodes, and state of the art networking redundancy technologies. To facilitate maintenance and daily operations we seek to provide some separation between major functions such as data import, export, acquisition, archiving, real-time processing, and alarming. As an example, we make waveform import and export functions independent by operating them on separate servers. Similarly, two independent servers provide waveform export, allowing data recipients to implement their own redundancy. The data import is handled differently by using one primary server and a live backup server. These data import servers, run fail-over software that allows automatic role switching in case of failure from primary to shadow. Similar to the classic earthworm design, all the acquired waveform data are broadcast onto a private network, which allows multiple machines to acquire and process the data. As we separate data import and export away from acquisition, we are also working on new approaches to separate real-time processing and rapid reliable archiving of real-time data. Further, improved network security is an integral part of the new design. Redundant firewalls will provide secure data imports, exports, and acquisition as well as DMZ zones for web servers and other publicly available servers. We will present the detailed design of this new configuration that is currently being implemented by the SCSN at Caltech. The design principals are general enough to be of use to most regional seismic networks.

  20. An Extensible Sensing and Control Platform for Building Energy Management

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rowe, Anthony; Berges, Mario; Martin, Christopher

    2016-04-03

    The goal of this project is to develop Mortar.io, an open-source BAS platform designed to simplify data collection, archiving, event scheduling and coordination of cross-system interactions. Mortar.io is optimized for (1) robustness to network outages, (2) ease of installation using plug-and-play and (3) scalable support for small to large buildings and campuses.

  1. ShapeSelectForest: a new r package for modeling landsat time series

    Treesearch

    Mary Meyer; Xiyue Liao; Gretchen Moisen; Elizabeth Freeman

    2015-01-01

    We present a new R package called ShapeSelectForest recently posted to the Comprehensive R Archival Network. The package was developed to fit nonparametric shape-restricted regression splines to time series of Landsat imagery for the purpose of modeling, mapping, and monitoring annual forest disturbance dynamics over nearly three decades. For each pixel and spectral...

  2. A PDA study management tool (SMT) utilizing wireless broadband and full DICOM viewing capability

    NASA Astrophysics Data System (ADS)

    Documet, Jorge; Liu, Brent; Zhou, Zheng; Huang, H. K.; Documet, Luis

    2007-03-01

    During the last 4 years IPI (Image Processing and Informatics) Laboratory has been developing a web-based Study Management Tool (SMT) application that allows Radiologists, Film librarians and PACS-related (Picture Archiving and Communication System) users to dynamically and remotely perform Query/Retrieve operations in a PACS network. The users utilizing a regular PDA (Personal Digital Assistant) can remotely query a PACS archive to distribute any study to an existing DICOM (Digital Imaging and Communications in Medicine) node. This application which has proven to be convenient to manage the Study Workflow [1, 2] has been extended to include a DICOM viewing capability in the PDA. With this new feature, users can take a quick view of DICOM images providing them mobility and convenience at the same time. In addition, we are extending this application to Metropolitan-Area Wireless Broadband Networks. This feature requires Smart Phones that are capable of working as a PDA and have access to Broadband Wireless Services. With the extended application to wireless broadband technology and the preview of DICOM images, the Study Management Tool becomes an even more powerful tool for clinical workflow management.

  3. Time Analyzer for Time Synchronization and Monitor of the Deep Space Network

    NASA Technical Reports Server (NTRS)

    Cole, Steven; Gonzalez, Jorge, Jr.; Calhoun, Malcolm; Tjoelker, Robert

    2003-01-01

    A software package has been developed to measure, monitor, and archive the performance of timing signals distributed in the NASA Deep Space Network. Timing signals are generated from a central master clock and distributed to over 100 users at distances up to 30 kilometers. The time offset due to internal distribution delays and time jitter with respect to the central master clock are critical for successful spacecraft navigation, radio science, and very long baseline interferometry (VLBI) applications. The instrument controller and operator interface software is written in LabView and runs on the Linux operating system. The software controls a commercial multiplexer to switch 120 separate timing signals to measure offset and jitter with a time-interval counter referenced to the master clock. The offset of each channel is displayed in histogram form, and "out of specification" alarms are sent to a central complex monitor and control system. At any time, the measurement cycle of 120 signals can be interrupted for diagnostic tests on an individual channel. The instrument also routinely monitors and archives the long-term stability of all frequency standards or any other 1-pps source compared against the master clock. All data is stored and made available for

  4. Integrated information systems for translational medicine.

    PubMed

    Winter, A; Funkat, G; Haeber, A; Mauz-Koerholz, C; Pommerening, K; Smers, S; Stausberg, J

    2007-01-01

    Translational medicine research needs a two-way information highway between 'bedside' and 'bench'. Unfortunately there are still weak links between successfully integrated information roads for bench, i.e. research networks, and bedside, i.e. regional or national health information systems. The question arises, what measures have to be taken to overcome the deficiencies. It is examined how patient care-related costs of clinical research can be separated and shared by health insurances, whether quality of patient care data is sufficient for research, how patient identity can be maintained without conflict to privacy, how care and research records can be archived, and how information systems for care and research can be integrated. Since clinical trials improve quality of care, insurers share parts of the costs. Quality of care data has to be improved by introducing minimum basic data sets. Pseudonymization solves the conflict between needs for patient identity and privacy. Archiving patient care records and research records is similar and XML and CDISC can be used. Principles of networking infrastructures for care and research still differ. They have to be bridged first and harmonized later. To link information systems for care (bed) and for research (bench) needs technical infrastructures as well as economic and organizational regulations.

  5. Ernest Solvay*s scientific networks. From personal research to academic patronage

    NASA Astrophysics Data System (ADS)

    Coupain, Nicolas

    2015-09-01

    Ernest Solvay was a multifaceted man. A successful captain of industry, he got known in the second part of his life as a magnanimous sponsor of academic science. His most notable achievements in this field are the creation of a series of university institutes in Brussels as well as the co-organization of the conferences of physics and chemistry that bear his name and are still held today. A famous picture of 1911 depicts this man deprived of any university degree, surrounded by the brightest scientists of the time. The often conveyed image of a self-made man leads to an underestimation of his networking and delegation capabilities. Recent investigations in his private archives as well as in "his" company archives shed new light on his organizational skills in the scientific arena. This paper focuses especially on this facet, and intends to analyze how Solvay behaved as an organizer of science. Three partially overlapping levels are discussed in sequence: the Solvay Company level, his personal level, and the academic level. The paper identifies the key actors in these areas, and evaluates the intensity of control and delegation exerted by Ernest Solvay in each of these spheres.

  6. Boosting productivity: a framework for professional/amateur collaborative teamwork

    NASA Astrophysics Data System (ADS)

    Al-Shedhani, Saleh S.

    2002-11-01

    As technology advances, remote operation of telescopes has paved the way for joint observational projects between Astronomy clubs. Equipped with a small telescope, a standard CCD, and a networked computer, the observatory can be set up to carry out several photometric studies. However, most club members lack the basic training and background required for such tasks. A collaborative network between professionals and amateurs is proposed to utilize professional know-how and amateurs' readiness for continuous observations. Working as a team, various long-term observational projects can be carried out using small telescopes. Professionals can play an important role in raising the standards of astronomy clubs via specialized training programs for members on how to use the available technology to search/observe certain events (e.g. supernovae, comets, etc.). Professionals in return can accumulate a research-relevant database and can set up an early notification scheme based on comparative analyses of the recently-added images in an online archive. Here we present a framework for the above collaborative teamwork that uses web-based communication tools to establish remote/robotic operation of the telescope, and an online archive and discussion forum, to maximize the interactions between professionals and amateurs and to boost the productivity of small telescope observatories.

  7. The AmericaView Project - Putting the Earth into Your Hands

    USGS Publications Warehouse

    ,

    2005-01-01

    The U.S. Geological Survey (USGS) is a leader in collecting, archiving, and distributing geospatial data and information about the Earth. Providing quick, reliable access to remotely sensed images and geospatial data is the driving principle behind the AmericaView Project. A national not-for-profit organization, AmericaView, Inc. was established and is supported by the USGS to coordinate the activities of a national network of university-led consortia with the primary objective of the advancement of the science of remote sensing. Individual consortia members include academic institutions, as well as state, local, and tribal government agencies. AmericaView's focus is to expand the understanding and use of remote sensing through education and outreach efforts and to provide affordable, integrated remote sensing information access and delivery to the American public. USGS's Landsat and NASA's Earth Observing System (EOS) satellite data are downlinked from satellites or transferred from other facilities to the USGS Center for Earth Resources Observation and Science (EROS) ground receiving station in Sioux Falls, South Dakota. The data can then be transferred over high-speed networks to consortium members, where it is archived and made available for public use.

  8. Paleoclimate networks: a concept meeting central challenges in the reconstruction of paleoclimate dynamics

    NASA Astrophysics Data System (ADS)

    Rehfeld, Kira; Goswami, Bedartha; Marwan, Norbert; Breitenbach, Sebastian; Kurths, Jürgen

    2013-04-01

    Statistical analysis of dependencies amongst paleoclimate data helps to infer on the climatic processes they reflect. Three key challenges have to be addressed, however: the datasets are heterogeneous in time (i) and space (ii), and furthermore time itself is a variable that needs to be reconstructed, which (iii) introduces additional uncertainties. To address these issues in a flexible way we developed the paleoclimate network framework, inspired by the increasing application of complex networks in climate research. Nodes in the paleoclimate network represent a paleoclimate archive, and an associated time series. Links between these nodes are assigned, if these time series are significantly similar. Therefore, the base of the paleoclimate network is formed by linear and nonlinear estimators for Pearson correlation, mutual information and event synchronization, which quantify similarity from irregularly sampled time series. Age uncertainties are propagated into the final network analysis using time series ensembles which reflect the uncertainty. We discuss how spatial heterogeneity influences the results obtained from network measures, and demonstrate the power of the approach by inferring teleconnection variability of the Asian summer monsoon for the past 1000 years.

  9. International travel as medical research: architecture and the modern hospital.

    PubMed

    Logan, Cameron; Willis, Julie

    2010-01-01

    The design and development of the modern hospital in Australia had a profound impact on medical practice and research at a variety of levels. Between the late 1920s and the 1950s hospital architects, administrators, and politicians travelled widely in order to review the latest international developments in the hospital field They were motivated by Australia's geographic isolation and a growing concern with how to govern the population at the level of physical health. While not 'medical research' in the conventional sense of the term, this travel was a powerful generator of medical thinking in Australia and has left a rich archival legacy. This paper draws on that archive to demonstrate the ways in which architectural research and international networks of hospital specialists profoundly shaped the provision of medical infrastructure in Australia.

  10. Archiving and access systems for remote sensing: Chapter 6

    USGS Publications Warehouse

    Faundeen, John L.; Percivall, George; Baros, Shirley; Baumann, Peter; Becker, Peter H.; Behnke, J.; Benedict, Karl; Colaiacomo, Lucio; Di, Liping; Doescher, Chris; Dominguez, J.; Edberg, Roger; Ferguson, Mark; Foreman, Stephen; Giaretta, David; Hutchison, Vivian; Ip, Alex; James, N.L.; Khalsa, Siri Jodha S.; Lazorchak, B.; Lewis, Adam; Li, Fuqin; Lymburner, Leo; Lynnes, C.S.; Martens, Matt; Melrose, Rachel; Morris, Steve; Mueller, Norman; Navale, Vivek; Navulur, Kumar; Newman, D.J.; Oliver, Simon; Purss, Matthew; Ramapriyan, H.K.; Rew, Russ; Rosen, Michael; Savickas, John; Sixsmith, Joshua; Sohre, Tom; Thau, David; Uhlir, Paul; Wang, Lan-Wei; Young, Jeff

    2016-01-01

    Focuses on major developments inaugurated by the Committee on Earth Observation Satellites, the Group on Earth Observations System of Systems, and the International Council for Science World Data System at the global level; initiatives at national levels to create data centers (e.g. the National Aeronautics and Space Administration (NASA) Distributed Active Archive Centers and other international space agency counterparts), and non-government systems (e.g. Center for International Earth Science Information Network). Other major elements focus on emerging tool sets, requirements for metadata, data storage and refresh methods, the rise of cloud computing, and questions about what and how much data should be saved. The sub-sections of the chapter address topics relevant to the science, engineering and standards used for state-of-the-art operational and experimental systems.

  11. Cost-effective data storage/archival subsystem for functional PACS

    NASA Astrophysics Data System (ADS)

    Chen, Y. P.; Kim, Yongmin

    1993-09-01

    Not the least of the requirements of a workable PACS is the ability to store and archive vast amounts of information. A medium-size hospital will generate between 1 and 2 TBytes of data annually on a fully functional PACS. A high-speed image transmission network coupled with a comparably high-speed central data storage unit can make local memory and magnetic disks in the PACS workstations less critical and, in an extreme case, unnecessary. Under these circumstances, the capacity and performance of the central data storage subsystem and database is critical in determining the response time at the workstations, thus significantly affecting clinical acceptability. The central data storage subsystem not only needs to provide sufficient capacity to store about ten days worth of images (five days worth of new studies, and on the average, about one comparison study for each new study), but also supplies images to the requesting workstation in a timely fashion. The database must provide fast retrieval responses upon users' requests for images. This paper analyzes both advantages and disadvantages of multiple parallel transfer disks versus RAID disks for short-term central data storage subsystem, as well as optical disk jukebox versus digital recorder tape subsystem for long-term archive. Furthermore, an example high-performance cost-effective storage subsystem which integrates both the RAID disks and high-speed digital tape subsystem as a cost-effective PACS data storage/archival unit are presented.

  12. Revising shortwave and longwave radiation archives in view of possible revisions of the WSG and WISG reference scales: methods and implications

    NASA Astrophysics Data System (ADS)

    Nyeki, Stephan; Wacker, Stefan; Gröbner, Julian; Finsterle, Wolfgang; Wild, Martin

    2017-08-01

    A large number of radiometers are traceable to the World Standard Group (WSG) for shortwave radiation and the interim World Infrared Standard Group (WISG) for longwave radiation, hosted by the Physikalisch-Meteorologisches Observatorium Davos/World Radiation Centre (PMOD/WRC, Davos, Switzerland). The WSG and WISG have recently been found to over- and underestimate radiation values, respectively (Fehlmann et al., 2012; Gröbner et al., 2014), although research is still ongoing. In view of a possible revision of the reference scales of both standard groups, this study discusses the methods involved and the implications on existing archives of radiation time series, such as the Baseline Surface Radiation Network (BSRN). Based on PMOD/WRC calibration archives and BSRN data archives, the downward longwave radiation (DLR) time series over the 2006-2015 period were analysed at four stations (polar and mid-latitude locations). DLR was found to increase by up to 3.5 and 5.4 W m-2 for all-sky and clear-sky conditions, respectively, after applying a WISG reference scale correction and a minor correction for the dependence of pyrgeometer sensitivity on atmospheric integrated water vapour content. Similar increases in DLR may be expected at other BSRN stations. Based on our analysis, a number of recommendations are made for future studies.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abercrombie, Robert K; Udoeyop, Akaninyene W

    This work examines a scientometric model that tracks the emergence of an identified technology from initial discovery (via original scientific and conference literature), through critical discoveries (via original scientific, conference literature and patents), transitioning through Technology Readiness Levels (TRLs) and ultimately on to commercial application. During the period of innovation and technology transfer, the impact of scholarly works, patents and on-line web news sources are identified. As trends develop, currency of citations, collaboration indicators, and on-line news patterns are identified. The combinations of four distinct and separate searchable on-line networked sources (i.e., scholarly publications and citation, worldwide patents, news archives,more » and on-line mapping networks) are assembled to become one collective network (a dataset for analysis of relations). This established network becomes the basis from which to quickly analyze the temporal flow of activity (searchable events) for the example subject domain we investigated.« less

  14. Analysis of context dependence in social interaction networks of a massively multiplayer online role-playing game.

    PubMed

    Son, Seokshin; Kang, Ah Reum; Kim, Hyun-chul; Kwon, Taekyoung; Park, Juyong; Kim, Huy Kang

    2012-01-01

    Rapid advances in modern computing and information technology have enabled millions of people to interact online via various social network and gaming services. The widespread adoption of such online services have made possible analysis of large-scale archival data containing detailed human interactions, presenting a very promising opportunity to understand the rich and complex human behavior. In collaboration with a leading global provider of Massively Multiplayer Online Role-Playing Games (MMORPGs), here we present a network science-based analysis of the interplay between distinct types of user interaction networks in the virtual world. We find that their properties depend critically on the nature of the context-interdependence of the interactions, highlighting the complex and multilayered nature of human interactions, a robust understanding of which we believe may prove instrumental in the designing of more realistic future virtual arenas as well as provide novel insights to the science of collective human behavior.

  15. Subaru Telescope Network III (STN-III): more effective, more operation-oriented, and more inexpensive solutions for the observatory's needs

    NASA Astrophysics Data System (ADS)

    Noumaru, Junichi; Kawai, Jun A.; Schubert, Kiaina; Yagi, Masafumi; Takata, Tadafumi; Winegar, Tom; Scanlon, Tim; Nishida, Takuhiro; Fox, Camron; Hayasaka, James; Forester, Jason; Uchida, Kenji; Nakamura, Isamu; Tom, Richard; Koura, Norikazu; Yamamoto, Tadahiro; Tanoue, Toshiya; Yamada, Toru

    2008-07-01

    Subaru Telescope has recently replaced most equipment of Subaru Telescope Network II with the new equipment which includes 124TB of RAID system for data archive. Switching the data storage from tape to RAID enables users to access the data faster. The STN-III dropped some important components of STN-II, such as supercomputers, development & testing subsystem for Subaru Observation Control System, or data processing subsystem. On the other hand, we invested more computers to the remote operation system. Thanks to IT innovations, our LAN as well as the network between Hilo and summit were upgraded to gigabit network at the similar or even reduced cost from the previous system. As the result of the redesigning of the computer system by more focusing on the observatory operation, we greatly reduced the total cost for computer rental, purchase and maintenance.

  16. The Astrophysical Multimessenger Observatory Network (AMON)

    NASA Technical Reports Server (NTRS)

    Smith. M. W. E.; Fox, D. B.; Cowen, D. F.; Meszaros, P.; Tesic, G.; Fixelle, J.; Bartos, I.; Sommers, P.; Ashtekar, Abhay; Babu, G. Jogesh; hide

    2013-01-01

    We summarize the science opportunity, design elements, current and projected partner observatories, and anticipated science returns of the Astrophysical Multimessenger Observatory Network (AMON). AMON will link multiple current and future high-energy, multimessenger, and follow-up observatories together into a single network, enabling near real-time coincidence searches for multimessenger astrophysical transients and their electromagnetic counterparts. Candidate and high-confidence multimessenger transient events will be identified, characterized, and distributed as AMON alerts within the network and to interested external observers, leading to follow-up observations across the electromagnetic spectrum. In this way, AMON aims to evoke the discovery of multimessenger transients from within observatory subthreshold data streams and facilitate the exploitation of these transients for purposes of astronomy and fundamental physics. As a central hub of global multimessenger science, AMON will also enable cross-collaboration analyses of archival datasets in search of rare or exotic astrophysical phenomena.

  17. Site characterization of the national seismic network of Italy

    NASA Astrophysics Data System (ADS)

    Bordoni, Paola; Pacor, Francesca; Cultrera, Giovanna; Casale, Paolo; Cara, Fabrizio; Di Giulio, Giuseppe; Famiani, Daniela; Ladina, Chiara; PIschiutta, Marta; Quintiliani, Matteo

    2017-04-01

    The national seismic network of Italy (Rete Sismica Nazionale, RSN) run by Istituto Nazionale di Geofisica e Vulcanologia (INGV) consists of more than 400 seismic stations connected in real time to the institute data center in order to locate earthquakes for civil defense purposes. A critical issue in the performance of a network is the characterization of site condition at the recording stations. Recently INGV has started addressing this subject through the revision of all available geological and geophysical data, the acquisition of new information by means of ad-hoc field measurements and the analysis of seismic waveforms. The main effort is towards building a database, integrated with the other INGV infrastructures, designed to archive homogeneous parameters through the seismic network useful for a complete site characterization, including housing, geological, seismological and geotechnical features as well as the site class according to the European and Italian building codes. Here we present the ongoing INGV activities.

  18. Scientometric methods for identifying emerging technologies

    DOEpatents

    Abercrombie, Robert K; Schlicher, Bob G; Sheldon, Frederick T

    2015-11-03

    Provided is a method of generating a scientometric model that tracks the emergence of an identified technology from initial discovery (via original scientific and conference literature), through critical discoveries (via original scientific, conference literature and patents), transitioning through Technology Readiness Levels (TRLs) and ultimately on to commercial application. During the period of innovation and technology transfer, the impact of scholarly works, patents and on-line web news sources are identified. As trends develop, currency of citations, collaboration indicators, and on-line news patterns are identified. The combinations of four distinct and separate searchable on-line networked sources (i.e., scholarly publications and citation, worldwide patents, news archives, and on-line mapping networks) are assembled to become one collective network (a dataset for analysis of relations). This established network becomes the basis from which to quickly analyze the temporal flow of activity (searchable events) for the example subject domain.

  19. Planning and cost analysis of digital radiography services for a network of hospitals (the Veterans Integrated Service Network).

    PubMed

    Duerinckx, A J; Kenagy, J J; Grant, E G

    1998-01-01

    This study analysed the design and cost of a picture archiving and communications system (PACS), computerized radiography (CR) and a wide-area network for teleradiology. The Desert Pacific Healthcare Network comprises 10 facilities, including four tertiary medical centres and one small hospital. Data were collected on radiologists' workloads, and patient and image flow within and between these medical centres. These were used to estimate the size and cash flows associated with a system-wide implementation of PACS, CR and teleradiology services. A cost analysis model was used to estimate the potential cost savings in a filmless radiology environment. ATM technology was selected as the communications medium between the medical centres. A strategic plan and business plan were successfully developed. The cost model predicted the cost-effectiveness of the proposed PACS/CR configuration within four to six years, if the base costs were kept low. The experience gained in design and cost analysis of a PACS/teleradiology network will serve as a model for similar projects.

  20. A pathologist-designed imaging system for anatomic pathology signout, teaching, and research.

    PubMed

    Schubert, E; Gross, W; Siderits, R H; Deckenbaugh, L; He, F; Becich, M J

    1994-11-01

    Pathology images are derived from gross surgical specimens, light microscopy, immunofluorescence, electron microscopy, molecular diagnostic gels, flow cytometry, image analysis data, and clinical laboratory data in graphic form. We have implemented a network of desktop personal computers (PCs) that allow us to easily capture, store, and retrieve gross and microscopic, anatomic, and research pathology images. System architecture involves multiple image acquisition and retrieval sites and a central file server for storage. The digitized images are conveyed via a local area network to and from image capture or display stations. Acquisition sites consist of a high-resolution camera connected to a frame grabber card in a 486-type personal computer, equipped with 16 MB (Table 1) RAM, a 1.05-gigabyte hard drive, and a 32-bit ethernet card for access to our anatomic pathology reporting system. We have designed a push-button workstation for acquiring and indexing images that does not significantly interfere with surgical pathology sign-out. Advantages of the system include the following: (1) Improving patient care: the availability of gross images at time of microscopic sign-out, verification of recurrence of malignancy from archived images, monitoring of bone marrow engraftment and immunosuppressive intervention after bone marrow/solid organ transplantation on repeat biopsies, and ability to seek instantaneous consultation with any pathologist on the network; (2) enhancing the teaching environment: building a digital surgical pathology atlas, improving the availability of images for conference support, and sharing cases across the network; (3) enhancing research: case study compilation, metastudy analysis, and availability of digitized images for quantitative analysis and permanent/reusable image records for archival study; and (4) other practical and economic considerations: storing case requisition images and hand-drawn diagrams deters the spread of gross room contaminants and results in considerable cost savings in photographic media for conferences, improved quality assurance by porting control stains across the network, and a multiplicity of other advantages that enhance image and information management in pathology.

  1. A global gridded dataset of daily precipitation going back to 1950, ideal for analysing precipitation extremes

    NASA Astrophysics Data System (ADS)

    Contractor, S.; Donat, M.; Alexander, L. V.

    2017-12-01

    Reliable observations of precipitation are necessary to determine past changes in precipitation and validate models, allowing for reliable future projections. Existing gauge based gridded datasets of daily precipitation and satellite based observations contain artefacts and have a short length of record, making them unsuitable to analyse precipitation extremes. The largest limiting factor for the gauge based datasets is a dense and reliable station network. Currently, there are two major data archives of global in situ daily rainfall data, first is Global Historical Station Network (GHCN-Daily) hosted by National Oceanic and Atmospheric Administration (NOAA) and the other by Global Precipitation Climatology Centre (GPCC) part of the Deutsche Wetterdienst (DWD). We combine the two data archives and use automated quality control techniques to create a reliable long term network of raw station data, which we then interpolate using block kriging to create a global gridded dataset of daily precipitation going back to 1950. We compare our interpolated dataset with existing global gridded data of daily precipitation: NOAA Climate Prediction Centre (CPC) Global V1.0 and GPCC Full Data Daily Version 1.0, as well as various regional datasets. We find that our raw station density is much higher than other datasets. To avoid artefacts due to station network variability, we provide multiple versions of our dataset based on various completeness criteria, as well as provide the standard deviation, kriging error and number of stations for each grid cell and timestep to encourage responsible use of our dataset. Despite our efforts to increase the raw data density, the in situ station network remains sparse in India after the 1960s and in Africa throughout the timespan of the dataset. Our dataset would allow for more reliable global analyses of rainfall including its extremes and pave the way for better global precipitation observations with lower and more transparent uncertainties.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    The Analysis of Search Results for the Clarification and Identification of Technology Emergence (AR-CITE) computer code examines a scientometric model that tracks the emergence of an identified technology from initial discovery (via original scientific and conference literature), through critical discoveries (via original scientific, conference literature and patents), transitioning through Technology Readiness Levels (TRLs) and ultimately on to commercial currency of citations, collaboration indicators, and on-line news patterns are identified. The combinations of four distinct and separate searchable on-line networked sources (i.e. scholarly publications and citation, world patents, news archives, and on-line mapping networks) are assembled to become one collective networkmore » (a dataset for analysis of relations). This established network becomes the basis from which to quickly analyze the temporal flow of activity (searchable events) for the subject domain to be clarified and identified.« less

  3. Horticulture in Portugal 1850-1900: The role of science and public utility in shaping knowledge.

    PubMed

    Rodrigues, Ana Duarte; Simões, Ana

    2017-07-01

    In this paper, we address the emergence of horticultural practice, agents, spaces and institutions in the two urban settings of Lisbon and Porto, in Portugal, during the second half of the nineteenth century. We do so by following the networking activities of two players: the self-made horticulturist and entrepreneur José Marques Loureiro, who created, in Porto, a commercial horticultural establishment and founded the Journal of Practical Horticulture; and the agronomist Francisco Simões Margiochi, head of the gardens and green grounds department of the municipality, who created the first course on gardening and horticulture, and founded the Royal Horticultural Society, both in Lisbon. Their joint activities were aimed at establishing horticulture as an applied science and to cater simultaneously to an extended audience of citizens. They enable us to enrich the narratives on the emergence and development of horticulture in Europe by calling attention to the participation in circulatory extended networks of actors who are often absent from these accounts. Additionally, they allow a comparative assessment of the outcome of their actions at the national level, and to understand their results in terms consonant with recent historiographical trends on the co-construction of centres and peripheries. AML - Arquivo Municipal de Lisboa (Municipal Archive of Lisbon).; ANTT - Arquivo Nacional da Torre do Tombo (National Archives at Torre do Tombo).; AHCPL - Arquivo Histórico da Casa Pia de Lisboa (Historical Archive of the Casa Pia of Lisbon).; JHP - Jornal de Horticultura Practica (Journal of Practical Horticulture). Online at: http://www.fc.up.pt/fa/?p=nav&f=html.fbib-Periodico-oa&item=378 ; BSNHP - Boletim da Sociedade Nacional de Horticultura de Portugal (Bulletin of the National Society of Horticulture of Portugal).

  4. Quality-assurance plan for groundwater activities, U.S. Geological Survey, Washington Water Science Center

    USGS Publications Warehouse

    Kozar, Mark D.; Kahle, Sue C.

    2013-01-01

    This report documents the standard procedures, policies, and field methods used by the U.S. Geological Survey’s (USGS) Washington Water Science Center staff for activities related to the collection, processing, analysis, storage, and publication of groundwater data. This groundwater quality-assurance plan changes through time to accommodate new methods and requirements developed by the Washington Water Science Center and the USGS Office of Groundwater. The plan is based largely on requirements and guidelines provided by the USGS Office of Groundwater, or the USGS Water Mission Area. Regular updates to this plan represent an integral part of the quality-assurance process. Because numerous policy memoranda have been issued by the Office of Groundwater since the previous groundwater quality assurance plan was written, this report is a substantial revision of the previous report, supplants it, and contains significant additional policies not covered in the previous report. This updated plan includes information related to the organization and responsibilities of USGS Washington Water Science Center staff, training, safety, project proposal development, project review procedures, data collection activities, data processing activities, report review procedures, and archiving of field data and interpretative information pertaining to groundwater flow models, borehole aquifer tests, and aquifer tests. Important updates from the previous groundwater quality assurance plan include: (1) procedures for documenting and archiving of groundwater flow models; (2) revisions to procedures and policies for the creation of sites in the Groundwater Site Inventory database; (3) adoption of new water-level forms to be used within the USGS Washington Water Science Center; (4) procedures for future creation of borehole geophysics, surface geophysics, and aquifer-test archives; and (5) use of the USGS Multi Optional Network Key Entry System software for entry of routine water-level data collected as part of long-term water-level monitoring networks.

  5. Archiving of Wideband Plasma Wave Data

    NASA Technical Reports Server (NTRS)

    Kurth, William S.

    1997-01-01

    Beginning with the third year of funding, we began a more ambitious archiving production effort, minimizing work on new software and concentrating on building representative archives of the missions mentioned above, recognizing that only a small percentage of the data from any one mission can be archived with reasonable effort. We concentrated on data from Dynamics Explorer and ISEE 1, archiving orbits or significant fractions of orbits which attempt to capture the essence of the mission and provide data which will hopefully be sufficient for ongoing and new research as well as to provide a reference to upcoming and current ISTP missions which will not fly in the same regions of space as the older missions and which will not have continuous wideband data. We archived approximately 181 Gigabytes of data, accounting for some 1582 hours of data. Included in these data are all of the AMPTE chemical releases, all of the Spacelab 2/PDP data obtained during the free-flight portion of its mission, as well as significant portions of the S3, DE-1, Imp-6, Hawkeye, Injun 5, and ISEE 1 and 2 data sets. Table 1 summarizes these data. All of the data archived are summarized in gif-formatted images of frequency-time spectrograms which are directly accessible via the internet. Each of the gif files are identified by year, day, and time as described in the Web page. This provides a user with a specific date/time in mind a way of determining very quickly if there is data for the interval in question and, by clicking on the file name, browsing the data. Alternately, a user can browse the data for interesting features and events simply by viewing each of the gif files. When a user finds data of interest, he/she can notify us by email of the time period involved. Based on the user's needs, we can provide data on a convenient medium or by ftp, or we can mount the appropriate data and provide access to our analysis tools via the network. We can even produce products such as plots or spectrograms in hardcopy form based on the specific request of the user.

  6. Tools for Integrating Data Access from the IRIS DMC into Research Workflows

    NASA Astrophysics Data System (ADS)

    Reyes, C. G.; Suleiman, Y. Y.; Trabant, C.; Karstens, R.; Weertman, B. R.

    2012-12-01

    Web service interfaces at the IRIS Data Management Center (DMC) provide access to a vast archive of seismological and related geophysical data. These interfaces are designed to easily incorporate data access into data processing workflows. Examples of data that may be accessed include: time series data, related metadata, and earthquake information. The DMC has developed command line scripts, MATLAB® interfaces and a Java library to support a wide variety of data access needs. Users of these interfaces do not need to concern themselves with web service details, networking, or even (in most cases) data conversion. Fetch scripts allow access to the DMC archive and are a comfortable fit for command line users. These scripts are written in Perl and are well suited for automation and integration into existing workflows on most operating systems. For metdata and event information, the Fetch scripts even parse the returned data into simple text summaries. The IRIS Java Web Services Library (IRIS-WS Library) allows Java developers the ability to create programs that access the DMC archives seamlessly. By returning the data and information as native Java objects the Library insulates the developer from data formats, network programming and web service details. The MATLAB interfaces leverage this library to allow users access to the DMC archive directly from within MATLAB (r2009b or newer), returning data into variables for immediate use. Data users and research groups are developing other toolkits that use the DMC's web services. Notably, the ObsPy framework developed at LMU Munich is a Python Toolbox that allows seamless access to data and information via the DMC services. Another example is the MATLAB-based GISMO and Waveform Suite developments that can now access data via web services. In summary, there now exist a host of ways that researchers can bring IRIS DMC data directly into their workflows. MATLAB users can use irisFetch.m, command line users can use the various Fetch scripts, Java users can use the IRIS-WS library, and Python users may request data through ObsPy. To learn more about any of these clients see http://www.iris.edu/ws/wsclients/.

  7. EMODnet Physics: open and free marine physical data for science and for society

    NASA Astrophysics Data System (ADS)

    Nolan, G.; Novellino, A.; Gorringe, P.; Manzella, G. M. R., Sr.; Schaap, D.; Pouliquen, S.; Richards, L.

    2016-02-01

    Europe is sustaining a long term strategy on Blue Growth, looking at seas and oceans as drivers for innovation and growth. A number of weaknesses have been identified, among which gaps in knowledge and data about the state of our oceans, seabed resources, marine life and risks to habitats and ecosystems. European Marine Observation and Data Network (EMODnet) has been created to improve the usefulness to European users for scientific, regulatory and commercial purposes of observations and the resulting marine data collected and held by European public and private bodies. EMODNet Physics is providing access to archived and real time data catalog on the physical condition in Europe's seas and oceans. The overall objectives are to provide access to archived and near real-time data on physical conditions in Europe's seas and oceans by means of a dedicated portal and to determine how well the data meet the needs of users from industry, public authorities and scientists. EMODnet Physics is contributing to the broader initiative 'Marine Knowledge 2020', and in particular to the implementation of the European Copernicus programme, an EU-wide programme that aims to support policymakers, business, and citizens with improved environmental information. In the global context, Copernicus is an integral part of the Global Earth Observation System of Systems. Near real time data and metadata are populated by data owners, organized at EuroGOOS level according its regional operational systems (ROOSs) infrastructure and conventions and made available with the EMODnet Physics user interface. Latest 60 days are freely viewable and downloadable while the access to older data (monthly archives) request credentials. Archived data series and metadata are organized according and in collaboration with NODCs network (SeaDataNet). Access to data and metadata consider measurements on winds at the sea surface, waves, temperature and salinity, water velocities, light attenuation, sea level and ice coverage. EMODnet Physics has the specific objective of processing physical data into interoperable formats which includes agreed standards, common baselines or reference conditions; assessments of their accuracy and precision. The data and metadata are accessible through an ISO, OGC, INSPIRE compliant portal that is operational 24/7.

  8. The Environmental Data Initiative: A broad-use data repository for environmental and ecological data that strives to balance data quality and ease of submission

    NASA Astrophysics Data System (ADS)

    Servilla, M. S.; Brunt, J.; Costa, D.; Gries, C.; Grossman-Clarke, S.; Hanson, P. C.; O'Brien, M.; Smith, C.; Vanderbilt, K.; Waide, R.

    2017-12-01

    In the world of data repositories, there seems to be a never ending struggle between the generation of high-quality data documentation and the ease of archiving a data product in a repository - the higher the documentation standards, the greater effort required by the scientist, and the less likely the data will be archived. The Environmental Data Initiative (EDI) attempts to balance the rigor of data documentation to the amount of effort required by a scientist to upload and archive data. As an outgrowth of the LTER Network Information System, the EDI is funded by the US NSF Division of Environmental Biology, to support the LTER, LTREB, OBFS, and MSB programs, in addition to providing an open data archive for environmental scientists without a viable archive. EDI uses the PASTA repository software, developed originally by the LTER. PASTA is metadata driven and documents data with the Ecological Metadata Language (EML), a high-fidelity standard that can describe all types of data in great detail. PASTA incorporates a series of data quality tests to ensure that data are correctly documented with EML in a process that is termed "metadata and data congruence", and incongruent data packages are forbidden in the repository. EDI reduces the burden of data documentation on scientists in two ways: first, EDI provides hands-on assistance in data documentation best practices using R and being developed in Python, for generating EML. These tools obscure the details of EML generation and syntax by providing a more natural and contextual setting for describing data. Second, EDI works closely with community information managers in defining rules used in PASTA quality tests. Rules deemed too strict can be turned off completely or just issue a warning, while the community learns to best handle the situation and improve their documentation practices. Rules can also be added or refined over time to improve overall quality of archived data. The outcome of quality tests are stored as part of the data archive in PASTA and are accessible to all users of the EDI data repository. In summary, EDI's metadata support to scientists and the comprehensive set of data quality tests for metadata and data congruency provide an ideal archive for environmental and ecological data.

  9. BOREAS AFM-04 Twin Otter Aircraft Flux Data

    NASA Technical Reports Server (NTRS)

    MacPherson, J. Ian; Hall, Forrest G. (Editor); Knapp, David E. (Editor); Desjardins, Raymond L.; Smith, David E. (Technical Monitor)

    2000-01-01

    The BOREAS AFM-5 team collected and processed data from the numerous radiosonde flights during the project. The goals of the AFM-05 team were to provide large-scale definition of the atmosphere by supplementing the existing AES aerological network, both temporally and spatially. This data set includes basic upper-air parameters collected from the network of upper-air stations during the 1993, 1994, and 1996 field campaigns over the entire study region. The data are contained in tabular ASCII files. The data files are available on a CD-ROM (see document number 20010000884) or from the Oak Ridge National Laboratory (ORNL) Distributed Active Archive Center (DAAC).

  10. Solar-terrestrial data access distribution and archiving

    NASA Technical Reports Server (NTRS)

    1984-01-01

    It is recommended that a central data catalog and data access network (CDC/DAN) for solar-terrestrial research be established, initially as a NASA pilot program. The system is envisioned to be flexible and to evolve as funds permit, starting from a catalog to an access network for high-resolution data. The report describes the various functional requirements for the CDC/DAN, but does not specify the hardware and software architectures as these are constantly evolving. The importance of a steering committee, working with the CDC/DAN organization, to provide scientific guidelines for the data catalog and for data storage, access, and distribution is also stressed.

  11. Investigation of Magnetic Field Measurements.

    DTIC Science & Technology

    1983-02-28

    Report) Ill. SUPPLEMENTARY NOTES IS. KEY WORDS (CoEntnue on revere side I necoseer mnd Identify by block mamber) AFGL Magnetometer Network Fluxgate ... Magnetometer Induction Coil Magnetometer Temperature Dependency of Fluxgate Automation of Testing 20. ABSTRACT (Coniniue an reverse aide If neeeeey and...data collection platforms. Support was also provided to AFGL to process the fluxgate magnetometer archive tapes in order to make the data available to

  12. Defense Advanced Research Projects Agency (DARPA) Network Archive (DNA)

    DTIC Science & Technology

    2008-12-01

    therefore decided for an iterative development process even within such a small project. The first iteration consisted of conducting specific...Public reporting burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing instructions...regarding this burden estimate or any other aspect of this collection of information, including suggestions for reducing this burden to Washington

  13. Rtop - an R package for interpolation along the stream network

    NASA Astrophysics Data System (ADS)

    Skøien, J. O.; Laaha, G.; Koffler, D.; Blöschl, G.; Pebesma, E.; Parajka, J.; Viglione, A.

    2012-04-01

    Geostatistical methods have a long tradition within analysis of data that can be conceptualized as simple point data, such as soil properties, or for regular blocks, such as mining data. However, these methods have been used to a limited extent for estimation along stream networks. A few exceptions are given by (Gottschalk 1993, Sauquet et al. 2000, Gottschalk et al. 2006, Skøien et al. 2006), and an overview by Laaha and Blöschl (2011). Interpolation of runoff characteristics are more complicated than the traditional random variables estimated by geostatistical methods, as the measurements have a more complicated support, and many catchments are nested. Skøien et al. (2006) presented the model Top-kriging which takes these effects into account for interpolation of stream flow characteristics (exemplified by the 100 year flood). The method has here been implemented as a package in the open source statistical environment R (R Development Core Team 2011). Taking advantage of the existing methods in R for working with spatial objects, and the extensive possibilities for visualizing the result, this makes it considerably easier to apply the method on new data sets, in comparison to earlier implementation of the method. In addition to user feedback, the package has also been tested by colleagues whose only responsibility has been to search for bugs, inconsistencies and shortcomings of the documentation. The last part is often the part that gets the least attention in small open source projects, and we have solved this by acknowledging their effects as co-authors. The model will soon be uploaded to CRAN, but is in the meantime also available from R-forge and can be installed by: > install.packages("rtop", repos="http://R-Forge.R-project.org") Gottschalk, L., 1993. Interpolation of runoff applying objective methods. Stochastic Hydrology and Hydraulics, 7, 269-281. Gottschalk, L., Krasovskaia, I., Leblois, E. & Sauquet, E., 2006. Mapping mean and variance of runoff in a river basin. Hydrology and Earth System Sciences, 10, 469-484. Laaha, G. & Blöschl, G. 2011. Geostatistics on river networks - a reviewed. EGU General Assembly, Vienna, Austria. R Development Core Team, 2011. R: A language and environment for statistical computing. Vienna, Austria, ISBN 3-900051-07-0. Sauquet, E., Gottschalk, L. & Leblois, E., 2000. Mapping average annual runoff: A hierarchical approach applying a stochastic interpolation scheme. Hydrological Sciences Journal, 45 (6), 799-815. Skøien, J.O., Merz, R. & Blöschl, G., 2006. Top-kriging - geostatistics on stream networks. Hydrology and Earth System Sciences, 10, 277-287.

  14. Feature selection using a one dimensional naïve Bayes’ classifier increases the accuracy of support vector machine classification of CDR3 repertoires

    PubMed Central

    Cinelli, Mattia; Sun, , Yuxin; Best, Katharine; Heather, James M.; Reich-Zeliger, Shlomit; Shifrut, Eric; Friedman, Nir; Shawe-Taylor, John; Chain, Benny

    2017-01-01

    Abstract Motivation: Somatic DNA recombination, the hallmark of vertebrate adaptive immunity, has the potential to generate a vast diversity of antigen receptor sequences. How this diversity captures antigen specificity remains incompletely understood. In this study we use high throughput sequencing to compare the global changes in T cell receptor β chain complementarity determining region 3 (CDR3β) sequences following immunization with ovalbumin administered with complete Freund’s adjuvant (CFA) or CFA alone. Results: The CDR3β sequences were deconstructed into short stretches of overlapping contiguous amino acids. The motifs were ranked according to a one-dimensional Bayesian classifier score comparing their frequency in the repertoires of the two immunization classes. The top ranking motifs were selected and used to create feature vectors which were used to train a support vector machine. The support vector machine achieved high classification scores in a leave-one-out validation test reaching >90% in some cases. Summary: The study describes a novel two-stage classification strategy combining a one-dimensional Bayesian classifier with a support vector machine. Using this approach we demonstrate that the frequency of a small number of linear motifs three amino acids in length can accurately identify a CD4 T cell response to ovalbumin against a background response to the complex mixture of antigens which characterize Complete Freund’s Adjuvant. Availability and implementation: The sequence data is available at www.ncbi.nlm.nih.gov/sra/?term¼SRP075893. The Decombinator package is available at github.com/innate2adaptive/Decombinator. The R package e1071 is available at the CRAN repository https://cran.r-project.org/web/packages/e1071/index.html. Contact: b.chain@ucl.ac.uk Supplementary information: Supplementary data are available at Bioinformatics online. PMID:28073756

  15. Taking digital imaging to the next level: challenges and opportunities.

    PubMed

    Hobbs, W Cecyl

    2004-01-01

    New medical imaging technology, such as multi-detector computed tomography (CT) scanners and positron emission tomography (PET) scanners, are creating new possibilities for non-invasive diagnosis that are leading providers to invest heavily in these new technologies. The volume of data produced by such technology is so large that it cannot be "read" using traditional film-based methods, and once in digital form, it creates a massive data integration and archiving challenge. Despite the benefits of digital imaging and archiving, there are several key challenges that healthcare organizations should consider in planning, selecting, and implementing the information technology (IT) infrastructure to support digital imaging. Decisions about storage and image distribution are essentially questions of "where" and "how fast." When planning the digital archiving infrastructure, organizations should think about where they want to store and distribute their images. This is similar to decisions that organizations have to make in regard to physical film storage and distribution, except the portability of images is even greater in a digital environment. The principle of "network effects" seems like a simple concept, yet the effect is not always considered when implementing a technology plan. To fully realize the benefits of digital imaging, the radiology department must integrate the archiving solutions throughout the department and, ultimately, with applications across other departments and enterprises. Medical institutions can derive a number of benefits from implementing digital imaging and archiving solutions like PACS. Hospitals and imaging centers can use the transition from film-based imaging as a foundational opportunity to reduce costs, increase competitive advantage, attract talent, and improve service to patients. The key factors in achieving these goals include attention to the means of data storage, distribution and protection.

  16. Rtop - an R package for interpolation of data with a variable spatial support - examples from river networks

    NASA Astrophysics Data System (ADS)

    Olav Skøien, Jon; Laaha, Gregor; Koffler, Daniel; Blöschl, Günter; Pebesma, Edzer; Parajka, Juraj; Viglione, Alberto

    2013-04-01

    Geostatistical methods have been applied only to a limited extent for spatial interpolation in applications where the observations have an irregular support, such as runoff characteristics or population health data. Several studies have shown the potential of such methods (Gottschalk 1993, Sauquet et al. 2000, Gottschalk et al. 2006, Skøien et al. 2006, Goovaerts 2008), but these developments have so far not led to easily accessible, versatile, easy to apply and open source software. Based on the top-kriging approach suggested by Skøien et al. (2006), we will here present the package rtop, which has been implemented in the statistical environment R (R Core Team 2012). Taking advantage of the existing methods in R for analysis of spatial objects (Bivand et al. 2008), and the extensive possibilities for visualizing the results, rtop makes it easy to apply geostatistical interpolation methods when observations have a non-point spatial support. Although the package is flexible regarding data input, the main application so far has been for interpolation along river networks. We will present some examples showing how the package can easily be used for such interpolation. The model will soon be uploaded to CRAN, but is in the meantime also available from R-forge and can be installed by: > install.packages("rtop", repos="http://R-Forge.R-project.org") Bivand, R.S., Pebesma, E.J. & Gómez-Rubio, V., 2008. Applied spatial data analysis with r: Springer. Goovaerts, P., 2008. Kriging and semivariogram deconvolution in the presence of irregular geographical units. Mathematical Geosciences, 40 (1), 101-128. Gottschalk, L., 1993. Interpolation of runoff applying objective methods. Stochastic Hydrology and Hydraulics, 7, 269-281. Gottschalk, L., Krasovskaia, I., Leblois, E. & Sauquet, E., 2006. Mapping mean and variance of runoff in a river basin. Hydrology and Earth System Sciences, 10, 469-484. R Core Team, 2012. R: A language and environment for statistical computing. Vienna, Austria, ISBN 3-900051-07-0. Sauquet, E., Gottschalk, L. & Leblois, E., 2000. Mapping average annual runoff: A hierarchical approach applying a stochastic interpolation scheme. Hydrological Sciences Journal, 45 (6), 799-815. Skøien, J.O., Merz, R. & Blöschl, G., 2006. Top-kriging - geostatistics on stream networks. Hydrology and Earth System Sciences, 10, 277-287.

  17. Seventeenth-century experimenta, magisterial formulae and the ‘animal alkahest’: new documents found in Royal Society archives

    PubMed Central

    Alfonso-Goldfarb, Ana Maria; Ferraz, Márcia Helena Mendes; Rattansi, Piyo M.

    2015-01-01

    In this paper we present three newly rediscovered documents from the Royal Society archives that, together with the four described in our previous publications, constitute a set on a cognate theme. The documents were written by, or addressed to, members of the early Royal Society, and their subject is several magisterial formulae, including J. B. van Helmont's alkahest and Ludus. In addition to the interest in those formulae as medicines for various grave illnesses, our analysis showed that some seventeenth-century scholars sought to explain operations of the animal body by invoking similar but natural substances, while attempting to assimilate the latest anatomical discoveries into a novel framework. The complete set of documents sheds some new light on the interests of seventeenth-century networks of scholars concerned with experimenta. PMID:26665488

  18. Fiber Optic Communication System For Medical Images

    NASA Astrophysics Data System (ADS)

    Arenson, Ronald L.; Morton, Dan E.; London, Jack W.

    1982-01-01

    This paper discusses a fiber optic communication system linking ultrasound devices, Computerized tomography scanners, Nuclear Medicine computer system, and a digital fluoro-graphic system to a central radiology research computer. These centrally archived images are available for near instantaneous recall at various display consoles. When a suitable laser optical disk is available for mass storage, more extensive image archiving will be added to the network including digitized images of standard radiographs for comparison purposes and for remote display in such areas as the intensive care units, the operating room, and selected outpatient departments. This fiber optic system allows for a transfer of high resolution images in less than a second over distances exceeding 2,000 feet. The advantages of using fiber optic cables instead of typical parallel or serial communication techniques will be described. The switching methodology and communication protocols will also be discussed.

  19. The philosophy of benchmark testing a standards-based picture archiving and communications system.

    PubMed

    Richardson, N E; Thomas, J A; Lyche, D K; Romlein, J; Norton, G S; Dolecek, Q E

    1999-05-01

    The Department of Defense issued its requirements for a Digital Imaging Network-Picture Archiving and Communications System (DIN-PACS) in a Request for Proposals (RFP) to industry in January 1997, with subsequent contracts being awarded in November 1997 to the Agfa Division of Bayer and IBM Global Government Industry. The Government's technical evaluation process consisted of evaluating a written technical proposal as well as conducting a benchmark test of each proposed system at the vendor's test facility. The purpose of benchmark testing was to evaluate the performance of the fully integrated system in a simulated operational environment. The benchmark test procedures and test equipment were developed through a joint effort between the Government, academic institutions, and private consultants. Herein the authors discuss the resources required and the methods used to benchmark test a standards-based PACS.

  20. Microbe-ID: an open source toolbox for microbial genotyping and species identification.

    PubMed

    Tabima, Javier F; Everhart, Sydney E; Larsen, Meredith M; Weisberg, Alexandra J; Kamvar, Zhian N; Tancos, Matthew A; Smart, Christine D; Chang, Jeff H; Grünwald, Niklaus J

    2016-01-01

    Development of tools to identify species, genotypes, or novel strains of invasive organisms is critical for monitoring emergence and implementing rapid response measures. Molecular markers, although critical to identifying species or genotypes, require bioinformatic tools for analysis. However, user-friendly analytical tools for fast identification are not readily available. To address this need, we created a web-based set of applications called Microbe-ID that allow for customizing a toolbox for rapid species identification and strain genotyping using any genetic markers of choice. Two components of Microbe-ID, named Sequence-ID and Genotype-ID, implement species and genotype identification, respectively. Sequence-ID allows identification of species by using BLAST to query sequences for any locus of interest against a custom reference sequence database. Genotype-ID allows placement of an unknown multilocus marker in either a minimum spanning network or dendrogram with bootstrap support from a user-created reference database. Microbe-ID can be used for identification of any organism based on nucleotide sequences or any molecular marker type and several examples are provided. We created a public website for demonstration purposes called Microbe-ID (microbe-id.org) and provided a working implementation for the genus Phytophthora (phytophthora-id.org). In Phytophthora-ID, the Sequence-ID application allows identification based on ITS or cox spacer sequences. Genotype-ID groups individuals into clonal lineages based on simple sequence repeat (SSR) markers for the two invasive plant pathogen species P. infestans and P. ramorum. All code is open source and available on github and CRAN. Instructions for installation and use are provided at https://github.com/grunwaldlab/Microbe-ID.

  1. Picture archiving and communication systems (PACS).

    PubMed

    Gamsu, Gordon; Perez, Enrico

    2003-07-01

    Over the past 2 decades, groups of computer scientists, electronic design engineers, and physicians, in universities and industry, have worked to achieve an electronic environment for the practice of medicine and radiology. The radiology component of this revolution is often called PACS (picture archiving and communication systems). More recently it has become evident that the efficiencies and cost savings of PACS are realized when they are part of an enterprise-wide electronic medical record. The installation of PACS requires careful planning by all the various stakeholds over many months prior to installation. All of the users must be aware of the initial disruption that will occur as they become familiar with the systems. Modern fourth generation PACS is linked to radiology and hospital information systems. The PACS consist of electronic acquisition sites-a robust network intelligently managed by a server, multiple viewing sites, and an archive. The details of how these are linked and their workflow analysis determines the success of PACS. PACS evolves over time, components are frequently replaced, and so the users must expect continuous learning about new updates and improved functionality. The digital medical revolution is rapidly being adopted in many medical centers, improving patient care and the success of the institution.

  2. Definition and evaluation of the data-link layer of PACnet

    NASA Astrophysics Data System (ADS)

    Alsafadi, Yasser H.; Martinez, Ralph; Sanders, William H.

    1991-07-01

    PACnet is a 200-500 Mbps dual-ring fiber optic network designed to implement a picture archiving and communication system (PACS) in a hospital environment. The network consists of three channels: an image transfer channel, a command and control channel, and a real-time data channel. An initial network interface unit (NIU) design for PACnet consisted of a functional description of the protocols and NIU major components. In order to develop a demonstration prototype, additional definition of protocol algorithms of each channel is necessary. Using the International Standards Organization/Open Systems Interconnection (ISO/OSI) reference model as a guide, the definition of the data link layer is extended. This definition covers interface service specifications for the two constituent sublayers: logical link control (LLC) and medium access control (MAC). Furthermore, it describes procedures for data transfer, mechanisms of error detection and fault recovery. A performance evaluation study was then made to determine how the network performs under various application scenarios. The performance evaluation study was performed using stochastic activity networks, which can formally describe the network behavior. The results of the study demonstrate the feasibility of PACnet as an integrated image, data, and voice network for PACS.

  3. Problem of data quality and the limitations of the infrastructure approach

    NASA Astrophysics Data System (ADS)

    Behlen, Fred M.; Sayre, Richard E.; Rackus, Edward; Ye, Dingzhong

    1998-07-01

    The 'Infrastructure Approach' is a PACS implementation methodology wherein the archive, network and information systems interfaces are acquired first, and workstations are installed later. The approach allows building a history of archived image data, so that most prior examinations are available in digital form when workstations are deployed. A limitation of the Infrastructure Approach is that the deferred use of digital image data defeats many data quality management functions that are provided automatically by human mechanisms when data is immediately used for the completion of clinical tasks. If the digital data is used solely for archiving while reports are interpreted from film, the radiologist serves only as a check against lost films, and another person must be designated as responsible for the quality of the digital data. Data from the Radiology Information System and the PACS were analyzed to assess the nature and frequency of system and data quality errors. The error level was found to be acceptable if supported by auditing and error resolution procedures requiring additional staff time, and in any case was better than the loss rate of a hardcopy film archive. It is concluded that the problem of data quality compromises but does not negate the value of the Infrastructure Approach. The Infrastructure Approach should best be employed only to a limited extent, and that any phased PACS implementation should have a substantial complement of workstations dedicated to softcopy interpretation for at least some applications, and with full deployment following not long thereafter.

  4. Quantifiable and objective approach to organizational performance enhancement.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scholand, Andrew Joseph; Tausczik, Yla R.

    This report describes a new methodology, social language network analysis (SLNA), that combines tools from social language processing and network analysis to identify socially situated relationships between individuals which, though subtle, are highly influential. Specifically, SLNA aims to identify and characterize the nature of working relationships by processing artifacts generated with computer-mediated communication systems, such as instant message texts or emails. Because social language processing is able to identify psychological, social, and emotional processes that individuals are not able to fully mask, social language network analysis can clarify and highlight complex interdependencies between group members, even when these relationships aremore » latent or unrecognized. This report outlines the philosophical antecedents of SLNA, the mechanics of preprocessing, processing, and post-processing stages, and some example results obtained by applying this approach to a 15-month corporate discussion archive.« less

  5. Perception de la verticale avec Un cadre visuel solidaire de la tete: implications pour la conception des afficheurs de casques en ae’ronauflque (Perception of the Vertical With a Head-Mounted Visual Frame: Implication for the Design of Helmet-Mounted Displays in Aeronautics)

    DTIC Science & Technology

    2003-02-01

    de celle de la tete. MWthodes Douze sujets, 9 hommes et 3 femmes, dg6s de 23 it 41 ans, se...qui donne la sensation de voir un 6cran informatique, centr6 sur l’axe interoculaire, d’une taille angulaire de 300 X 22,5’. L ’&ran virtuel apparait...inclin6s de faqon identique par rapport ýt la gravit6. Le premier essai 6tait toujours r6alis6 avec la tete droite . Ensuite, une nouvelle orientation de

  6. Overview of PACS

    NASA Astrophysics Data System (ADS)

    Vanden Brink, John A.

    1995-08-01

    Development of the DICOM standard and incremental developments in workstation, network, compression, archiving, and digital x-ray technology have produced cost effective image communication possibilities for selected medical applications. The emerging markets include modality PACS, mini PACS, and teleradiology. Military and VA programs lead the way in the move to adopt PACS technology. Commercial markets for PACS components and PAC systems are at LR400 million growing to LR500 million in 1996.

  7. International Federation of Library Associations Annual Conference. Papers of the Management and Technology Division: Information Technology Section (47th, Leipzig, East Germany, August 17-22, 1981).

    ERIC Educational Resources Information Center

    Bradler, Reinhard; And Others

    These seven papers on library management and networks focus on: (1) computerized access to archival and library materials, describing the methodological problems associated with a pilot project in the German Democratic Republic, as well as the efficiency of data bank systems; (2) present and future development of libraries and information centers…

  8. Performance enhancement of a web-based picture archiving and communication system using commercial off-the-shelf server clusters.

    PubMed

    Liu, Yan-Lin; Shih, Cheng-Ting; Chang, Yuan-Jen; Chang, Shu-Jun; Wu, Jay

    2014-01-01

    The rapid development of picture archiving and communication systems (PACSs) thoroughly changes the way of medical informatics communication and management. However, as the scale of a hospital's operations increases, the large amount of digital images transferred in the network inevitably decreases system efficiency. In this study, a server cluster consisting of two server nodes was constructed. Network load balancing (NLB), distributed file system (DFS), and structured query language (SQL) duplication services were installed. A total of 1 to 16 workstations were used to transfer computed radiography (CR), computed tomography (CT), and magnetic resonance (MR) images simultaneously to simulate the clinical situation. The average transmission rate (ATR) was analyzed between the cluster and noncluster servers. In the download scenario, the ATRs of CR, CT, and MR images increased by 44.3%, 56.6%, and 100.9%, respectively, when using the server cluster, whereas the ATRs increased by 23.0%, 39.2%, and 24.9% in the upload scenario. In the mix scenario, the transmission performance increased by 45.2% when using eight computer units. The fault tolerance mechanisms of the server cluster maintained the system availability and image integrity. The server cluster can improve the transmission efficiency while maintaining high reliability and continuous availability in a healthcare environment.

  9. Performance Enhancement of a Web-Based Picture Archiving and Communication System Using Commercial Off-the-Shelf Server Clusters

    PubMed Central

    Chang, Shu-Jun; Wu, Jay

    2014-01-01

    The rapid development of picture archiving and communication systems (PACSs) thoroughly changes the way of medical informatics communication and management. However, as the scale of a hospital's operations increases, the large amount of digital images transferred in the network inevitably decreases system efficiency. In this study, a server cluster consisting of two server nodes was constructed. Network load balancing (NLB), distributed file system (DFS), and structured query language (SQL) duplication services were installed. A total of 1 to 16 workstations were used to transfer computed radiography (CR), computed tomography (CT), and magnetic resonance (MR) images simultaneously to simulate the clinical situation. The average transmission rate (ATR) was analyzed between the cluster and noncluster servers. In the download scenario, the ATRs of CR, CT, and MR images increased by 44.3%, 56.6%, and 100.9%, respectively, when using the server cluster, whereas the ATRs increased by 23.0%, 39.2%, and 24.9% in the upload scenario. In the mix scenario, the transmission performance increased by 45.2% when using eight computer units. The fault tolerance mechanisms of the server cluster maintained the system availability and image integrity. The server cluster can improve the transmission efficiency while maintaining high reliability and continuous availability in a healthcare environment. PMID:24701580

  10. Performance of a distributed superscalar storage server

    NASA Technical Reports Server (NTRS)

    Finestead, Arlan; Yeager, Nancy

    1993-01-01

    The RS/6000 performed well in our test environment. The potential exists for the RS/6000 to act as a departmental server for a small number of users, rather than as a high speed archival server. Multiple UniTree Disk Server's utilizing one UniTree Disk Server's utilizing one UniTree Name Server could be developed that would allow for a cost effective archival system. Our performance tests were clearly limited by the network bandwidth. The performance gathered by the LibUnix testing shows that UniTree is capable of exceeding ethernet speeds on an RS/6000 Model 550. The performance of FTP might be significantly faster if asked to perform across a higher bandwidth network. The UniTree Name Server also showed signs of being a potential bottleneck. UniTree sites that would require a high ratio of file creations and deletions to reads and writes would run into this bottleneck. It is possible to improve the UniTree Name Server performance by bypassing the UniTree LibUnix Library altogether and communicating directly with the UniTree Name Server and optimizing creations. Although testing was performed in a less than ideal environment, hopefully the performance statistics stated in this paper will give end-users a realistic idea as to what performance they can expect in this type of setup.

  11. Wallops Ship Surveillance System

    NASA Technical Reports Server (NTRS)

    Smith, Donna C.

    2011-01-01

    Approved as a Wallops control center backup system, the Wallops Ship Surveillance Software is a day-of-launch risk analysis tool for spaceport activities. The system calculates impact probabilities and displays ship locations relative to boundary lines. It enables rapid analysis of possible flight paths to preclude the need to cancel launches and allow execution of launches in a timely manner. Its design is based on low-cost, large-customer- base elements including personal computers, the Windows operating system, C/C++ object-oriented software, and network interfaces. In conformance with the NASA software safety standard, the system is designed to ensure that it does not falsely report a safe-for-launch condition. To improve the current ship surveillance method, the system is designed to prevent delay of launch under a safe-for-launch condition. A single workstation is designated the controller of the official ship information and the official risk analysis. Copies of this information are shared with other networked workstations. The program design is divided into five subsystems areas: 1. Communication Link -- threads that control the networking of workstations; 2. Contact List -- a thread that controls a list of protected item (ocean vessel) information; 3. Hazard List -- threads that control a list of hazardous item (debris) information and associated risk calculation information; 4. Display -- threads that control operator inputs and screen display outputs; and 5. Archive -- a thread that controls archive file read and write access. Currently, most of the hazard list thread and parts of other threads are being reused as part of a new ship surveillance system, under the SureTrak project.

  12. Building the European Seismological Research Infrastructure: results from 4 years NERIES EC project

    NASA Astrophysics Data System (ADS)

    van Eck, T.; Giardini, D.

    2010-12-01

    The EC Research Infrastructure (RI) project, Network of Research Infrastructures for European Seismology (NERIES), implemented a comprehensive European integrated RI for earthquake seismological data that is scalable and sustainable. NERIES opened a significant amount of additional seismological data, integrated different distributed data archives, implemented and produced advanced analysis tools and advanced software packages and tools. A single seismic data portal provides a single access point and overview for European seismological data available for the earth science research community. Additional data access tools and sites have been implemented to meet user and robustness requirements, notably those at the EMSC and ORFEUS. The datasets compiled in NERIES and available through the portal include among others: - The expanded Virtual European Broadband Seismic Network (VEBSN) with real-time access to more then 500 stations from > 53 observatories. This data is continuously monitored, quality controlled and archived in the European Integrated Distributed waveform Archive (EIDA). - A unique integration of acceleration datasets from seven networks in seven European or associated countries centrally accessible in a homogeneous format, thus forming the core comprehensive European acceleration database. Standardized parameter analysis and actual software are included in the database. - A Distributed Archive of Historical Earthquake Data (AHEAD) for research purposes, containing among others a comprehensive European Macroseismic Database and Earthquake Catalogue (1000 - 1963, M ≥5.8), including analysis tools. - Data from 3 one year OBS deployments at three sites, Atlantic, Ionian and Ligurian Sea within the general SEED format, thus creating the core integrated data base for ocean, sea and land based seismological observatories. Tools to facilitate analysis and data mining of the RI datasets are: - A comprehensive set of European seismological velocity reference model including a standardized model description with several visualisation tools currently adapted on a global scale. - An integrated approach to seismic hazard modelling and forecasting, a community accepted forecasting testing and model validation approach and the core hazard portal developed along the same technologies as the NERIES data portal. - Implemented homogeneous shakemap estimation tools at several large European observatories and a complementary new loss estimation software tool. - A comprehensive set of new techniques for geotechnical site characterization with relevant software packages documented and maintained (www.geopsy.org). - A set of software packages for data mining, data reduction, data exchange and information management in seismology as research and observatory analysis tools NERIES has a long-term impact and is coordinated with related US initiatives IRIS and EarthScope. The follow-up EC project of NERIES, NERA (2010 - 2014), is funded and will integrate the seismological and the earthquake engineering infrastructures. NERIES further provided the proof of concept for the ESFRI2008 initiative: the European Plate Observing System (EPOS). Its preparatory phase (2010 - 2014) is also funded by the EC.

  13. Fine-Tuning Neural Patient Question Retrieval Model with Generative Adversarial Networks.

    PubMed

    Tang, Guoyu; Ni, Yuan; Wang, Keqiang; Yong, Qin

    2018-01-01

    The online patient question and answering (Q&A) system attracts an increasing amount of users in China. Patient will post their questions and wait for doctors' response. To avoid the lag time involved with the waiting and to reduce the workload on the doctors, a better method is to automatically retrieve the semantically equivalent question from the archive. We present a Generative Adversarial Networks (GAN) based approach to automatically retrieve patient question. We apply supervised deep learning based approaches to determine the similarity between patient questions. Then a GAN framework is used to fine-tune the pre-trained deep learning models. The experiment results show that fine-tuning by GAN can improve the performance.

  14. A portal for the ocean biogeographic information system

    USGS Publications Warehouse

    Zhang, Yunqing; Grassle, J. F.

    2002-01-01

    Since its inception in 1999 the Ocean Biogeographic Information System (OBIS) has developed into an international science program as well as a globally distributed network of biogeographic databases. An OBIS portal at Rutgers University provides the links and functional interoperability among member database systems. Protocols and standards have been established to support effective communication between the portal and these functional units. The portal provides distributed data searching, a taxonomy name service, a GIS with access to relevant environmental data, biological modeling, and education modules for mariners, students, environmental managers, and scientists. The portal will integrate Census of Marine Life field projects, national data archives, and other functional modules, and provides for network-wide analyses and modeling tools.

  15. A high-speed network for cardiac image review.

    PubMed

    Elion, J L; Petrocelli, R R

    1994-01-01

    A high-speed fiber-based network for the transmission and display of digitized full-motion cardiac images has been developed. Based on Asynchronous Transfer Mode (ATM), the network is scaleable, meaning that the same software and hardware is used for a small local area network or for a large multi-institutional network. The system can handle uncompressed digital angiographic images, considered to be at the "high-end" of the bandwidth requirements. Along with the networking, a general-purpose multi-modality review station has been implemented without specialized hardware. This station can store a full injection sequence in "loop RAM" in a 512 x 512 format, then interpolate to 1024 x 1024 while displaying at 30 frames per second. The network and review stations connect to a central file server that uses a virtual file system to make a large high-speed RAID storage disk and associated off-line storage tapes and cartridges all appear as a single large file system to the software. In addition to supporting archival storage and review, the system can also digitize live video using high-speed Direct Memory Access (DMA) from the frame grabber to present uncompressed data to the network. Fully functional prototypes have provided the proof of concept, with full deployment in the institution planned as the next stage.

  16. A high-speed network for cardiac image review.

    PubMed Central

    Elion, J. L.; Petrocelli, R. R.

    1994-01-01

    A high-speed fiber-based network for the transmission and display of digitized full-motion cardiac images has been developed. Based on Asynchronous Transfer Mode (ATM), the network is scaleable, meaning that the same software and hardware is used for a small local area network or for a large multi-institutional network. The system can handle uncompressed digital angiographic images, considered to be at the "high-end" of the bandwidth requirements. Along with the networking, a general-purpose multi-modality review station has been implemented without specialized hardware. This station can store a full injection sequence in "loop RAM" in a 512 x 512 format, then interpolate to 1024 x 1024 while displaying at 30 frames per second. The network and review stations connect to a central file server that uses a virtual file system to make a large high-speed RAID storage disk and associated off-line storage tapes and cartridges all appear as a single large file system to the software. In addition to supporting archival storage and review, the system can also digitize live video using high-speed Direct Memory Access (DMA) from the frame grabber to present uncompressed data to the network. Fully functional prototypes have provided the proof of concept, with full deployment in the institution planned as the next stage. PMID:7949964

  17. Maritime Aerosol Network as a Component of AERONET - a Useful Tool for Evaluation of the Global Sea-Salt Aerosol Distribution

    NASA Astrophysics Data System (ADS)

    Smirnov, A.; Holben, B. N.; Kinne, S.; Nelson, N. B.; Stenchikov, G. L.; Broccardo, S. P.; Sowers, D.; Lobecker, E.; Ondrusek, M.; Zielinski, T. P.; Gray, L. M.; Frouin, R.; Radionov, V. F.; Smyth, T. J.; Zibordi, G.; Heller, M. I.; Slabakova, V.; Krüger, K.; Reid, E. A.; Istomina, L.; Vandermeulen, R. A.; O'Neill, N. T.; Levy, G.; Giles, D. M.; Slutsker, I.; Sorokin, M. G.; Eck, T. F.

    2016-02-01

    Sea-salt aerosol plays an important role in radiation balance and chemistry of marine atmosphere. Sea-salt production depends on various factors. There is a significant uncertainty in the parametrization of the sea-salt production and budget. Ship-based aerosol optical depth (AOD) measurements can be used as an important validation tool for various global models and in-situ measurements. The paper presents the current status of the Maritime Aerosol Network (MAN) which is a component of Aerosol Robotic Network. Since 2006 over 300 cruises were completed and data archive of more than 5500 measurement days is accessible at http://aeronet.gsfc.nasa.gov/new_web/maritime_aerosol_network.html . AOD measurements from ships of opportunity complemented island-based AERONET measurements and provided important reference points for satellite retrieved and modelled AOD climatology over the oceans. The program exemplifies mutually beneficial international, multi-agency effort in atmospheric aerosol optical studies over the oceans.

  18. Analysis of Context Dependence in Social Interaction Networks of a Massively Multiplayer Online Role-Playing Game

    PubMed Central

    Son, Seokshin; Kang, Ah Reum; Kim, Hyun-chul; Kwon, Taekyoung; Park, Juyong; Kim, Huy Kang

    2012-01-01

    Rapid advances in modern computing and information technology have enabled millions of people to interact online via various social network and gaming services. The widespread adoption of such online services have made possible analysis of large-scale archival data containing detailed human interactions, presenting a very promising opportunity to understand the rich and complex human behavior. In collaboration with a leading global provider of Massively Multiplayer Online Role-Playing Games (MMORPGs), here we present a network science-based analysis of the interplay between distinct types of user interaction networks in the virtual world. We find that their properties depend critically on the nature of the context-interdependence of the interactions, highlighting the complex and multilayered nature of human interactions, a robust understanding of which we believe may prove instrumental in the designing of more realistic future virtual arenas as well as provide novel insights to the science of collective human behavior. PMID:22496771

  19. Mass storage technology in networks

    NASA Astrophysics Data System (ADS)

    Ishii, Katsunori; Takeda, Toru; Itao, Kiyoshi; Kaneko, Reizo

    1990-08-01

    Trends and features of mass storage subsystems in network are surveyed and their key technologies spotlighted. Storage subsystems are becoming increasingly important in new network systems in which communications and data processing are systematically combined. These systems require a new class of high-performance mass-information storage in order to effectively utilize their processing power. The requirements of high transfer rates, high transactional rates and large storage capacities, coupled with high functionality, fault tolerance and flexibility in configuration, are major challenges in storage subsystems. Recent progress in optical disk technology has resulted in improved performance of on-line external memories to optical disk drives, which are competing with mid-range magnetic disks. Optical disks are more effective than magnetic disks in using low-traffic random-access file storing multimedia data that requires large capacity, such as in archive use and in information distribution use by ROM disks. Finally, it demonstrates image coded document file servers for local area network use that employ 130mm rewritable magneto-optical disk subsystems.

  20. Application-driven strategies for efficient transfer of medical images over very high speed networks

    NASA Astrophysics Data System (ADS)

    Alsafadi, Yasser H.; McNeill, Kevin M.; Martinez, Ralph

    1993-09-01

    The American College of Radiology (ACR) and the National Electrical Manufacturing Association (NEMA) in 1982 formed the ACR-NEMA committee to develop a standard to enable equipment from different vendors to communicate and participate in a picture archiving and communications system (PACS). The standard focused mostly on interconnectivity issues and communication needs of PACS. It was patterned after the international standards organization open systems interconnection (ISO/OSI) reference model. Three versions of the standard appeared, evolving from simple point-to-point specification of connection between two medical devices to a complex standard of a network environment. However, fast changes in network software and hardware technologies makes it difficult for the standard to keep pace. This paper compares two versions of the ACR-NEMA standard and then describes a system that is used at the University of Arizona Intensive Care Unit. In this system, the application should specify the interface to network services and grade of service required. These provisions are suggested to make the application independent from evolving network technology and support true open systems.

  1. Squish: Near-Optimal Compression for Archival of Relational Datasets

    PubMed Central

    Gao, Yihan; Parameswaran, Aditya

    2017-01-01

    Relational datasets are being generated at an alarmingly rapid rate across organizations and industries. Compressing these datasets could significantly reduce storage and archival costs. Traditional compression algorithms, e.g., gzip, are suboptimal for compressing relational datasets since they ignore the table structure and relationships between attributes. We study compression algorithms that leverage the relational structure to compress datasets to a much greater extent. We develop Squish, a system that uses a combination of Bayesian Networks and Arithmetic Coding to capture multiple kinds of dependencies among attributes and achieve near-entropy compression rate. Squish also supports user-defined attributes: users can instantiate new data types by simply implementing five functions for a new class interface. We prove the asymptotic optimality of our compression algorithm and conduct experiments to show the effectiveness of our system: Squish achieves a reduction of over 50% in storage size relative to systems developed in prior work on a variety of real datasets. PMID:28180028

  2. Repurposing traditional instructor-led lectures for continuing education: rewarding instructors as authors and maximizing return on investment.

    PubMed

    Rushinek, Avi; Rushinek, Sara; Lippincott, Christine; Ambrosia, Todd

    2014-04-01

    The aim of this article is to describe the repurposing of classroom video surveillance and on-screen archives (RCVSOSA) model, which is an innovative, technology-enabled approach to continuing education in nursing. The RCVSOSA model leverages network Internet-protocol, high-definition surveillance cameras to record videos of classroom lectures that can be automatically uploaded to the Internet or converted to DVD, either in their entirety or as content-specific modules, with the production work embedded in the technology. The proposed model supports health care continuing education through the use of online assessments for focused education modules, access to archived online recordings and DVD training courses, voice-to-text transcripts, and possibly continuing education modules that may be translated into multiple languages. Potential benefits of this model include increased access to educational modules for students, instant authorship, and financial compensation for instructors and their respective organizations.

  3. CFD Data Sets on the WWW for Education and Testing

    NASA Technical Reports Server (NTRS)

    Globus, Al; Lasinski, T. A. (Technical Monitor)

    1995-01-01

    The Numerical Aerodynamic Simulation (NAS) Systems Division at NASA Ames Research Center has begun the development of a Computational Fluid Dynamics (CFD) data set archive on the World Wide Web (WWW) at URL http://www.nas.nasa.gov/NAS/DataSets/. Data sets are integrated with related information such as research papers, metadata, visualizations, etc. In this paper, four classes of users are identified and discussed: students, visualization developers, CFD practitioners, and management. Bandwidth and security issues are briefly reviewed and the status of the archive as of May 1995 is examined. Routine network distribution of data sets is likely to have profound implications for the conduct of science. The exact nature of these changes is subject to speculation, but the ability for anyone to examine the data, in addition to the investigator's analysis, may well play an important role in the future.

  4. An Approach to Data Center-Based KDD of Remote Sensing Datasets

    NASA Technical Reports Server (NTRS)

    Lynnes, Christopher; Mack, Robert; Wharton, Stephen W. (Technical Monitor)

    2001-01-01

    The data explosion in remote sensing is straining the ability of data centers to deliver the data to the user community, yet many large-volume users actually seek a relatively small information component within the data, which they extract at their sites using Knowledge Discovery in Databases (KDD) techniques. To improve the efficiency of this process, the Goddard Earth Sciences Distributed Active Archive Center (GES DAAC) has implemented a KDD subsystem that supports execution of the user's KDD algorithm at the data center, dramatically reducing the volume that is sent to the user. The data are extracted from the archive in a planned, organized "campaign"; the algorithms are executed, and the output products sent to the users over the network. The first campaign, now complete, has resulted in overall reductions in shipped volume from 3.3 TB to 0.4 TB.

  5. [From planning to realization of an electronic patient record].

    PubMed

    Krämer, T; Rapp, R; Krämer, K L

    1999-03-01

    The high complex requirements on information and information flow in todays hospitals can only be accomplished by the use of modern Information Systems (IS). In order to achieve this, the Stiftung Orthopädische Universitätsklinik has carried out first the Project "Strategic Informations System Planning" in 1993. Then realizing the necessary infrastructure (network; client-server) from 1993 to 1997, and finally started the introduction of modern IS (SAP R/3 and IXOS-Archive) in the clinical area. One of the approved goal was the replacement of the paper medical record by an up-to-date electronical medical record. In this article the following three topics will be discussed: the difference between the up-to-date electronical medical record and the electronically archived finished cases, steps performed by our clinic to realize the up-to-date electronical medical record and the problems occurred during this process.

  6. From planning to realisation of an electronic patient record.

    PubMed

    Krämer, T; Rapp, R; Krämer, K-L

    1999-03-01

    The high complex requirements on information and information flow in todays hospitals can only be accomplished by the use of modern Information Systems (IS). In order to achieve this, the Stiftung Orthopädische Universitätsklinik has carried out first the Project "Strategic Informations System Planning" in 1993. Then realizing the neccessary infrastructure (network; client-server) from 1993 to 1997, and finally started the introduction of modern IS (SAP R/3 and IXOS-Archive) in the clinical area. One of the approved goal was the replacement of the paper medical record by an up-to-date electronical medical record. In this article the following three topics will be discussed: the difference between the up-to-date electronical medical record and the electronically archived finished cases, steps performed by our clinic to realize the up-to-date electronical medical record and the problems occured during this process.

  7. Department of Defense picture archiving and communication system acceptance testing: results and identification of problem components.

    PubMed

    Allison, Scott A; Sweet, Clifford F; Beall, Douglas P; Lewis, Thomas E; Monroe, Thomas

    2005-09-01

    The PACS implementation process is complicated requiring a tremendous amount of time, resources, and planning. The Department of Defense (DOD) has significant experience in developing and refining PACS acceptance testing (AT) protocols that assure contract compliance, clinical safety, and functionality. The DOD's AT experience under the initial Medical Diagnostic Imaging Support System contract led to the current Digital Imaging Network-Picture Archiving and Communications Systems (DIN-PACS) contract AT protocol. To identify the most common system and component deficiencies under the current DIN-PACS AT protocol, 14 tri-service sites were evaluated during 1998-2000. Sixteen system deficiency citations with 154 separate types of limitations were noted with problems involving the workstation, interfaces, and the Radiology Information System comprising more than 50% of the citations. Larger PACS deployments were associated with a higher number of deficiencies. The most commonly cited systems deficiencies were among the most expensive components of the PACS.

  8. Creating and Searching a Local Inventory for Data Granules in a Remote Archive

    NASA Astrophysics Data System (ADS)

    Cornillon, P. C.

    2016-12-01

    More often than not, search capabilities for network accessible data do not exist or do not meet the requirements of the user. For large archives this can make finding data of interest tedious at best. This summer, the author encountered such a problem with regard to the two existing archives of VIIRS L2 sea surface temperature (SST) fields obtained with the new ACSPO retrieval algorithm; one at the Jet Propulsion Laboratory's PO-DAAC and the other at NOAA's National Centers for Environmental Information (NCEI). In both cases the data were available via ftp and OPeNDAP but there was no search capability at the PO-DAAC and the NCEI archive was incomplete. Furthermore, in order to meet the needs of a broad range of datasets and users, the beta version of the search engine at NCEI was cumbersome for the searches of interest. Although some of these problems have been resolved since (and may be described in other posters/presentations at this meeting), the solution described in this presentation offers the user the ability to develop a search capability for archives lacking a search capability and/or to configure searches more to his or her preferences than the generic searches offered by the data provider. The solution, a Matlab script, used html access to the PO-DAAC web site to locate all VIIRS 10 minute granules and OPeNDAP access to acquire the bounding box for each granule from the metadata bound to the file. This task required several hours of wall time to acquire the data and to write the bounding boxes to a local file with the associated ftp and OPeNDAP urls for the 110,000+ granule archive. A second Matlab script searched the local archive, seconds, for granules falling in a user defined space-time window and an ascii file of wget commands associated with these was generated. This file was then executed to acquire the data of interest. The wget commands can be configured to acquire the entire files via ftp or a subset of each file via OPeNDAP. Furthermore, the search capability, based on bounding boxes and rectangular regions, could easily be modified to further refine the search. Finally, the script that builds the inventory has been designed to update the local inventory, minutes per month rather than hours.

  9. NDSC Lidar Intercomparisons and Validation: OPAL and MLO3 Campaigns in 1995

    NASA Technical Reports Server (NTRS)

    McDermid, Stuart; McGee, Thomas J.; Stuart, Daan P. J.

    1996-01-01

    The Network for the Detection of Stratospheric Change (NDSC) has developed and adopted a Validation Policy in order to ensure that the results submitted and stored in its archives are of a known, high quality. As a part of this validation policy, blind instrument intercomparisons are considered an essential element in the certification of NDSC instruments and a specific format for these campaigns has been recommended by the NDSC-Steering Committee.

  10. UAV Swarm Tactics: An Agent-Based Simulation and Markov Process Analysis

    DTIC Science & Technology

    2013-06-01

    CRN Common Random Numbers CSV Comma Separated Values DoE Design of Experiment GLM Generalized Linear Model HVT High Value Target JAR Java ARchive JMF... Java Media Framework JRE Java runtime environment Mason Multi-Agent Simulator Of Networks MOE Measure Of Effectiveness MOP Measures Of Performance...with every set several times, and to write a CSV file with the results. Rather than scripting the agent behavior deterministically, the agents should

  11. The Telecommunications and Data Acquisition Report

    NASA Technical Reports Server (NTRS)

    Posner, E. C. (Editor)

    1993-01-01

    This quarterly publication provides archival reports on developments in programs managed by JPL's Office of Telecommunications and Data Acquisition (TDA). In space communications, radio navigation, radio science, and ground-based radio and radar astronomy, it reports on activities of the Deep Space Network (DSN) in planning, supporting research and technology, implementation, and operations. Also included are standards activity at JPL for space data and information systems and reimbursable DSN work performed for other space agencies through NASA.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scott, Bari

    SoundVision held a post-workshop teleconference for our 2011 graduates (as we have done for all participants) to consolidate what they'd learned during the workshop. To maximize the Science Literacy Project's impact after it ends, we strengthened and reinforced our alumni's vibrant networking infrastructure so they can continue to connect and support each other, and updated our archive system to ensure all of our science and science journalism resources and presentations will be easy to access and use over time.

  13. The Gaia scientific exploitation networks

    NASA Astrophysics Data System (ADS)

    Figueras, F.; Jordi, C.

    2015-05-01

    On July 2014 the Gaia satellite, placed at L2 since January 2014, finished their commissioning phase and started collecting high accurate scientific data. New and more realistic estimations of the astrometric, photometric and spectroscopic accuracy expected after five years mission operation (2014-2019) have been recently published in the Gaia Science Performance Web page. Here we present the coordination efforts and the activities being conducted through the two GREAT (Gaia Research for European Astronomy Training) European Networks, the GREAT-ESF, a programme supported by the European Science Foundation (2010-2015), and the GREAT-ITN network, from the European Union's Seventh Framework Programme (2011-2015). The main research theme of these networks is to unravel the origin and history of our home galaxy. Emphasis is placed on the research projects being conducted by the Spanish Researchers through these networks, well coordinated by the Red Española de Explotación Científica de Gaia (REG network, with more than 140 participants). Members of the REG play an important role on the collection of complementary spectroscopic data from ground based telescopes, on the development of new tools for an optimal scientific exploitation of Gaia data and on the preparation task to create the Gaia archive.

  14. Digital echocardiography 2002: now is the time

    NASA Technical Reports Server (NTRS)

    Thomas, James D.; Greenberg, Neil L.; Garcia, Mario J.

    2002-01-01

    The ability to acquire echocardiographic images digitally, store and transfer these data using the DICOM standard, and routinely analyze examinations exists today and allows the implementation of a digital echocardiography laboratory. The purpose of this review article is to outline the critical components of a digital echocardiography laboratory, discuss general strategies for implementation, and put forth some of the pitfalls that we have encountered in our own implementation. The major components of the digital laboratory include (1) digital echocardiography machines with network output, (2) a switched high-speed network, (3) a high throughput server with abundant local storage, (4) a reliable low-cost archive, (5) software to manage information, and (6) support mechanisms for software and hardware. Implementation strategies can vary from a complete vendor solution providing all components (hardware, software, support), to a strategy similar to our own where standard computer and networking hardware are used with specialized software for management of image and measurement information.

  15. A reusability and efficiency oriented software design method for mobile land inspection

    NASA Astrophysics Data System (ADS)

    Cai, Wenwen; He, Jun; Wang, Qing

    2008-10-01

    Aiming at the requirement from the real-time land inspection domain, a land inspection handset system was presented in this paper. In order to increase the reusability of the system, a design pattern based framework was presented. Encapsulation for command like actions by applying COMMAND pattern was proposed for the problem of complex UI interactions. Integrating several GPS-log parsing engines into a general parsing framework was archived by introducing STRATEGY pattern. A network transmission module based network middleware was constructed. For mitigating the high coupling of complex network communication programs, FACTORY pattern was applied to facilitate the decoupling. Moreover, in order to efficiently manipulate huge GIS datasets, a VISITOR pattern and Quad-tree based multi-scale representation method was presented. It had been proved practically that these design patterns reduced the coupling between the subsystems, and improved the expansibility.

  16. System Requirement Analyses for Ubiquitous Environment Management System

    NASA Astrophysics Data System (ADS)

    Lim, Sang Boem; Gil, Kyung Jun; Choe, Ho Rim; Eo, Yang Dam

    We are living in new stage of society. U-City introduces new paradigm that cannot be archived in traditional city to future city. Korea is one of the most active countries to construct U-City based on advances of IT technologies - especially based on high-speed network through out country [1]. Peoples are realizing ubiquitous service is key factor of success of U-City. Among the U-services, U-security service is one of the most important services. Nowadays we have to concern about traditional threat and also personal information. Since apartment complex is the most common residence type in Korea. We are developing security rules and system based on analyses of apartment complex and assert of apartment complex. Based on these analyses, we are developing apartment complex security using various technologies including home network system. We also will discuss basic home network security architecture.

  17. BIND: the Biomolecular Interaction Network Database

    PubMed Central

    Bader, Gary D.; Betel, Doron; Hogue, Christopher W. V.

    2003-01-01

    The Biomolecular Interaction Network Database (BIND: http://bind.ca) archives biomolecular interaction, complex and pathway information. A web-based system is available to query, view and submit records. BIND continues to grow with the addition of individual submissions as well as interaction data from the PDB and a number of large-scale interaction and complex mapping experiments using yeast two hybrid, mass spectrometry, genetic interactions and phage display. We have developed a new graphical analysis tool that provides users with a view of the domain composition of proteins in interaction and complex records to help relate functional domains to protein interactions. An interaction network clustering tool has also been developed to help focus on regions of interest. Continued input from users has helped further mature the BIND data specification, which now includes the ability to store detailed information about genetic interactions. The BIND data specification is available as ASN.1 and XML DTD. PMID:12519993

  18. Performance characteristics of the Mayo/IBM PACS

    NASA Astrophysics Data System (ADS)

    Persons, Kenneth R.; Gehring, Dale G.; Pavicic, Mark J.; Ding, Yingjai

    1991-07-01

    The Mayo Clinic and IBM (at Rochester, Minnesota) have jointly developed a picture archiving system for use with Mayo's MRI and Neuro CT imaging modalities. The communications backbone of the PACS is a portion of the Mayo institutional network: a series of 4-Mbps token rings interconnected by bridges and fiber optic extensions. The performance characteristics of this system are important to understand because they affect the response time a PACS user can expect, and the response time for non-PACS users competing for resources on the institutional network. The performance characteristics of each component and the average load levels of the network were measured for various load distributions. These data were used to quantify the response characteristics of the existing system and to tune a model developed by North Dakota State University Department of Computer Science for predicting response times of more complex topologies.

  19. Prototype development and implementation of picture archiving and communications systems based on ISO-OSI standard

    NASA Astrophysics Data System (ADS)

    Martinez, Ralph; Nam, Jiseung

    1992-07-01

    Picture Archiving and Communication Systems (PACS) is an integration of digital image formation in a hospital, which encompasses various imaging equipment, image viewing workstations, image databases, and a high speed network. The integration requires a standardization of communication protocols to connect devices from different vendors. The American College of Radiology and the National Electrical Manufacturers Association (ACR- NEMA) standard Version 2.0 provides a point-to-point hardware interface, a set of software commands, and a consistent set of data formats for PACS. But, it is inadequate for PACS networking environments, because of its point-to-point nature and its inflexibility to allow other services and protocols in the future. Based on previous experience of PACS developments in The University of Arizona, a new communication protocol for PACS networks and an approach were proposed to ACR-NEMA Working Group VI. The defined PACS protocol is intended to facilitate the development of PACS''s capable of interfacing with other hospital information systems. Also, it is intended to allow the creation of diagnostic information data bases which can be interrogated by a variety of distributed devices. A particularly important goal is to support communications in a multivendor environment. The new protocol specifications are defined primarily as a combination of the International Organization for Standardization/Open Systems Interconnection (ISO/OSI), TCP/IP protocols, and the data format portion of ACR-NEMA standard. This paper addresses the specification and implementation of the ISO-based protocol into a PACS prototype. The protocol specification, which covers Presentation, Session, Transport, and Network layers, is summarized briefly. The protocol implementation is discussed based on our implementation efforts in the UNIX Operating System Environment. At the same time, results of performance comparison between the ISO and TCP/IP implementations are presented to demonstrate the implementation of defined protocol. The testing of performance analysis is done by prototyping PACS on available platforms, which are Micro VAX II, DECstation and SUN Workstation.

  20. Products and Services Available from the Southern California Earthquake Data Center (SCEDC) and the Southern California Seismic Network (SCSN)

    NASA Astrophysics Data System (ADS)

    Yu, E.; Chen, S.; Chowdhury, F.; Bhaskaran, A.; Hutton, K.; Given, D.; Hauksson, E.; Clayton, R. W.

    2009-12-01

    The SCEDC archives continuous and triggered data from nearly 3000 data channels from 375 SCSN recorded stations. The SCSN and SCEDC process and archive an average of 12,000 earthquakes each year, contributing to the southern California earthquake catalog that spans from 1932 to present. The SCEDC provides public, searchable access to these earthquake parametric and waveform data through its website www.data.scec.org and through client applications such as STP, NETDC and DHI. New data products: ● The SCEDC is distributing synthetic waveform data from the 2008 ShakeOut scenario (Jones et al., USGS Open File Rep., 2008-1150) and (Graves et al. 2008; Geophys. Res. Lett.) This is a M 7.8 earthquake on the southern San Andreas fault. Users will be able to download 40 sps velocity waveforms in SAC format from the SCEDC website. The SCEDC is also distributing synthetic GPS data (Crowell et al., 2009; Seismo. Res. Letters.) for this scenario as well. ● The SCEDC has added a new web page to show the latest tomographic model of Southern California. This model is based on Tape et al., 2009 Science. New data services: ● The SCEDC is exporting data in QuakeML format. This is an xml format that has been adopted by the Advanced National Seismic System (ANSS). This data will also be available as a web service. ● The SCEDC is exporting data in StationXML format. This is an xml format created by the SCEDC and adopted by ANSS to fully describe station metadata. This data will also be available as a web service. ● The stp 1.6 client can now access both the SCEDC and the Northern California Earthquake Data Center (NCEDC) earthquake and waveform archives. In progress - SCEDC to distribute 1 sps GPS data in miniSEED format: ● As part of a NASA Advanced Information Systems Technology project in collaboration with Jet Propulsion Laboratory and Scripps Institution of Oceanography, the SCEDC will receive real time 1 sps streams of GPS displacement solutions from the California Real Time Network (http://sopac.ucsd.edu/projects/realtime; Genrich and Bock, 2006, J. Geophys. Res.). These channels will be archived at the SCEDC as miniSEED waveforms, which then can be distributed to the user community via applications such as STP.

  1. Valorisation of Como Historical Cadastral Maps Through Modern Web Geoservices

    NASA Astrophysics Data System (ADS)

    Brovelli, M. A.; Minghini, M.; Zamboni, G.

    2012-07-01

    Cartographic cultural heritage preserved in worldwide archives is often stored in the original paper version only, thus restricting both the chances of utilization and the range of possible users. The Web C.A.R.T.E. system addressed this issue with regard to the precious cadastral maps preserved at the State Archive of Como. Aim of the project was to improve the visibility and accessibility of this heritage using the latest free and open source tools for processing, cataloguing and web publishing the maps. The resulting architecture should therefore assist the State Archive of Como in managing its cartographic contents. After a pre-processing consisting of digitization and georeferencing steps, maps were provided with metadata, compiled according to the current Italian standards and managed through an ad hoc version of the GeoNetwork Opensource geocatalog software. A dedicated MapFish-based webGIS client, with an optimized version also for mobile platforms, was built for maps publication and 2D navigation. A module for 3D visualization of cadastral maps was finally developed using the NASA World Wind Virtual Globe. Thanks to a temporal slidebar, time was also included in the system producing a 4D Graphical User Interface. The overall architecture was totally built with free and open source software and allows a direct and intuitive consultation of historical maps. Besides the notable advantage of keeping original paper maps intact, the system greatly simplifies the work of the State Archive of Como common users and together widens the same range of users thanks to the modernization of map consultation tools.

  2. Annual and solar cycle dependencies of SuperDARN scatter occurrence and ionospheric convection measurements

    NASA Astrophysics Data System (ADS)

    Lester, M.; Imber, S. M.; Milan, S. E.

    2012-12-01

    The Super Dual Auroral Radar Network (SuperDARN) provides a long term data series which enables investigations of the coupled magnetosphere-ionosphere system. The network has been in existence essentially since 1995 when 6 radars were operational in the northern hemisphere and 4 in the southern hemisphere. We have been involved in an analysis of the data over the lifetime of the project and present results here from two key studies. In the first study we calculated the amount of ionospheric scatter which is observed by the radars and see clear annual and solar cycle variations in both hemispheres. The recent extended solar minimum also produces a significant effect in the scatter occurrence. In the second study, we have determined the latitude of the Heppner-Maynard Boundary (HMB) using the northern hemisphere SuperDARN radars. The HMB represents the equatorward extent of ionospheric convection for the interval 1996 - 2011. We find that the average latitude of the HMB at midnight is 61° magnetic latitude during solar the maximum of 2003, but it moves significantly poleward during solar minimum, averaging 64° latitude during 1996, and 68° during 2010. This poleward motion is observed despite the increasing number of low latitude radars built in recent years as part of the StormDARN network, and so is not an artefact of data coverage. We believe that the recent extreme solar minimum led to an average HMB location that was further poleward than the previous solar cycle. We have also calculated the Open-Closed field line Boundary (OCB) from auroral images during a subset of the interval (2000 - 2002) and find that on average the HMB is located equatorward of the OCB by ~7o. We suggest that the HMB may be a useful proxy for the OCB when global images are not available. The work presented in this paper has been undertaken as part of the European Cluster Assimilation Technology (ECLAT) project which is funded through the EU FP7 programme and involves groups at Leicester, Helsinki, Uppsala, FMI, Graz and St. Petersburg. The aim of the project is to provide additional data sets, primarily ground based data, to the Cluster Active Archive, and its successor the Cluster Final Archive, in order to enhance the scientific productivity of the archives.

  3. Ensuring the Quality of Data Packages in the LTER Network Provenance Aware Synthesis Tracking Architecture Data Management System and Archive

    NASA Astrophysics Data System (ADS)

    Servilla, M. S.; O'Brien, M.; Costa, D.

    2013-12-01

    Considerable ecological research performed today occurs through the analysis of data downloaded from various repositories and archives, often resulting in derived or synthetic products generated by automated workflows. These data are only meaningful for research if they are well documented by metadata, lest semantic or data type errors may occur in interpretation or processing. The Long Term Ecological Research (LTER) Network now screens all data packages entering its long-term archive to ensure that each package contains metadata that is complete, of high quality, and accurately describes the structure of its associated data entity and the data are structurally congruent to the metadata. Screening occurs prior to the upload of a data package into the Provenance Aware Synthesis Tracking Architecture (PASTA) data management system through a series of quality checks, thus preventing ambiguously or incorrectly documented data packages from entering the system. The quality checks within PASTA are designed to work specifically with the Ecological Metadata Language (EML), the metadata standard adopted by the LTER Network to describe data generated by their 26 research sites. Each quality check is codified in Java as part of the ecological community-supported Data Manager Library, which is a resource of the EML specification and used as a component of the PASTA software stack. Quality checks test for metadata quality, data integrity, or metadata-data congruence. Quality checks are further classified as either conditional or informational. Conditional checks issue a 'valid', 'warning' or 'error' response. Only an 'error' response blocks the data package from upload into PASTA. Informational checks only provide descriptive content pertaining to a particular facet of the data package. Quality checks are designed by a group of LTER information managers and reviewed by the LTER community before deploying into PASTA. A total of 32 quality checks have been deployed to date. Quality checks can be customized through a configurable template, which includes turning checks 'on' or 'off' and setting the severity of conditional checks. This feature is important to other potential users of the Data Manager Library who wish to configure its quality checks in accordance with the standards of their community. Executing the complete set of quality checks produces a report that describes the result of each check. The report is an XML document that is stored by PASTA for future reference.

  4. Sedimentology, stratigraphy and chronology of a decantation tank in the sewer network of Orléans (France).

    NASA Astrophysics Data System (ADS)

    Jacob, Jérémy; Thibault, Alexandre; Simonneau, Anaëlle; Le Milbeau, Claude; DiGiovanni, Christian; Sabatier, Pierre; Reyss, Jean-Louis; Ardito, Luigi; Morio, Cédric

    2017-04-01

    Current debates on the status of the Anthropocene convey geologists and palaeoenvironmentalists toward new spatial and temporal targets. One of the most emblematic socio-ecosystem of the Anthropocene is urban areas in which the dynamics of materials are mainly controlled by human activities. This brings unprecedented elemental, molecular and isotopic concentrations and distributions that lead Norra (2009) to propose a new geological sphere: the Astysphere. Here we propose that sediments accumulated in sewer networks can provide original, integrated, and multi-thematic archives for the recent history of cities by considering urban systems as any catchment where materials are produced, transported and sedimented. The study site is a decantation tank that collects stormwater and wastewater from the north of Orléans city, upstream wastewater plants in Orléans. Sediments accumulated since 1942 over 17 m depth and were never cleaned out until 2015. Two sedimentary cores of 70 (A) and 250 cm long (B) were collected before clean out and then a third of 150 cm (C) after. Sediments are organized into layers constituted by sands and gravels alternating with silts and organic layers. Sharp contacts between those layers indicate evenemential sedimentation, as expected in sewer networks. We formulate the hypothesis that organic/mineral alternations result from a seasonal dynamic. 7Be presence in topmost sample from core A confirms it was deposited within the last 6 months. In core C, only the upper half core, mostly mineral, displays significant 7Be levels whereas 7Be is absent from the lower half, which is mostly organic. This would confirm that our hypothesis of a seasonal alternation, with organic facies deposited during spring- summer and mineral facies deposited during fall-winter. 30 14C dates measured on cores A and B by postbomb method are logically distributed with depth, the most ancient (beginning of the eighties) being recorder at 2.5m depth. This study shows that sediments accumulated in a decantation tank constitute sedimentary archives comparable to more natural ones, thus allowing palaeoenvironmental reconstructions for the Anthropocene. We are currently examining the mineral and organic content of this archive to provide a detailed chronology of the history of man-made materials (drugs, plastics, pesticides…) in urban contexts.

  5. Rich Support for Heterogeneous Polar Data in RAMADDA

    NASA Astrophysics Data System (ADS)

    McWhirter, J.; Crosby, C. J.; Griffith, P. C.; Khalsa, S.; Lazzara, M. A.; Weber, W. J.

    2013-12-01

    Difficult to navigate environments, tenuous logistics, strange forms, deeply rooted cultures - these are all experiences shared by Polar scientist in the field as well as the developers of the underlying data management systems back in the office. Among the key data management challenges that Polar investigations present are the heterogeneity and complexity of data that are generated. Polar regions are intensely studied across many science domains through a variety of techniques - satellite and aircraft remote sensing, in-situ observation networks, modeling, sociological investigations, and extensive PI-driven field project data collection. While many data management efforts focus on large homogeneous collections of data targeting specific science domains (e.g., satellite, GPS, modeling), multi-disciplinary efforts that focus on Polar data need to be able to address a wide range of data formats, science domains and user communities. There is growing use of the RAMADDA (Repository for Archiving, Managing and Accessing Diverse Data) system to manage and provide services for Polar data. RAMADDA is a freely available extensible data repository framework that supports a wide range of data types and services to allow the creation, management, discovery and use of data and metadata. The broad range of capabilities provided by RAMADDA and its extensibility makes it well-suited as an archive solution for Polar data. RAMADDA can run in a number of diverse contexts - as a centralized archive, at local institutions, and can even run on an investigator's laptop in the field, providing in-situ metadata and data management services. We are actively developing archives and support for a number of Polar initiatives: - NASA-Arctic Boreal Vulnerability Experiment (ABoVE): ABoVE is a long-term multi-instrument field campaign that will make use of a wide range of data. We have developed an extensive ontology of program, project and site metadata in RAMADDA, in support of the ABoVE Science Definition Team and Project Office. See: http://above.nasa.gov - UNAVCO Terrestrial Laser Scanning (TLS): UNAVCO's Polar program provides support for terrestrial laser scanning field projects. We are using RAMADDA to archive these field projects, with over 40 projects ingested to date. - NASA-IceBridge: As part of the NASA LiDAR Access System (NLAS) project, RAMADDA supports numerous airborne and satellite LiDAR data sets - GLAS, LVIS, ATM, Paris, McORDS, etc. - Antarctic Meteorological Research Center (AMRC): Satellite and surface observation network - Support for numerous other data from AON-ACADIS, Greenland GC-Net, NOAA-GMD, AmeriFlux, etc. In this talk we will discuss some of the challenges that Polar data brings to geoinformatics and describe the approaches we have taken to address these challenges in RAMADDA.

  6. On detecting variables using ROTSE-IIId archival data

    NASA Astrophysics Data System (ADS)

    Yesilyaprak, C.; Yerli, S. K.; Aksaker, N.; Gucsav, B. B.; Kiziloglu, U.; Dikicioglu, E.; Coker, D.; Aydin, E.; Ozeren, F. F.

    ROTSE (Robotic Optical Transient Search Experiment) telescopes can also be used for variable star detection. As explained in the system description tep{2003PASP..115..132A}, they have a good sky coverage and they allow a fast data acquisition. The optical magnitude range varies between 7^m to 19^m. Thirty percent of the telescope time of north-eastern leg of the network, namely ROTSE-IIId (located at TUBITAK National Observatory, Bakirlitepe, Turkey http://www.tug.tubitak.gov.tr/) is owned by Turkish researchers. Since its first light (May 2004) considerably a large amount of data has been collected (around 2 TB) from the Turkish time and roughly one million objects have been identified from the reduced data. A robust pipeline has been constructed to discover new variables, transients and planetary nebulae from this archival data. In the detection process, different statistical methods were applied to the archive. We have detected thousands of variable stars by applying roughly four different tests to light curve of each star. In this work a summary of the pipeline is presented. It uses a high performance computing (HPC) algorithm which performs inhomogeneous ensemble photometry of the data on a 36 core cluster. This study is supported by TUBITAK (Scientific and Technological Research Council of Turkey) with the grant number TBAG-108T475.

  7. A semiparametric Bayesian proportional hazards model for interval censored data with frailty effects.

    PubMed

    Henschel, Volkmar; Engel, Jutta; Hölzel, Dieter; Mansmann, Ulrich

    2009-02-10

    Multivariate analysis of interval censored event data based on classical likelihood methods is notoriously cumbersome. Likelihood inference for models which additionally include random effects are not available at all. Developed algorithms bear problems for practical users like: matrix inversion, slow convergence, no assessment of statistical uncertainty. MCMC procedures combined with imputation are used to implement hierarchical models for interval censored data within a Bayesian framework. Two examples from clinical practice demonstrate the handling of clustered interval censored event times as well as multilayer random effects for inter-institutional quality assessment. The software developed is called survBayes and is freely available at CRAN. The proposed software supports the solution of complex analyses in many fields of clinical epidemiology as well as health services research.

  8. heatmaply: an R package for creating interactive cluster heatmaps for online publishing.

    PubMed

    Galili, Tal; O'Callaghan, Alan; Sidi, Jonathan; Sievert, Carson

    2018-05-01

    heatmaply is an R package for easily creating interactive cluster heatmaps that can be shared online as a stand-alone HTML file. Interactivity includes a tooltip display of values when hovering over cells, as well as the ability to zoom in to specific sections of the figure from the data matrix, the side dendrograms, or annotated labels. Thanks to the synergistic relationship between heatmaply and other R packages, the user is empowered by a refined control over the statistical and visual aspects of the heatmap layout. The heatmaply package is available under the GPL-2 Open Source license. It comes with a detailed vignette, and is freely available from: http://cran.r-project.org/package=heatmaply. tal.galili@math.tau.ac.il. Supplementary data are available at Bioinformatics online.

  9. Integrated data management for clinical studies: automatic transformation of data models with semantic annotations for principal investigators, data managers and statisticians.

    PubMed

    Dugas, Martin; Dugas-Breit, Susanne

    2014-01-01

    Design, execution and analysis of clinical studies involves several stakeholders with different professional backgrounds. Typically, principle investigators are familiar with standard office tools, data managers apply electronic data capture (EDC) systems and statisticians work with statistics software. Case report forms (CRFs) specify the data model of study subjects, evolve over time and consist of hundreds to thousands of data items per study. To avoid erroneous manual transformation work, a converting tool for different representations of study data models was designed. It can convert between office format, EDC and statistics format. In addition, it supports semantic annotations, which enable precise definitions for data items. A reference implementation is available as open source package ODMconverter at http://cran.r-project.org.

  10. A paleoclimate rainfall reconstruction in the Murray-Darling Basin (MDB), Australia: 1. Evaluation of different paleoclimate archives, rainfall networks, and reconstruction techniques

    NASA Astrophysics Data System (ADS)

    Ho, Michelle; Kiem, Anthony S.; Verdon-Kidd, Danielle C.

    2015-10-01

    From ˜1997 to 2009 the Murray-Darling Basin (MDB), Australia's largest water catchment and reputed "food bowl," experienced a severe drought termed the "Millennium Drought" or "Big Dry" followed by devastating floods in the austral summers of 2010/2011, 2011/2012, and 2012/2013. The magnitude and severity of these extreme events highlight the limitations associated with assessing hydroclimatic risk based on relatively short instrumental records (˜100 years). An option for extending hydroclimatic records is through the use of paleoclimate records. However, there are few in situ proxies of rainfall or streamflow suitable for assessing hydroclimatic risk in Australia and none are available in the MDB. In this paper, available paleoclimate records are reviewed and those of suitable quality for hydroclimatic risk assessments are used to develop preinstrumental information for the MDB. Three different paleoclimate reconstruction techniques are assessed using two instrumental rainfall networks: (1) corresponding to rainfall at locations where rainfall-sensitive Australian paleoclimate archives currently exist and (2) corresponding to rainfall at locations identified as being optimal for explaining MDB rainfall variability. It is shown that the optimized rainfall network results in a more accurate model of MDB rainfall compared to reconstructions based on rainfall at locations where paleoclimate rainfall proxies currently exist. This highlights the importance of first identifying key locations where existing and as yet unrealized paleoclimate records will be most useful in characterizing variability. These results give crucial insight as to where future investment and research into developing paleoclimate proxies for Australia could be most beneficial, with respect to better understanding instrumental, preinstrumental and potential future variability in the MDB.

  11. Seismic Data Archive Quality Assurance -- Analytics Adding Value at Scale

    NASA Astrophysics Data System (ADS)

    Casey, R. E.; Ahern, T. K.; Sharer, G.; Templeton, M. E.; Weertman, B.; Keyson, L.

    2015-12-01

    Since the emergence of real-time delivery of seismic data over the last two decades, solutions for near-real-time quality analysis and station monitoring have been developed by data producers and data stewards. This has allowed for a nearly constant awareness of the quality of the incoming data and the general health of the instrumentation around the time of data capture. Modern quality assurance systems are evolving to provide ready access to a large variety of metrics, a rich and self-correcting history of measurements, and more importantly the ability to access these quality measurements en-masse through a programmatic interface.The MUSTANG project at the IRIS Data Management Center is working to achieve 'total archival data quality', where a large number of standardized metrics, some computationally expensive, are generated and stored for all data from decades past to the near present. To perform this on a 300 TB archive of compressed time series requires considerable resources in network I/O, disk storage, and CPU capacity to achieve scalability, not to mention the technical expertise to develop and maintain it. In addition, staff scientists are necessary to develop the system metrics and employ them to produce comprehensive and timely data quality reports to assist seismic network operators in maintaining their instrumentation. All of these metrics must be available to the scientist 24/7.We will present an overview of the MUSTANG architecture including the development of its standardized metrics code in R. We will show examples of the metrics values that we make publicly available to scientists and educators and show how we are sharing the algorithms used. We will also discuss the development of a capability that will enable scientific researchers to specify data quality constraints on their requests for data, providing only the data that is best suited to their area of study.

  12. Constraint based scheduling for the Goddard Space Flight Center distributed Active Archive Center's data archive and distribution system

    NASA Technical Reports Server (NTRS)

    Short, Nick, Jr.; Bedet, Jean-Jacques; Bodden, Lee; Boddy, Mark; White, Jim; Beane, John

    1994-01-01

    The Goddard Space Flight Center (GSFC) Distributed Active Archive Center (DAAC) has been operational since October 1, 1993. Its mission is to support the Earth Observing System (EOS) by providing rapid access to EOS data and analysis products, and to test Earth Observing System Data and Information System (EOSDIS) design concepts. One of the challenges is to ensure quick and easy retrieval of any data archived within the DAAC's Data Archive and Distributed System (DADS). Over the 15-year life of EOS project, an estimated several Petabytes (10(exp 15)) of data will be permanently stored. Accessing that amount of information is a formidable task that will require innovative approaches. As a precursor of the full EOS system, the GSFC DAAC with a few Terabits of storage, has implemented a prototype of a constraint-based task and resource scheduler to improve the performance of the DADS. This Honeywell Task and Resource Scheduler (HTRS), developed by Honeywell Technology Center in cooperation the Information Science and Technology Branch/935, the Code X Operations Technology Program, and the GSFC DAAC, makes better use of limited resources, prevents backlog of data, provides information about resources bottlenecks and performance characteristics. The prototype which is developed concurrently with the GSFC Version 0 (V0) DADS, models DADS activities such as ingestion and distribution with priority, precedence, resource requirements (disk and network bandwidth) and temporal constraints. HTRS supports schedule updates, insertions, and retrieval of task information via an Application Program Interface (API). The prototype has demonstrated with a few examples, the substantial advantages of using HTRS over scheduling algorithms such as a First In First Out (FIFO) queue. The kernel scheduling engine for HTRS, called Kronos, has been successfully applied to several other domains such as space shuttle mission scheduling, demand flow manufacturing, and avionics communications scheduling.

  13. Development of a system for transferring images via a network: supporting a regional liaison.

    PubMed

    Mihara, Naoki; Manabe, Shiro; Takeda, Toshihiro; Shinichirou, Kitamura; Junichi, Murakami; Kouji, Kiso; Matsumura, Yasushi

    2013-01-01

    We developed a system that transfers images via network and started using them in our hospital's PACS (Picture Archiving and Communication Systems) in 2006. We are pleased to report that the system has been re-developed and has been running so that there will be a regional liaison in the future. It has become possible to automatically transfer images simply by selecting the destination hospital that is registered in advance at the relay server. The gateway of this system can send images to a multi-center, relay management server, which receives the images and resends them. This system has the potential to be useful for image exchange, and to serve as a regional medical liaison.

  14. Enabling Earth Science: The Facilities and People of the NCCS

    NASA Technical Reports Server (NTRS)

    2002-01-01

    The NCCS's mass data storage system allows scientists to store and manage the vast amounts of data generated by these computations, and its high-speed network connections allow the data to be accessed quickly from the NCCS archives. Some NCCS users perform studies that are directly related to their ability to run computationally expensive and data-intensive simulations. Because the number and type of questions scientists research often are limited by computing power, the NCCS continually pursues the latest technologies in computing, mass storage, and networking technologies. Just as important as the processors, tapes, and routers of the NCCS are the personnel who administer this hardware, create and manage accounts, maintain security, and assist the scientists, often working one on one with them.

  15. The new space and earth science information systems at NASA's archive

    NASA Technical Reports Server (NTRS)

    Green, James L.

    1990-01-01

    The on-line interactive systems of the National Space Science Data Center (NSSDC) are examined. The worldwide computer network connections that allow access to NSSDC users are outlined. The services offered by the NSSDC new technology on-line systems are presented, including the IUE request system, ozone TOMS data, and data sets on astrophysics, atmospheric science, land sciences, and space plasma physics. Plans for future increases in the NSSDC data holdings are considered.

  16. The new space and Earth science information systems at NASA's archive

    NASA Technical Reports Server (NTRS)

    Green, James L.

    1990-01-01

    The on-line interactive systems of the National Space Science Data Center (NSSDC) are examined. The worldwide computer network connections that allow access to NSSDC users are outlined. The services offered by the NSSDC new technology on-line systems are presented, including the IUE request system, Total Ozone Mapping Spectrometer (TOMS) data, and data sets on astrophysics, atmospheric science, land sciences, and space plasma physics. Plans for future increases in the NSSDC data holdings are considered.

  17. Commission 5: Documentation and Astronomical Data

    NASA Astrophysics Data System (ADS)

    Norris, Raymond P.; Ohishi, Masatoshi; Genova, Françoise; Grothkopf, Uta; Malkov, Oleg Yu.; Pence, William D.; Schmitz, Marion; Hanisch, Robert J.; Zhou, Xu

    IAU Commission 5 deals with data management issues, and its working groups and task groups deal specifically with information handling, with data centres and networks, with technical aspects of collection, archiving, storage and dissemination of data, with designations and classification of astronomical objects, with library services, editorial policies, computer communications, ad hoc methodologies, and with various standards, reference frames, etc., FITS, astronomys Flexible Image Transport System, the major data exchange format, is controlled, maintained and updated by the Working Group FITS.

  18. Convergence: Illicit Networks and National Security in the Age of Globalization

    DTIC Science & Technology

    2013-01-01

    four decades of combat. 52 “The FARC Files: Venezuela, Ecuador and the Secret Archives of ‘Raúl Reyes.’” The Colombian government gave the International...Why Guatemala?” Hubs versus Havens “There are so many forgotten places, out of government control, too scary for investors and tourists ,” Antonio Maria...represent the views of the Defense Department or any other agency of the Federal Government . Cleared for public release; distribution unlimited

  19. Problems and Mitigation Strategies for Developing and Validating Statistical Cyber Defenses

    DTIC Science & Technology

    2014-04-01

    Clustering Support  Vector  Machine   (SVM)  Classification Netflow Twitter   Training  Datasets Trained  SVMs Enriched  Feature   State...requests. • Netflow data for TCP connections • E-mail data from SMTP logs • Chat data from XMPP logs • Microtext data (from Twitter message archives...summary data from Bro and Netflow data captured on the BBN network over the period of 1 month, plus simulated attacks WHOIS Domain name record

  20. Anomalous Anticipatory Responses in Networked Random Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nelson, Roger D.; Bancel, Peter A.

    2006-10-16

    We examine an 8-year archive of synchronized, parallel time series of random data from a world spanning network of physical random event generators (REGs). The archive is a publicly accessible matrix of normally distributed 200-bit sums recorded at 1 Hz which extends from August 1998 to the present. The primary question is whether these data show non-random structure associated with major events such as natural or man-made disasters, terrible accidents, or grand celebrations. Secondarily, we examine the time course of apparently correlated responses. Statistical analyses of the data reveal consistent evidence that events which strongly affect people engender small butmore » significant effects. These include suggestions of anticipatory responses in some cases, leading to a series of specialized analyses to assess possible non-random structure preceding precisely timed events. A focused examination of data collected around the time of earthquakes with Richter magnitude 6 and greater reveals non-random structure with a number of intriguing, potentially important features. Anomalous effects in the REG data are seen only when the corresponding earthquakes occur in populated areas. No structure is found if they occur in the oceans. We infer that an important contributor to the effect is the relevance of the earthquake to humans. Epoch averaging reveals evidence for changes in the data some hours prior to the main temblor, suggestive of reverse causation.« less

  1. Network oriented radiological and medical archive

    NASA Astrophysics Data System (ADS)

    Ferraris, M.; Frixione, P.; Squarcia, S.

    2001-10-01

    In this paper the basic ideas of NORMA (Network Oriented Radiological and Medical Archive) are discussed. NORMA is an original project built by a team of physicists in collaboration with radiologists in order to select the best Treatment Planning in radiotherapy. It allows physicians and health physicists, working in different places, to discuss on interesting clinical cases visualizing the same diagnostic images, at the same time, and highlighting zones of interest (tumors and organs at risk). NORMA has a client/server architecture in order to be platform independent. Applying World Wide Web technologies, it can be easily used by people with no specific computer knowledge providing a verbose help to guide the user through the right steps of execution. The client side is an applet while the server side is a Java application. In order to optimize execution the project also includes a proprietary protocol, lying over TCP/IP suite, that organizes data exchanges and control messages. Diagnostic images are retrieved from a relational database or from a standard DICOM (Digital Images and COmmunications in Medicine) PACS through the DICOM-WWW gateway allowing connection of the usual Web browsers, used by the NORMA system, to DICOM applications via the HTTP protocol. Browser requests are sent to the gateway from the Web server through CGI (Common Gateway Interface). DICOM software translates the requests in DICOM messages and organizes the communication with the remote DICOM Application.

  2. Volume serving and media management in a networked, distributed client/server environment

    NASA Technical Reports Server (NTRS)

    Herring, Ralph H.; Tefend, Linda L.

    1993-01-01

    The E-Systems Modular Automated Storage System (EMASS) is a family of hierarchical mass storage systems providing complete storage/'file space' management. The EMASS volume server provides the flexibility to work with different clients (file servers), different platforms, and different archives with a 'mix and match' capability. The EMASS design considers all file management programs as clients of the volume server system. System storage capacities are tailored to customer needs ranging from small data centers to large central libraries serving multiple users simultaneously. All EMASS hardware is commercial off the shelf (COTS), selected to provide the performance and reliability needed in current and future mass storage solutions. All interfaces use standard commercial protocols and networks suitable to service multiple hosts. EMASS is designed to efficiently store and retrieve in excess of 10,000 terabytes of data. Current clients include CRAY's YMP Model E based Data Migration Facility (DMF), IBM's RS/6000 based Unitree, and CONVEX based EMASS File Server software. The VolSer software provides the capability to accept client or graphical user interface (GUI) commands from the operator's console and translate them to the commands needed to control any configured archive. The VolSer system offers advanced features to enhance media handling and particularly media mounting such as: automated media migration, preferred media placement, drive load leveling, registered MediaClass groupings, and drive pooling.

  3. An open, interoperable, and scalable prehospital information technology network architecture.

    PubMed

    Landman, Adam B; Rokos, Ivan C; Burns, Kevin; Van Gelder, Carin M; Fisher, Roger M; Dunford, James V; Cone, David C; Bogucki, Sandy

    2011-01-01

    Some of the most intractable challenges in prehospital medicine include response time optimization, inefficiencies at the emergency medical services (EMS)-emergency department (ED) interface, and the ability to correlate field interventions with patient outcomes. Information technology (IT) can address these and other concerns by ensuring that system and patient information is received when and where it is needed, is fully integrated with prior and subsequent patient information, and is securely archived. Some EMS agencies have begun adopting information technologies, such as wireless transmission of 12-lead electrocardiograms, but few agencies have developed a comprehensive plan for management of their prehospital information and integration with other electronic medical records. This perspective article highlights the challenges and limitations of integrating IT elements without a strategic plan, and proposes an open, interoperable, and scalable prehospital information technology (PHIT) architecture. The two core components of this PHIT architecture are 1) routers with broadband network connectivity to share data between ambulance devices and EMS system information services and 2) an electronic patient care report to organize and archive all electronic prehospital data. To successfully implement this comprehensive PHIT architecture, data and technology requirements must be based on best available evidence, and the system must adhere to health data standards as well as privacy and security regulations. Recent federal legislation prioritizing health information technology may position federal agencies to help design and fund PHIT architectures.

  4. Microbe-ID: an open source toolbox for microbial genotyping and species identification

    PubMed Central

    Tabima, Javier F.; Everhart, Sydney E.; Larsen, Meredith M.; Weisberg, Alexandra J.; Kamvar, Zhian N.; Tancos, Matthew A.; Smart, Christine D.; Chang, Jeff H.

    2016-01-01

    Development of tools to identify species, genotypes, or novel strains of invasive organisms is critical for monitoring emergence and implementing rapid response measures. Molecular markers, although critical to identifying species or genotypes, require bioinformatic tools for analysis. However, user-friendly analytical tools for fast identification are not readily available. To address this need, we created a web-based set of applications called Microbe-ID that allow for customizing a toolbox for rapid species identification and strain genotyping using any genetic markers of choice. Two components of Microbe-ID, named Sequence-ID and Genotype-ID, implement species and genotype identification, respectively. Sequence-ID allows identification of species by using BLAST to query sequences for any locus of interest against a custom reference sequence database. Genotype-ID allows placement of an unknown multilocus marker in either a minimum spanning network or dendrogram with bootstrap support from a user-created reference database. Microbe-ID can be used for identification of any organism based on nucleotide sequences or any molecular marker type and several examples are provided. We created a public website for demonstration purposes called Microbe-ID (microbe-id.org) and provided a working implementation for the genus Phytophthora (phytophthora-id.org). In Phytophthora-ID, the Sequence-ID application allows identification based on ITS or cox spacer sequences. Genotype-ID groups individuals into clonal lineages based on simple sequence repeat (SSR) markers for the two invasive plant pathogen species P. infestans and P. ramorum. All code is open source and available on github and CRAN. Instructions for installation and use are provided at https://github.com/grunwaldlab/Microbe-ID. PMID:27602267

  5. A Flexible Spatio-Temporal Model for Air Pollution with Spatial and Spatio-Temporal Covariates.

    PubMed

    Lindström, Johan; Szpiro, Adam A; Sampson, Paul D; Oron, Assaf P; Richards, Mark; Larson, Tim V; Sheppard, Lianne

    2014-09-01

    The development of models that provide accurate spatio-temporal predictions of ambient air pollution at small spatial scales is of great importance for the assessment of potential health effects of air pollution. Here we present a spatio-temporal framework that predicts ambient air pollution by combining data from several different monitoring networks and deterministic air pollution model(s) with geographic information system (GIS) covariates. The model presented in this paper has been implemented in an R package, SpatioTemporal, available on CRAN. The model is used by the EPA funded Multi-Ethnic Study of Atherosclerosis and Air Pollution (MESA Air) to produce estimates of ambient air pollution; MESA Air uses the estimates to investigate the relationship between chronic exposure to air pollution and cardiovascular disease. In this paper we use the model to predict long-term average concentrations of NO x in the Los Angeles area during a ten year period. Predictions are based on measurements from the EPA Air Quality System, MESA Air specific monitoring, and output from a source dispersion model for traffic related air pollution (Caline3QHCR). Accuracy in predicting long-term average concentrations is evaluated using an elaborate cross-validation setup that accounts for a sparse spatio-temporal sampling pattern in the data, and adjusts for temporal effects. The predictive ability of the model is good with cross-validated R 2 of approximately 0.7 at subject sites. Replacing four geographic covariate indicators of traffic density with the Caline3QHCR dispersion model output resulted in very similar prediction accuracy from a more parsimonious and more interpretable model. Adding traffic-related geographic covariates to the model that included Caline3QHCR did not further improve the prediction accuracy.

  6. ICA model order selection of task co-activation networks.

    PubMed

    Ray, Kimberly L; McKay, D Reese; Fox, Peter M; Riedel, Michael C; Uecker, Angela M; Beckmann, Christian F; Smith, Stephen M; Fox, Peter T; Laird, Angela R

    2013-01-01

    Independent component analysis (ICA) has become a widely used method for extracting functional networks in the brain during rest and task. Historically, preferred ICA dimensionality has widely varied within the neuroimaging community, but typically varies between 20 and 100 components. This can be problematic when comparing results across multiple studies because of the impact ICA dimensionality has on the topology of its resultant components. Recent studies have demonstrated that ICA can be applied to peak activation coordinates archived in a large neuroimaging database (i.e., BrainMap Database) to yield whole-brain task-based co-activation networks. A strength of applying ICA to BrainMap data is that the vast amount of metadata in BrainMap can be used to quantitatively assess tasks and cognitive processes contributing to each component. In this study, we investigated the effect of model order on the distribution of functional properties across networks as a method for identifying the most informative decompositions of BrainMap-based ICA components. Our findings suggest dimensionality of 20 for low model order ICA to examine large-scale brain networks, and dimensionality of 70 to provide insight into how large-scale networks fractionate into sub-networks. We also provide a functional and organizational assessment of visual, motor, emotion, and interoceptive task co-activation networks as they fractionate from low to high model-orders.

  7. ICA model order selection of task co-activation networks

    PubMed Central

    Ray, Kimberly L.; McKay, D. Reese; Fox, Peter M.; Riedel, Michael C.; Uecker, Angela M.; Beckmann, Christian F.; Smith, Stephen M.; Fox, Peter T.; Laird, Angela R.

    2013-01-01

    Independent component analysis (ICA) has become a widely used method for extracting functional networks in the brain during rest and task. Historically, preferred ICA dimensionality has widely varied within the neuroimaging community, but typically varies between 20 and 100 components. This can be problematic when comparing results across multiple studies because of the impact ICA dimensionality has on the topology of its resultant components. Recent studies have demonstrated that ICA can be applied to peak activation coordinates archived in a large neuroimaging database (i.e., BrainMap Database) to yield whole-brain task-based co-activation networks. A strength of applying ICA to BrainMap data is that the vast amount of metadata in BrainMap can be used to quantitatively assess tasks and cognitive processes contributing to each component. In this study, we investigated the effect of model order on the distribution of functional properties across networks as a method for identifying the most informative decompositions of BrainMap-based ICA components. Our findings suggest dimensionality of 20 for low model order ICA to examine large-scale brain networks, and dimensionality of 70 to provide insight into how large-scale networks fractionate into sub-networks. We also provide a functional and organizational assessment of visual, motor, emotion, and interoceptive task co-activation networks as they fractionate from low to high model-orders. PMID:24339802

  8. Use of film digitizers to assist radiology image management

    NASA Astrophysics Data System (ADS)

    Honeyman-Buck, Janice C.; Frost, Meryll M.; Staab, Edward V.

    1996-05-01

    The purpose of this development effort was to evaluate the possibility of using digital technologies to solve image management problems in the Department of Radiology at the University of Florida. The three problem areas investigated were local interpretation of images produced in remote locations, distribution of images to areas outside of radiology, and film handling. In all cases the use of a laser film digitizer interfaced to an existing Picture Archiving and Communication System (PACS) was investigated as a solution to the problem. In each case the volume of studies involved were evaluated to estimate the impact of the solution on the network, archive, and workstations. Communications were stressed in the analysis of the needs for all image transmission. The operational aspects of the solution were examined to determine the needs for training, service, and maintenance. The remote sites requiring local interpretation included were a rural hospital needing coverage for after hours studies, the University of Florida student infirmary, and the emergency room. Distribution of images to the intensive care units was studied to improve image access and patient care. Handling of films originating from remote sites and those requiring urgent reporting were evaluated to improve management functions. The results of our analysis and the decisions that were made based on the analysis are described below. In the cases where systems were installed, a description of the system and its integration into the PACS system is included. For all three problem areas, although we could move images via a digitizer to the archive and a workstation, there was no way to inform the radiologist that a study needed attention. In the case of outside films, the patient did not always have a medical record number that matched one in our Radiology Information Systems (RIS). In order to incorporate all studies for a patient, we needed common locations for orders, reports, and images. RIS orders were generated for each outside study to be interpreted and a medical record number assigned if none existed. All digitized outside films were archived in the PACS archive for later review or comparison use. The request generated by the RIS requesting a diagnostic interpretation was placed at the PACS workstation to alert the radiologists that unread images had arrived and a box was added to the workstation user interface that could be checked by the radiologist to indicate that a report had been dictated. The digitizer system solved several problems, unavailable films in the emergency room, teleradiology, and archiving of outside studies that had been read by University of Florida radiologists. In addition to saving time for outside film management, we now store the studies for comparison purposes, no longer lose emergency room films, generate diagnostic reports on emergency room films in a timely manner (important for billing and reimbursement), and can handle the distributed nature of our business. As changes in health care drive management changes, existing tools can be used in new ways to help make the transition easier. In this case, adding digitizers to an existing PACS network helped solve several image management problems.

  9. Translational networks in healthcare? Evidence on the design and initiation of organizational networks for knowledge mobilization.

    PubMed

    Fitzgerald, Louise; Harvey, Gill

    2015-08-01

    International attention has focussed on the variations between research evidence and practice in healthcare. This prompted the creation of formalized translational networks consisting of academic-service partnerships. The English Collaborations for Leadership in Applied Health Research and Care (CLAHRCs) are one example of a translational network. Using longitudinal, archival case study data from one CLAHRC over a 3-year period (2008-11), this article explores the relationship between organizational form and the function(s) of a translational network. The article focuses on the research gaps on the effective structures and appropriate governance to support a translational network. Data analysis suggested that the policy of setting up translational networks is insufficient of itself to produce positive translational activity. The data indicate that to leverage the benefits of the whole network, attention must be paid to devising a structure which integrates research production and use and facilitates lateral cross-disciplinary and cross-organizational communication. Equally, appropriate governance arrangements are necessary, particularly in large, multi-stakeholder networks, where shared governance may be questionable. Inappropriate network structure and governance inhibits the potential of the translational network. Finally, the case provides insights into the movement of knowledge within and between network organizations. The data demonstrate that knowledge mobilization extends beyond knowledge translation; knowledge mobilization includes the negotiated utilization of knowledge - a balanced power form of collaboration. Whilst much translational effort is externally focused on the health system, our findings highlight the essential need for the internal negotiation and mobilization of knowledge within academia. Copyright © 2015 Elsevier Ltd. All rights reserved.

  10. Particular geoscientific perspectives on stable isotope analysis in the arboreal system

    NASA Astrophysics Data System (ADS)

    Helle, Gerhard; Balting, Daniel; Pauly, Maren; Slotta, Franziska

    2017-04-01

    In geosciences stable isotopes of carbon, oxygen and hydrogen from the tree ring archive have been used for several decades to trace the course of past environmental and climatological fluctuations. In contrast to ice cores, the tree ring archive is of biological nature (like many other terrestrial archives), but provides the opportunity to establish site networks with very high resolution in space and time. Many of the basic physical mechanisms of isotope shifts are known, but biologically mediated processes may lead to isotope effects that are poorly understood. This implies that the many processes within the arboreal system leading to archived isotope ratios in wood material are governed by a multitude of environmental variables that are not only tied to the isotopic composition of atmospheric source values (precipitation, CO2), but also to seasonally changing metabolic flux rates and pool sizes of photosynthates within the trees. Consequently, the extraction of climate and environmental information is particularly challenging and reconstructions are still of rather qualitative nature. Over the last 10 years or so, monitoring studies have been implemented to investigate stable isotope, climate and environmental signal transfer within the arboreal system to develop transfer or response functions that can translate the relevant isotope values extracted from tree rings into climate or other environmental variables. To what extent have these efforts lead to a better understanding that helps improving the meaningfulness of tree ring isotope signals? For example, do monitoring studies help deciphering the causes for age-related trends in tree ring stable isotope sequences that are published in a growing number of papers. Are existing monitoring studies going into detail enough or is it already too much effort for the outcome? Based on what we know already particularly in mesic habitats, tree ring stable isotopes are much better climate proxies than other tree ring parameters. However, millennial or multi-millennial high quality reconstructions from tree ring isotopes are still rare. This is because of i) methodological constraints related to mass spectrometric analyses and ii) the nature of tree-ring chronologies that are put together by many trees of various individual ages. In view of this: What is the state-of-the-art in high throughput tree ring stable isotope analyses? Is it necessary to advance existing methodologies further to conserve the annual time resolution provided by the tree-ring archive? Other terrestrial archives, like lake sediments and speleothems rarely provide annually resolved stable isotope data. Furthermore, certain tree species from tropical or sub-tropical regions cannot be dated properly by dendrochronology and hence demand specific stable isotope measuring strategies, etc.. Although the points raised here do specifically apply for the tree ring archive, some of them are important for all proxy archives of organic origin.

  11. Russian women emigrées in psychology: informal Jewish networks.

    PubMed

    Woodward, William R

    2010-05-01

    This paper uses archival sources and autobiographies to give a fuller account of the lives of three Russian women psychologists, each of whom voluntarily emigrated several years before the Third Reich. As such, their stories contribute to gender history, emigration history, and ethnic history. The characteristics of second-generation women in psychology seem to apply to this sample; they accepted applied or secondary positions in psychology or allied fields and came late to tenure-track positions. Some first-generation characteristics fit them also: choosing career over marriage, accepting the "family claim," and living "fractured lives." Emigrée history reveals that these women found careers in the United States that could not have happened in the smaller, more restricted higher education networks of Europe. Female friendships and family ties to the Old World sustained them. All struggled with professional networking and had varying success, depending heavily upon the patronage of sympathetic male psychologists. Ethnic history shows that none identified strongly with Judaism, yet all benefited from Jewish mentors and networks of patronage. Evidence of gendered or racial discrimination in hiring practices is sparse, though it surely existed.

  12. Mimoza: web-based semantic zooming and navigation in metabolic networks.

    PubMed

    Zhukova, Anna; Sherman, David J

    2015-02-26

    The complexity of genome-scale metabolic models makes them quite difficult for human users to read, since they contain thousands of reactions that must be included for accurate computer simulation. Interestingly, hidden similarities between groups of reactions can be discovered, and generalized to reveal higher-level patterns. The web-based navigation system Mimoza allows a human expert to explore metabolic network models in a semantically zoomable manner: The most general view represents the compartments of the model; the next view shows the generalized versions of reactions and metabolites in each compartment; and the most detailed view represents the initial network with the generalization-based layout (where similar metabolites and reactions are placed next to each other). It allows a human expert to grasp the general structure of the network and analyze it in a top-down manner Mimoza can be installed standalone, or used on-line at http://mimoza.bordeaux.inria.fr/ , or installed in a Galaxy server for use in workflows. Mimoza views can be embedded in web pages, or downloaded as COMBINE archives.

  13. The Telecommunications and Data Acquisition Report

    NASA Technical Reports Server (NTRS)

    Posner, Edward C. (Editor)

    1991-01-01

    This quarterly publication provides archival reports on developments in programs managed by the Jet Propulsion Laboratory's (JPL's) Office of Telecommunications and Data Acquisition (TDA). In space communications, radio navigation, radio science, and ground-based radio and radar astronomy, it reports on the activities of the Deep Space Network (DSN) in planning, in supporting research and technology, in implementation, and in operations. Also included is standards activity at JPL for space data, information systems, and reimbursable DSN work performed for other space agencies through NASA.

  14. The Naval Postgraduate School SECURE ARCHIVAL STORAGE SYSTEM. Part II. Segment and Process Management Implementation.

    DTIC Science & Technology

    1981-03-01

    Research Instructor of Computer Scienr-. Reviewed by: Released by: WILLIAM M. TOLLES Department puter Science Dean of Research 4c t SECURITY...Lyle A. Cox, Roger R. Schell, and Sonja L. Perdue 9. PERFORMING ORGANIZATION NAME ANO ADDRESS 10. PROGRAM ELEMENT. PROJECT. TASK AREA A WORK UNIT... Computer Networks, Operating Systems, Computer Security 20. AftUrCT (Cnthm, w v re eae old* It n..*p and idm 0 F W blk ..m.m.o’) ",A_;he security

  15. Data base management system and display software for the National Geophysical Data Center geomagnetic CD-ROM's

    NASA Technical Reports Server (NTRS)

    Papitashvili, N. E.; Papitashvili, V. O.; Allen, J. H.; Morris, L. D.

    1995-01-01

    The National Geophysical Data Center has the largest collection of geomagnetic data from the worldwide network of magnetic observatories. The data base management system and retrieval/display software have been developed for the archived geomagnetic data (annual means, monthly, daily, hourly, and 1-minute values) and placed on the center's CD-ROM's to provide users with 'user-oriented' and 'user-friendly' support. This system is described in this paper with a brief outline of provided options.

  16. A PC-controlled microwave tomographic scanner for breast imaging

    NASA Astrophysics Data System (ADS)

    Padhi, Shantanu; Howard, John; Fhager, A.; Bengtsson, Sebastian

    2011-01-01

    This article presents the design and development of a personal computer based controller for a microwave tomographic system for breast cancer detection. The system uses motorized, dual-polarized antennas and a custom-made GUI interface to control stepper motors, a wideband vector network analyzer (VNA) and to coordinate data acquisition and archival in a local MDSPlus database. Both copolar and cross-polar scattered field components can be measured directly. Experimental results are presented to validate the various functionalities of the scanner.

  17. Reliability issues in PACS

    NASA Astrophysics Data System (ADS)

    Taira, Ricky K.; Chan, Kelby K.; Stewart, Brent K.; Weinberg, Wolfram S.

    1991-07-01

    Reliability is an increasing concern when moving PACS from the experimental laboratory to the clinical environment. Any system downtime may seriously affect patient care. The authors report on the several classes of errors encountered during the pre-clinical release of the PACS during the past several months and present the solutions implemented to handle them. The reliability issues discussed include: (1) environmental precautions, (2) database backups, (3) monitor routines of critical resources and processes, (4) hardware redundancy (networks, archives), and (5) development of a PACS quality control program.

  18. Convolutional Neural Network on Embedded Linux(trademark) System-on-Chip: A Methodology and Performance Benchmark

    DTIC Science & Technology

    2016-05-01

    A9 CPU and 15 W for the i7 CPU. A method of accelerating this computation is by using a customized hardware unit called a field- programmable gate...implementation of custom logic to accelerate com- putational workloads. This FPGA fabric, in addition to the standard programmable logic, contains 220...chip; field- programmable gate array Daniel Gebhardt U U U U 18 (619) 553-2786 INITIAL DISTRIBUTION 84300 Library (2) 85300 Archive/Stock (1

  19. Convolutional Neural Network on Embedded Linux System-on-Chip: A Methodology and Performance Benchmark

    DTIC Science & Technology

    2016-05-01

    A9 CPU and 15 W for the i7 CPU. A method of accelerating this computation is by using a customized hardware unit called a field- programmable gate...implementation of custom logic to accelerate com- putational workloads. This FPGA fabric, in addition to the standard programmable logic, contains 220...chip; field- programmable gate array Daniel Gebhardt U U U U 18 (619) 553-2786 INITIAL DISTRIBUTION 84300 Library (2) 85300 Archive/Stock (1

  20. The Telecommunications and Data Acquisition

    NASA Technical Reports Server (NTRS)

    Posner, Edward C. (Editor)

    1992-01-01

    This quarterly publication provides archival reports on developments in programs managed by JPL's Office of Telecommunications and Data Acquisition (TDA). In space communications, radio navigation, radio science, and ground-based radio and radar astronomy, it reports on activities of the Deep Space Network (DSN) in planning, supporting research and technology, implementation, and operations. Also included are standards activity at JPL for space data and information systems and reimbursable DSN work performed for other space agencies through NASA. The preceding work is all performed for NASA's Office of Space Communications (OSC).

  1. Design of an image-distribution service from a clinical PACS

    NASA Astrophysics Data System (ADS)

    Gehring, Dale G.; Persons, Kenneth R.; Rothman, Melvyn L.; Felmlee, Joel P.; Gerhart, D. J.; Hangiandreou, Nicholas J.; Reardon, Frank J.; Shirk, M.; Forbes, Glenn S.; Williamson, Byrn, Jr.

    1994-05-01

    A PACS system has been developed through a multi-phase collaboration between the Mayo Clinic and IBM/Rochester. The current system has been fully integrated into the clinical practice of the Radiology Department for the primary purpose of digital image archival, retrieval, and networked workstation review. Work currently in progress includes the design and implementation of a gateway device for providing digital image data to third-party workstations, laser printers, and other devices, for users both within and outside of the Radiology Department.

  2. Warfighter Physiological and Environmental Monitoring: A Study for the U.S. Army Research Institute in Environmental Medicine and the Soldier Systems Center

    DTIC Science & Technology

    2004-11-01

    peripheral devices , such as a heart- rate monitor, oximeter, etc., over a wireless link. Interfacing to peripheral sensors requires installation of... devices are powered from wall outlets. However, for networks comprising mobile devices , and in particular for a PAN comprising body-worn sensors ...SpO2) cost in excess of $25K per system 2. Size, weight, and power – Excluding the sensors , the mobile components (comm link and data archiving

  3. Experience with PACS in an ATM/Ethernet switched network environment.

    PubMed

    Pelikan, E; Ganser, A; Kotter, E; Schrader, U; Timmermann, U

    1998-03-01

    Legacy local area network (LAN) technologies based on shared media concepts are not adequate for the growth of a large-scale picture archiving and communication system (PACS) in a client-server architecture. First, an asymmetric network load, due to the requests of a large number of PACS clients for only a few main servers, should be compensated by communication links to the servers with a higher bandwidth compared to the clients. Secondly, as the number of PACS nodes increases, the network throughout should not measurably cut production. These requirements can easily be fulfilled using switching technologies. Here asynchronous transfer mode (ATM) is clearly one of the hottest topics in networking because the ATM architecture provides integrated support for a variety of communication services, and it supports virtual networking. On the other hand, most of the imaging modalities are not yet ready for integration into a native ATM network. For a lot of nodes already joining an Ethernet, a cost-effective and pragmatic way to benefit from the switching concept would be a combined ATM/Ethernet switching environment. This incorporates an incremental migration strategy with the immediate benefits of high-speed, high-capacity ATM (for servers and high-sophisticated display workstations), while preserving elements of the existing network technologies. In addition, Ethernet switching instead of shared media Ethernet improves the performance considerably. The LAN emulation (LANE) specification by the ATM forum defines mechanisms that allow ATM networks to coexist with legacy systems using any data networking protocol. This paper points out the suitability of this network architecture in accordance with an appropriate system design.

  4. THE EFFECT OF CLOUD FRACTION ON THE RADIATIVE ENERGY BUDGET: The Satellite-Based GEWEX-SRB Data vs. the Ground-Based BSRN Measurements

    NASA Astrophysics Data System (ADS)

    Zhang, T.; Stackhouse, P. W.; Gupta, S. K.; Cox, S. J.; Mikovitz, J. C.; Nasa Gewex Srb

    2011-12-01

    The NASA GEWEX-SRB (Global Energy and Water cycle Experiment - Surface Radiation Budget) project produces and archives shortwave and longwave atmospheric radiation data at the top of the atmosphere (TOA) and the Earth's surface. The archive holds uninterrupted records of shortwave/longwave downward/upward radiative fluxes at 1 degree by 1 degree resolution for the entire globe. The latest version in the archive, Release 3.0, is available as 3-hourly, daily and monthly means, spanning 24.5 years from July 1983 to December 2007. Primary inputs to the models used to produce the data include: shortwave and longwave radiances from International Satellite Cloud Climatology Project (ISCCP) pixel-level (DX) data, cloud and surface properties derived therefrom, temperature and moisture profiles from GEOS-4 reanalysis product obtained from the NASA Global Modeling and Assimilation Office (GMAO), and column ozone amounts constituted from Total Ozone Mapping Spectrometer (TOMS), TIROS Operational Vertical Sounder (TOVS) archives, and Stratospheric Monitoring-group's Ozone Blended Analysis (SMOBA), an assimilation product from NOAA's Climate Prediction Center. The data in the archive have been validated systemically against ground-based measurements which include the Baseline Surface Radiation Network (BSRN) data, the World Radiation Data Centre (WRDC) data, and the Global Energy Balance Archive (GEBA) data, and generally good agreement has been achieved. In addition to all-sky radiative fluxes, the output data include clear-sky fluxes, cloud optical depth, cloud fraction and so on. The BSRN archive also includes observations that can be used to derive the cloud fraction, which provides a means for analyzing and explaining the SRB-BSRN flux differences. In this paper, we focus on the effect of cloud fraction on the surface shortwave flux and the level of agreement between the satellite-based SRB data and the ground-based BSRN data. The satellite and BSRN employ different measuring methodologies and thus result in data representing means on dramatically different spatial scales. Therefore, the satellite-based and ground-based measurements are not expected to agree all the time, especially under skies with clouds. The flux comparisons are made under different cloud fractions, and it is found that the SRB-BSRN radiative flux discrepancies can be explained to a certain extent by the SRB-BSRN cloud fraction discrepancies. Apparently, cloud fraction alone cannot completely define the role of clouds in radiation transfer. Further studies need to incorporate the classification of cloud types, altitudes, cloud optical depths and so on.

  5. Improving Access to NASA Earth Science Data through Collaborative Metadata Curation

    NASA Astrophysics Data System (ADS)

    Sisco, A. W.; Bugbee, K.; Shum, D.; Baynes, K.; Dixon, V.; Ramachandran, R.

    2017-12-01

    The NASA-developed Common Metadata Repository (CMR) is a high-performance metadata system that currently catalogs over 375 million Earth science metadata records. It serves as the authoritative metadata management system of NASA's Earth Observing System Data and Information System (EOSDIS), enabling NASA Earth science data to be discovered and accessed by a worldwide user community. The size of the EOSDIS data archive is steadily increasing, and the ability to manage and query this archive depends on the input of high quality metadata to the CMR. Metadata that does not provide adequate descriptive information diminishes the CMR's ability to effectively find and serve data to users. To address this issue, an innovative and collaborative review process is underway to systematically improve the completeness, consistency, and accuracy of metadata for approximately 7,000 data sets archived by NASA's twelve EOSDIS data centers, or Distributed Active Archive Centers (DAACs). The process involves automated and manual metadata assessment of both collection and granule records by a team of Earth science data specialists at NASA Marshall Space Flight Center. The team communicates results to DAAC personnel, who then make revisions and reingest improved metadata into the CMR. Implementation of this process relies on a network of interdisciplinary collaborators leveraging a variety of communication platforms and long-range planning strategies. Curating metadata at this scale and resolving metadata issues through community consensus improves the CMR's ability to serve current and future users and also introduces best practices for stewarding the next generation of Earth Observing System data. This presentation will detail the metadata curation process, its outcomes thus far, and also share the status of ongoing curation activities.

  6. Steps toward a CONUS-wide reanalysis with archived NEXRAD data using National Mosaic and Multisensor Quantitative Precipitation Estimation (NMQ/Q2) algorithms

    NASA Astrophysics Data System (ADS)

    Stevens, S. E.; Nelson, B. R.; Langston, C.; Qi, Y.

    2012-12-01

    The National Mosaic and Multisensor QPE (NMQ/Q2) software suite, developed at NOAA's National Severe Storms Laboratory (NSSL) in Norman, OK, addresses a large deficiency in the resolution of currently archived precipitation datasets. Current standards, both radar- and satellite-based, provide for nationwide precipitation data with a spatial resolution of up to 4-5 km, with a temporal resolution as fine as one hour. Efforts are ongoing to process archived NEXRAD data for the period of record (1996 - present), producing a continuous dataset providing precipitation data at a spatial resolution of 1 km, on a timescale of only five minutes. In addition, radar-derived precipitation data are adjusted hourly using a wide variety of automated gauge networks spanning the United States. Applications for such a product range widely, from emergency management and flash flood guidance, to hydrological studies and drought monitoring. Results are presented from a subset of the NEXRAD dataset, providing basic statistics on the distribution of rainrates, relative frequency of precipitation types, and several other variables which demonstrate the variety of output provided by the software. Precipitation data from select case studies are also presented to highlight the increased resolution provided by this reanalysis and the possibilities that arise from the availability of data on such fine scales. A previously completed pilot project and steps toward a nationwide implementation are presented along with proposed strategies for managing and processing such a large dataset. Reprocessing efforts span several institutions in both North Carolina and Oklahoma, and data/software coordination are key in producing a homogeneous record of precipitation to be archived alongside NOAA's other Climate Data Records. Methods are presented for utilizing supercomputing capability in expediting processing, to allow for the iterative nature of a reanalysis effort.

  7. The Kanzelhöhe Observatory

    NASA Astrophysics Data System (ADS)

    Pötzi, Werner; Temmer, Manuela; Veronig, Astrid; Hirtenfellner-Polanec, Wolfgang; Baumgartner, Dietmar

    2013-04-01

    Kanzelhöhe Observatory (KSO; kso.ac.at) located in the South of Austria is part of the Institute of Physics of the University of Graz. Since the early 1940s, the Sun has been observed in various layers and wavelengths. Currently, KSO provides high-cadence full-disk observations of the solar disk in three wavelengths: H-alpha line, Ca II K line, white light. Real-time images are published online. For scientific use, the data is processed, and immediately available to the scientific community after each observing day via the Kanzelhöhe Online Data Archive archive (KODA; kanzelhohe.uni-graz.at). KSO is part of the Global H-Alpha Network and is also one of the contributing stations for the international sunspot number. In the frame of ESA's Space Situational Awareness program, methods are currently under development for near-real image recognition with respect to solar flares and filaments. These data products will give valuable complementary information for the solar sources of space weather.

  8. Bonneville Power Administration Communication Alarm Processor expert system:

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goeltz, R.; Purucker, S.; Tonn, B.

    This report describes the Communications Alarm Processor (CAP), a prototype expert system developed for the Bonneville Power Administration by Oak Ridge National Laboratory. The system is designed to receive and diagnose alarms from Bonneville's Microwave Communications System (MCS). The prototype encompasses one of seven branches of the communications network and a subset of alarm systems and alarm types from each system. The expert system employs a backward chaining approach to diagnosing alarms. Alarms are fed into the expert system directly from the communication system via RS232 ports and sophisticated alarm filtering and mailbox software. Alarm diagnoses are presented to operatorsmore » for their review and concurrence before the diagnoses are archived. Statistical software is incorporated to allow analysis of archived data for report generation and maintenance studies. The delivered system resides on a Digital Equipment Corporation VAX 3200 workstation and utilizes Nexpert Object and SAS for the expert system and statistical analysis, respectively. 11 refs., 23 figs., 7 tabs.« less

  9. The Centennial Trends Greater Horn of Africa precipitation dataset.

    PubMed

    Funk, Chris; Nicholson, Sharon E; Landsfeld, Martin; Klotter, Douglas; Peterson, Pete; Harrison, Laura

    2015-01-01

    East Africa is a drought prone, food and water insecure region with a highly variable climate. This complexity makes rainfall estimation challenging, and this challenge is compounded by low rain gauge densities and inhomogeneous monitoring networks. The dearth of observations is particularly problematic over the past decade, since the number of records in globally accessible archives has fallen precipitously. This lack of data coincides with an increasing scientific and humanitarian need to place recent seasonal and multi-annual East African precipitation extremes in a deep historic context. To serve this need, scientists from the UC Santa Barbara Climate Hazards Group and Florida State University have pooled their station archives and expertise to produce a high quality gridded 'Centennial Trends' precipitation dataset. Additional observations have been acquired from the national meteorological agencies and augmented with data provided by other universities. Extensive quality control of the data was carried out and seasonal anomalies interpolated using kriging. This paper documents the CenTrends methodology and data.

  10. The Centennial Trends Greater Horn of Africa precipitation dataset

    USGS Publications Warehouse

    Funk, Chris; Nicholson, Sharon E.; Landsfeld, Martin F.; Klotter, Douglas; Peterson, Pete J.; Harrison, Laura

    2015-01-01

    East Africa is a drought prone, food and water insecure region with a highly variable climate. This complexity makes rainfall estimation challenging, and this challenge is compounded by low rain gauge densities and inhomogeneous monitoring networks. The dearth of observations is particularly problematic over the past decade, since the number of records in globally accessible archives has fallen precipitously. This lack of data coincides with an increasing scientific and humanitarian need to place recent seasonal and multi-annual East African precipitation extremes in a deep historic context. To serve this need, scientists from the UC Santa Barbara Climate Hazards Group and Florida State University have pooled their station archives and expertise to produce a high quality gridded ‘Centennial Trends’ precipitation dataset. Additional observations have been acquired from the national meteorological agencies and augmented with data provided by other universities. Extensive quality control of the data was carried out and seasonal anomalies interpolated using kriging. This paper documents the CenTrends methodology and data.

  11. VARSEDIG: an algorithm for morphometric characters selection and statistical validation in morphological taxonomy.

    PubMed

    Guisande, Cástor; Vari, Richard P; Heine, Jürgen; García-Roselló, Emilio; González-Dacosta, Jacinto; Perez-Schofield, Baltasar J García; González-Vilas, Luis; Pelayo-Villamil, Patricia

    2016-09-12

    We present and discuss VARSEDIG, an algorithm which identifies the morphometric features that significantly discriminate two taxa and validates the morphological distinctness between them via a Monte-Carlo test. VARSEDIG is freely available as a function of the RWizard application PlotsR (http://www.ipez.es/RWizard) and as R package on CRAN. The variables selected by VARSEDIG with the overlap method were very similar to those selected by logistic regression and discriminant analysis, but overcomes some shortcomings of these methods. VARSEDIG is, therefore, a good alternative by comparison to current classical classification methods for identifying morphometric features that significantly discriminate a taxon and for validating its morphological distinctness from other taxa. As a demonstration of the potential of VARSEDIG for this purpose, we analyze morphological discrimination among some species of the Neotropical freshwater family Characidae.

  12. CADC and CANFAR: Extending the role of the data centre

    NASA Astrophysics Data System (ADS)

    Gaudet, Severin

    2015-12-01

    Over the past six years, the CADC has moved beyond the astronomy archive data centre to a multi-service system for the community. This evolution is based on two major initiatives. The first is the adoption of International Virtual Observatory Alliance (IVOA) standards in both the system and data architecture of the CADC, including a common characterization data model. The second is the Canadian Advanced Network for Astronomical Research (CANFAR), a digital infrastructure combining the Canadian national research network (CANARIE), cloud processing and storage resources (Compute Canada) and a data centre (Canadian Astronomy Data Centre) into a unified ecosystem for storage and processing for the astronomy community. This talk will describe the architecture and integration of IVOA and CANFAR services into CADC operations, the operational experiences, the lessons learned and future directions

  13. Turkish meteor surveillance systems and network: Impact craters and meteorites database

    NASA Astrophysics Data System (ADS)

    Unsalan, O.; Ozel, M. E.; Derman, I. E.; Terzioglu, Z.; Kaygisiz, E.; Temel, T.; Topoyan, D.; Solmaz, A.; Yilmaz Kocahan, O.; Esenoglu, H. H.; Emrahoglu, N.; Yilmaz, A.; Yalcinkaya, B. O.

    2014-07-01

    In our project, we aim toward constructing Turkish Meteor Surveillance Systems and Network in Turkey. For this goal, video observational systems from SonotaCo (Japan) were chosen. Meteors are going to be observed with the specific cameras, their orbits will be calculated by the software from SonotaCo, and the places where they will be falling / impacting will be examined by field trips. The collected meteorites will be investigated by IR-Raman Spectroscopic techniques and SEM-EDX analyses in order to setup a database. On the other hand, according to our Prime Ministry Ottoman Archives, there are huge amounts of reports of falls for the past centuries. In order to treat these data properly, it is obvious that processing systems should be constructed and developed.

  14. A Geospatial Database that Supports Derivation of Climatological Features of Severe Weather

    NASA Astrophysics Data System (ADS)

    Phillips, M.; Ansari, S.; Del Greco, S.

    2007-12-01

    The Severe Weather Data Inventory (SWDI) at NOAA's National Climatic Data Center (NCDC) provides user access to archives of several datasets critical to the detection and evaluation of severe weather. These datasets include archives of: · NEXRAD Level-III point features describing general storm structure, hail, mesocyclone and tornado signatures · National Weather Service Storm Events Database · National Weather Service Local Storm Reports collected from storm spotters · National Weather Service Warnings · Lightning strikes from Vaisala's National Lightning Detection Network (NLDN) SWDI archives all of these datasets in a spatial database that allows for convenient searching and subsetting. These data are accessible via the NCDC web site, Web Feature Services (WFS) or automated web services. The results of interactive web page queries may be saved in a variety of formats, including plain text, XML, Google Earth's KMZ, standards-based NetCDF and Shapefile. NCDC's Storm Risk Assessment Project (SRAP) uses data from the SWDI database to derive gridded climatology products that show the spatial distributions of the frequency of various events. SRAP also can relate SWDI events to other spatial data such as roads, population, watersheds, and other geographic, sociological, or economic data to derive products that are useful in municipal planning, emergency management, the insurance industry, and other areas where there is a need to quantify and qualify how severe weather patterns affect people and property.

  15. Queuing Models of Tertiary Storage

    NASA Technical Reports Server (NTRS)

    Johnson, Theodore

    1996-01-01

    Large scale scientific projects generate and use large amounts of data. For example, the NASA Earth Observation System Data and Information System (EOSDIS) project is expected to archive one petabyte per year of raw satellite data. This data is made automatically available for processing into higher level data products and for dissemination to the scientific community. Such large volumes of data can only be stored in robotic storage libraries (RSL's) for near-line access. A characteristic of RSL's is the use of a robot arm that transfers media between a storage rack and the read/write drives, thus multiplying the capacity of the system. The performance of the RSL's can be a critical limiting factor for the performance of the archive system. However, the many interacting components of an RSL make a performance analysis difficult. In addition, different RSL components can have widely varying performance characteristics. This paper describes our work to develop performance models of an RSL in isolation. Next we show how the RSL model can be incorporated into a queuing network model. We use the models to make some example performance studies of archive systems. The models described in this paper, developed for the NASA EODIS project, are implemented in C with a well defined interface. The source code, accompanying documentation, and also sample JAVA applets are available at: http://www.cis.ufl.edu/ted/

  16. MyShake: Initial observations from a global smartphone seismic network

    NASA Astrophysics Data System (ADS)

    Kong, Qingkai; Allen, Richard M.; Schreier, Louis

    2016-09-01

    MyShake is a global smartphone seismic network that harnesses the power of crowdsourcing. In the first 6 months since the release of the MyShake app, there were almost 200,000 downloads. On a typical day about 8000 phones provide acceleration waveform data to the MyShake archive. The on-phone app can detect and trigger on P waves and is capable of recording magnitude 2.5 and larger events. More than 200 seismic events have been recorded so far, including events in Chile, Argentina, Mexico, Morocco, Nepal, New Zealand, Taiwan, Japan, and across North America. The largest number of waveforms from a single earthquake to date comes from the M5.2 Borrego Springs earthquake in Southern California, for which MyShake collected 103 useful three-component waveforms. The network continues to grow with new downloads from the Google Play store everyday and expands rapidly when public interest in earthquakes peaks such as during an earthquake sequence.

  17. Development of Standard Station Interface for Comprehensive Nuclear Test Ban Treaty Organistation Monitoring Networks

    NASA Astrophysics Data System (ADS)

    Dricker, I. G.; Friberg, P.; Hellman, S.

    2001-12-01

    Under the contract with the CTBTO, Instrumental Software Technologies Inc., (ISTI) has designed and developed a Standard Station Interface (SSI) - a set of executable programs and application programming interface libraries for acquisition, authentication, archiving and telemetry of seismic and infrasound data for stations of the CTBTO nuclear monitoring network. SSI (written in C) is fully supported under both the Solaris and Linux operating systems and will be shipped with fully documented source code. SSI consists of several interconnected modules. The Digitizer Interface Module maintains a near-real-time data flow between multiple digitizers and the SSI. The Disk Buffer Module is responsible for local data archival. The Station Key Management Module is a low-level tool for data authentication and verification of incoming signatures. The Data Transmission Module supports packetized near-real-time data transmission from the primary CTBTO stations to the designated Data Center. The AutoDRM module allows transport of seismic and infrasound signed data via electronic mail (auxiliary station mode). The Command Interface Module is used to pass the remote commands to the digitizers and other modules of SSI. A station operator has access to the state-of-health information and waveforms via an the Operator Interface Module. Modular design of SSI will allow painless extension of the software system within and outside the boundaries of CTBTO station requirements. Currently an alpha version of SSI undergoes extensive tests in the lab and onsite.

  18. Optical Properties of Aerosols from Long Term Ground-Based Aeronet Measurements

    NASA Technical Reports Server (NTRS)

    Holben, B. N.; Tanre, D.; Smirnov, A.; Eck, T. F.; Slutsker, I.; Dubovik, O.; Lavenu, F.; Abuhassen, N.; Chatenet, B.

    1999-01-01

    AERONET is an optical ground-based aerosol monitoring network and data archive supported by NASA's Earth Observing System and expanded by federation with many non-NASA institutions including AEROCAN (AERONET CANada) and PHOTON (PHOtometrie pour le Traiteinent Operatonnel de Normalisation Satellitaire). The network hardware consists of identical automatic sun-sky scanning spectral radiometers owned by national agencies and universities purchased for their own monitoring and research objectives. Data are transmitted hourly through the data collection system (DCS) on board the geostationary meteorological satellites GMS, GOES and METEOSAT and received in a common archive for daily processing utilizing a peer reviewed series of algorithms thus imposing a standardization and quality control of the product data base. Data from this collaboration provides globally distributed near real time observations of aerosol spectral optical depths, aerosol size distributions, and precipitable water in diverse aerosol regimes. Access to the AERONET data base has shifted from the interactive program 'demonstrat' (reserved for PI's) to the AERONET homepage allowing faster access and greater development for GIS object oriented retrievals and analysis with companion geocoded data sets from satellites, LIDAR and solar flux measurements for example. We feel that a significant yet under utilized component of the AERONET data base are inversion products made from hourly principal plane and almucanter measurements. The current inversions have been shown to retrieve aerosol volume size distributions. A significant enhancement to the inversion code has been developed and is presented in these proceedings.

  19. Network Level Carbon Dioxide Emissions From On-road Sources in the Portland OR, (USA) Metropolitan Area

    NASA Astrophysics Data System (ADS)

    Powell, J.; Butenhoff, C. L.; Rice, A. L.

    2014-12-01

    To mitigate climate change, governments at multiple levels are developing policies to decrease anthropogenic carbon dioxide (CO2) emissions. The City of Portland (Oregon) and Multnomah County have adopted a Climate Action Plan with a stated goal of reducing emissions to 80% below 1990 levels by 2050. The transportation sector alone accounts for about 40% of total emissions in the Portland metropolitan area. Here we show a new street-level model of on-road mobile CO2 emissions for the Portland, OR metropolitan region. The model uses hourly traffic counter recordings made by the Portland Bureau of Transportation at 9,352 sites over 21 years (1986-2006), augmented with freeway loop detector data from the Portland Regional Transportation Archive Listing (PORTAL) transportation data archive. We constructed a land use regression model to fill in traffic network gaps with traffic counts as the dependent variable using GIS data such as road class (32 categories) and population density. The Environmental Protection Agency (EPA) MOtor Vehicle Emission Simulator (MOVES) model was used to estimate transportation CO2 emissions. The street-level emissions can be aggregated and gridded and used as input to atmospheric transport models for comparison with atmospheric measurements. This model also provides an independent assessment of top-down inventories that determine emissions from fuel sales, while being an important component of our ongoing effort to assess the effectiveness of emission mitigation strategies at the urban scale.

  20. Data Delivery Latency Improvements And First Steps Towards The Distributed Computing Of The Caltech/USGS Southern California Seismic Network Earthquake Early Warning System

    NASA Astrophysics Data System (ADS)

    Stubailo, I.; Watkins, M.; Devora, A.; Bhadha, R. J.; Hauksson, E.; Thomas, V. I.

    2016-12-01

    The USGS/Caltech Southern California Seismic Network (SCSN) is a modern digital ground motion seismic network. It develops and maintains Earthquake Early Warning (EEW) data collection and delivery systems in southern California as well as real-time EEW algorithms. Recently, Behr et al., SRL, 2016 analyzed data from several regional seismic networks deployed around the globe. They showed that the SCSN was the network with the smallest data communication delays or latency. Since then, we have reduced further the telemetry delays for many of the 330 current sites. The latency has been reduced on average from 2-6 sec to 0.4 seconds by tuning the datalogger parameters and/or deploying software upgrades. Recognizing the latency data as one of the crucial parameters in EEW, we have started archiving the per-packet latencies in mseed format for all the participating sites in a similar way it is traditionally done for the seismic waveform data. The archived latency values enable us to understand and document long-term changes in performance of the telemetry links. We can also retroactively investigate how latent the waveform data were during a specific event or during a specific time period. In addition the near-real time latency values are useful for monitoring and displaying the real-time station latency, in particular to compare different telemetry technologies. A future step to reduce the latency is to deploy the algorithms on the dataloggers at the seismic stations and transmit either the final solutions or intermediate parameters to a central processing center. To implement this approach, we are developing a stand-alone version of the OnSite algorithm to run on the dataloggers in the field. This will increase the resiliency of the SCSN to potential telemetry restrictions in the immediate aftermath of a large earthquake, either by allowing local alarming by the single station, or permitting transmission of lightweight parametric information rather than continuous waveform data to the central processing facility. State-of-the-art development of Internet of Things (IoT) tools and platforms, which can be used to distribute and maintain software on a large number of remote devices are making this approach to earthquake early warning more feasible.

Top