Download the current and legacy versions of the BenMAP program. Download configuration and aggregation/pooling/valuation files to estimate benefits. BenMAP-CE is free and open source software, and the source code is available upon request.
M2Lite: An Open-source, Light-weight, Pluggable and Fast Proteome Discoverer MSF to mzIdentML Tool.
Aiyetan, Paul; Zhang, Bai; Chen, Lily; Zhang, Zhen; Zhang, Hui
2014-04-28
Proteome Discoverer is one of many tools used for protein database search and peptide to spectrum assignment in mass spectrometry-based proteomics. However, the inadequacy of conversion tools makes it challenging to compare and integrate its results to those of other analytical tools. Here we present M2Lite, an open-source, light-weight, easily pluggable and fast conversion tool. M2Lite converts proteome discoverer derived MSF files to the proteomics community defined standard - the mzIdentML file format. M2Lite's source code is available as open-source at https://bitbucket.org/paiyetan/m2lite/src and its compiled binaries and documentation can be freely downloaded at https://bitbucket.org/paiyetan/m2lite/downloads.
A High-Resolution Stopwatch for Cents
ERIC Educational Resources Information Center
Gingl, Z.; Kopasz, K.
2011-01-01
A very low-cost, easy-to-make stopwatch is presented to support various experiments in mechanics. The high-resolution stopwatch is based on two photodetectors connected directly to the microphone input of a sound card. Dedicated free open-source software has been developed and made available to download. The efficiency is demonstrated by a free…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Iversen, C.M.; Powell, A.S.; McCormack, M.L.
The second version of the Fine-Root Ecology Database is available for download! Download the full FRED 2.0 data set, user guidance document, map, and list of data sources here. Prior to downloading the data, please read and follow the Data Use Guidelines, and it's worth checking out some tips for using FRED before you begin your analyses. Also, see here for an updating list of corrections to FRED 2.0.
Share, steal, or buy? A social cognitive perspective of music downloading.
LaRose, Robert; Kim, Junghyun
2007-04-01
The music downloading phenomenon presents a unique opportunity to examine normative influences on media consumption behavior. Downloaders face moral, legal, and ethical quandaries that can be conceptualized as normative influences within the self-regulatory mechanism of social cognitive theory. The music industry hopes to eliminate illegal file sharing and to divert illegal downloaders to pay services by asserting normative influence through selective prosecutions and public information campaigns. However the deficient self-regulation of downloaders counters these efforts maintaining file sharing as a persistent habit that defies attempts to establish normative control. The present research tests and extends the social cognitive theory of downloading on a sample of college students. The expected outcomes of downloading behavior and deficient self-regulation of that behavior were found to be important determinants of intentions to continue downloading. Consistent with social cognitive theory but in contrast to the theory of planned behavior, it was found that descriptive and prescriptive norms influenced deficient self-regulation but had no direct impact on behavioral intentions. Downloading intentions also had no direct relationship to either compact disc purchases or to subscription to online pay music services.
Boros, L G; Lepow, C; Ruland, F; Starbuck, V; Jones, S; Flancbaum, L; Townsend, M C
1992-07-01
A powerful method of processing MEDLINE and CINAHL source data uploaded to the IBM 3090 mainframe computer through an IBM/PC is described. Data are first downloaded from the CD-ROM's PC devices to floppy disks. These disks then are uploaded to the mainframe computer through an IBM/PC equipped with WordPerfect text editor and computer network connection (SONNGATE). Before downloading, keywords specifying the information to be accessed are typed at the FIND prompt of the CD-ROM station. The resulting abstracts are downloaded into a file called DOWNLOAD.DOC. The floppy disks containing the information are simply carried to an IBM/PC which has a terminal emulation (TELNET) connection to the university-wide computer network (SONNET) at the Ohio State University Academic Computing Services (OSU ACS). The WordPerfect (5.1) processes and saves the text into DOS format. Using the File Transfer Protocol (FTP, 130,000 bytes/s) of SONNET, the entire text containing the information obtained through the MEDLINE and CINAHL search is transferred to the remote mainframe computer for further processing. At this point, abstracts in the specified area are ready for immediate access and multiple retrieval by any PC having network switch or dial-in connection after the USER ID, PASSWORD and ACCOUNT NUMBER are specified by the user. The system provides the user an on-line, very powerful and quick method of searching for words specifying: diseases, agents, experimental methods, animals, authors, and journals in the research area downloaded. The user can also copy the TItles, AUthors and SOurce with optional parts of abstracts into papers under edition. This arrangement serves the special demands of a research laboratory by handling MEDLINE and CINAHL source data resulting after a search is performed with keywords specified for ongoing projects. Since the Ohio State University has a centrally founded mainframe system, the data upload, storage and mainframe operations are free.
Schroedinger’s code: Source code availability and transparency in astrophysics
NASA Astrophysics Data System (ADS)
Ryan, PW; Allen, Alice; Teuben, Peter
2018-01-01
Astronomers use software for their research, but how many of the codes they use are available as source code? We examined a sample of 166 papers from 2015 for clearly identified software use, then searched for source code for the software packages mentioned in these research papers. We categorized the software to indicate whether source code is available for download and whether there are restrictions to accessing it, and if source code was not available, whether some other form of the software, such as a binary, was. Over 40% of the source code for the software used in our sample was not available for download.As URLs have often been used as proxy citations for software, we also extracted URLs from one journal’s 2015 research articles, removed those from certain long-term, reliable domains, and tested the remainder to determine what percentage of these URLs were still accessible in September and October, 2017.
Free and open source software for the manipulation of digital images.
Solomon, Robert W
2009-06-01
Free and open source software is a type of software that is nearly as powerful as commercial software but is freely downloadable. This software can do almost everything that the expensive programs can. GIMP (gnu image manipulation program) is the free program that is comparable to Photoshop, and versions are available for Windows, Macintosh, and Linux platforms. This article briefly describes how GIMP can be installed and used to manipulate radiology images. It is no longer necessary to budget large amounts of money for high-quality software to achieve the goals of image processing and document creation because free and open source software is available for the user to download at will.
Aiyetan, Paul; Zhang, Bai; Zhang, Zhen; Zhang, Hui
2014-01-01
Mass spectrometry based glycoproteomics has become a major means of identifying and characterizing previously N-linked glycan attached loci (glycosites). In the bottom-up approach, several factors which include but not limited to sample preparation, mass spectrometry analyses, and protein sequence database searches result in previously N-linked peptide spectrum matches (PSMs) of varying lengths. Given that multiple PSM scan map to a glycosite, we reason that identified PSMs are varying length peptide species of a unique set of glycosites. Because associated spectra of these PSMs are typically summed separately, true glycosite associated spectra counts are lost or complicated. Also, these varying length peptide species complicate protein inference as smaller sized peptide sequences are more likely to map to more proteins than larger sized peptides or actual glycosite sequences. Here, we present XGlycScan. XGlycScan maps varying length peptide species to glycosites to facilitate an accurate quantification of glycosite associated spectra counts. We observed that this reduced the variability in reported identifications of mass spectrometry technical replicates of our sample dataset. We also observed that mapping identified peptides to glycosites provided an assessment of search-engine identification. Inherently, XGlycScan reported glycosites reduce the complexity in protein inference. We implemented XGlycScan in the platform independent Java programing language and have made it available as open source. XGlycScan's source code is freely available at https://bitbucket.org/paiyetan/xglycscan/src and its compiled binaries and documentation can be freely downloaded at https://bitbucket.org/paiyetan/xglycscan/downloads. The graphical user interface version can also be found at https://bitbucket.org/paiyetan/xglycscangui/src and https://bitbucket.org/paiyetan/xglycscangui/downloads respectively.
Scaling to diversity: The DERECHOS distributed infrastructure for analyzing and sharing data
NASA Astrophysics Data System (ADS)
Rilee, M. L.; Kuo, K. S.; Clune, T.; Oloso, A.; Brown, P. G.
2016-12-01
Integrating Earth Science data from diverse sources such as satellite imagery and simulation output can be expensive and time-consuming, limiting scientific inquiry and the quality of our analyses. Reducing these costs will improve innovation and quality in science. The current Earth Science data infrastructure focuses on downloading data based on requests formed from the search and analysis of associated metadata. And while the data products provided by archives may use the best available data sharing technologies, scientist end-users generally do not have such resources (including staff) available to them. Furthermore, only once an end-user has received the data from multiple diverse sources and has integrated them can the actual analysis and synthesis begin. The cost of getting from idea to where synthesis can start dramatically slows progress. In this presentation we discuss a distributed computational and data storage framework that eliminates much of the aforementioned cost. The SciDB distributed array database is central as it is optimized for scientific computing involving very large arrays, performing better than less specialized frameworks like Spark. Adding spatiotemporal functions to the SciDB creates a powerful platform for analyzing and integrating massive, distributed datasets. SciDB allows Big Earth Data analysis to be performed "in place" without the need for expensive downloads and end-user resources. Spatiotemporal indexing technologies such as the hierarchical triangular mesh enable the compute and storage affinity needed to efficiently perform co-located and conditional analyses minimizing data transfers. These technologies automate the integration of diverse data sources using the framework, a critical step beyond current metadata search and analysis. Instead of downloading data into their idiosyncratic local environments, end-users can generate and share data products integrated from diverse multiple sources using a common shared environment, turning distributed active archive centers (DAACs) from warehouses into distributed active analysis centers.
Weadock, William J; Londy, Frank J; Ellis, James H; Goldman, Edward B
2008-10-01
To determine the prevalence of protected health information (PHI) in PowerPoint presentations available for downloading from the Internet. No institutional review board approval was needed for this project, which involved no patient subjects. Two Google searches, each limited to PowerPoint files, were performed by using the criteria "Cardiac CT" and "Magnetic Resonance Imaging." The first 100 hits of each search were downloaded from the source Web site. The presentations were examined for the PHI contained on any images, links, or notes pages. Two hundred presentations were evaluated. There were 143 presentations with images, image links, or notes, and 52 (36%) of these contained PHI. There were 129 presentations containing radiologic images; 51 (40%) of these contained PHI, and 31 (24%) showed the patient's name. At least 132 (66%) of the 200 presentations originated from the United States. Thirty-five (37%) of 94 presentations with images, image links, or notes contained PHI. Eighty-six U.S. presentations contained radiologic images; 34 (40%) of these contained PHI, and 19 (22%) showed the patient's name. Online or other distributions of PowerPoint presentations that contain radiologic images often contain PHI, and this may violate laws, including the U.S. Health Insurance Portability and Accountability Act. (c) RSNA, 2008.
3D reconstruction software comparison for short sequences
NASA Astrophysics Data System (ADS)
Strupczewski, Adam; Czupryński, BłaŻej
2014-11-01
Large scale multiview reconstruction is recently a very popular area of research. There are many open source tools that can be downloaded and run on a personal computer. However, there are few, if any, comparisons between all the available software in terms of accuracy on small datasets that a single user can create. The typical datasets for testing of the software are archeological sites or cities, comprising thousands of images. This paper presents a comparison of currently available open source multiview reconstruction software for small datasets. It also compares the open source solutions with a simple structure from motion pipeline developed by the authors from scratch with the use of OpenCV and Eigen libraries.
An Analysis of Open Source Security Software Products Downloads
ERIC Educational Resources Information Center
Barta, Brian J.
2014-01-01
Despite the continued demand for open source security software, a gap in the identification of success factors related to the success of open source security software persists. There are no studies that accurately assess the extent of this persistent gap, particularly with respect to the strength of the relationships of open source software…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marsh, Amber; Harsch, Tim; Pitt, Julie
2007-08-31
The computer side of the IMAGE project consists of a collection of Perl scripts that perform a variety of tasks; scripts are available to insert, update and delete data from the underlying Oracle database, download data from NCBI's Genbank and other sources, and generate data files for download by interested parties. Web scripts make up the tracking interface, and various tools available on the project web-site (image.llnl.gov) that provide a search interface to the database.
Recent Updates to the System Advisor Model (SAM)
DOE Office of Scientific and Technical Information (OSTI.GOV)
DiOrio, Nicholas A
The System Advisor Model (SAM) is a mature suite of techno-economic models for many renewable energy technologies that can be downloaded for free as a desktop application or software development kit. SAM is used for system-level modeling, including generating performance pro the release of the code as an open source project on GitHub. Other additions that will be covered include the ability to download data directly into SAM from the National Solar Radiation Database (NSRDB) and up- dates to a user-interface macro that assists with PV system sizing. A brief update on SAM's battery model and its integration with themore » detailed photovoltaic model will also be discussed. Finally, an outline of planned work for the next year will be presented, including the addition of a bifacial model, support for multiple MPPT inputs for detailed inverter modeling, and the addition of a model for inverter thermal behavior.« less
Neuroimaging, Genetics, and Clinical Data Sharing in Python Using the CubicWeb Framework
Grigis, Antoine; Goyard, David; Cherbonnier, Robin; Gareau, Thomas; Papadopoulos Orfanos, Dimitri; Chauvat, Nicolas; Di Mascio, Adrien; Schumann, Gunter; Spooren, Will; Murphy, Declan; Frouin, Vincent
2017-01-01
In neurosciences or psychiatry, the emergence of large multi-center population imaging studies raises numerous technological challenges. From distributed data collection, across different institutions and countries, to final data publication service, one must handle the massive, heterogeneous, and complex data from genetics, imaging, demographics, or clinical scores. These data must be both efficiently obtained and downloadable. We present a Python solution, based on the CubicWeb open-source semantic framework, aimed at building population imaging study repositories. In addition, we focus on the tools developed around this framework to overcome the challenges associated with data sharing and collaborative requirements. We describe a set of three highly adaptive web services that transform the CubicWeb framework into a (1) multi-center upload platform, (2) collaborative quality assessment platform, and (3) publication platform endowed with massive-download capabilities. Two major European projects, IMAGEN and EU-AIMS, are currently supported by the described framework. We also present a Python package that enables end users to remotely query neuroimaging, genetics, and clinical data from scripts. PMID:28360851
Neuroimaging, Genetics, and Clinical Data Sharing in Python Using the CubicWeb Framework.
Grigis, Antoine; Goyard, David; Cherbonnier, Robin; Gareau, Thomas; Papadopoulos Orfanos, Dimitri; Chauvat, Nicolas; Di Mascio, Adrien; Schumann, Gunter; Spooren, Will; Murphy, Declan; Frouin, Vincent
2017-01-01
In neurosciences or psychiatry, the emergence of large multi-center population imaging studies raises numerous technological challenges. From distributed data collection, across different institutions and countries, to final data publication service, one must handle the massive, heterogeneous, and complex data from genetics, imaging, demographics, or clinical scores. These data must be both efficiently obtained and downloadable. We present a Python solution, based on the CubicWeb open-source semantic framework, aimed at building population imaging study repositories. In addition, we focus on the tools developed around this framework to overcome the challenges associated with data sharing and collaborative requirements. We describe a set of three highly adaptive web services that transform the CubicWeb framework into a (1) multi-center upload platform, (2) collaborative quality assessment platform, and (3) publication platform endowed with massive-download capabilities. Two major European projects, IMAGEN and EU-AIMS, are currently supported by the described framework. We also present a Python package that enables end users to remotely query neuroimaging, genetics, and clinical data from scripts.
NASA Astrophysics Data System (ADS)
Allen, Alice; Teuben, Peter J.; Ryan, P. Wesley
2018-05-01
We examined software usage in a sample set of astrophysics research articles published in 2015 and searched for the source codes for the software mentioned in these research papers. We categorized the software to indicate whether the source code is available for download and whether there are restrictions to accessing it, and if the source code is not available, whether some other form of the software, such as a binary, is. We also extracted hyperlinks from one journal’s 2015 research articles, as links in articles can serve as an acknowledgment of software use and lead to the data used in the research, and tested them to determine which of these URLs are still accessible. For our sample of 715 software instances in the 166 articles we examined, we were able to categorize 418 records as according to whether source code was available and found that 285 unique codes were used, 58% of which offered the source code for download. Of the 2558 hyperlinks extracted from 1669 research articles, at best, 90% of them were available over our testing period.
Specht, Michael; Kuhlgert, Sebastian; Fufezan, Christian; Hippler, Michael
2011-04-15
We present Proteomatic, an operating system independent and user-friendly platform that enables the construction and execution of MS/MS data evaluation pipelines using free and commercial software. Required external programs such as for peptide identification are downloaded automatically in the case of free software. Due to a strict separation of functionality and presentation, and support for multiple scripting languages, new processing steps can be added easily. Proteomatic is implemented in C++/Qt, scripts are implemented in Ruby, Python and PHP. All source code is released under the LGPL. Source code and installers for Windows, Mac OS X, and Linux are freely available at http://www.proteomatic.org. michael.specht@uni-muenster.de Supplementary data are available at Bioinformatics online.
Free for All: Open Source Software
ERIC Educational Resources Information Center
Schneider, Karen
2008-01-01
Open source software has become a catchword in libraryland. Yet many remain unclear about open source's benefits--or even what it is. So what is open source software (OSS)? It's software that is free in every sense of the word: free to download, free to use, and free to view or modify. Most OSS is distributed on the Web and one doesn't need to…
PyCorrFit-generic data evaluation for fluorescence correlation spectroscopy.
Müller, Paul; Schwille, Petra; Weidemann, Thomas
2014-09-01
We present a graphical user interface (PyCorrFit) for the fitting of theoretical model functions to experimental data obtained by fluorescence correlation spectroscopy (FCS). The program supports many data file formats and features a set of tools specialized in FCS data evaluation. The Python source code is freely available for download from the PyCorrFit web page at http://pycorrfit.craban.de. We offer binaries for Ubuntu Linux, Mac OS X and Microsoft Windows. © The Author 2014. Published by Oxford University Press.
A two dimensional study of rotor/airfoil interaction in hover
NASA Technical Reports Server (NTRS)
Lee, Chyang S.
1988-01-01
A two dimensional model for the chordwise flow near the wing tip of the tilt rotor in hover is presented. The airfoil is represented by vortex panels and the rotor is modeled by doublet panels. The rotor slipstream and the airfoil wake are simulated by free point vortices. Calculations on a 20 percent thick elliptical airfoil under a uniform rotor inflow are performed. Variations on rotor size, spacing between the rotor and the airfoil, ground effect, and the influence upper surface blowing in download reduction are analyzed. Rotor size has only a minor influence on download when it is small. Increase of the rotor/airfoil spacing causes a gradual decrease on download. Proximity to the ground effectively reduces the download and makes the wake unsteady. The surface blowing changes the whole flow structure and significantly reduces the download within the assumption of a potential solution. Improvement on the present model is recommended to estimate the wall jets induced suction on the airfoil lower surface.
An All-Sky Portable (ASP) Optical Catalogue
NASA Astrophysics Data System (ADS)
Flesch, Eric Wim
2017-06-01
This optical catalogue combines the all-sky USNO-B1.0/A1.0 and most-sky APM catalogues, plus overlays of SDSS optical data, into a single all-sky map presented in a sparse binary format that is easily downloaded at 9 Gb zipped. Total count is 1 163 237 190 sources and each has J2000 astrometry, red and blue magnitudes with PSFs and variability indicator, and flags for proper motion, epoch, and source survey and catalogue for each of the photometry and astrometry. The catalogue is available on http://quasars.org/asp.html, and additional data for this paper is available at http://dx.doi.org/10.4225/50/5807fbc12595f.
NASA Astrophysics Data System (ADS)
Rajib, A.; Zhao, L.; Merwade, V.; Shin, J.; Smith, J.; Song, C. X.
2017-12-01
Despite the significant potential of remotely sensed earth observations, their application is still not full-fledged in water resources research, management and education. Inconsistent storage structures, data formats and spatial resolution among different platforms/sources of earth observations hinder the use of these data. Available web-services can help bulk data downloading and visualization, but they are not sufficiently tailored to meet the degree of interoperability required for direct application of earth observations in hydrologic modeling at user-defined spatio-temporal scales. Similarly, the least ambiguous way for educators and watershed managers is to instantaneously obtain a time-series at any watershed of interest without spending time and computational resources on data download and post-processing activities. To address this issue, an open access, online platform, named HydroGlobe, is developed that minimizes all these processing tasks and delivers ready-to-use data from different earth observation sources. HydroGlobe can provide spatially-averaged time series of earth observations by using the following inputs: (i) data source, (ii) temporal extent in the form of start/end date, and (iii) geographic units (e.g., grid cell or sub-basin boundary) and extent in the form of GIS shapefile. In its preliminary version, HydroGlobe simultaneously handles five data sources including the surface and root zone soil moisture from SMAP (Soil Moisture Active Passive Mission), actual and potential evapotranspiration from MODIS (Moderate Resolution Imaging Spectroradiometer), and precipitation from GPM (Global Precipitation Measurements). This presentation will demonstrate the HydroGlobe interface and its applicability using few test cases on watersheds from different parts of the globe.
NASA Astrophysics Data System (ADS)
Fang, H.; Kato, H.; Rodell, M.; Teng, W. L.; Vollmer, B. E.
2008-12-01
The Global Land Data Assimilation System (GLDAS) has been generating a series of land surface state (e.g., soil moisture and surface temperature) and flux (e.g., evaporation and sensible heat flux) products, simulated by four land surface models (CLM, Mosaic, Noah and VIC). These products are now accessible at the Hydrology Data and Information Services Center (HDISC), a component of the NASA Goddard Earth Sciences Data and Information Services Center (GES DISC). Current GLDAS data hosted at HDISC include a set of 1.0° data products, covering 1979 to the present, from the four models and a 0.25° data product, covering 2000 to the present, from the Noah model. In addition to the basic anonymous ftp data downloading, users can avail themselves of several advanced data search and downloading services, such as Mirador and OPeNDAP. Mirador is a Google-based search tool that provides keywords searching, on-the-fly spatial and parameter subsetting of selected data. OPeNDAP (Open-source Project for a Network Data Access Protocol) enables remote OPeNDAP clients to access OPeNDAP served data regardless of local storage format. Additional data services to be available in the near future from HDISC include (1) on-the-fly converter of GLDAS to NetCDF and binary data formats; (2) temporal aggregation of GLDAS files; and (3) Giovanni, an online visualization and analysis tool that provides a simple way to visualize, analyze, and access vast amounts of data without having to download the data.
Astrophysics Source Code Library: Incite to Cite!
NASA Astrophysics Data System (ADS)
DuPrie, K.; Allen, A.; Berriman, B.; Hanisch, R. J.; Mink, J.; Nemiroff, R. J.; Shamir, L.; Shortridge, K.; Taylor, M. B.; Teuben, P.; Wallen, J. F.
2014-05-01
The Astrophysics Source Code Library (ASCl,http://ascl.net/) is an on-line registry of over 700 source codes that are of interest to astrophysicists, with more being added regularly. The ASCL actively seeks out codes as well as accepting submissions from the code authors, and all entries are citable and indexed by ADS. All codes have been used to generate results published in or submitted to a refereed journal and are available either via a download site or from an identified source. In addition to being the largest directory of scientist-written astrophysics programs available, the ASCL is also an active participant in the reproducible research movement with presentations at various conferences, numerous blog posts and a journal article. This poster provides a description of the ASCL and the changes that we are starting to see in the astrophysics community as a result of the work we are doing.
SEGY to ASCII: Conversion and Plotting Program
Goldman, Mark R.
1999-01-01
This report documents a computer program to convert standard 4 byte, IBM floating point SEGY files to ASCII xyz format. The program then optionally plots the seismic data using the GMT plotting package. The material for this publication is contained in a standard tar file (of99-126.tar) that is uncompressed and 726 K in size. It can be downloaded by any Unix machine. Move the tar file to the directory you wish to use it in, then type 'tar xvf of99-126.tar' The archive files (and diskette) contain a NOTE file, a README file, a version-history file, source code, a makefile for easy compilation, and an ASCII version of the documentation. The archive files (and diskette) also contain example test files, including a typical SEGY file along with the resulting ASCII xyz and postscript files. Requirements for compiling the source code into an executable are a C++ compiler. The program has been successfully compiled using Gnu's g++ version 2.8.1, and use of other compilers may require modifications to the existing source code. The g++ compiler is a free, high quality C++ compiler and may be downloaded from the ftp site: ftp://ftp.gnu.org/gnu Requirements for plotting the seismic data is the existence of the GMT plotting package. The GMT plotting package may be downloaded from the web site: http://www.soest.hawaii.edu/gmt/
caCORE: a common infrastructure for cancer informatics.
Covitz, Peter A; Hartel, Frank; Schaefer, Carl; De Coronado, Sherri; Fragoso, Gilberto; Sahni, Himanso; Gustafson, Scott; Buetow, Kenneth H
2003-12-12
Sites with substantive bioinformatics operations are challenged to build data processing and delivery infrastructure that provides reliable access and enables data integration. Locally generated data must be processed and stored such that relationships to external data sources can be presented. Consistency and comparability across data sets requires annotation with controlled vocabularies and, further, metadata standards for data representation. Programmatic access to the processed data should be supported to ensure the maximum possible value is extracted. Confronted with these challenges at the National Cancer Institute Center for Bioinformatics, we decided to develop a robust infrastructure for data management and integration that supports advanced biomedical applications. We have developed an interconnected set of software and services called caCORE. Enterprise Vocabulary Services (EVS) provide controlled vocabulary, dictionary and thesaurus services. The Cancer Data Standards Repository (caDSR) provides a metadata registry for common data elements. Cancer Bioinformatics Infrastructure Objects (caBIO) implements an object-oriented model of the biomedical domain and provides Java, Simple Object Access Protocol and HTTP-XML application programming interfaces. caCORE has been used to develop scientific applications that bring together data from distinct genomic and clinical science sources. caCORE downloads and web interfaces can be accessed from links on the caCORE web site (http://ncicb.nci.nih.gov/core). caBIO software is distributed under an open source license that permits unrestricted academic and commercial use. Vocabulary and metadata content in the EVS and caDSR, respectively, is similarly unrestricted, and is available through web applications and FTP downloads. http://ncicb.nci.nih.gov/core/publications contains links to the caBIO 1.0 class diagram and the caCORE 1.0 Technical Guide, which provide detailed information on the present caCORE architecture, data sources and APIs. Updated information appears on a regular basis on the caCORE web site (http://ncicb.nci.nih.gov/core).
A Cooperative Downloading Method for VANET Using Distributed Fountain Code.
Liu, Jianhang; Zhang, Wenbin; Wang, Qi; Li, Shibao; Chen, Haihua; Cui, Xuerong; Sun, Yi
2016-10-12
Cooperative downloading is one of the effective methods to improve the amount of downloaded data in vehicular ad hoc networking (VANET). However, the poor channel quality and short encounter time bring about a high packet loss rate, which decreases transmission efficiency and fails to satisfy the requirement of high quality of service (QoS) for some applications. Digital fountain code (DFC) can be utilized in the field of wireless communication to increase transmission efficiency. For cooperative forwarding, however, processing delay from frequent coding and decoding as well as single feedback mechanism using DFC cannot adapt to the environment of VANET. In this paper, a cooperative downloading method for VANET using concatenated DFC is proposed to solve the problems above. The source vehicle and cooperative vehicles encodes the raw data using hierarchical fountain code before they send to the client directly or indirectly. Although some packets may be lost, the client can recover the raw data, so long as it receives enough encoded packets. The method avoids data retransmission due to packet loss. Furthermore, the concatenated feedback mechanism in the method reduces the transmission delay effectively. Simulation results indicate the benefits of the proposed scheme in terms of increasing amount of downloaded data and data receiving rate.
jmzML, an open-source Java API for mzML, the PSI standard for MS data.
Côté, Richard G; Reisinger, Florian; Martens, Lennart
2010-04-01
We here present jmzML, a Java API for the Proteomics Standards Initiative mzML data standard. Based on the Java Architecture for XML Binding and XPath-based XML indexer random-access XML parser, jmzML can handle arbitrarily large files in minimal memory, allowing easy and efficient processing of mzML files using the Java programming language. jmzML also automatically resolves internal XML references on-the-fly. The library (which includes a viewer) can be downloaded from http://jmzml.googlecode.com.
jTraML: an open source Java API for TraML, the PSI standard for sharing SRM transitions.
Helsens, Kenny; Brusniak, Mi-Youn; Deutsch, Eric; Moritz, Robert L; Martens, Lennart
2011-11-04
We here present jTraML, a Java API for the Proteomics Standards Initiative TraML data standard. The library provides fully functional classes for all elements specified in the TraML XSD document, as well as convenient methods to construct controlled vocabulary-based instances required to define SRM transitions. The use of jTraML is demonstrated via a two-way conversion tool between TraML documents and vendor specific files, facilitating the adoption process of this new community standard. The library is released as open source under the permissive Apache2 license and can be downloaded from http://jtraml.googlecode.com . TraML files can also be converted online at http://iomics.ugent.be/jtraml .
LSDCat: Detection and cataloguing of emission-line sources in integral-field spectroscopy datacubes
NASA Astrophysics Data System (ADS)
Herenz, Edmund Christian; Wisotzki, Lutz
2017-06-01
We present a robust, efficient, and user-friendly algorithm for detecting faint emission-line sources in large integral-field spectroscopic datacubes together with the public release of the software package Line Source Detection and Cataloguing (LSDCat). LSDCat uses a three-dimensional matched filter approach, combined with thresholding in signal-to-noise, to build a catalogue of individual line detections. In a second pass, the detected lines are grouped into distinct objects, and positions, spatial extents, and fluxes of the detected lines are determined. LSDCat requires only a small number of input parameters, and we provide guidelines for choosing appropriate values. The software is coded in Python and capable of processing very large datacubes in a short time. We verify the implementation with a source insertion and recovery experiment utilising a real datacube taken with the MUSE instrument at the ESO Very Large Telescope. The LSDCat software is available for download at http://muse-vlt.eu/science/tools and via the Astrophysics Source Code Library at http://ascl.net/1612.002
National Air Toxic Assessments (NATA) Results
The National Air Toxics Assessment was conducted by EPA in 2002 to assess air toxics emissions in order to identify and prioritize air toxics, emission source types and locations which are of greatest potential concern in terms of contributing to population risk. This data source provides downloadable information on emissions at the state, county and census tract level.
KiT: a MATLAB package for kinetochore tracking.
Armond, Jonathan W; Vladimirou, Elina; McAinsh, Andrew D; Burroughs, Nigel J
2016-06-15
During mitosis, chromosomes are attached to the mitotic spindle via large protein complexes called kinetochores. The motion of kinetochores throughout mitosis is intricate and automated quantitative tracking of their motion has already revealed many surprising facets of their behaviour. Here, we present 'KiT' (Kinetochore Tracking)-an easy-to-use, open-source software package for tracking kinetochores from live-cell fluorescent movies. KiT supports 2D, 3D and multi-colour movies, quantification of fluorescence, integrated deconvolution, parallel execution and multiple algorithms for particle localization. KiT is free, open-source software implemented in MATLAB and runs on all MATLAB supported platforms. KiT can be downloaded as a package from http://www.mechanochemistry.org/mcainsh/software.php The source repository is available at https://bitbucket.org/jarmond/kit and under continuing development. Supplementary data are available at Bioinformatics online. jonathan.armond@warwick.ac.uk. © The Author 2016. Published by Oxford University Press.
Implementation and Testing of Low Cost Uav Platform for Orthophoto Imaging
NASA Astrophysics Data System (ADS)
Brucas, D.; Suziedelyte-Visockiene, J.; Ragauskas, U.; Berteska, E.; Rudinskas, D.
2013-08-01
Implementation of Unmanned Aerial Vehicles for civilian applications is rapidly increasing. Technologies which were expensive and available only for military use have recently spread on civilian market. There is a vast number of low cost open source components and systems for implementation on UAVs available. Using of low cost hobby and open source components ensures considerable decrease of UAV price, though in some cases compromising its reliability. In Space Science and Technology Institute (SSTI) in collaboration with Vilnius Gediminas Technical University (VGTU) researches have been performed in field of constructing and implementation of small UAVs composed of low cost open source components (and own developments). Most obvious and simple implementation of such UAVs - orthophoto imaging with data download and processing after the flight. The construction, implementation of UAVs, flight experience, data processing and data implementation will be further covered in the paper and presentation.
Simple and detailed conceptual model diagram and associated narrative for ammonia, dissolved oxygen, flow alteration, herbicides, insecticides, ionic strength, metals, nutrients, ph, physical habitat, sediments, temperature, unspecified toxic chemicals.
NASA Astrophysics Data System (ADS)
Takaesu, M.; Horikawa, H.; Sueki, K.; Kamiya, S.; Nakamura, T.; Nakano, M.; Takahashi, N.; Sonoda, A.; Tsuboi, S.
2014-12-01
Mega-thrust earthquakes are anticipated to occur in the Nankai Trough in southwest Japan. In the source areas, we installed seafloor seismic network, DONET (Dense Ocean-floor Network System for Earthquake and Tsunamis), in 2010 in order to monitor seismicity, crustal deformations, and tsunamis. DONET system consists of totally 20 stations, which is composed of six kinds of sensors; strong-motion and broadband seismometers, quartz and differential pressure gauges, hydrophone, and thermometer. The stations are densely distributed with an average spatial interval of 15-20 km and cover near coastal areas to the trench axis. Observed data are transferred to a land station through a fiber-optical cable and then to JAMSTEC (Japan Agency for Marine-Earth Science and Technology) data management center through a private network in real time. The data are based on WIN32 format in the private network and finally archived in SEED format in the management center to combine waveform data with related metadata. We are developing a web-based application system to easily download seismic waveform data of DONET. In this system, users can select 20 Hz broadband (BH type) and 200 Hz strong-motion (EH type) data and download them in SEED. Users can also search events from the options of time periods, magnitude, source area and depth in a GUI platform. Event data are produced referring to event catalogues from USGS and JMA (Japan Meteorological Agency). The thresholds of magnitudes for the production are M6 for far-field and M4 for local events using the USGS and JMA lists, respectively. Available data lengths depend on magnitudes and epicentral distances. In this presentation, we briefly introduce DONET stations and then show our developed application system. We open DONET data through the system and want them to be widely recognized so that many users analyze. We also discuss next plans for further developments of the system.
Expression Atlas: gene and protein expression across multiple studies and organisms
Tang, Y Amy; Bazant, Wojciech; Burke, Melissa; Fuentes, Alfonso Muñoz-Pomer; George, Nancy; Koskinen, Satu; Mohammed, Suhaib; Geniza, Matthew; Preece, Justin; Jarnuczak, Andrew F; Huber, Wolfgang; Stegle, Oliver; Brazma, Alvis; Petryszak, Robert
2018-01-01
Abstract Expression Atlas (http://www.ebi.ac.uk/gxa) is an added value database that provides information about gene and protein expression in different species and contexts, such as tissue, developmental stage, disease or cell type. The available public and controlled access data sets from different sources are curated and re-analysed using standardized, open source pipelines and made available for queries, download and visualization. As of August 2017, Expression Atlas holds data from 3,126 studies across 33 different species, including 731 from plants. Data from large-scale RNA sequencing studies including Blueprint, PCAWG, ENCODE, GTEx and HipSci can be visualized next to each other. In Expression Atlas, users can query genes or gene-sets of interest and explore their expression across or within species, tissues, developmental stages in a constitutive or differential context, representing the effects of diseases, conditions or experimental interventions. All processed data matrices are available for direct download in tab-delimited format or as R-data. In addition to the web interface, data sets can now be searched and downloaded through the Expression Atlas R package. Novel features and visualizations include the on-the-fly analysis of gene set overlaps and the option to view gene co-expression in experiments investigating constitutive gene expression across tissues or other conditions. PMID:29165655
An Adjunct Galilean Satellite Orbiter Using a Small Radioisotope Power Source
NASA Technical Reports Server (NTRS)
Abelson, Robert Dean; Randolph, J.; Alkalai, L.; Collins, D.; Moore, W.
2005-01-01
This is a conceptual mission study intended to demonstrate the range of possible missions and applications that could be enabled were a new generation of Small Radioisotope Power Systems to be developed by NASA and DOE. While such systems are currently being considered by NASA and DOE, they do not currently exist. This study is one of several small RPS-enabled mission concepts that were studied and presented in the NASA/JPL document "Enabling Exploration with Small Radioisotope Power Systems" available at: http://solarsystem.nasa.gov/multimedia/download-detail.cfm?DL_ID=82
Standards-Based Open-Source Planetary Map Server: Lunaserv
NASA Astrophysics Data System (ADS)
Estes, N. M.; Silva, V. H.; Bowley, K. S.; Lanjewar, K. K.; Robinson, M. S.
2018-04-01
Lunaserv is a planetary capable Web Map Service developed by the LROC SOC. It enables researchers to serve their own planetary data to a wide variety of GIS clients without any additional processing or download steps.
The Baghdad that Was: Using Primary Sources to Teach World History
ERIC Educational Resources Information Center
Schur, Joan Brodsky
2009-01-01
That primary source documents have the power to bring the past alive is no news to social studies teachers. What is new in the last 10 years is the number of digitized documents available online that teachers can download and use in their classrooms. Encouraging teachers to utilize this ever-increasing treasure trove of resources was the goal of…
The State of Open Source Electronic Health Record Projects: A Software Anthropology Study
2017-01-01
Background Electronic health records (EHR) are a key tool in managing and storing patients’ information. Currently, there are over 50 open source EHR systems available. Functionality and usability are important factors for determining the success of any system. These factors are often a direct reflection of the domain knowledge and developers’ motivations. However, few published studies have focused on the characteristics of free and open source software (F/OSS) EHR systems and none to date have discussed the motivation, knowledge background, and demographic characteristics of the developers involved in open source EHR projects. Objective This study analyzed the characteristics of prevailing F/OSS EHR systems and aimed to provide an understanding of the motivation, knowledge background, and characteristics of the developers. Methods This study identified F/OSS EHR projects on SourceForge and other websites from May to July 2014. Projects were classified and characterized by license type, downloads, programming languages, spoken languages, project age, development status, supporting materials, top downloads by country, and whether they were “certified” EHRs. Health care F/OSS developers were also surveyed using an online survey. Results At the time of the assessment, we uncovered 54 open source EHR projects, but only four of them had been successfully certified under the Office of the National Coordinator for Health Information Technology (ONC Health IT) Certification Program. In the majority of cases, the open source EHR software was downloaded by users in the United States (64.07%, 148,666/232,034), underscoring that there is a significant interest in EHR open source applications in the United States. A survey of EHR open source developers was conducted and a total of 103 developers responded to the online questionnaire. The majority of EHR F/OSS developers (65.3%, 66/101) are participating in F/OSS projects as part of a paid activity and only 25.7% (26/101) of EHR F/OSS developers are, or have been, health care providers in their careers. In addition, 45% (45/99) of developers do not work in the health care field. Conclusion The research presented in this study highlights some challenges that may be hindering the future of health care F/OSS. A minority of developers have been health care professionals, and only 55% (54/99) work in the health care field. This undoubtedly limits the ability of functional design of F/OSS EHR systems from being a competitive advantage over prevailing commercial EHR systems. Open source software seems to be a significant interest to many; however, given that only four F/OSS EHR systems are ONC-certified, this interest is unlikely to yield significant adoption of these systems in the United States. Although the Health Information Technology for Economic and Clinical Health (HITECH) act was responsible for a substantial infusion of capital into the EHR marketplace, the lack of a corporate entity in most F/OSS EHR projects translates to a marginal capacity to market the respective F/OSS system and to navigate certification. This likely has further disadvantaged F/OSS EHR adoption in the United States. PMID:28235750
This program serves two purposes: (1) as a general-purpose indoor exposure model in buildings with multiple zones, multiple chemicals and multiple sources and sinks, and (2) as a special-purpose concentration model
NASA Astrophysics Data System (ADS)
Neher, Peter F.; Stieltjes, Bram; Reisert, Marco; Reicht, Ignaz; Meinzer, Hans-Peter; Fritzsche, Klaus H.
2012-02-01
Fiber tracking algorithms yield valuable information for neurosurgery as well as automated diagnostic approaches. However, they have not yet arrived in the daily clinical practice. In this paper we present an open source integration of the global tractography algorithm proposed by Reisert et.al.1 into the open source Medical Imaging Interaction Toolkit (MITK) developed and maintained by the Division of Medical and Biological Informatics at the German Cancer Research Center (DKFZ). The integration of this algorithm into a standardized and open development environment like MITK enriches accessibility of tractography algorithms for the science community and is an important step towards bringing neuronal tractography closer to a clinical application. The MITK diffusion imaging application, downloadable from www.mitk.org, combines all the steps necessary for a successful tractography: preprocessing, reconstruction of the images, the actual tracking, live monitoring of intermediate results, postprocessing and visualization of the final tracking results. This paper presents typical tracking results and demonstrates the steps for pre- and post-processing of the images.
Closer Look at Cancer Imaging Tools
... download on the Apple Store and Google Play. SOURCE: National Institute of Biomedical Imaging and Bioengineering: Medical Scans Spring 2018 Issue: Volume 13 Number 1 Page 11 MedlinePlus Subscribe Magazine Information Contact Us Viewers & Players Friends of the National Library of Medicine (FNLM) top
Gamma-Insensitive Fast Neutron Detector with Spectral Source Identification Potential
2011-03-01
commercialising novel detection technologies developed for fundamental science. Rico’s past work experience includes consulting work performed for the European...copies of this journal and the articles contained herein may be printed or downloaded and redistributed for personal, research or educational
... this page, please enable JavaScript. MedlinePlus produces XML data sets that you are welcome to download and use. If you have questions about the MedlinePlus XML files, please contact us . For additional sources of MedlinePlus data in XML format, visit our Web service page, ...
A Guide to Lowering Test Scores.
ERIC Educational Resources Information Center
Rosenblum, Shelly; Spark, Barbara
2002-01-01
Discusses the adverse impact of poor classroom air quality on student performance and how school officials can eliminate the sources of indoor air pollution. Describes Environmental Protection Agency's "Indoor Air Quality Tools for Schools" program downloadable at www.epa.gov/iaq/schools/index.html. (PKP)
GIS Data Downloads | USDA Plant Hardiness Zone Map
Acknowledgments & Citation Copyright Map & Data Downloads Map Downloads Geography (GIS) Downloads Multi & Data Downloads / GIS Data Downloads Topics Map Downloads Geography (GIS) Downloads Multi-Zip Code
Jackson, M E; Gnadt, J W
1999-03-01
The object-oriented graphical programming language LabView was used to implement the numerical solution to a computational model of saccade generation in primates. The computational model simulates the activity and connectivity of anatomical strictures known to be involved in saccadic eye movements. The LabView program provides a graphical user interface to the model that makes it easy to observe and modify the behavior of each element of the model. Essential elements of the source code of the LabView program are presented and explained. A copy of the model is available for download from the internet.
Are YouTube videos accurate and reliable on basic life support and cardiopulmonary resuscitation?
Yaylaci, Serpil; Serinken, Mustafa; Eken, Cenker; Karcioglu, Ozgur; Yilmaz, Atakan; Elicabuk, Hayri; Dal, Onur
2014-10-01
The objective of this study is to investigate reliability and accuracy of the information on YouTube videos related to CPR and BLS in accord with 2010 CPR guidelines. YouTube was queried using four search terms 'CPR', 'cardiopulmonary resuscitation', 'BLS' and 'basic life support' between 2011 and 2013. Sources that uploaded the videos, the record time, the number of viewers in the study period, inclusion of human or manikins were recorded. The videos were rated if they displayed the correct order of resuscitative efforts in full accord with 2010 CPR guidelines or not. Two hundred and nine videos meeting the inclusion criteria after the search in YouTube with four search terms ('CPR', 'cardiopulmonary resuscitation', 'BLS' and 'basic life support') comprised the study sample subjected to the analysis. Median score of the videos is 5 (IQR: 3.5-6). Only 11.5% (n = 24) of the videos were found to be compatible with 2010 CPR guidelines with regard to sequence of interventions. Videos uploaded by 'Guideline bodies' had significantly higher rates of download when compared with the videos uploaded by other sources. Sources of the videos and date of upload (year) were not shown to have any significant effect on the scores received (P = 0.615 and 0.513, respectively). The videos' number of downloads did not differ according to the videos compatible with the guidelines (P = 0.832). The videos downloaded more than 10,000 times had a higher score than the others (P = 0.001). The majority of You-Tube video clips purporting to be about CPR are not relevant educational material. Of those that are focused on teaching CPR, only a small minority optimally meet the 2010 Resucitation Guidelines. © 2014 Australasian College for Emergency Medicine and Australasian Society for Emergency Medicine.
VizieR Online Data Catalog: A spectral approach to transit timing variations (Ofir+, 2018)
NASA Astrophysics Data System (ADS)
Ofir, A.; Xie, J.-W.; Jiang, C.-F.; Sari, R.; Aharonson, O.
2018-03-01
We used Kepler Data Release 24 as the source data and the Kepler Objects of Interest (KOIs) table downloaded from the NExSci archive on 2015 December 25 as the source of list of candidate signals, and processed 4706 object not dispositioned as "false positive". We remove eclipsing binaries from the candidates list (see section 4.1 for further details). (1 data file).
Performance Assessment of Network Intrusion-Alert Prediction
2012-09-01
the threats. In this thesis, we use Snort to generate the intrusion detection alerts. 2. SNORT Snort is an open source network intrusion...standard for IPS. (Snort, 2012) We choose Snort because it is an open source product that is free to download and can be deployed cross-platform...Learning & prediction in relational time series: A survey. 21st Behavior Representation in Modeling & Simulation ( BRIMS ) Conference 2012, 93–100. Tan
Community Coordinated Modeling Center Support of Science Needs for Integrated Data Environment
NASA Technical Reports Server (NTRS)
Kuznetsova, M. M.; Hesse, M.; Rastatter, L.; Maddox, M.
2007-01-01
Space science models are essential component of integrated data environment. Space science models are indispensable tools to facilitate effective use of wide variety of distributed scientific sources and to place multi-point local measurements into global context. The Community Coordinated Modeling Center (CCMC) hosts a set of state-of-the- art space science models ranging from the solar atmosphere to the Earth's upper atmosphere. The majority of models residing at CCMC are comprehensive computationally intensive physics-based models. To allow the models to be driven by data relevant to particular events, the CCMC developed an online data file generation tool that automatically downloads data from data providers and transforms them to required format. CCMC provides a tailored web-based visualization interface for the model output, as well as the capability to download simulations output in portable standard format with comprehensive metadata and user-friendly model output analysis library of routines that can be called from any C supporting language. CCMC is developing data interpolation tools that enable to present model output in the same format as observations. CCMC invite community comments and suggestions to better address science needs for the integrated data environment.
Understanding and Using the Fermi Science Tools
NASA Astrophysics Data System (ADS)
Asercion, Joseph
2018-01-01
The Fermi Science Support Center (FSSC) provides information, documentation, and tools for the analysis of Fermi science data, including both the Large-Area Telescope (LAT) and the Gamma-ray Burst Monitor (GBM). Source and binary versions of the Fermi Science Tools can be downloaded from the FSSC website, and are supported on multiple platforms. An overview document, the Cicerone, provides details of the Fermi mission, the science instruments and their response functions, the science data preparation and analysis process, and interpretation of the results. Analysis Threads and a reference manual available on the FSSC website provide the user with step-by-step instructions for many different types of data analysis: point source analysis - generating maps, spectra, and light curves, pulsar timing analysis, source identification, and the use of python for scripting customized analysis chains. We present an overview of the structure of the Fermi science tools and documentation, and how to acquire them. We also provide examples of standard analyses, including tips and tricks for improving Fermi science analysis.
Crowd-sourcing relative preferences for ecosystem services in the St. Louis River AOC
Analysis of ecosystem service tradeoffs among project scenarios is more reliable when valuation data are available. Empirical valuation data are expensive and difficult to collect. As a possible alternative or supplement to empirical data, we downloaded and classified images from...
This tutorial provides instructions for accessing, retrieving, and downloading the following software to install on a host computer in support of Quantitative Microbial Risk Assessment (QMRA) modeling:• SDMProjectBuilder (which includes the Microbial Source Module as part...
75 FR 35457 - Draft of the 2010 Causal Analysis/Diagnosis Decision Information System (CADDIS)
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-22
... Causal Analysis/Diagnosis Decision Information System (CADDIS) AGENCY: Environmental Protection Agency... site, ``2010 release of the Causal Analysis/Diagnosis Decision Information System (CADDIS).'' The... analyses, downloadable software tools, and links to outside information sources. II. How to Submit Comments...
Navigating protected genomics data with UCSC Genome Browser in a Box.
Haeussler, Maximilian; Raney, Brian J; Hinrichs, Angie S; Clawson, Hiram; Zweig, Ann S; Karolchik, Donna; Casper, Jonathan; Speir, Matthew L; Haussler, David; Kent, W James
2015-03-01
Genome Browser in a Box (GBiB) is a small virtual machine version of the popular University of California Santa Cruz (UCSC) Genome Browser that can be run on a researcher's own computer. Once GBiB is installed, a standard web browser is used to access the virtual server and add personal data files from the local hard disk. Annotation data are loaded on demand through the Internet from UCSC or can be downloaded to the local computer for faster access. Software downloads and installation instructions are freely available for non-commercial use at https://genome-store.ucsc.edu/. GBiB requires the installation of open-source software VirtualBox, available for all major operating systems, and the UCSC Genome Browser, which is open source and free for non-commercial use. Commercial use of GBiB and the Genome Browser requires a license (http://genome.ucsc.edu/license/). © The Author 2014. Published by Oxford University Press.
The State of Open Source Electronic Health Record Projects: A Software Anthropology Study.
Alsaffar, Mona; Yellowlees, Peter; Odor, Alberto; Hogarth, Michael
2017-02-24
Electronic health records (EHR) are a key tool in managing and storing patients' information. Currently, there are over 50 open source EHR systems available. Functionality and usability are important factors for determining the success of any system. These factors are often a direct reflection of the domain knowledge and developers' motivations. However, few published studies have focused on the characteristics of free and open source software (F/OSS) EHR systems and none to date have discussed the motivation, knowledge background, and demographic characteristics of the developers involved in open source EHR projects. This study analyzed the characteristics of prevailing F/OSS EHR systems and aimed to provide an understanding of the motivation, knowledge background, and characteristics of the developers. This study identified F/OSS EHR projects on SourceForge and other websites from May to July 2014. Projects were classified and characterized by license type, downloads, programming languages, spoken languages, project age, development status, supporting materials, top downloads by country, and whether they were "certified" EHRs. Health care F/OSS developers were also surveyed using an online survey. At the time of the assessment, we uncovered 54 open source EHR projects, but only four of them had been successfully certified under the Office of the National Coordinator for Health Information Technology (ONC Health IT) Certification Program. In the majority of cases, the open source EHR software was downloaded by users in the United States (64.07%, 148,666/232,034), underscoring that there is a significant interest in EHR open source applications in the United States. A survey of EHR open source developers was conducted and a total of 103 developers responded to the online questionnaire. The majority of EHR F/OSS developers (65.3%, 66/101) are participating in F/OSS projects as part of a paid activity and only 25.7% (26/101) of EHR F/OSS developers are, or have been, health care providers in their careers. In addition, 45% (45/99) of developers do not work in the health care field. The research presented in this study highlights some challenges that may be hindering the future of health care F/OSS. A minority of developers have been health care professionals, and only 55% (54/99) work in the health care field. This undoubtedly limits the ability of functional design of F/OSS EHR systems from being a competitive advantage over prevailing commercial EHR systems. Open source software seems to be a significant interest to many; however, given that only four F/OSS EHR systems are ONC-certified, this interest is unlikely to yield significant adoption of these systems in the United States. Although the Health Information Technology for Economic and Clinical Health (HITECH) act was responsible for a substantial infusion of capital into the EHR marketplace, the lack of a corporate entity in most F/OSS EHR projects translates to a marginal capacity to market the respective F/OSS system and to navigate certification. This likely has further disadvantaged F/OSS EHR adoption in the United States. ©Mona Alsaffar, Peter Yellowlees, Alberto Odor, Michael Hogarth. Originally published in JMIR Medical Informatics (http://medinform.jmir.org), 24.02.2017.
Chandra Source Catalog: User Interfaces
NASA Astrophysics Data System (ADS)
Bonaventura, Nina; Evans, I. N.; Harbo, P. N.; Rots, A. H.; Tibbetts, M. S.; Van Stone, D. W.; Zografou, P.; Anderson, C. S.; Chen, J. C.; Davis, J. E.; Doe, S. M.; Evans, J. D.; Fabbiano, G.; Galle, E.; Gibbs, D. G.; Glotfelty, K. J.; Grier, J. D.; Hain, R.; Hall, D. M.; He, X.; Houck, J. C.; Karovska, M.; Lauer, J.; McCollough, M. L.; McDowell, J. C.; Miller, J. B.; Mitschang, A. W.; Morgan, D. L.; Nichols, J. S.; Nowak, M. A.; Plummer, D. A.; Primini, F. A.; Refsdal, B. L.; Siemiginowska, A. L.; Sundheim, B. A.; Winkelman, S. L.
2010-03-01
The CSCview data mining interface is available for browsing the Chandra Source Catalog (CSC) and downloading tables of quality-assured source properties and data products. Once the desired source properties and search criteria are entered into the CSCview query form, the resulting source matches are returned in a table along with the values of the requested source properties for each source. (The catalog can be searched on any source property, not just position.) At this point, the table of search results may be saved to a text file, and the available data products for each source may be downloaded. CSCview save files are output in RDB-like and VOTable format. The available CSC data products include event files, spectra, lightcurves, and images, all of which are processed with the CIAO software. CSC data may also be accessed non-interactively with Unix command-line tools such as cURL and Wget, using ADQL 2.0 query syntax. In fact, CSCview features a separate ADQL query form for those who wish to specify this type of query within the GUI. Several interfaces are available for learning if a source is included in the catalog (in addition to CSCview): 1) the CSC interface to Sky in Google Earth shows the footprint of each Chandra observation on the sky, along with the CSC footprint for comparison (CSC source properties are also accessible when a source within a Chandra field-of-view is clicked); 2) the CSC Limiting Sensitivity online tool indicates if a source at an input celestial location was too faint for detection; 3) an IVOA Simple Cone Search interface locates all CSC sources within a specified radius of an R.A. and Dec.; and 4) the CSC-SDSS cross-match service returns the list of sources common to the CSC and SDSS, either all such sources or a subset based on search criteria.
MRMPlus: an open source quality control and assessment tool for SRM/MRM assay development.
Aiyetan, Paul; Thomas, Stefani N; Zhang, Zhen; Zhang, Hui
2015-12-12
Selected and multiple reaction monitoring involves monitoring a multiplexed assay of proteotypic peptides and associated transitions in mass spectrometry runs. To describe peptide and associated transitions as stable, quantifiable, and reproducible representatives of proteins of interest, experimental and analytical validation is required. However, inadequate and disparate analytical tools and validation methods predispose assay performance measures to errors and inconsistencies. Implemented as a freely available, open-source tool in the platform independent Java programing language, MRMPlus computes analytical measures as recommended recently by the Clinical Proteomics Tumor Analysis Consortium Assay Development Working Group for "Tier 2" assays - that is, non-clinical assays sufficient enough to measure changes due to both biological and experimental perturbations. Computed measures include; limit of detection, lower limit of quantification, linearity, carry-over, partial validation of specificity, and upper limit of quantification. MRMPlus streamlines assay development analytical workflow and therefore minimizes error predisposition. MRMPlus may also be used for performance estimation for targeted assays not described by the Assay Development Working Group. MRMPlus' source codes and compiled binaries can be freely downloaded from https://bitbucket.org/paiyetan/mrmplusgui and https://bitbucket.org/paiyetan/mrmplusgui/downloads respectively.
GAMBIT: the global and modular beyond-the-standard-model inference tool
NASA Astrophysics Data System (ADS)
Athron, Peter; Balazs, Csaba; Bringmann, Torsten; Buckley, Andy; Chrząszcz, Marcin; Conrad, Jan; Cornell, Jonathan M.; Dal, Lars A.; Dickinson, Hugh; Edsjö, Joakim; Farmer, Ben; Gonzalo, Tomás E.; Jackson, Paul; Krislock, Abram; Kvellestad, Anders; Lundberg, Johan; McKay, James; Mahmoudi, Farvah; Martinez, Gregory D.; Putze, Antje; Raklev, Are; Ripken, Joachim; Rogan, Christopher; Saavedra, Aldo; Savage, Christopher; Scott, Pat; Seo, Seon-Hee; Serra, Nicola; Weniger, Christoph; White, Martin; Wild, Sebastian
2017-11-01
We describe the open-source global fitting package GAMBIT: the Global And Modular Beyond-the-Standard-Model Inference Tool. GAMBIT combines extensive calculations of observables and likelihoods in particle and astroparticle physics with a hierarchical model database, advanced tools for automatically building analyses of essentially any model, a flexible and powerful system for interfacing to external codes, a suite of different statistical methods and parameter scanning algorithms, and a host of other utilities designed to make scans faster, safer and more easily-extendible than in the past. Here we give a detailed description of the framework, its design and motivation, and the current models and other specific components presently implemented in GAMBIT. Accompanying papers deal with individual modules and present first GAMBIT results. GAMBIT can be downloaded from gambit.hepforge.org.
MultiElec: A MATLAB Based Application for MEA Data Analysis.
Georgiadis, Vassilis; Stephanou, Anastasis; Townsend, Paul A; Jackson, Thomas R
2015-01-01
We present MultiElec, an open source MATLAB based application for data analysis of microelectrode array (MEA) recordings. MultiElec displays an extremely user-friendly graphic user interface (GUI) that allows the simultaneous display and analysis of voltage traces for 60 electrodes and includes functions for activation-time determination, the production of activation-time heat maps with activation time and isoline display. Furthermore, local conduction velocities are semi-automatically calculated along with their corresponding vector plots. MultiElec allows ad hoc signal suppression, enabling the user to easily and efficiently handle signal artefacts and for incomplete data sets to be analysed. Voltage traces and heat maps can be simply exported for figure production and presentation. In addition, our platform is able to produce 3D videos of signal progression over all 60 electrodes. Functions are controlled entirely by a single GUI with no need for command line input or any understanding of MATLAB code. MultiElec is open source under the terms of the GNU General Public License as published by the Free Software Foundation, version 3. Both the program and source code are available to download from http://www.cancer.manchester.ac.uk/MultiElec/.
CCDST: A free Canadian climate data scraping tool
NASA Astrophysics Data System (ADS)
Bonifacio, Charmaine; Barchyn, Thomas E.; Hugenholtz, Chris H.; Kienzle, Stefan W.
2015-02-01
In this paper we present a new software tool that automatically fetches, downloads and consolidates climate data from a Web database where the data are contained on multiple Web pages. The tool is called the Canadian Climate Data Scraping Tool (CCDST) and was developed to enhance access and simplify analysis of climate data from Canada's National Climate Data and Information Archive (NCDIA). The CCDST deconstructs a URL for a particular climate station in the NCDIA and then iteratively modifies the date parameters to download large volumes of data, remove individual file headers, and merge data files into one output file. This automated sequence enhances access to climate data by substantially reducing the time needed to manually download data from multiple Web pages. To this end, we present a case study of the temporal dynamics of blowing snow events that resulted in ~3.1 weeks time savings. Without the CCDST, the time involved in manually downloading climate data limits access and restrains researchers and students from exploring climate trends. The tool is coded as a Microsoft Excel macro and is available to researchers and students for free. The main concept and structure of the tool can be modified for other Web databases hosting geophysical data.
VizieR Online Data Catalog: Swift AGN and Cluster Survey (SACS). I. (Dai+, 2015)
NASA Astrophysics Data System (ADS)
Dai, X.; Griffin, R. D.; Kochanek, C. S.; Nugent, J. M.; Bregman, J. N.
2015-07-01
The Swift XRT observations of GRB fields were downloaded from the HEASARC website. This includes ~9yr of data through 2013 July 27. We matched the sources to the WISE all-sky catalog (see section 3.2). (5 data files).
The International Energy Portal includes a powerful data browser that provides country-level energy data; many countries have at least 30 years of historical data. The data browser provides users the ability to view and download complete datasets for consumption, production, trade, reserves, and carbon dioxide emissions for different fuels and energy sources.
Taxonomy of Spyware and Empirical Study of Network Drive-By-Downloads
2005-09-01
homepages Mortgage company homepages Universities 500 Source: www.utexas.edu/world/univ/state/ Unsafe Sectors Adult Entertainment 418 Free porn ...industries/technology/2004-07-01-cyber-threat_x.htm. (Last accessed September 7, 2005) [2] American Library Association. Great Web Sites for Kids
Data Mining Meets HCI: Making Sense of Large Graphs
2012-07-01
graph algo- rithms, won the Open Source Software World Challenge, Silver Award. We have released Pegasus as free , open-source software, downloaded by...METIS [77], spectral clustering [108], and the parameter- free “Cross-associations” (CA) [26]. Belief Propagation can also be used for clus- tering, as...number of tools have been developed to support “ landscape ” views of information. These include WebBook and Web- Forager [23], which use a book metaphor
Duley, Aaron R; Janelle, Christopher M; Coombes, Stephen A
2004-11-01
The cardiovascular system has been extensively measured in a variety of research and clinical domains. Despite technological and methodological advances in cardiovascular science, the analysis and evaluation of phasic changes in heart rate persists as a way to assess numerous psychological concomitants. Some researchers, however, have pointed to constraints on data analysis when evaluating cardiac activity indexed by heart rate or heart period. Thus, an off-line application toolkit for heart rate analysis is presented. The program, written with National Instruments' LabVIEW, incorporates a variety of tools for off-line extraction and analysis of heart rate data. Current methods and issues concerning heart rate analysis are highlighted, and how the toolkit provides a flexible environment to ameliorate common problems that typically lead to trial rejection is discussed. Source code for this program may be downloaded from the Psychonomic Society Web archive at www.psychonomic.org/archive/.
Donkin, Christopher; Brown, Scott D; Heathcote, Andrew
2009-02-01
Psychological experiments often collect choice responses using buttonpresses. However, spoken responses are useful in many cases-for example, when working with special clinical populations, or when a paradigm demands vocalization, or when accurate response time measurements are desired. In these cases, spoken responses are typically collected using a voice key, which usually involves manual coding by experimenters in a tedious and error-prone manner. We describe ChoiceKey, an open-source speech recognition package for MATLAB. It can be optimized by training for small response sets and different speakers. We show ChoiceKey to be reliable with minimal training for most participants in experiments with two different responses. Problems presented by individual differences, and occasional atypical responses, are examined, and extensions to larger response sets are explored. The ChoiceKey source files and instructions may be downloaded as supplemental materials for this article from brm.psychonomic-journals.org/content/supplemental.
The disturbed geomagnetic field at European observatories. Sources and significance
NASA Astrophysics Data System (ADS)
Greculeasa, Razvan; Dobrica, Venera; Demetrescu, Crisan
2014-05-01
The disturbed geomagnetic field recorded at Earth's surface is given by the effects of electric current systems in the magnetosphere and ionosphere, as a result of the interaction of geomagnetic field with the solar wind and the interplanetary magnetic field. In this paper the geomagnetic disturbance recorded at European observatories has been investigated as regards its sources, for the time interval August 1-10, 2010, in which a moderate storm (Dstmin= -70 nT) occurred (August 3-4). The disturbance has been evidenced against the solar quiet daily variation, for each of the 29 observatories with minute data in the mentioned time interval. Data have been downloaded from the INTERMAGNET web page. The contribution of the magnetospheric ring current and of the auroral electrojet to the observed disturbance field in the X, Z, and D geomagnetic elements is discussed and the corresponding geographical distribution is presented.
XtalOpt version r9: An open-source evolutionary algorithm for crystal structure prediction
Falls, Zackary; Lonie, David C.; Avery, Patrick; ...
2015-10-23
This is a new version of XtalOpt, an evolutionary algorithm for crystal structure prediction available for download from the CPC library or the XtalOpt website, http://xtalopt.github.io. XtalOpt is published under the Gnu Public License (GPL), which is an open source license that is recognized by the Open Source Initiative. We have detailed the new version incorporates many bug-fixes and new features here and predict the crystal structure of a system from its stoichiometry alone, using evolutionary algorithms.
Prior-Based Quantization Bin Matching for Cloud Storage of JPEG Images.
Liu, Xianming; Cheung, Gene; Lin, Chia-Wen; Zhao, Debin; Gao, Wen
2018-07-01
Millions of user-generated images are uploaded to social media sites like Facebook daily, which translate to a large storage cost. However, there exists an asymmetry in upload and download data: only a fraction of the uploaded images are subsequently retrieved for viewing. In this paper, we propose a cloud storage system that reduces the storage cost of all uploaded JPEG photos, at the expense of a controlled increase in computation mainly during download of requested image subset. Specifically, the system first selectively re-encodes code blocks of uploaded JPEG images using coarser quantization parameters for smaller storage sizes. Then during download, the system exploits known signal priors-sparsity prior and graph-signal smoothness prior-for reverse mapping to recover original fine quantization bin indices, with either deterministic guarantee (lossless mode) or statistical guarantee (near-lossless mode). For fast reverse mapping, we use small dictionaries and sparse graphs that are tailored for specific clusters of similar blocks, which are classified via tree-structured vector quantizer. During image upload, cluster indices identifying the appropriate dictionaries and graphs for the re-quantized blocks are encoded as side information using a differential distributed source coding scheme to facilitate reverse mapping during image download. Experimental results show that our system can reap significant storage savings (up to 12.05%) at roughly the same image PSNR (within 0.18 dB).
Joyce, Brendan; Lee, Danny; Rubio, Alex; Ogurtsov, Aleksey; Alves, Gelio; Yu, Yi-Kuo
2018-03-15
RAId is a software package that has been actively developed for the past 10 years for computationally and visually analyzing MS/MS data. Founded on rigorous statistical methods, RAId's core program computes accurate E-values for peptides and proteins identified during database searches. Making this robust tool readily accessible for the proteomics community by developing a graphical user interface (GUI) is our main goal here. We have constructed a graphical user interface to facilitate the use of RAId on users' local machines. Written in Java, RAId_GUI not only makes easy executions of RAId but also provides tools for data/spectra visualization, MS-product analysis, molecular isotopic distribution analysis, and graphing the retrieval versus the proportion of false discoveries. The results viewer displays and allows the users to download the analyses results. Both the knowledge-integrated organismal databases and the code package (containing source code, the graphical user interface, and a user manual) are available for download at https://www.ncbi.nlm.nih.gov/CBBresearch/Yu/downloads/raid.html .
Geospatial resources for the geologic community: The USGS National Map
Witt, Emitt C.
2015-01-01
Geospatial data are a key component of investigating, interpreting, and communicating the geological sciences. Locating geospatial data can be time-consuming, which detracts from time spent on a study because these data are not obviously placed in central locations or are served from many disparate databases. The National Map of the US Geological Survey is a publicly available resource for accessing the geospatial base map data needs of the geological community from a central location. The National Map data are available through a viewer and download platform providing access to eight primary data themes, plus the US Topo and scanned historical topographic maps. The eight themes are elevation, orthoimagery, hydrography, geographic names, boundaries, transportation, structures, and land cover, and they are being offered for download as predefined tiles in formats supported by leading geographic information system software. Data tiles are periodically refreshed to capture the most current content and are an efficient method for disseminating and receiving geospatial information. Elevation data, for example, are offered as a download from the National Map as 1° × 1° tiles for the 10- and 30- m products and as 15′ × 15′ tiles for the higher-resolution 3-m product. Vector data sets with smaller file sizes are offered at several tile sizes and formats. Partial tiles are not a download option—any prestaged data that intersect the requesting bounding box will be, in their entirety, part of the download order. While there are many options for accessing geospatial data via the Web, the National Map represents authoritative sources of data that are documented and can be referenced for citation and inclusion in scientific publications. Therefore, National Map products and services should be part of a geologist’s first stop for geospatial information and data.
Broadcasting a Lab Measurement over Existing Conductor Networks
ERIC Educational Resources Information Center
Knipp, Peter A.
2009-01-01
Students learn about physical laws and the scientific method when they analyze experimental data in a laboratory setting. Three common sources exist for the experimental data that they analyze: (1) "hands-on" measurements by the students themselves, (2) electronic transfer (by downloading a spreadsheet, video, or computer-aided data-acquisition…
Homology Modeling and Molecular Docking for the Science Curriculum
ERIC Educational Resources Information Center
McDougal, Owen M.; Cornia, Nic; Sambasivarao, S. V.; Remm, Andrew; Mallory, Chris; Oxford, Julia Thom; Maupin, C. Mark; Andersen, Tim
2014-01-01
DockoMatic 2.0 is a powerful open source software program (downloadable from sourceforge.net) that allows users to utilize a readily accessible computational tool to explore biomolecules and their interactions. This manuscript describes a practical tutorial for use in the undergraduate curriculum that introduces students to macromolecular…
No Strings Attached: Open Source Solutions
ERIC Educational Resources Information Center
Fredricks, Kathy
2009-01-01
Imagine downloading a new software application and not having to worry about licensing, finding dollars in the budget, or incurring additional maintenance costs. Imagine finding a Web design tool in the public domain--free for use. Imagine major universities that provide online courses with no strings attached. Imagine online textbooks without a…
Methods and Software for Building Bibliographic Data Bases.
ERIC Educational Resources Information Center
Daehn, Ralph M.
1985-01-01
This in-depth look at database management systems (DBMS) for microcomputers covers data entry, information retrieval, security, DBMS software and design, and downloading of literature search results. The advantages of in-house systems versus online search vendors are discussed, and specifications of three software packages and 14 sources are…
Site Map | USDA Plant Hardiness Zone Map
Acknowledgments & Citation Copyright Map & Data Downloads Map Downloads Geography (GIS) Downloads Multi ; Citation Copyright Map & Data Downloads Map Downloads Geography (GIS) Downloads Multi-ZIP Code Finder
In Flight Calibration of the Magnetospheric Multiscale Mission Fast Plasma Investigation
NASA Technical Reports Server (NTRS)
Barrie, Alexander C.; Gershman, Daniel J.; Gliese, Ulrik; Dorelli, John C.; Avanov, Levon A.; Rager, Amy C.; Schiff, Conrad; Pollock, Craig J.
2015-01-01
The Fast Plasma Investigation (FPI) on the Magnetospheric Multiscale mission (MMS) combines data from eight spectrometers, each with four deflection states, into a single map of the sky. Any systematic discontinuity, artifact, noise source, etc. present in this map may be incorrectly interpreted as legitimate data and incorrect conclusions reached. For this reason it is desirable to have all spectrometers return the same output for a given input, and for this output to be low in noise sources or other errors. While many missions use statistical analyses of data to calibrate instruments in flight, this process is insufficient with FPI for two reasons: 1. Only a small fraction of high resolution data is downloaded to the ground due to bandwidth limitations and 2: The data that is downloaded is, by definition, scientifically interesting and therefore not ideal for calibration. FPI uses a suite of new tools to calibrate in flight. A new method for detection system ground calibration has been developed involving sweeping the detection threshold to fully define the pulse height distribution. This method has now been extended for use in flight as a means to calibrate MCP voltage and threshold (together forming the operating point) of the Dual Electron Spectrometers (DES) and Dual Ion Spectrometers (DIS). A method of comparing higher energy data (which has low fractional voltage error) to lower energy data (which has a higher fractional voltage error) will be used to calibrate the high voltage outputs. Finally, a comparison of pitch angle distributions will be used to find remaining discrepancies among sensors.
The Chandra Source Catalog: User Interface
NASA Astrophysics Data System (ADS)
Bonaventura, Nina; Evans, I. N.; Harbo, P. N.; Rots, A. H.; Tibbetts, M. S.; Van Stone, D. W.; Zografou, P.; Anderson, C. S.; Chen, J. C.; Davis, J. E.; Doe, S. M.; Evans, J. D.; Fabbiano, G.; Galle, E.; Gibbs, D. G.; Glotfelty, K. J.; Grier, J. D.; Hain, R.; Hall, D. M.; He, X.; Houck, J. C.; Karovska, M.; Lauer, J.; McCollough, M. L.; McDowell, J. C.; Miller, J. B.; Mitschang, A. W.; Morgan, D. L.; Nichols, J. S.; Nowak, M. A.; Plummer, D. A.; Primini, F. A.; Refsdal, B. L.; Siemiginowska, A. L.; Sundheim, B. A.; Winkelman, S. L.
2009-01-01
The Chandra Source Catalog (CSC) is the definitive catalog of all X-ray sources detected by Chandra. The CSC is presented to the user in two tables: the Master Chandra Source Table and the Table of Individual Source Observations. Each distinct X-ray source identified in the CSC is represented by a single master source entry and one or more individual source entries. If a source is unaffected by confusion and pile-up in multiple observations, the individual source observations are merged to produce a master source. In each table, a row represents a source, and each column a quantity that is officially part of the catalog. The CSC contains positions and multi-band fluxes for the sources, as well as derived spatial, spectral, and temporal source properties. The CSC also includes associated source region and full-field data products for each source, including images, photon event lists, light curves, and spectra. The master source properties represent the best estimates of the properties of a source, and are presented in the following categories: Position and Position Errors, Source Flags, Source Extent and Errors, Source Fluxes, Source Significance, Spectral Properties, and Source Variability. The CSC Data Access GUI provides direct access to the source properties and data products contained in the catalog. The user may query the catalog database via a web-style search or an SQL command-line query. Each query returns a table of source properties, along with the option to browse and download associated data products. The GUI is designed to run in a web browser with Java version 1.5 or higher, and may be accessed via a link on the CSC website homepage (http://cxc.harvard.edu/csc/). As an alternative to the GUI, the contents of the CSC may be accessed directly through a URL, using the command-line tool, cURL. Support: NASA contract NAS8-03060 (CXC).
Wing Download Results from a Test of a 0.658-Scale V-22 Rotor and Wing
NASA Technical Reports Server (NTRS)
Felker, Fort F.
1992-01-01
A test of a 0.658-scale V-22 rotor and wing was conducted in the 40 x 80 Foot Wind Tunnel at Ames Research Center. One of the principal objectives of the test was to measure the wing download in hover for a variety of test configurations. The wing download and surface pressures were measured for a wide range of thrust coefficients, with five different flap angles, two nacelle angles, and both directions or rotor rotation. This paper presents these results, and describes a new method for interpreting wing surface pressure data in hover. This method shows that the wing flap can produce substantial lift loads in hover.
Automated realtime data import for the i2b2 clinical data warehouse: introducing the HL7 ETL cell.
Majeed, Raphael W; Röhrig, Rainer
2012-01-01
Clinical data warehouses are used to consolidate all available clinical data from one or multiple organizations. They represent an important source for clinical research, quality management and controlling. Since its introduction, the data warehouse i2b2 gathered a large user base in the research community. Yet, little work has been done on the process of importing clinical data into data warehouses using existing standards. In this article, we present a novel approach of utilizing the clinical integration server as data source, commonly available in most hospitals. As information is transmitted through the integration server, the standardized HL7 message is immediately parsed and inserted into the data warehouse. Evaluation of import speeds suggest feasibility of the provided solution for real-time processing of HL7 messages. By using the presented approach of standardized data import, i2b2 can be used as a plug and play data warehouse, without the hurdle of customized import for every clinical information system or electronic medical record. The provided solution is available for download at http://sourceforge.net/projects/histream/.
Fractional populations in multiple gene inheritance.
Chung, Myung-Hoon; Kim, Chul Koo; Nahm, Kyun
2003-01-22
With complete knowledge of the human genome sequence, one of the most interesting tasks remaining is to understand the functions of individual genes and how they communicate. Using the information about genes (locus, allele, mutation rate, fitness, etc.), we attempt to explain population demographic data. This population evolution study could complement and enhance biologists' understanding about genes. We present a general approach to study population genetics in complex situations. In the present approach, multiple allele inheritance, multiple loci inheritance, natural selection and mutations are allowed simultaneously in order to consider a more realistic situation. A simulation program is presented so that readers can readily carry out studies with their own parameters. It is shown that the multiplicity of the loci greatly affects the demographic results of fractional population ratios. Furthermore, the study indicates that some high infant mortality rates due to congenital anomalies can be attributed to multiple loci inheritance. The simulation program can be downloaded from http://won.hongik.ac.kr/~mhchung/index_files/yapop.htm. In order to run this program, one needs Visual Studio.NET platform, which can be downloaded from http://msdn.microsoft.com/netframework/downloads/default.asp.
Multi-modal automatic montaging of adaptive optics retinal images
Chen, Min; Cooper, Robert F.; Han, Grace K.; Gee, James; Brainard, David H.; Morgan, Jessica I. W.
2016-01-01
We present a fully automated adaptive optics (AO) retinal image montaging algorithm using classic scale invariant feature transform with random sample consensus for outlier removal. Our approach is capable of using information from multiple AO modalities (confocal, split detection, and dark field) and can accurately detect discontinuities in the montage. The algorithm output is compared to manual montaging by evaluating the similarity of the overlapping regions after montaging, and calculating the detection rate of discontinuities in the montage. Our results show that the proposed algorithm has high alignment accuracy and a discontinuity detection rate that is comparable (and often superior) to manual montaging. In addition, we analyze and show the benefits of using multiple modalities in the montaging process. We provide the algorithm presented in this paper as open-source and freely available to download. PMID:28018714
A WebGL Tool for Visualizing the Topology of the Sun's Coronal Magnetic Field
NASA Astrophysics Data System (ADS)
Duffy, A.; Cheung, C.; DeRosa, M. L.
2012-12-01
We present a web-based, topology-viewing tool that allows users to visualize the geometry and topology of the Sun's 3D coronal magnetic field in an interactive manner. The tool is implemented using, open-source, mature, modern web technologies including WebGL, jQuery, HTML 5, and CSS 3, which are compatible with nearly all modern web browsers. As opposed to the traditional method of visualization, which involves the downloading and setup of various software packages-proprietary and otherwise-the tool presents a clean interface that allows the user to easily load and manipulate the model, while also offering great power to choose which topological features are displayed. The tool accepts data encoded in the JSON open format that has libraries available for nearly every major programming language, making it simple to generate the data.
Riva, Giuseppe; Carelli, Laura; Gaggioli, Andrea; Gorini, Alessandra; Vigna, Cinzia; Corsi, Riccardo; Faletti, Gianluca; Vezzadini, Luca
2009-01-01
At MMVR 2007 we presented NeuroVR (http://www.neurovr.org) a free virtual reality platform based on open-source software. The software allows non-expert users to adapt the content of 14 pre-designed virtual environments to the specific needs of the clinical or experimental setting. Following the feedbacks of the 700 users who downloaded the first version, we developed a new version - NeuroVR 1.5 - that improves the possibility for the therapist to enhance the patient's feeling of familiarity and intimacy with the virtual scene, by using external sounds, photos or videos. Specifically, the new version now includes full sound support and the ability of triggering external sounds and videos using the keyboard. The outcomes of different trials made using NeuroVR will be presented and discussed.
Teachers' Acceptance and Use of an Educational Portal
ERIC Educational Resources Information Center
Pynoo, Bram; Tondeur, Jo; van Braak, Johan; Duyck, Wouter; Sijnave, Bart; Duyck, Philippe
2012-01-01
In this study, teachers' acceptance and use of an educational portal is assessed based on data from two sources: usage data (number of logins, downloads, uploads, reactions and pages viewed) and an online acceptance questionnaire. The usage data is extracted on two occasions from the portal's database: at survey completion (T1) and twenty-two…
How to Keep Your Health Information Private and Secure
... permanently. When Using Mobile Devices · Research mobile apps – software programs that perform one or more specific functions – before you download and install any of them. Be sure to use known app websites or trusted sources. · Read the terms of service and the privacy notice of the mobile app ...
The Astrophysics Source Code Library: An Update
NASA Astrophysics Data System (ADS)
Allen, Alice; Nemiroff, R. J.; Shamir, L.; Teuben, P. J.
2012-01-01
The Astrophysics Source Code Library (ASCL), founded in 1999, takes an active approach to sharing astrophysical source code. ASCL's editor seeks out both new and old peer-reviewed papers that describe methods or experiments that involve the development or use of source code, and adds entries for the found codes to the library. This approach ensures that source codes are added without requiring authors to actively submit them, resulting in a comprehensive listing that covers a significant number of the astrophysics source codes used in peer-reviewed studies. The ASCL moved to a new location in 2010, and has over 300 codes in it and continues to grow. In 2011, the ASCL (http://asterisk.apod.com/viewforum.php?f=35) has on average added 19 new codes per month; we encourage scientists to submit their codes for inclusion. An advisory committee has been established to provide input and guide the development and expansion of its new site, and a marketing plan has been developed and is being executed. All ASCL source codes have been used to generate results published in or submitted to a refereed journal and are freely available either via a download site or from an identified source. This presentation covers the history of the ASCL and examines the current state and benefits of the ASCL, the means of and requirements for including codes, and outlines its future plans.
MetAMOS: a modular and open source metagenomic assembly and analysis pipeline
2013-01-01
We describe MetAMOS, an open source and modular metagenomic assembly and analysis pipeline. MetAMOS represents an important step towards fully automated metagenomic analysis, starting with next-generation sequencing reads and producing genomic scaffolds, open-reading frames and taxonomic or functional annotations. MetAMOS can aid in reducing assembly errors, commonly encountered when assembling metagenomic samples, and improves taxonomic assignment accuracy while also reducing computational cost. MetAMOS can be downloaded from: https://github.com/treangen/MetAMOS. PMID:23320958
Herbst, Christian T; Oh, Jinook; Vydrová, Jitka; Švec, Jan G
2015-07-01
In this short report we introduce DigitalVHI, a free open-source software application for obtaining Voice Handicap Index (VHI) and other questionnaire data, which can be put on a computer in clinics and used in clinical practice. The software can simplify performing clinical studies since it makes the VHI scores directly available for analysis in a digital form. It can be downloaded from http://www.christian-herbst.org/DigitalVHI/.
pytc: Open-Source Python Software for Global Analyses of Isothermal Titration Calorimetry Data.
Duvvuri, Hiranmayi; Wheeler, Lucas C; Harms, Michael J
2018-05-08
Here we describe pytc, an open-source Python package for global fits of thermodynamic models to multiple isothermal titration calorimetry experiments. Key features include simplicity, the ability to implement new thermodynamic models, a robust maximum likelihood fitter, a fast Bayesian Markov-Chain Monte Carlo sampler, rigorous implementation, extensive documentation, and full cross-platform compatibility. pytc fitting can be done using an application program interface or via a graphical user interface. It is available for download at https://github.com/harmslab/pytc .
2008-01-01
anomalous traffic of the node ÷ total anomalous traffic – Make parent nodes by merging child node information. prefix/length coverage/collateral...A T A U P L O A D F L O W S E N S O R 5. DATA DOWNLOAD FLOW SENSOR 1. RECON SNORT: KICKASS_PORN DRAGON: PORN HARDCORE SOURCEDEST SOURCE SOURCE...traffic • Make parent nodes by merging child node information. prefix/length coverage/collateral 0.0.0.0/0 100/100 depth=0 non divided 0.0.0.0/1 50/30
Forde, Arnell S.; Flocks, James G.; Wiese, Dana S.; Fredericks, Jake J.
2016-03-29
The archived trace data are in standard SEG Y rev. 0 format (Barry and others, 1975); the first 3,200 bytes of the card image header are in American Standard Code for Information Interchange (ASCII) format instead of Extended Binary Coded Decimal Interchange Code (EBCDIC) format. The SEG Y files are available on the DVD version of this report or online, downloadable via the USGS Coastal and Marine Geoscience Data System (http://cmgds.marine.usgs.gov). The data are also available for viewing using GeoMapApp (http://www.geomapapp.org) and Virtual Ocean (http://www.virtualocean.org) multi-platform open source software. The Web version of this archive does not contain the SEG Y trace files. To obtain the complete DVD archive, contact USGS Information Services at 1-888-ASK-USGS or infoservices@usgs.gov. The SEG Y files may be downloaded and processed with commercial or public domain software such as Seismic Unix (SU) (Cohen and Stockwell, 2010). See the How To Download SEG Y Data page for download instructions. The printable profiles are provided as Graphics Interchange Format (GIF) images processed and gained using SU software and can be viewed from theProfiles page or by using the links located on the trackline maps; refer to the Software page for links to example SU processing scripts.
Top of the charts: download versus citations in the International Journal of Cardiology.
Coats, Andrew J S
2005-11-02
The medical literature is growing at an alarming rate. Research assessment exercises, research quality frameworks, league tables and the like have attempted to quantify the volume, quality and impact of research. Yet the established measures (such as citation rates) are being challenged by the sheer number of journals, variability in the "gold standard" of peer-review and the emergence of open-source or web-based journals. In the last few years, we have seen a growth in downloads to individual journal articles that now easily exceeds formal journal subscriptions. We have recorded the 10 top cited articles over a 12-month period and compared them to the 10 most popular articles being downloaded over the same time period. The citation-based listing included basic and applied, observational and interventional original research reports. For downloaded articles, which have shown a dramatic increase for the International Journal of Cardiology from 48,000 in 2002 to 120,000 in 2003 to 200,000 in 2004, the most popular articles over the same period are very different and are dominated by up-to-date reviews of either cutting-edge topics (such as the potential of stem cells) or of the management of rare or unusual conditions. There is no overlap between the two lists despite covering exactly the same 12-month period and using measures of peer esteem. Perhaps the time has come to look at the usage of articles rather than, or in addition to, their referencing.
Community interactive webtool to retrieve Greenland glacier data for 1-D geometry
NASA Astrophysics Data System (ADS)
Perrette, Mahé
2015-04-01
Marine-terminating, outlet glaciers are challenging to include in conventional Greenland-wide ice sheet models because of the large variation in scale between model grid size (typically 10 km) and outlet glacier width (typically 1-5km), making it a subgrid scale feature. A possible approach to tackle this problem is to use one-dimensional flowline models for the individual glaciers (e.g. Nick et al., 2013, Nature; Enderlin et al 2013a,b, The Cryosphere). Here we present a python- and javascript- based webtool to prepare data required to feed in or validate a flowline model. It is designed primarily to outline the glacier geometry and returns relevant data averaged over cross-sections. The tool currently allows to: visualize 2-D ice sheet data (zoom/pan), quickly switch between datasets (e.g. ice thickness, bedrock elevation, surface velocity) interpolated / transformed on a common grid. draw flowlines from user-input seeds on the map, calculated from a vector field of surface velocity, as an helpful guide for point 3 interactively draw glacier outline (side and middle lines) on top of the data mesh the outlined glacier domain in the horizontal plane extract relevant data into a 1-D longitudinal profile download the result as a netCDF file The project is hosted on github to encourage collaboration, under the open-source MIT Licence. The server-side is written in python (open-source) using the web-framework flask, and the client-side (javascript) makes use of the d3 library for interactive figures. For now it only works locally in a web browser (start server: "python runserver.py"). Data need to be downloaded separately from the original sources. See the README file in the project for information how to use it. Github projects: https://github.com/perrette/webglacier1d (main) https://github.com/perrette/dimarray (dependency)
Miller, John J.; von Huene, Roland E.; Ryan, Holly F.
2014-01-01
In 1946 at Unimak Pass, Alaska, a tsunami destroyed the lighthouse at Scotch Cap, Unimak Island, took 159 lives on the Hawaiian Islands, damaged island coastal facilities across the south Pacific, and destroyed a hut in Antarctica. The tsunami magnitude of 9.3 is comparable to the magnitude 9.1 tsunami that devastated the Tohoku coast of Japan in 2011. Both causative earthquake epicenters occurred in shallow reaches of the subduction zone. Contractile tectonism along the Alaska margin presumably generated the far-field tsunami by producing a seafloor elevation change. However, the Scotch Cap lighthouse was destroyed by a near-field tsunami that was probably generated by a coeval large undersea landslide, yet bathymetric surveys showed no fresh large landslide scar. We investigated this problem by reprocessing five seismic lines, presented here as high-resolution graphic images, both uninterpreted and interpreted, and available for the reader to download. In addition, the processed seismic data for each line are available for download as seismic industry-standard SEG-Y files. One line, processed through prestack depth migration, crosses a 10 × 15 kilometer and 800-meter-high hill presumed previously to be basement, but that instead is composed of stratified rock superimposed on the slope sediment. This image and multibeam bathymetry illustrate a slide block that could have sourced the 1946 near-field tsunami because it is positioned within a distance determined by the time between earthquake shaking and the tsunami arrival at Scotch Cap and is consistent with the local extent of high runup of 42 meters along the adjacent Alaskan coast. The Unimak/Scotch Cap margin is structurally similar to the 2011 Tohoku tsunamigenic margin where a large landslide at the trench, coeval with the Tohoku earthquake, has been documented. Further study can improve our understanding of tsunami sources along Alaska’s erosional margins.
NASA Astrophysics Data System (ADS)
Nakagawa, Y.; Kawahara, S.; Araki, F.; Matsuoka, D.; Ishikawa, Y.; Fujita, M.; Sugimoto, S.; Okada, Y.; Kawazoe, S.; Watanabe, S.; Ishii, M.; Mizuta, R.; Murata, A.; Kawase, H.
2017-12-01
Analyses of large ensemble data are quite useful in order to produce probabilistic effect projection of climate change. Ensemble data of "+2K future climate simulations" are currently produced by Japanese national project "Social Implementation Program on Climate Change Adaptation Technology (SI-CAT)" as a part of a database for Policy Decision making for Future climate change (d4PDF; Mizuta et al. 2016) produced by Program for Risk Information on Climate Change. Those data consist of global warming simulations and regional downscaling simulations. Considering that those data volumes are too large (a few petabyte) to download to a local computer of users, a user-friendly system is required to search and download data which satisfy requests of the users. We develop "a database system for near-future climate change projections" for providing functions to find necessary data for the users under SI-CAT. The database system for near-future climate change projections mainly consists of a relational database, a data download function and user interface. The relational database using PostgreSQL is a key function among them. Temporally and spatially compressed data are registered on the relational database. As a first step, we develop the relational database for precipitation, temperature and track data of typhoon according to requests by SI-CAT members. The data download function using Open-source Project for a Network Data Access Protocol (OPeNDAP) provides a function to download temporally and spatially extracted data based on search results obtained by the relational database. We also develop the web-based user interface for using the relational database and the data download function. A prototype of the database system for near-future climate change projections are currently in operational test on our local server. The database system for near-future climate change projections will be released on Data Integration and Analysis System Program (DIAS) in fiscal year 2017. Techniques of the database system for near-future climate change projections might be quite useful for simulation and observational data in other research fields. We report current status of development and some case studies of the database system for near-future climate change projections.
The Astrophysics Source Code Library: Supporting software publication and citation
NASA Astrophysics Data System (ADS)
Allen, Alice; Teuben, Peter
2018-01-01
The Astrophysics Source Code Library (ASCL, ascl.net), established in 1999, is a free online registry for source codes used in research that has appeared in, or been submitted to, peer-reviewed publications. The ASCL is indexed by the SAO/NASA Astrophysics Data System (ADS) and Web of Science and is citable by using the unique ascl ID assigned to each code. In addition to registering codes, the ASCL can house archive files for download and assign them DOIs. The ASCL advocations for software citation on par with article citation, participates in multidiscipinary events such as Force11, OpenCon, and the annual Workshop on Sustainable Software for Science, works with journal publishers, and organizes Special Sessions and Birds of a Feather meetings at national and international conferences such as Astronomical Data Analysis Software and Systems (ADASS), European Week of Astronomy and Space Science, and AAS meetings. In this presentation, I will discuss some of the challenges of gathering credit for publishing software and ideas and efforts from other disciplines that may be useful to astronomy.
Wireless acquisition of multi-channel seismic data using the Seismobile system
NASA Astrophysics Data System (ADS)
Isakow, Zbigniew
2017-11-01
This paper describes the wireless acquisition of multi-channel seismic data using a specialized mobile system, Seismobile, designed for subsoil diagnostics for transportation routes. The paper presents examples of multi-channel seismic records obtained during system tests in a configuration with 96 channels (4 landstreamers of 24-channel) and various seismic sources. Seismic waves were generated at the same point using different sources: a 5-kg hammer, a Gisco's source with a 90-kg pile-driver, and two other the pile-drivers of 45 and 70 kg. Particular attention is paid to the synchronization of source timing, the measurement of geometry by autonomous GPS systems, and the repeatability of triggering measurements constrained by an accelerometer identifying the seismic waveform. The tests were designed to the registration, reliability, and range of the wireless transmission of survey signals. The effectiveness of the automatic numbering of measuring modules was tested as the system components were arranged and fixed to the streamers. After measurements were completed, the accuracy and speed of data downloading from the internal memory (SDHC 32GB WiFi) was determined. Additionally, the functionality of automatic battery recharging, the maximum survey duration, and the reliability of battery discharge signalling were assessed.
Code of Federal Regulations, 2013 CFR
2013-04-01
... limited to software, files, data, and prize schedules. (2) Downloads must use secure methodologies that... date of the completion of the download; (iii) The Class II gaming system components to which software was downloaded; (iv) The version(s) of download package and any software downloaded. Logging of the...
Code of Federal Regulations, 2014 CFR
2014-04-01
... limited to software, files, data, and prize schedules. (2) Downloads must use secure methodologies that... date of the completion of the download; (iii) The Class II gaming system components to which software was downloaded; (iv) The version(s) of download package and any software downloaded. Logging of the...
Espino, Jeremy U; Wagner, M; Szczepaniak, C; Tsui, F C; Su, H; Olszewski, R; Liu, Z; Chapman, W; Zeng, X; Ma, L; Lu, Z; Dara, J
2004-09-24
Computer-based outbreak and disease surveillance requires high-quality software that is well-supported and affordable. Developing software in an open-source framework, which entails free distribution and use of software and continuous, community-based software development, can produce software with such characteristics, and can do so rapidly. The objective of the Real-Time Outbreak and Disease Surveillance (RODS) Open Source Project is to accelerate the deployment of computer-based outbreak and disease surveillance systems by writing software and catalyzing the formation of a community of users, developers, consultants, and scientists who support its use. The University of Pittsburgh seeded the Open Source Project by releasing the RODS software under the GNU General Public License. An infrastructure was created, consisting of a website, mailing lists for developers and users, designated software developers, and shared code-development tools. These resources are intended to encourage growth of the Open Source Project community. Progress is measured by assessing website usage, number of software downloads, number of inquiries, number of system deployments, and number of new features or modules added to the code base. During September--November 2003, users generated 5,370 page views of the project website, 59 software downloads, 20 inquiries, one new deployment, and addition of four features. Thus far, health departments and companies have been more interested in using the software as is than in customizing or developing new features. The RODS laboratory anticipates that after initial installation has been completed, health departments and companies will begin to customize the software and contribute their enhancements to the public code base.
Alternative Fuels Data Center: Data Downloads
Data Downloads to someone by E-mail Share Alternative Fuels Data Center: Data Downloads on Facebook Tweet about Alternative Fuels Data Center: Data Downloads on Twitter Bookmark Alternative Fuels Data Center: Data Downloads on Google Bookmark Alternative Fuels Data Center: Data Downloads on Delicious Rank
Rich client data exploration and research prototyping for NOAA
NASA Astrophysics Data System (ADS)
Grossberg, Michael; Gladkova, Irina; Guch, Ingrid; Alabi, Paul; Shahriar, Fazlul; Bonev, George; Aizenman, Hannah
2009-08-01
Data from satellites and model simulations is increasing exponentially as observations and model computing power improve rapidly. Not only is technology producing more data, but it often comes from sources all over the world. Researchers and scientists who must collaborate are also located globally. This work presents a software design and technologies which will make it possible for groups of researchers to explore large data sets visually together without the need to download these data sets locally. The design will also make it possible to exploit high performance computing remotely and transparently to analyze and explore large data sets. Computer power, high quality sensing, and data storage capacity have improved at a rate that outstrips our ability to develop software applications that exploit these resources. It is impractical for NOAA scientists to download all of the satellite and model data that may be relevant to a given problem and the computing environments available to a given researcher range from supercomputers to only a web browser. The size and volume of satellite and model data are increasing exponentially. There are at least 50 multisensor satellite platforms collecting Earth science data. On the ground and in the sea there are sensor networks, as well as networks of ground based radar stations, producing a rich real-time stream of data. This new wealth of data would have limited use were it not for the arrival of large-scale high-performance computation provided by parallel computers, clusters, grids, and clouds. With these computational resources and vast archives available, it is now possible to analyze subtle relationships which are global, multi-modal and cut across many data sources. Researchers, educators, and even the general public, need tools to access, discover, and use vast data center archives and high performance computing through a simple yet flexible interface.
Making geospatial data in ASF archive readily accessible
NASA Astrophysics Data System (ADS)
Gens, R.; Hogenson, K.; Wolf, V. G.; Drew, L.; Stern, T.; Stoner, M.; Shapran, M.
2015-12-01
The way geospatial data is searched, managed, processed and used has changed significantly in recent years. A data archive such as the one at the Alaska Satellite Facility (ASF), one of NASA's twelve interlinked Distributed Active Archive Centers (DAACs), used to be searched solely via user interfaces that were specifically developed for its particular archive and data sets. ASF then moved to using an application programming interface (API) that defined a set of routines, protocols, and tools for distributing the geospatial information stored in the database in real time. This provided a more flexible access to the geospatial data. Yet, it was up to user to develop the tools to get a more tailored access to the data they needed. We present two new approaches for serving data to users. In response to the recent Nepal earthquake we developed a data feed for distributing ESA's Sentinel data. Users can subscribe to the data feed and are provided with the relevant metadata the moment a new data set is available for download. The second approach was an Open Geospatial Consortium (OGC) web feature service (WFS). The WFS hosts the metadata along with a direct link from which the data can be downloaded. It uses the open-source GeoServer software (Youngblood and Iacovella, 2013) and provides an interface to include the geospatial information in the archive directly into the user's geographic information system (GIS) as an additional data layer. Both services are run on top of a geospatial PostGIS database, an open-source geographic extension for the PostgreSQL object-relational database (Marquez, 2015). Marquez, A., 2015. PostGIS essentials. Packt Publishing, 198 p. Youngblood, B. and Iacovella, S., 2013. GeoServer Beginner's Guide, Packt Publishing, 350 p.
NASA Astrophysics Data System (ADS)
Collins, L.
2014-12-01
Students in the Environmental Studies major at the University of Southern California fulfill their curriculum requirements by taking a broad range of courses in the social and natural sciences. Climate change is often taught in 1-2 lectures in these courses with limited examination of this complex topic. Several upper division elective courses focus on the science, policy, and social impacts of climate change. In an upper division course focused on the scientific tools used to determine paleoclimate and predict future climate, I have developed a project where students download, manipulate, and analyze data from the National Climatic Data Center. Students are required to download 100 or more years of daily temperature records and use the statistical program R to analyze that data, calculating daily, monthly, and yearly temperature averages along with changes in the number of extreme hot or cold days (≥90˚F and ≤30˚F, respectively). In parallel, they examine population growth, city expansion, and changes in transportation looking for correlations between the social data and trends observed in the temperature data. Students examine trends over time to determine correlations to urban heat island effect. This project exposes students to "real" data, giving them the tools necessary to critically analyze scientific studies without being experts in the field. Utilizing the existing, public, online databases provides almost unlimited, free data. Open source statistical programs provide a cost-free platform for examining the data although some in-class time is required to help students navigate initial data importation and analysis. Results presented will highlight data compiled over three years of course projects.
An Open Source Simulation System
NASA Technical Reports Server (NTRS)
Slack, Thomas
2005-01-01
An investigation into the current state of the art of open source real time programming practices. This document includes what technologies are available, how easy is it to obtain, configure, and use them, and some performance measures done on the different systems. A matrix of vendors and their products is included as part of this investigation, but this is not an exhaustive list, and represents only a snapshot of time in a field that is changing rapidly. Specifically, there are three approaches investigated: 1. Completely open source on generic hardware, downloaded from the net. 2. Open source packaged by a vender and provided as free evaluation copy. 3. Proprietary hardware with pre-loaded proprietary source available software provided by the vender as for our evaluation.
OpenElectrophy: An Electrophysiological Data- and Analysis-Sharing Framework
Garcia, Samuel; Fourcaud-Trocmé, Nicolas
2008-01-01
Progress in experimental tools and design is allowing the acquisition of increasingly large datasets. Storage, manipulation and efficient analyses of such large amounts of data is now a primary issue. We present OpenElectrophy, an electrophysiological data- and analysis-sharing framework developed to fill this niche. It stores all experiment data and meta-data in a single central MySQL database, and provides a graphic user interface to visualize and explore the data, and a library of functions for user analysis scripting in Python. It implements multiple spike-sorting methods, and oscillation detection based on the ridge extraction methods due to Roux et al. (2007). OpenElectrophy is open source and is freely available for download at http://neuralensemble.org/trac/OpenElectrophy. PMID:19521545
ERIC Educational Resources Information Center
Kroski, Ellyssa
2008-01-01
A widget displays Web content from external sources and can be embedded into a blog, social network, or other Web page, or downloaded to one's desktop. With widgets--sometimes referred to as gadgets--one can insert video into a blog post, display slideshows on MySpace, get the weather delivered to his mobile device, drag-and-drop his Netflix queue…
Twelve example local data support files are automatically downloaded when the SDMProjectBuilder is installed on a computer. They allow the user to modify values to parameters that impact the release, migration, fate, and transport of microbes within a watershed, and control delin...
Twelve example local data support files are automatically downloaded when the SDMProjectBuilder is installed on a computer. They allow the user to modify values to parameters that impact the release, migration, fate, and transport of microbes within a watershed, and control delin...
A communication efficient and scalable distributed data mining for the astronomical data
NASA Astrophysics Data System (ADS)
Govada, A.; Sahay, S. K.
2016-07-01
In 2020, ∼60PB of archived data will be accessible to the astronomers. But to analyze such a paramount data will be a challenging task. This is basically due to the computational model used to download the data from complex geographically distributed archives to a central site and then analyzing it in the local systems. Because the data has to be downloaded to the central site, the network BW limitation will be a hindrance for the scientific discoveries. Also analyzing this PB-scale on local machines in a centralized manner is challenging. In this, virtual observatory is a step towards this problem, however, it does not provide the data mining model (Zhang et al., 2004). Adding the distributed data mining layer to the VO can be the solution in which the knowledge can be downloaded by the astronomers instead the raw data and thereafter astronomers can either reconstruct the data back from the downloaded knowledge or use the knowledge directly for further analysis. Therefore, in this paper, we present Distributed Load Balancing Principal Component Analysis for optimally distributing the computation among the available nodes to minimize the transmission cost and downloading cost for the end user. The experimental analysis is done with Fundamental Plane (FP) data, Gadotti data and complex Mfeat data. In terms of transmission cost, our approach performs better than Qi et al. and Yue et al. The analysis shows that with the complex Mfeat data ∼90% downloading cost can be reduced for the end user with the negligible loss in accuracy.
Downloading from the OPAC: The Innovative Interfaces Environment.
ERIC Educational Resources Information Center
Spore, Stuart
1991-01-01
Discussion of downloading from online public access catalogs focuses on downloading to MS-DOS microcomputers from the INNOPAC online catalog system. Tools for capturing and postprocessing downloaded files are described, technical and institutional constraints on downloading are addressed, and an innovative program for overcoming such constraints…
AstroBlend: An astrophysical visualization package for Blender
NASA Astrophysics Data System (ADS)
Naiman, J. P.
2016-04-01
The rapid growth in scale and complexity of both computational and observational astrophysics over the past decade necessitates efficient and intuitive methods for examining and visualizing large datasets. Here, I present AstroBlend, an open-source Python library for use within the three dimensional modeling software, Blender. While Blender has been a popular open-source software among animators and visual effects artists, in recent years it has also become a tool for visualizing astrophysical datasets. AstroBlend combines the three dimensional capabilities of Blender with the analysis tools of the widely used astrophysical toolset, yt, to afford both computational and observational astrophysicists the ability to simultaneously analyze their data and create informative and appealing visualizations. The introduction of this package includes a description of features, work flow, and various example visualizations. A website - www.astroblend.com - has been developed which includes tutorials, and a gallery of example images and movies, along with links to downloadable data, three dimensional artistic models, and various other resources.
Tsay, Ming-Yueh; Wu, Tai-Luan; Tseng, Ling-Li
2017-01-01
This study examines the completeness and overlap of coverage in physics of six open access scholarly communication systems, including two search engines (Google Scholar and Microsoft Academic), two aggregate institutional repositories (OAIster and OpenDOAR), and two physics-related open sources (arXiv.org and Astrophysics Data System). The 2001-2013 Nobel Laureates in Physics served as the sample. Bibliographic records of their publications were retrieved and downloaded from each system, and a computer program was developed to perform the analytical tasks of sorting, comparison, elimination, aggregation and statistical calculations. Quantitative analyses and cross-referencing were performed to determine the completeness and overlap of the system coverage of the six open access systems. The results may enable scholars to select an appropriate open access system as an efficient scholarly communication channel, and academic institutions may build institutional repositories or independently create citation index systems in the future. Suggestions on indicators and tools for academic assessment are presented based on the comprehensiveness assessment of each system.
An Open Catalog for Supernova Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Guillochon, James; Parrent, Jerod; Kelley, Luke Zoltan
We present the Open Supernova Catalog , an online collection of observations and metadata for presently 36,000+ supernovae and related candidates. The catalog is freely available on the web (https://sne.space), with its main interface having been designed to be a user-friendly, rapidly searchable table accessible on desktop and mobile devices. In addition to the primary catalog table containing supernova metadata, an individual page is generated for each supernova, which displays its available metadata, light curves, and spectra spanning X-ray to radio frequencies. The data presented in the catalog is automatically rebuilt on a daily basis and is constructed by parsingmore » several dozen sources, including the data presented in the supernova literature and from secondary sources such as other web-based catalogs. Individual supernova data is stored in the hierarchical, human- and machine-readable JSON format, with the entirety of each supernova’s data being contained within a single JSON file bearing its name. The setup we present here, which is based on open-source software maintained via git repositories hosted on github, enables anyone to download the entirety of the supernova data set to their home computer in minutes, and to make contributions of their own data back to the catalog via git. As the supernova data set continues to grow, especially in the upcoming era of all-sky synoptic telescopes, which will increase the total number of events by orders of magnitude, we hope that the catalog we have designed will be a valuable tool for the community to analyze both historical and contemporary supernovae.« less
An Open Catalog for Supernova Data
NASA Astrophysics Data System (ADS)
Guillochon, James; Parrent, Jerod; Kelley, Luke Zoltan; Margutti, Raffaella
2017-01-01
We present the Open Supernova Catalog, an online collection of observations and metadata for presently 36,000+ supernovae and related candidates. The catalog is freely available on the web (https://sne.space), with its main interface having been designed to be a user-friendly, rapidly searchable table accessible on desktop and mobile devices. In addition to the primary catalog table containing supernova metadata, an individual page is generated for each supernova, which displays its available metadata, light curves, and spectra spanning X-ray to radio frequencies. The data presented in the catalog is automatically rebuilt on a daily basis and is constructed by parsing several dozen sources, including the data presented in the supernova literature and from secondary sources such as other web-based catalogs. Individual supernova data is stored in the hierarchical, human- and machine-readable JSON format, with the entirety of each supernova’s data being contained within a single JSON file bearing its name. The setup we present here, which is based on open-source software maintained via git repositories hosted on github, enables anyone to download the entirety of the supernova data set to their home computer in minutes, and to make contributions of their own data back to the catalog via git. As the supernova data set continues to grow, especially in the upcoming era of all-sky synoptic telescopes, which will increase the total number of events by orders of magnitude, we hope that the catalog we have designed will be a valuable tool for the community to analyze both historical and contemporary supernovae.
Observer's Interface for Solar System Target Specification
NASA Astrophysics Data System (ADS)
Roman, Anthony; Link, Miranda; Moriarty, Christopher; Stansberry, John A.
2016-10-01
When observing an asteroid or comet with HST, it has been necessary for the observer to manually enter the target's orbital elements into the Astronomer's Proposal Tool (APT). This allowed possible copy/paste transcription errors from the observer's source of orbital elements data. In order to address this issue, APT has now been improved with the capability to identify targets in and then download orbital elements from JPL Horizons. The observer will first use a target name resolver to choose the intended target from the Horizons database, and then download the orbital elements from Horizons directly into APT. A manual entry option is also still retained if the observer does not wish to use elements from Horizons. This new capability is available for HST observing, and it will also be supported for JWST observing. The poster shows examples of this new interface.
Observer's Interface for Solar System Target Specification
NASA Astrophysics Data System (ADS)
Roman, Anthony; Link, Miranda; Moriarty, Christopher; Stansberry, John A.
2016-01-01
When observing an asteroid or comet with HST, it has been necessary for the observer to manually enter the target's orbital elements into the Astronomer's Proposal Tool (APT). This allowed possible copy/paste transcription errors from the observer's source of orbital elements data. In order to address this issue, APT has now been improved with the capability to identify targets in and then download orbital elements from JPL Horizons. The observer will first use a target name resolver to choose the intended target from the Horizons database, and then download the orbital elements from Horizons directly into APT. A manual entry option is also still retained if the observer does not wish to use elements from Horizons. This new capability is available for HST observing, and it will also be supported for JWST observing. The poster shows examples of this new interface.
College Students' Moral Evaluations of Illegal Music Downloading
ERIC Educational Resources Information Center
Jambon, Marc M.; Smetana, Judith G.
2012-01-01
Although unauthorized music downloading is illegal, a majority of college students have downloaded music for free online. Evaluations of illegal music downloading and their association with downloading behavior were examined using social domain theory in a sample of 188 ethnically diverse college students (M[subscript age] = 19.80 years, SD =…
Cytoscape.js: a graph theory library for visualisation and analysis.
Franz, Max; Lopes, Christian T; Huck, Gerardo; Dong, Yue; Sumer, Onur; Bader, Gary D
2016-01-15
Cytoscape.js is an open-source JavaScript-based graph library. Its most common use case is as a visualization software component, so it can be used to render interactive graphs in a web browser. It also can be used in a headless manner, useful for graph operations on a server, such as Node.js. Cytoscape.js is implemented in JavaScript. Documentation, downloads and source code are available at http://js.cytoscape.org. gary.bader@utoronto.ca. © The Author 2015. Published by Oxford University Press.
Schmedes, Sarah E; King, Jonathan L; Budowle, Bruce
2015-01-01
Whole-genome data are invaluable for large-scale comparative genomic studies. Current sequencing technologies have made it feasible to sequence entire bacterial genomes with relative ease and time with a substantially reduced cost per nucleotide, hence cost per genome. More than 3,000 bacterial genomes have been sequenced and are available at the finished status. Publically available genomes can be readily downloaded; however, there are challenges to verify the specific supporting data contained within the download and to identify errors and inconsistencies that may be present within the organizational data content and metadata. AutoCurE, an automated tool for bacterial genome database curation in Excel, was developed to facilitate local database curation of supporting data that accompany downloaded genomes from the National Center for Biotechnology Information. AutoCurE provides an automated approach to curate local genomic databases by flagging inconsistencies or errors by comparing the downloaded supporting data to the genome reports to verify genome name, RefSeq accession numbers, the presence of archaea, BioProject/UIDs, and sequence file descriptions. Flags are generated for nine metadata fields if there are inconsistencies between the downloaded genomes and genomes reports and if erroneous or missing data are evident. AutoCurE is an easy-to-use tool for local database curation for large-scale genome data prior to downstream analyses.
Current Status of Cardiovascular Disease-Related Smartphone Apps Downloadable in China.
Xiao, Qian; Lu, Sai; Wang, Yanling; Sun, Liu; Wu, Ying
2017-03-01
Smartphone apps present a great opportunity for the management of cardiovascular disease (CVD) as the adoption of apps becomes increasingly popular in China. Yet, little is known about the status of CVD-related Smartphone apps in the country. The aim of this study was to examine the current status of CVD-related smartphone apps available for download in China. Using CVD-related keywords written either in Chinese or English, the top 6 most popular smartphone app online stores in China were searched in September 2015. The information accountability of the selected apps was assessed with the Silberg scale. The key topic areas identified from the European Guidelines on cardiovascular disease prevention served to determine information coverage of the top 5 downloaded apps. The average Silberg score of 151 apps was 2.87 (out of 9) with most apps not revealing authors' qualifications and information references. There was also a lack of sponsorship disclosure and information update. Moreover, none of the top 5 downloaded apps covered all key areas of CVD management as recommended by the European Guidelines on cardiovascular disease prevention. There was little evidence of health professionals' involvement in the formation of the CVD-related apps. This study identified areas for improvement concerning information accountability and the scope of coverage of CVD-related apps downloadable in China. The findings may guide the future advancement of CVD-related apps and benefit CVD management in China.
ERIC Educational Resources Information Center
Kadhim, Kais A.; Habeeb, Luwaytha S.; Sapar, Ahmad Arifin; Hussin, Zaharah; Abdullah, Muhammad Ridhuan Tony Lim
2013-01-01
Nowadays, online Machine Translation (MT) is used widely with translation software, such as Google and Babylon, being easily available and downloadable. This study aims to test the translation quality of these two machine systems in translating Arabic news headlines into English. 40 Arabic news headlines were selected from three online sources,…
specification How to install the software How to use the software Download the source code (using .gz). Standard Exchange Format (SHEF) is a documented set of rules for coding of data in a form for both visual and information to describe the data. Current SHEF specification How to install the software How to use the
77 FR 15369 - Mobility Fund Phase I Auction GIS Data of Potentially Eligible Census Blocks
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-15
....fcc.gov/auctions/901/ , are the following: Downloadable shapefile Web mapping service MapBox map tiles... GIS software allows you to add this service as a layer to your session or project. 6. MapBox map tiles are cached map tiles of the data. With this open source software approach, these image tiles can be...
DLRS: gene tree evolution in light of a species tree.
Sjöstrand, Joel; Sennblad, Bengt; Arvestad, Lars; Lagergren, Jens
2012-11-15
PrIME-DLRS (or colloquially: 'Delirious') is a phylogenetic software tool to simultaneously infer and reconcile a gene tree given a species tree. It accounts for duplication and loss events, a relaxed molecular clock and is intended for the study of homologous gene families, for example in a comparative genomics setting involving multiple species. PrIME-DLRS uses a Bayesian MCMC framework, where the input is a known species tree with divergence times and a multiple sequence alignment, and the output is a posterior distribution over gene trees and model parameters. PrIME-DLRS is available for Java SE 6+ under the New BSD License, and JAR files and source code can be downloaded from http://code.google.com/p/jprime/. There is also a slightly older C++ version available as a binary package for Ubuntu, with download instructions at http://prime.sbc.su.se. The C++ source code is available upon request. joel.sjostrand@scilifelab.se or jens.lagergren@scilifelab.se. PrIME-DLRS is based on a sound probabilistic model (Åkerborg et al., 2009) and has been thoroughly validated on synthetic and biological datasets (Supplementary Material online).
Dynamics of Change and Change in Dynamics
Boker, Steven M.; Staples, Angela D.; Hu, Yueqin
2017-01-01
A framework is presented for building and testing models of dynamic regulation by categorizing sources of differences between theories of dynamics. A distinction is made between the dynamics of change, i.e., how a system self–regulates on a short time scale, and change in dynamics, i.e., how those dynamics may themselves change over a longer time scale. In order to clarify the categories, models are first built to estimate individual differences in equilibrium value and equilibrium change. Next, models are presented in which there are individual differences in parameters of dynamics such as frequency of fluctuations, damping of fluctuations, and amplitude of fluctuations. Finally, models for within–person change in dynamics over time are proposed. Simulations demonstrating feasibility of these models are presented and OpenMx scripts for fitting these models have been made available in a downloadable archive along with scripts to simulate data so that a researcher may test a selected models’ feasibility within a chosen experimental design. PMID:29046764
: Upload Date Photo Date 1 2 3 4 5 Next Arctic Edge 2018 Download Full Image Photo Details Arctic Edge 2018 Download Full Image Photo Details Arctic Edge 2018 Download Full Image Photo Details Arctic Edge 2018 Download Full Image Photo Details Arctic Edge 2018 Download Full Image Photo Details Arctic Edge 2018
Integration of EGA secure data access into Galaxy.
Hoogstrate, Youri; Zhang, Chao; Senf, Alexander; Bijlard, Jochem; Hiltemann, Saskia; van Enckevort, David; Repo, Susanna; Heringa, Jaap; Jenster, Guido; J A Fijneman, Remond; Boiten, Jan-Willem; A Meijer, Gerrit; Stubbs, Andrew; Rambla, Jordi; Spalding, Dylan; Abeln, Sanne
2016-01-01
High-throughput molecular profiling techniques are routinely generating vast amounts of data for translational medicine studies. Secure access controlled systems are needed to manage, store, transfer and distribute these data due to its personally identifiable nature. The European Genome-phenome Archive (EGA) was created to facilitate access and management to long-term archival of bio-molecular data. Each data provider is responsible for ensuring a Data Access Committee is in place to grant access to data stored in the EGA. Moreover, the transfer of data during upload and download is encrypted. ELIXIR, a European research infrastructure for life-science data, initiated a project (2016 Human Data Implementation Study) to understand and document the ELIXIR requirements for secure management of controlled-access data. As part of this project, a full ecosystem was designed to connect archived raw experimental molecular profiling data with interpreted data and the computational workflows, using the CTMM Translational Research IT (CTMM-TraIT) infrastructure http://www.ctmm-trait.nl as an example. Here we present the first outcomes of this project, a framework to enable the download of EGA data to a Galaxy server in a secure way. Galaxy provides an intuitive user interface for molecular biologists and bioinformaticians to run and design data analysis workflows. More specifically, we developed a tool -- ega_download_streamer - that can download data securely from EGA into a Galaxy server, which can subsequently be further processed. This tool will allow a user within the browser to run an entire analysis containing sensitive data from EGA, and to make this analysis available for other researchers in a reproducible manner, as shown with a proof of concept study. The tool ega_download_streamer is available in the Galaxy tool shed: https://toolshed.g2.bx.psu.edu/view/yhoogstrate/ega_download_streamer.
Integration of EGA secure data access into Galaxy
Hoogstrate, Youri; Zhang, Chao; Senf, Alexander; Bijlard, Jochem; Hiltemann, Saskia; van Enckevort, David; Repo, Susanna; Heringa, Jaap; Jenster, Guido; Fijneman, Remond J.A.; Boiten, Jan-Willem; A. Meijer, Gerrit; Stubbs, Andrew; Rambla, Jordi; Spalding, Dylan; Abeln, Sanne
2016-01-01
High-throughput molecular profiling techniques are routinely generating vast amounts of data for translational medicine studies. Secure access controlled systems are needed to manage, store, transfer and distribute these data due to its personally identifiable nature. The European Genome-phenome Archive (EGA) was created to facilitate access and management to long-term archival of bio-molecular data. Each data provider is responsible for ensuring a Data Access Committee is in place to grant access to data stored in the EGA. Moreover, the transfer of data during upload and download is encrypted. ELIXIR, a European research infrastructure for life-science data, initiated a project (2016 Human Data Implementation Study) to understand and document the ELIXIR requirements for secure management of controlled-access data. As part of this project, a full ecosystem was designed to connect archived raw experimental molecular profiling data with interpreted data and the computational workflows, using the CTMM Translational Research IT (CTMM-TraIT) infrastructure http://www.ctmm-trait.nl as an example. Here we present the first outcomes of this project, a framework to enable the download of EGA data to a Galaxy server in a secure way. Galaxy provides an intuitive user interface for molecular biologists and bioinformaticians to run and design data analysis workflows. More specifically, we developed a tool -- ega_download_streamer - that can download data securely from EGA into a Galaxy server, which can subsequently be further processed. This tool will allow a user within the browser to run an entire analysis containing sensitive data from EGA, and to make this analysis available for other researchers in a reproducible manner, as shown with a proof of concept study. The tool ega_download_streamer is available in the Galaxy tool shed: https://toolshed.g2.bx.psu.edu/view/yhoogstrate/ega_download_streamer. PMID:28232859
Automated bond order assignment as an optimization problem.
Dehof, Anna Katharina; Rurainski, Alexander; Bui, Quang Bao Anh; Böcker, Sebastian; Lenhof, Hans-Peter; Hildebrandt, Andreas
2011-03-01
Numerous applications in Computational Biology process molecular structures and hence strongly rely not only on correct atomic coordinates but also on correct bond order information. For proteins and nucleic acids, bond orders can be easily deduced but this does not hold for other types of molecules like ligands. For ligands, bond order information is not always provided in molecular databases and thus a variety of approaches tackling this problem have been developed. In this work, we extend an ansatz proposed by Wang et al. that assigns connectivity-based penalty scores and tries to heuristically approximate its optimum. In this work, we present three efficient and exact solvers for the problem replacing the heuristic approximation scheme of the original approach: an A*, an ILP and an fixed-parameter approach (FPT) approach. We implemented and evaluated the original implementation, our A*, ILP and FPT formulation on the MMFF94 validation suite and the KEGG Drug database. We show the benefit of computing exact solutions of the penalty minimization problem and the additional gain when computing all optimal (or even suboptimal) solutions. We close with a detailed comparison of our methods. The A* and ILP solution are integrated into the open-source C++ LGPL library BALL and the molecular visualization and modelling tool BALLView and can be downloaded from our homepage www.ball-project.org. The FPT implementation can be downloaded from http://bio.informatik.uni-jena.de/software/.
Atmospheric Science Data Center
2017-10-12
Order online ASDC Web Ordering Tool data available for ftp download. Order online CERES Subsetter Ordering Page data available for ftp download. Order online Earthdata Search Tool data available for download. Download the ...
FIT-MART: Quantum Magnetism with a Gentle Learning Curve
NASA Astrophysics Data System (ADS)
Engelhardt, Larry; Garland, Scott C.; Rainey, Cameron; Freeman, Ray A.
We present a new open-source software package, FIT-MART, that allows non-experts to quickly get started sim- ulating quantum magnetism. FIT-MART can be downloaded as a platform-idependent executable Java (JAR) file. It allows the user to define (Heisenberg) Hamiltonians by electronically drawing pictures that represent quantum spins and operators. Sliders are automatically generated to control the values of the parameters in the model, and when the values change, several plots are updated in real time to display both the resulting energy spectra and the equilibruim magnetic properties. Several experimental data sets for real magnetic molecules are included in FIT-MART to allow easy comparison between simulated and experimental data, and FIT-MART users can also import their own data for analysis and compare the goodness of fit for different models.
Byrska-Bishop, Marta; Wallace, John; Frase, Alexander T; Ritchie, Marylyn D
2018-01-01
Abstract Motivation BioBin is an automated bioinformatics tool for the multi-level biological binning of sequence variants. Herein, we present a significant update to BioBin which expands the software to facilitate a comprehensive rare variant analysis and incorporates novel features and analysis enhancements. Results In BioBin 2.3, we extend our software tool by implementing statistical association testing, updating the binning algorithm, as well as incorporating novel analysis features providing for a robust, highly customizable, and unified rare variant analysis tool. Availability and implementation The BioBin software package is open source and freely available to users at http://www.ritchielab.com/software/biobin-download Contact mdritchie@geisinger.edu Supplementary information Supplementary data are available at Bioinformatics online. PMID:28968757
High Temporal Resolution Mapping of Seismic Noise Sources Using Heterogeneous Supercomputers
NASA Astrophysics Data System (ADS)
Paitz, P.; Gokhberg, A.; Ermert, L. A.; Fichtner, A.
2017-12-01
The time- and space-dependent distribution of seismic noise sources is becoming a key ingredient of modern real-time monitoring of various geo-systems like earthquake fault zones, volcanoes, geothermal and hydrocarbon reservoirs. We present results of an ongoing research project conducted in collaboration with the Swiss National Supercomputing Centre (CSCS). The project aims at building a service providing seismic noise source maps for Central Europe with high temporal resolution. We use source imaging methods based on the cross-correlation of seismic noise records from all seismic stations available in the region of interest. The service is hosted on the CSCS computing infrastructure; all computationally intensive processing is performed on the massively parallel heterogeneous supercomputer "Piz Daint". The solution architecture is based on the Application-as-a-Service concept to provide the interested researchers worldwide with regular access to the noise source maps. The solution architecture includes the following sub-systems: (1) data acquisition responsible for collecting, on a periodic basis, raw seismic records from the European seismic networks, (2) high-performance noise source mapping application responsible for the generation of source maps using cross-correlation of seismic records, (3) back-end infrastructure for the coordination of various tasks and computations, (4) front-end Web interface providing the service to the end-users and (5) data repository. The noise source mapping itself rests on the measurement of logarithmic amplitude ratios in suitably pre-processed noise correlations, and the use of simplified sensitivity kernels. During the implementation we addressed various challenges, in particular, selection of data sources and transfer protocols, automation and monitoring of daily data downloads, ensuring the required data processing performance, design of a general service-oriented architecture for coordination of various sub-systems, and engineering an appropriate data storage solution. The present pilot version of the service implements noise source maps for Switzerland. Extension of the solution to Central Europe is planned for the next project phase.
The Chandra Source Catalog 2.0: Interfaces
NASA Astrophysics Data System (ADS)
D'Abrusco, Raffaele; Zografou, Panagoula; Tibbetts, Michael; Allen, Christopher E.; Anderson, Craig S.; Budynkiewicz, Jamie A.; Burke, Douglas; Chen, Judy C.; Civano, Francesca Maria; Doe, Stephen M.; Evans, Ian N.; Evans, Janet D.; Fabbiano, Giuseppina; Gibbs, Danny G., II; Glotfelty, Kenny J.; Graessle, Dale E.; Grier, John D.; Hain, Roger; Hall, Diane M.; Harbo, Peter N.; Houck, John C.; Lauer, Jennifer L.; Laurino, Omar; Lee, Nicholas P.; Martínez-Galarza, Rafael; McCollough, Michael L.; McDowell, Jonathan C.; Miller, Joseph; McLaughlin, Warren; Morgan, Douglas L.; Mossman, Amy E.; Nguyen, Dan T.; Nichols, Joy S.; Nowak, Michael A.; Paxson, Charles; Plummer, David A.; Primini, Francis Anthony; Rots, Arnold H.; Siemiginowska, Aneta; Sundheim, Beth A.; Van Stone, David W.
2018-01-01
Easy-to-use, powerful public interfaces to access the wealth of information contained in any modern, complex astronomical catalog are fundamental to encourage its usage. In this poster,I present the public interfaces of the second Chandra Source Catalog (CSC2). CSC2 is the most comprehensive catalog of X-ray sources detected by Chandra, thanks to the inclusion of Chandra observations public through the end of 2014 and to methodological advancements. CSC2 provides measured properties for a large number of sources that sample the X-ray sky at fainter levels than the previous versions of the CSC, thanks to the stacking of single overlapping observations within 1’ before source detection. Sources from stacks are then crossmatched, if multiple stacks cover the same area of the sky, to create a list of unique, optimal CSC2 sources. The properties of sources detected in each single stack and each single observation are also measured. The layered structure of the CSC2 catalog is mirrored in the organization of the CSC2 database, consisting of three tables containing all properties for the unique stacked sources (“Master Source”), single stack sources (“Stack Source”) and sources in any single observation (“Observation Source”). These tables contain estimates of the position, flags, extent, significances, fluxes, spectral properties and variability (and associated errors) for all classes of sources. The CSC2 also includes source region and full-field data products for all master sources, stack sources and observation sources: images, photon event lists, light curves and spectra.CSCview, the main interface to the CSC2 source properties and data products, is a GUI tool that allows to build queries based on the values of all properties contained in CSC2 tables, query the catalog, inspect the returned table of source properties, browse and download the associated data products. I will also introduce the suite of command-line interfaces to CSC2 that can be used in alternative to CSCview, and will present the concept for an additional planned cone-search web-based interface.This work has been supported by NASA under contract NAS 8-03060 to the Smithsonian Astrophysical Observatory for operation of the Chandra X-ray Center.
Design of FPGA ICA for hyperspectral imaging processing
NASA Astrophysics Data System (ADS)
Nordin, Anis; Hsu, Charles C.; Szu, Harold H.
2001-03-01
The remote sensing problem which uses hyperspectral imaging can be transformed into a blind source separation problem. Using this model, hyperspectral imagery can be de-mixed into sub-pixel spectra which indicate the different material present in the pixel. This can be further used to deduce areas which contain forest, water or biomass, without even knowing the sources which constitute the image. This form of remote sensing allows previously blurred images to show the specific terrain involved in that region. The blind source separation problem can be implemented using an Independent Component Analysis algorithm. The ICA Algorithm has previously been successfully implemented using software packages such as MATLAB, which has a downloadable version of FastICA. The challenge now lies in implementing it in a form of hardware, or firmware in order to improve its computational speed. Hardware implementation also solves insufficient memory problem encountered by software packages like MATLAB when employing ICA for high resolution images and a large number of channels. Here, a pipelined solution of the firmware, realized using FPGAs are drawn out and simulated using C. Since C code can be translated into HDLs or be used directly on the FPGAs, it can be used to simulate its actual implementation in hardware. The simulated results of the program is presented here, where seven channels are used to model the 200 different channels involved in hyperspectral imaging.
ReadyVax: A new mobile vaccine information app.
Bednarczyk, Robert A; Frew, Paula M; Salmon, Daniel A; Whitney, Ellen; Omer, Saad B
2017-05-04
Vaccine information of varying quality is available through many different sources. We describe the creation, release and utilization of ReadyVax, a new mobile smartphone app providing access to trustworthy, evidence-based vaccine information for a target audience of healthcare providers, pharmacists, and patients (including parents of children). We describe the information content and technical development of ReadyVax. Between the hard launch of the app on February 12, 2015 and October 8, 2016, the app has been downloaded by 5,142 unique users, with 6,841 total app sessions initiated, comprising a total of 15,491 screen views (2.3 screens/session on average). ReadyVax has been downloaded by users in 102 different countries; most users (52%) are from the United States. We are continuing outreach efforts to increase app use, and planning for development of an Android-compatible version of ReadyVax, to increase the available market for the app.
Microcomputer software development facilities
NASA Technical Reports Server (NTRS)
Gorman, J. S.; Mathiasen, C.
1980-01-01
A more efficient and cost effective method for developing microcomputer software is to utilize a host computer with high-speed peripheral support. Application programs such as cross assemblers, loaders, and simulators are implemented in the host computer for each of the microcomputers for which software development is a requirement. The host computer is configured to operate in a time share mode for multiusers. The remote terminals, printers, and down loading capabilities provided are based on user requirements. With this configuration a user, either local or remote, can use the host computer for microcomputer software development. Once the software is developed (through the code and modular debug stage) it can be downloaded to the development system or emulator in a test area where hardware/software integration functions can proceed. The microcomputer software program sources reside in the host computer and can be edited, assembled, loaded, and then downloaded as required until the software development project has been completed.
ReadyVax: A new mobile vaccine information app
Bednarczyk, Robert A.; Frew, Paula M.; Salmon, Daniel A.; Whitney, Ellen; Omer, Saad B.
2017-01-01
Abstract Vaccine information of varying quality is available through many different sources. We describe the creation, release and utilization of ReadyVax, a new mobile smartphone app providing access to trustworthy, evidence-based vaccine information for a target audience of healthcare providers, pharmacists, and patients (including parents of children). We describe the information content and technical development of ReadyVax. Between the hard launch of the app on February 12, 2015 and October 8, 2016, the app has been downloaded by 5,142 unique users, with 6,841 total app sessions initiated, comprising a total of 15,491 screen views (2.3 screens/session on average). ReadyVax has been downloaded by users in 102 different countries; most users (52%) are from the United States. We are continuing outreach efforts to increase app use, and planning for development of an Android-compatible version of ReadyVax, to increase the available market for the app. PMID:28059610
ChelomEx: Isotope-assisted discovery of metal chelates in complex media using high-resolution LC-MS.
Baars, Oliver; Morel, François M M; Perlman, David H
2014-11-18
Chelating agents can control the speciation and reactivity of trace metals in biological, environmental, and laboratory-derived media. A large number of trace metals (including Fe, Cu, Zn, Hg, and others) show characteristic isotopic fingerprints that can be exploited for the discovery of known and unknown organic metal complexes and related chelating ligands in very complex sample matrices using high-resolution liquid chromatography mass spectrometry (LC-MS). However, there is currently no free open-source software available for this purpose. We present a novel software tool, ChelomEx, which identifies isotope pattern-matched chromatographic features associated with metal complexes along with free ligands and other related adducts in high-resolution LC-MS data. High sensitivity and exclusion of false positives are achieved by evaluation of the chromatographic coherence of the isotope pattern within chromatographic features, which we demonstrate through the analysis of bacterial culture media. A built-in graphical user interface and compound library aid in identification and efficient evaluation of results. ChelomEx is implemented in MatLab. The source code, binaries for MS Windows and MAC OS X as well as test LC-MS data are available for download at SourceForge ( http://sourceforge.net/projects/chelomex ).
PLUS: open-source toolkit for ultrasound-guided intervention systems.
Lasso, Andras; Heffter, Tamas; Rankin, Adam; Pinter, Csaba; Ungi, Tamas; Fichtinger, Gabor
2014-10-01
A variety of advanced image analysis methods have been under the development for ultrasound-guided interventions. Unfortunately, the transition from an image analysis algorithm to clinical feasibility trials as part of an intervention system requires integration of many components, such as imaging and tracking devices, data processing algorithms, and visualization software. The objective of our paper is to provide a freely available open-source software platform-PLUS: Public software Library for Ultrasound-to facilitate rapid prototyping of ultrasound-guided intervention systems for translational clinical research. PLUS provides a variety of methods for interventional tool pose and ultrasound image acquisition from a wide range of tracking and imaging devices, spatial and temporal calibration, volume reconstruction, simulated image generation, and recording and live streaming of the acquired data. This paper introduces PLUS, explains its functionality and architecture, and presents typical uses and performance in ultrasound-guided intervention systems. PLUS fulfills the essential requirements for the development of ultrasound-guided intervention systems and it aspires to become a widely used translational research prototyping platform. PLUS is freely available as open source software under BSD license and can be downloaded from http://www.plustoolkit.org.
XAssist: A System for the Automation of X-ray Astrophysics Analysis
NASA Astrophysics Data System (ADS)
Ptak, A.; Griffiths, R.
XAssist is a NASA AISR-funded project for the automation of X-ray astrophysics, with emphasis on galaxies. It is nearing completion of its initially funded effort, and is working well for Chandra and ROSAT data. Initial support for XMM-Newton data is present as well. It is capable of data reprocessing, source detection, and preliminary spatial, temporal and spectral analysis for each source with sufficient counts. The bulk of the system is written in Python, which in turn drives underlying software (CIAO for Chandra data, etc.). Future work will include a GUI (mainly for beginners and status monitoring) and the exposure of at least some functionality as web services. The latter will help XAssist to eventually become part of the VO, making advanced queries possible, such as determining the X-ray fluxes of counterparts to HST or SDSS sources (including the use of unpublished X-ray data), and add the ability of ``on-the-fly'' X-ray processing. Pipelines are running on ROSAT, Chandra and now XMM-Newton observations of galaxies to demonstrate XAssist's capabilities, and the results are available online (in real time) at http://www.xassist.org. XAssist itself as well as various associated projects are available for download.
Walkabout the Galaxy: Podcasting for Informal and Accessible Astronomy Outreach and Education
NASA Astrophysics Data System (ADS)
Colwell, J. E.; Dove, A.; Kehoe, A.; Becker, T. M.
2014-12-01
"Walkabout the Galaxy" is a weekly podcast we have been publishing since May 2014 discussing astronomical news that is in the popular media at the time of recording. Episodes are 25-30 minutes in length and are informal in style: we emphasize one or two basic points while engaging in a free-form discussion of the topic with frequent tangential asides. The target audience is the interested layperson rather than a student, professional, or amateur of astronomy. The informal style is deliberately chosen to keep the podcast from sounding like a classroom lesson and to improve the reach of the podcast to a broader public. Guests have included both experts and laypeople. The number of episode downloads varies by nearly a factor of two from episode to episode (~450 to 750). We will present statistics on downloads and subscriptions, and correlations with episode length, subject matter, and style of episode title. The style of the content cannot influence download statistics, however, and it is not possible to track actual listenership data once the episodes are downloaded. We will discuss lessons learned in creating and producing an educational podcast as well as listener feedback.
XTALOPT version r11: An open-source evolutionary algorithm for crystal structure prediction
NASA Astrophysics Data System (ADS)
Avery, Patrick; Falls, Zackary; Zurek, Eva
2018-01-01
Version 11 of XTALOPT, an evolutionary algorithm for crystal structure prediction, has now been made available for download from the CPC library or the XTALOPT website, http://xtalopt.github.io. Whereas the previous versions of XTALOPT were published under the Gnu Public License (GPL), the current version is made available under the 3-Clause BSD License, which is an open source license that is recognized by the Open Source Initiative. Importantly, the new version can be executed via a command line interface (i.e., it does not require the use of a Graphical User Interface). Moreover, the new version is written as a stand-alone program, rather than an extension to AVOGADRO.
ICIS-NPDES Biosolids Annual Report Download Summary ...
ECHO, Enforcement and Compliance History Online, provides compliance and enforcement information for approximately 800,000 EPA-regulated facilities nationwide. ECHO includes permit, inspection, violation, enforcement action, and penalty information about facilities regulated under the Clean Air Act (CAA) Stationary Source Program, Clean Water Act (CWA) National Pollutant Elimination Discharge System (NPDES), and/or Resource Conservation and Recovery Act (RCRA). Information also is provided on surrounding demographics when available.
NPDES Monitoring Data Download | ECHO | US EPA
ECHO, Enforcement and Compliance History Online, provides compliance and enforcement information for approximately 800,000 EPA-regulated facilities nationwide. ECHO includes permit, inspection, violation, enforcement action, and penalty information about facilities regulated under the Clean Air Act (CAA) Stationary Source Program, Clean Water Act (CWA) National Pollutant Elimination Discharge System (NPDES), and/or Resource Conservation and Recovery Act (RCRA). Information also is provided on surrounding demographics when available.
FRS Download Summary and Data Element Dictionary ...
ECHO, Enforcement and Compliance History Online, provides compliance and enforcement information for approximately 800,000 EPA-regulated facilities nationwide. ECHO includes permit, inspection, violation, enforcement action, and penalty information about facilities regulated under the Clean Air Act (CAA) Stationary Source Program, Clean Water Act (CWA) National Pollutant Elimination Discharge System (NPDES), and/or Resource Conservation and Recovery Act (RCRA). Information also is provided on surrounding demographics when available.
ICIS-NPDES Download Summary and Data Element ...
ECHO, Enforcement and Compliance History Online, provides compliance and enforcement information for approximately 800,000 EPA-regulated facilities nationwide. ECHO includes permit, inspection, violation, enforcement action, and penalty information about facilities regulated under the Clean Air Act (CAA) Stationary Source Program, Clean Water Act (CWA) National Pollutant Elimination Discharge System (NPDES), and/or Resource Conservation and Recovery Act (RCRA). Information also is provided on surrounding demographics when available.
RCRAInfo Download Summary and Data Element Dictionary ...
ECHO, Enforcement and Compliance History Online, provides compliance and enforcement information for approximately 800,000 EPA-regulated facilities nationwide. ECHO includes permit, inspection, violation, enforcement action, and penalty information about facilities regulated under the Clean Air Act (CAA) Stationary Source Program, Clean Water Act (CWA) National Pollutant Elimination Discharge System (NPDES), and/or Resource Conservation and Recovery Act (RCRA). Information also is provided on surrounding demographics when available.
Open source software to control Bioflo bioreactors.
Burdge, David A; Libourel, Igor G L
2014-01-01
Bioreactors are designed to support highly controlled environments for growth of tissues, cell cultures or microbial cultures. A variety of bioreactors are commercially available, often including sophisticated software to enhance the functionality of the bioreactor. However, experiments that the bioreactor hardware can support, but that were not envisioned during the software design cannot be performed without developing custom software. In addition, support for third party or custom designed auxiliary hardware is often sparse or absent. This work presents flexible open source freeware for the control of bioreactors of the Bioflo product family. The functionality of the software includes setpoint control, data logging, and protocol execution. Auxiliary hardware can be easily integrated and controlled through an integrated plugin interface without altering existing software. Simple experimental protocols can be entered as a CSV scripting file, and a Python-based protocol execution model is included for more demanding conditional experimental control. The software was designed to be a more flexible and free open source alternative to the commercially available solution. The source code and various auxiliary hardware plugins are publicly available for download from https://github.com/LibourelLab/BiofloSoftware. In addition to the source code, the software was compiled and packaged as a self-installing file for 32 and 64 bit windows operating systems. The compiled software will be able to control a Bioflo system, and will not require the installation of LabVIEW.
Open Source Software to Control Bioflo Bioreactors
Burdge, David A.; Libourel, Igor G. L.
2014-01-01
Bioreactors are designed to support highly controlled environments for growth of tissues, cell cultures or microbial cultures. A variety of bioreactors are commercially available, often including sophisticated software to enhance the functionality of the bioreactor. However, experiments that the bioreactor hardware can support, but that were not envisioned during the software design cannot be performed without developing custom software. In addition, support for third party or custom designed auxiliary hardware is often sparse or absent. This work presents flexible open source freeware for the control of bioreactors of the Bioflo product family. The functionality of the software includes setpoint control, data logging, and protocol execution. Auxiliary hardware can be easily integrated and controlled through an integrated plugin interface without altering existing software. Simple experimental protocols can be entered as a CSV scripting file, and a Python-based protocol execution model is included for more demanding conditional experimental control. The software was designed to be a more flexible and free open source alternative to the commercially available solution. The source code and various auxiliary hardware plugins are publicly available for download from https://github.com/LibourelLab/BiofloSoftware. In addition to the source code, the software was compiled and packaged as a self-installing file for 32 and 64 bit windows operating systems. The compiled software will be able to control a Bioflo system, and will not require the installation of LabVIEW. PMID:24667828
Providing open access data online to advance malaria research and control.
Moyes, Catherine L; Temperley, William H; Henry, Andrew J; Burgert, Clara R; Hay, Simon I
2013-05-16
To advance research on malaria, the outputs from existing studies and the data that fed into them need to be made freely available. This will ensure new studies can build on the work that has gone before. These data and results also need to be made available to groups who are developing public health policies based on up-to-date evidence. The Malaria Atlas Project (MAP) has collated and geopositioned over 50,000 parasite prevalence and vector occurrence survey records contributed by over 3,000 sources including research groups, government agencies and non-governmental organizations worldwide. This paper describes the results of a project set up to release data gathered, used and generated by MAP. Requests for permission to release data online were sent to 236 groups who had contributed unpublished prevalence (parasite rate) surveys. An online explorer tool was developed so that users can visualize the spatial distribution of the vector and parasite survey data before downloading it. In addition, a consultation group was convened to provide advice on the mode and format of release for data generated by MAP's modelling work. New software was developed to produce a suite of publication-quality map images for download from the internet for use in external publications. More than 40,000 survey records can now be visualized on a set of dynamic maps and downloaded from the MAP website on a free and unrestricted basis. As new data are added and new permissions to release existing data come in, the volume of data available for download will increase. The modelled data output from MAP's own analyses are also available online in a range of formats, including image files and GIS surface data, for use in advocacy, education, further research and to help parameterize or validate other mathematical models.
Baek, Sora; Yoon, Dae Young; Lim, Kyoung Ja; Cho, Young Kwon; Seo, Young Lan; Yun, Eun Joo
2018-05-07
To evaluate and compare the characteristics of the most downloaded and most cited articles in radiology journals. We selected 41 radiology journals that provided lists of both the most downloaded and most cited articles on their websites, and identified the 596 most downloaded articles and 596 most cited articles. We compared the following characteristics of the most downloaded and most cited articles: year of publication, journal title, department of the first author, country of origin, publication type, radiologic subspecialty, radiologic technique and accessibility. Compared to the most cited articles, the most downloaded articles were more frequently review articles (36.1% vs 17.1%, p < 0.05), case reports (5.9% vs 3.2%, p < 0.05), guidelines/consensus statements (5.4% vs 2.7%, p < 0.05), editorials/commentaries (3.7% vs 0.7%, p < 0.05) and pictorial essays (2.0% vs 0.2%, p < 0.05). Compared to the most cited articles, the most downloaded articles more frequently originated from the UK (8.7% vs 5.0%, p < 0.05) and were more frequently free-access articles (46.0% vs 39.4%, p < 0.05). Educational and free-access articles are more frequent among the most downloaded articles. • There was only small overlap between the most downloaded and most cited articles. • Educational articles were more frequent among the most downloaded articles. • Free-access articles are more frequent among the most downloaded articles.
Lin, Chin-Feng
2008-02-01
This study examined the downloader cognitive structures toward Web service quality and the downloader ethical attitudes across various levels of participation in a virtual community. Using four types of free downloads as the research subjects, the researcher found that the users in different participation degrees have different perception preferences. Owners of the free downloading Web sites can use the findings of this study to develop effective Web marketing strategies.
Managing multicentre clinical trials with open source.
Raptis, Dimitri Aristotle; Mettler, Tobias; Fischer, Michael Alexander; Patak, Michael; Lesurtel, Mickael; Eshmuminov, Dilmurodjon; de Rougemont, Olivier; Graf, Rolf; Clavien, Pierre-Alain; Breitenstein, Stefan
2014-03-01
Multicentre clinical trials are challenged by high administrative burden, data management pitfalls and costs. This leads to a reduced enthusiasm and commitment of the physicians involved and thus to a reluctance in conducting multicentre clinical trials. The purpose of this study was to develop a web-based open source platform to support a multi-centre clinical trial. We developed on Drupal, an open source software distributed under the terms of the General Public License, a web-based, multi-centre clinical trial management system with the design science research approach. This system was evaluated by user-testing and well supported several completed and on-going clinical trials and is available for free download. Open source clinical trial management systems are capable in supporting multi-centre clinical trials by enhancing efficiency, quality of data management and collaboration.
NASA Astrophysics Data System (ADS)
Tsuboi, Seiji; Horikawa, Hiroki; Takaesu, Morifumi; Sueki, Kentaro; Araki, Eiichiro; Sonoda, Akira; Takahashi, Narumi
2016-04-01
The Nankai Trough in southwest Japan is one of most active subduction zone in the world. Great mega-thrust earthquakes repeatedly occurred every 100 to 150 years in this area, it's anticipated to occur in the not distant future. For the purpose of elucidation of the history of mega-splay fault activity, the physical properties of the geological strata and the internal structure of the accretionary prism, and monitoring of diastrophism in this area, we have a plan, Nankai Trough Seismogenic Zone Experiments (NanTroSEIZE), as a part of Integrated Ocean Drilling Program (IODP). We have a plan to install the borehole observation system in a few locations by the NanTroSEIZE. This system is called Long-Term Borehole Monitoring System, it consists of various sensors in the borehole such as a broadband seismometer, a tiltmeter, a strainmeter, geophones and accelerometer, thermometer array as well as pressure ports for pore-fluid pressure monitoring. The signal from sensors is transmitted to DONET (Dense Ocean-floor Network System for Earthquake and Tsunamis) in real time. During IODP Exp. 332 in December 2010, the first Long-Term Borehole Monitoring System was installed into the C0002 borehole site located 80 km off the Kii Peninsula, 1938 m water depth in the Nankai Trough. We have developed a web application system for data download, Long-Term Borehole Monitoring Data Site. Based on a term and sensors which user selected on this site, user can download monitoring waveform data (e.g. broadband seismometer data, accelerometer data, strainmeter data, tiltmeter data) in near real-time. This system can make the arbitrary data which user selected a term and sensors, and download it simply. Downloadable continuous data is provided in seed format, which includes sensor informations. In addition, before data download, user can check that data is abailable or not by data check function. In this presentation, we show our web application system and discuss our future plans for developments of monitoring data download system.
SIDECACHE: Information access, management and dissemination framework for web services.
Doderer, Mark S; Burkhardt, Cory; Robbins, Kay A
2011-06-14
Many bioinformatics algorithms and data sets are deployed using web services so that the results can be explored via the Internet and easily integrated into other tools and services. These services often include data from other sites that is accessed either dynamically or through file downloads. Developers of these services face several problems because of the dynamic nature of the information from the upstream services. Many publicly available repositories of bioinformatics data frequently update their information. When such an update occurs, the developers of the downstream service may also need to update. For file downloads, this process is typically performed manually followed by web service restart. Requests for information obtained by dynamic access of upstream sources is sometimes subject to rate restrictions. SideCache provides a framework for deploying web services that integrate information extracted from other databases and from web sources that are periodically updated. This situation occurs frequently in biotechnology where new information is being continuously generated and the latest information is important. SideCache provides several types of services including proxy access and rate control, local caching, and automatic web service updating. We have used the SideCache framework to automate the deployment and updating of a number of bioinformatics web services and tools that extract information from remote primary sources such as NCBI, NCIBI, and Ensembl. The SideCache framework also has been used to share research results through the use of a SideCache derived web service.
JHelioviewer: Open-Source Software for Discovery and Image Access in the Petabyte Age (Invited)
NASA Astrophysics Data System (ADS)
Mueller, D.; Dimitoglou, G.; Langenberg, M.; Pagel, S.; Dau, A.; Nuhn, M.; Garcia Ortiz, J. P.; Dietert, H.; Schmidt, L.; Hughitt, V. K.; Ireland, J.; Fleck, B.
2010-12-01
The unprecedented torrent of data returned by the Solar Dynamics Observatory is both a blessing and a barrier: a blessing for making available data with significantly higher spatial and temporal resolution, but a barrier for scientists to access, browse and analyze them. With such staggering data volume, the data is bound to be accessible only from a few repositories and users will have to deal with data sets effectively immobile and practically difficult to download. From a scientist's perspective this poses three challenges: accessing, browsing and finding interesting data while avoiding the proverbial search for a needle in a haystack. To address these challenges, we have developed JHelioviewer, an open-source visualization software that lets users browse large data volumes both as still images and movies. We did so by deploying an efficient image encoding, storage, and dissemination solution using the JPEG 2000 standard. This solution enables users to access remote images at different resolution levels as a single data stream. Users can view, manipulate, pan, zoom, and overlay JPEG 2000 compressed data quickly, without severe network bandwidth penalties. Besides viewing data, the browser provides third-party metadata and event catalog integration to quickly locate data of interest, as well as an interface to the Virtual Solar Observatory to download science-quality data. As part of the Helioviewer Project, JHelioviewer offers intuitive ways to browse large amounts of heterogeneous data remotely and provides an extensible and customizable open-source platform for the scientific community.
Tiered Human Integrated Sequence Search Databases for Shotgun Proteomics.
Deutsch, Eric W; Sun, Zhi; Campbell, David S; Binz, Pierre-Alain; Farrah, Terry; Shteynberg, David; Mendoza, Luis; Omenn, Gilbert S; Moritz, Robert L
2016-11-04
The results of analysis of shotgun proteomics mass spectrometry data can be greatly affected by the selection of the reference protein sequence database against which the spectra are matched. For many species there are multiple sources from which somewhat different sequence sets can be obtained. This can lead to confusion about which database is best in which circumstances-a problem especially acute in human sample analysis. All sequence databases are genome-based, with sequences for the predicted gene and their protein translation products compiled. Our goal is to create a set of primary sequence databases that comprise the union of sequences from many of the different available sources and make the result easily available to the community. We have compiled a set of four sequence databases of varying sizes, from a small database consisting of only the ∼20,000 primary isoforms plus contaminants to a very large database that includes almost all nonredundant protein sequences from several sources. This set of tiered, increasingly complete human protein sequence databases suitable for mass spectrometry proteomics sequence database searching is called the Tiered Human Integrated Search Proteome set. In order to evaluate the utility of these databases, we have analyzed two different data sets, one from the HeLa cell line and the other from normal human liver tissue, with each of the four tiers of database complexity. The result is that approximately 0.8%, 1.1%, and 1.5% additional peptides can be identified for Tiers 2, 3, and 4, respectively, as compared with the Tier 1 database, at substantially increasing computational cost. This increase in computational cost may be worth bearing if the identification of sequence variants or the discovery of sequences that are not present in the reviewed knowledge base entries is an important goal of the study. We find that it is useful to search a data set against a simpler database, and then check the uniqueness of the discovered peptides against a more complex database. We have set up an automated system that downloads all the source databases on the first of each month and automatically generates a new set of search databases and makes them available for download at http://www.peptideatlas.org/thisp/ .
Tiered Human Integrated Sequence Search Databases for Shotgun Proteomics
Deutsch, Eric W.; Sun, Zhi; Campbell, David S.; Binz, Pierre-Alain; Farrah, Terry; Shteynberg, David; Mendoza, Luis; Omenn, Gilbert S.; Moritz, Robert L.
2016-01-01
The results of analysis of shotgun proteomics mass spectrometry data can be greatly affected by the selection of the reference protein sequence database against which the spectra are matched. For many species there are multiple sources from which somewhat different sequence sets can be obtained. This can lead to confusion about which database is best in which circumstances – a problem especially acute in human sample analysis. All sequence databases are genome-based, with sequences for the predicted gene and their protein translation products compiled. Our goal is to create a set of primary sequence databases that comprise the union of sequences from many of the different available sources and make the result easily available to the community. We have compiled a set of four sequence databases of varying sizes, from a small database consisting of only the ~20,000 primary isoforms plus contaminants to a very large database that includes almost all non-redundant protein sequences from several sources. This set of tiered, increasingly complete human protein sequence databases suitable for mass spectrometry proteomics sequence database searching is called the Tiered Human Integrated Search Proteome set. In order to evaluate the utility of these databases, we have analyzed two different data sets, one from the HeLa cell line and the other from normal human liver tissue, with each of the four tiers of database complexity. The result is that approximately 0.8%, 1.1%, and 1.5% additional peptides can be identified for Tiers 2, 3, and 4, respectively, as compared with the Tier 1 database, at substantially increasing computational cost. This increase in computational cost may be worth bearing if the identification of sequence variants or the discovery of sequences that are not present in the reviewed knowledge base entries is an important goal of the study. We find that it is useful to search a data set against a simpler database, and then check the uniqueness of the discovered peptides against a more complex database. We have set up an automated system that downloads all the source databases on the first of each month and automatically generates a new set of search databases and makes them available for download at http://www.peptideatlas.org/thisp/. PMID:27577934
Chain of Custody Item Monitor Message Viewer v.1.0 Beta
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schwartz, Steven Robert; Fielder, Laura; Hymel, Ross W.
The CoCIM Message Viewer software allows users to connect to and download messages from a Chain of Custody Item Monitor (CoCIM) connected to a serial port on the user’s computer. The downloaded messages are authenticated and displayed in a Graphical User Interface that allows the user a limited degree of sorting and filtering of the downloaded messages as well as the ability to save downloaded files or to open previously downloaded message history files.
ICIS-NPDES Data Set Download | ECHO | US EPA
ECHO, Enforcement and Compliance History Online, provides compliance and enforcement information for approximately 800,000 EPA-regulated facilities nationwide. ECHO includes permit, inspection, violation, enforcement action, and penalty information about facilities regulated under the Clean Air Act (CAA) Stationary Source Program, Clean Water Act (CWA) National Pollutant Elimination Discharge System (NPDES), and/or Resource Conservation and Recovery Act (RCRA). Information also is provided on surrounding demographics when available.
Transport Traffic Analysis for Abusive Infrastructure Characterization
2012-12-14
Introduction Abusive traffic abounds on the Internet, in the form of email, malware, vulnerability scanners, worms, denial-of-service, drive-by-downloads, scam ...insight is two-fold. First, attackers have a basic requirement to source large amounts of data, be it denial-of-service, scam -hosting, spam, or other...the network core. This paper explores the power of transport-layer traffic analysis to detect and characterize scam hosting infrastructure, including
EPA FRS Facilities Combined File CSV Download for the Marshall Islands
The Facility Registry System (FRS) identifies facilities, sites, or places subject to environmental regulation or of environmental interest to EPA programs or delegated states. Using vigorous verification and data management procedures, FRS integrates facility data from program national systems, state master facility records, tribal partners, and other federal agencies and provides the Agency with a centrally managed, single source of comprehensive and authoritative information on facilities.
EPA FRS Facilities Single File CSV Download for the Marshall Islands
The Facility Registry System (FRS) identifies facilities, sites, or places subject to environmental regulation or of environmental interest to EPA programs or delegated states. Using vigorous verification and data management procedures, FRS integrates facility data from program national systems, state master facility records, tribal partners, and other federal agencies and provides the Agency with a centrally managed, single source of comprehensive and authoritative information on facilities.
A nomenclator of extant and fossil taxa of the Valvatidae (Gastropoda, Ectobranchia)
Haszprunar, Gerhard
2014-01-01
Abstract A compilation of all supra- and (infra-) specific taxa of extant and fossil Valvatidae, a group of freshwater operculate snails, is provided, including taxa initially described in this family and subsequently classified in other families, as well as names containing errors or misspellings. The extensive reference list is directly linked to the available electronic source (digital view or pdf-download) of the respective papers. PMID:24578604
ICIS-Air Download Summary and Data Element Dictionary ...
ECHO, Enforcement and Compliance History Online, provides compliance and enforcement information for approximately 800,000 EPA-regulated facilities nationwide. ECHO includes permit, inspection, violation, enforcement action, and penalty information about facilities regulated under the Clean Air Act (CAA) Stationary Source Program, Clean Water Act (CWA) National Pollutant Elimination Discharge System (NPDES), and/or Resource Conservation and Recovery Act (RCRA). Information also is provided on surrounding demographics when available.
ICIS-FE&C Download Summary and Data Element Dictionary ...
ECHO, Enforcement and Compliance History Online, provides compliance and enforcement information for approximately 800,000 EPA-regulated facilities nationwide. ECHO includes permit, inspection, violation, enforcement action, and penalty information about facilities regulated under the Clean Air Act (CAA) Stationary Source Program, Clean Water Act (CWA) National Pollutant Elimination Discharge System (NPDES), and/or Resource Conservation and Recovery Act (RCRA). Information also is provided on surrounding demographics when available.
2014 Offshore Wind Market and Economic Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hamilton, Bruce
2014-08-25
The objective of this report is to provide a comprehensive annual assessment of the U.S. offshore wind market.This 3rd annual report focuses on new developments that have occurred in 2014. The report provides stakeholders with a reliable and consistent data source addressing entry barriers and U.S. competitiveness in the offshore wind market. Available for download are both the full report and the report's underlying data.
NPDES Monitoring Data Download Help | ECHO | US EPA
ECHO, Enforcement and Compliance History Online, provides compliance and enforcement information for approximately 800,000 EPA-regulated facilities nationwide. ECHO includes permit, inspection, violation, enforcement action, and penalty information about facilities regulated under the Clean Air Act (CAA) Stationary Source Program, Clean Water Act (CWA) National Pollutant Elimination Discharge System (NPDES), and/or Resource Conservation and Recovery Act (RCRA). Information also is provided on surrounding demographics when available.
NPDES eRule Dashboard Download Help | ECHO | US EPA
ECHO, Enforcement and Compliance History Online, provides compliance and enforcement information for approximately 800,000 EPA-regulated facilities nationwide. ECHO includes permit, inspection, violation, enforcement action, and penalty information about facilities regulated under the Clean Air Act (CAA) Stationary Source Program, Clean Water Act (CWA) National Pollutant Elimination Discharge System (NPDES), and/or Resource Conservation and Recovery Act (RCRA). Information also is provided on surrounding demographics when available.
GeneXplorer: an interactive web application for microarray data visualization and analysis.
Rees, Christian A; Demeter, Janos; Matese, John C; Botstein, David; Sherlock, Gavin
2004-10-01
When publishing large-scale microarray datasets, it is of great value to create supplemental websites where either the full data, or selected subsets corresponding to figures within the paper, can be browsed. We set out to create a CGI application containing many of the features of some of the existing standalone software for the visualization of clustered microarray data. We present GeneXplorer, a web application for interactive microarray data visualization and analysis in a web environment. GeneXplorer allows users to browse a microarray dataset in an intuitive fashion. It provides simple access to microarray data over the Internet and uses only HTML and JavaScript to display graphic and annotation information. It provides radar and zoom views of the data, allows display of the nearest neighbors to a gene expression vector based on their Pearson correlations and provides the ability to search gene annotation fields. The software is released under the permissive MIT Open Source license, and the complete documentation and the entire source code are freely available for download from CPAN http://search.cpan.org/dist/Microarray-GeneXplorer/.
Integrative Genomics Viewer (IGV): high-performance genomics data visualization and exploration
Thorvaldsdóttir, Helga; Mesirov, Jill P.
2013-01-01
Data visualization is an essential component of genomic data analysis. However, the size and diversity of the data sets produced by today’s sequencing and array-based profiling methods present major challenges to visualization tools. The Integrative Genomics Viewer (IGV) is a high-performance viewer that efficiently handles large heterogeneous data sets, while providing a smooth and intuitive user experience at all levels of genome resolution. A key characteristic of IGV is its focus on the integrative nature of genomic studies, with support for both array-based and next-generation sequencing data, and the integration of clinical and phenotypic data. Although IGV is often used to view genomic data from public sources, its primary emphasis is to support researchers who wish to visualize and explore their own data sets or those from colleagues. To that end, IGV supports flexible loading of local and remote data sets, and is optimized to provide high-performance data visualization and exploration on standard desktop systems. IGV is freely available for download from http://www.broadinstitute.org/igv, under a GNU LGPL open-source license. PMID:22517427
Integrative Genomics Viewer (IGV): high-performance genomics data visualization and exploration.
Thorvaldsdóttir, Helga; Robinson, James T; Mesirov, Jill P
2013-03-01
Data visualization is an essential component of genomic data analysis. However, the size and diversity of the data sets produced by today's sequencing and array-based profiling methods present major challenges to visualization tools. The Integrative Genomics Viewer (IGV) is a high-performance viewer that efficiently handles large heterogeneous data sets, while providing a smooth and intuitive user experience at all levels of genome resolution. A key characteristic of IGV is its focus on the integrative nature of genomic studies, with support for both array-based and next-generation sequencing data, and the integration of clinical and phenotypic data. Although IGV is often used to view genomic data from public sources, its primary emphasis is to support researchers who wish to visualize and explore their own data sets or those from colleagues. To that end, IGV supports flexible loading of local and remote data sets, and is optimized to provide high-performance data visualization and exploration on standard desktop systems. IGV is freely available for download from http://www.broadinstitute.org/igv, under a GNU LGPL open-source license.
Wu, Tai-luan; Tseng, Ling-li
2017-01-01
This study examines the completeness and overlap of coverage in physics of six open access scholarly communication systems, including two search engines (Google Scholar and Microsoft Academic), two aggregate institutional repositories (OAIster and OpenDOAR), and two physics-related open sources (arXiv.org and Astrophysics Data System). The 2001–2013 Nobel Laureates in Physics served as the sample. Bibliographic records of their publications were retrieved and downloaded from each system, and a computer program was developed to perform the analytical tasks of sorting, comparison, elimination, aggregation and statistical calculations. Quantitative analyses and cross-referencing were performed to determine the completeness and overlap of the system coverage of the six open access systems. The results may enable scholars to select an appropriate open access system as an efficient scholarly communication channel, and academic institutions may build institutional repositories or independently create citation index systems in the future. Suggestions on indicators and tools for academic assessment are presented based on the comprehensiveness assessment of each system. PMID:29267327
Python for large-scale electrophysiology.
Spacek, Martin; Blanche, Tim; Swindale, Nicholas
2008-01-01
Electrophysiology is increasingly moving towards highly parallel recording techniques which generate large data sets. We record extracellularly in vivo in cat and rat visual cortex with 54-channel silicon polytrodes, under time-locked visual stimulation, from localized neuronal populations within a cortical column. To help deal with the complexity of generating and analysing these data, we used the Python programming language to develop three software projects: one for temporally precise visual stimulus generation ("dimstim"); one for electrophysiological waveform visualization and spike sorting ("spyke"); and one for spike train and stimulus analysis ("neuropy"). All three are open source and available for download (http://swindale.ecc.ubc.ca/code). The requirements and solutions for these projects differed greatly, yet we found Python to be well suited for all three. Here we present our software as a showcase of the extensive capabilities of Python in neuroscience.
The DIMA web resource--exploring the protein domain network.
Pagel, Philipp; Oesterheld, Matthias; Stümpflen, Volker; Frishman, Dmitrij
2006-04-15
Conserved domains represent essential building blocks of most known proteins. Owing to their role as modular components carrying out specific functions they form a network based both on functional relations and direct physical interactions. We have previously shown that domain interaction networks provide substantially novel information with respect to networks built on full-length protein chains. In this work we present a comprehensive web resource for exploring the Domain Interaction MAp (DIMA), interactively. The tool aims at integration of multiple data sources and prediction techniques, two of which have been implemented so far: domain phylogenetic profiling and experimentally demonstrated domain contacts from known three-dimensional structures. A powerful yet simple user interface enables the user to compute, visualize, navigate and download domain networks based on specific search criteria. http://mips.gsf.de/genre/proj/dima
MyPMFs: a simple tool for creating statistical potentials to assess protein structural models.
Postic, Guillaume; Hamelryck, Thomas; Chomilier, Jacques; Stratmann, Dirk
2018-05-29
Evaluating the model quality of protein structures that evolve in environments with particular physicochemical properties requires scoring functions that are adapted to their specific residue compositions and/or structural characteristics. Thus, computational methods developed for structures from the cytosol cannot work properly on membrane or secreted proteins. Here, we present MyPMFs, an easy-to-use tool that allows users to train statistical potentials of mean force (PMFs) on the protein structures of their choice, with all parameters being adjustable. We demonstrate its use by creating an accurate statistical potential for transmembrane protein domains. We also show its usefulness to study the influence of the physical environment on residue interactions within protein structures. Our open-source software is freely available for download at https://github.com/bibip-impmc/mypmfs. Copyright © 2018. Published by Elsevier B.V.
FRBCAT: The Fast Radio Burst Catalogue
NASA Astrophysics Data System (ADS)
Petroff, E.; Barr, E. D.; Jameson, A.; Keane, E. F.; Bailes, M.; Kramer, M.; Morello, V.; Tabbara, D.; van Straten, W.
2016-09-01
Here, we present a catalogue of known Fast Radio Burst sources in the form of an online catalogue, FRBCAT. The catalogue includes information about the instrumentation used for the observations for each detected burst, the measured quantities from each observation, and model-dependent quantities derived from observed quantities. To aid in consistent comparisons of burst properties such as width and signal-to-noise ratios, we have re-processed all the bursts for which we have access to the raw data, with software which we make available. The originally derived properties are also listed for comparison. The catalogue is hosted online as a Mysql database which can also be downloaded in tabular or plain text format for off-line use. This database will be maintained for use by the community for studies of the Fast Radio Burst population as it grows.
ELM: super-resolution analysis of wide-field images of fluorescent shell structures
NASA Astrophysics Data System (ADS)
Manton, James D.; Xiao, Yao; Turner, Robert D.; Christie, Graham; Rees, Eric J.
2018-07-01
It is often necessary to precisely quantify the size of specimens in biological studies. When measuring feature size in fluorescence microscopy, significant biases can arise due to blurring of its edges if the feature is smaller than the diffraction limit of resolution. This problem is avoided if an equation describing the feature’s entire image is fitted to its image data. In this paper we present open-source software, ELM, which uses this approach to measure the size of spheroidal or cylindrical fluorescent shells with a precision of around 10 nm. This has been used to measure coat protein locations in bacterial spores and cell wall diameter in vegetative bacilli, and may also be valuable in microbiological studies of algae, fungi and viruses. ELM is available for download at https://github.com/quantitativeimaging/ELM.
ELM: super-resolution analysis of wide-field images of fluorescent shell structures.
Manton, James; Xiao, Yao; Turner, Robert; Christie, Graham; Rees, Eric
2018-05-04
It is often necessary to precisely quantify the size of specimens in biological studies. When measuring feature size in fluorescence microscopy, significant biases can arise due to blurring of its edges if the feature is smaller than the diffraction limit of resolution. This problem is avoided if an equation describing the feature's entire image is fitted to its image data. In this paper we present open-source software, ELM, which uses this approach to measure the size of spheroidal or cylindrical fluorescent shells with a precision of around 10 nm. This has been used to measure coat protein locations in bacterial spores and cell wall diameter in vegetative bacilli, and may also be valuable in microbiological studies of algae, fungi and viruses. ELM is available for download at https://github.com/quantitativeimaging/ELM. Creative Commons Attribution license.
Implementation of the Regulatory Authority Information System in Egypt
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carson, S.D.; Schetnan, R.; Hasan, A.
2006-07-01
As part of the implementation of a bar-code-based system to track radioactive sealed sources (RSS) in Egypt, the Regulatory Authority Information System Personal Digital Assistant (RAIS PDA) Application was developed to extend the functionality of the International Atomic Energy Agency's (IAEA's) RAIS database by allowing users to download RSS data from the database to a portable PDA equipped with a bar-code scanner. [1, 4] The system allows users in the field to verify radioactive sealed source data, gather radioactive sealed source audit information, and upload that data to the RAIS database. This paper describes the development of the RAIS PDAmore » Application, its features, and how it will be implemented in Egypt. (authors)« less
Genetically improved BarraCUDA.
Langdon, W B; Lam, Brian Yee Hong
2017-01-01
BarraCUDA is an open source C program which uses the BWA algorithm in parallel with nVidia CUDA to align short next generation DNA sequences against a reference genome. Recently its source code was optimised using "Genetic Improvement". The genetically improved (GI) code is up to three times faster on short paired end reads from The 1000 Genomes Project and 60% more accurate on a short BioPlanet.com GCAT alignment benchmark. GPGPU BarraCUDA running on a single K80 Tesla GPU can align short paired end nextGen sequences up to ten times faster than bwa on a 12 core server. The speed up was such that the GI version was adopted and has been regularly downloaded from SourceForge for more than 12 months.
VESL: The Virtual Earth Sheet Laboratory for Ice Sheet Modeling and Visualization
NASA Astrophysics Data System (ADS)
Cheng, D. L. C.; Larour, E. Y.; Quinn, J. D.; Halkides, D. J.
2017-12-01
We present the Virtual Earth System Laboratory (VESL), a scientific modeling and visualization tool delivered through an integrated web portal. This allows for the dissemination of data, simulation of physical processes, and promotion of climate literacy. The current iteration leverages NASA's Ice Sheet System Model (ISSM), a state-of-the-art polar ice sheet dynamics model developed at the Jet Propulsion Lab and UC Irvine. We utilize the Emscripten source-to-source compiler to convert the C/C++ ISSM engine core to JavaScript, and bundled pre/post-processing JS scripts to be compatible with the existing ISSM Python/Matlab API. Researchers using VESL will be able to effectively present their work for public dissemination with little-to-no additional post-processing. Moreover, the portal allows for real time visualization and editing of models, cloud based computational simulation, and downloads of relevant data. This allows for faster publication in peer-reviewed journals and adaption of results for educational applications. Through application of this concept to multiple aspects of the Earth System, VESL is able to broaden data applications in the geosciences and beyond. At this stage, we still seek feedback from the greater scientific and public outreach communities regarding the ease of use and feature set of VESL. As we plan its expansion, we aim to achieve more rapid communication and presentation of scientific results.
The Wastewater Information System Tool (TWIST) is downloadable, user-friendly management tool that will allow state and local health departments to effectively inventory and manage small wastewater treatment systems in their jurisdictions.
JHelioviewer. Time-dependent 3D visualisation of solar and heliospheric data
NASA Astrophysics Data System (ADS)
Müller, D.; Nicula, B.; Felix, S.; Verstringe, F.; Bourgoignie, B.; Csillaghy, A.; Berghmans, D.; Jiggens, P.; García-Ortiz, J. P.; Ireland, J.; Zahniy, S.; Fleck, B.
2017-09-01
Context. Solar observatories are providing the world-wide community with a wealth of data, covering wide time ranges (e.g. Solar and Heliospheric Observatory, SOHO), multiple viewpoints (Solar TErrestrial RElations Observatory, STEREO), and returning large amounts of data (Solar Dynamics Observatory, SDO). In particular, the large volume of SDO data presents challenges; the data are available only from a few repositories, and full-disk, full-cadence data for reasonable durations of scientific interest are difficult to download, due to their size and the download rates available to most users. From a scientist's perspective this poses three problems: accessing, browsing, and finding interesting data as efficiently as possible. Aims: To address these challenges, we have developed JHelioviewer, a visualisation tool for solar data based on the JPEG 2000 compression standard and part of the open source ESA/NASA Helioviewer Project. Since the first release of JHelioviewer in 2009, the scientific functionality of the software has been extended significantly, and the objective of this paper is to highlight these improvements. Methods: The JPEG 2000 standard offers useful new features that facilitate the dissemination and analysis of high-resolution image data and offers a solution to the challenge of efficiently browsing petabyte-scale image archives. The JHelioviewer software is open source, platform independent, and extendable via a plug-in architecture. Results: With JHelioviewer, users can visualise the Sun for any time period between September 1991 and today; they can perform basic image processing in real time, track features on the Sun, and interactively overlay magnetic field extrapolations. The software integrates solar event data and a timeline display. Once an interesting event has been identified, science quality data can be accessed for in-depth analysis. As a first step towards supporting science planning of the upcoming Solar Orbiter mission, JHelioviewer offers a virtual camera model that enables users to set the vantage point to the location of a spacecraft or celestial body at any given time.
NASA Astrophysics Data System (ADS)
Barrie, A.; Gliese, U.; Gershman, D. J.; Avanov, L. A.; Rager, A. C.; Pollock, C. J.; Dorelli, J.
2015-12-01
The Fast Plasma Investigation (FPI) on the Magnetospheric Multiscale mission (MMS) combines data from eight spectrometers, each with four deflection states, into a single map of the sky. Any systematic discontinuity, artifact, noise source, etc. present in this map may be incorrectly interpreted as legitimate data and incorrect conclusions reached. For this reason it is desirable to have all spectrometers return the same output for a given input, and for this output to be low in noise sources or other errors. While many missions use statistical analyses of data to calibrate instruments in flight, this process is difficult with FPI for two reasons: 1. Only a small fraction of high resolution data is downloaded to the ground due to bandwidth limitations and 2: The data that is downloaded is, by definition, scientifically interesting and therefore not ideal for calibration. FPI uses a suite of new tools to calibrate in flight. A new method for detection system ground calibration has been developed involving sweeping the detection threshold to fully define the pulse height distribution. This method has now been extended for use in flight as a means to calibrate MCP voltage and threshold (together forming the operating point) of the Dual Electron Spectrometers (DES) and Dual Ion Spectrometers (DIS). A method of comparing higher energy data (which has low fractional voltage error) to lower energy data (which has a higher fractional voltage error) will be used to calibrate the high voltage outputs. Finally, a comparison of pitch angle distributions will be used to find remaining discrepancies among sensors. Initial flight results from the four MMS observatories will be discussed here. Specifically, data from initial commissioning, inter-instrument cross calibration and interference testing, and initial Phase1A routine calibration results. Success and performance of the in flight calibration as well as deviation from the ground calibration will be discussed.
Multiagency Urban Search Experiment Detector and Algorithm Test Bed
NASA Astrophysics Data System (ADS)
Nicholson, Andrew D.; Garishvili, Irakli; Peplow, Douglas E.; Archer, Daniel E.; Ray, William R.; Swinney, Mathew W.; Willis, Michael J.; Davidson, Gregory G.; Cleveland, Steven L.; Patton, Bruce W.; Hornback, Donald E.; Peltz, James J.; McLean, M. S. Lance; Plionis, Alexander A.; Quiter, Brian J.; Bandstra, Mark S.
2017-07-01
In order to provide benchmark data sets for radiation detector and algorithm development, a particle transport test bed has been created using experimental data as model input and validation. A detailed radiation measurement campaign at the Combined Arms Collective Training Facility in Fort Indiantown Gap, PA (FTIG), USA, provides sample background radiation levels for a variety of materials present at the site (including cinder block, gravel, asphalt, and soil) using long dwell high-purity germanium (HPGe) measurements. In addition, detailed light detection and ranging data and ground-truth measurements inform model geometry. This paper describes the collected data and the application of these data to create background and injected source synthetic data for an arbitrary gamma-ray detection system using particle transport model detector response calculations and statistical sampling. In the methodology presented here, HPGe measurements inform model source terms while detector response calculations are validated via long dwell measurements using 2"×4"×16" NaI(Tl) detectors at a variety of measurement points. A collection of responses, along with sampling methods and interpolation, can be used to create data sets to gauge radiation detector and algorithm (including detection, identification, and localization) performance under a variety of scenarios. Data collected at the FTIG site are available for query, filtering, visualization, and download at muse.lbl.gov.
Scoria: a Python module for manipulating 3D molecular data.
Ropp, Patrick; Friedman, Aaron; Durrant, Jacob D
2017-09-18
Third-party packages have transformed the Python programming language into a powerful computational-biology tool. Package installation is easy for experienced users, but novices sometimes struggle with dependencies and compilers. This presents a barrier that can hinder the otherwise broad adoption of new tools. We present Scoria, a Python package for manipulating three-dimensional molecular data. Unlike similar packages, Scoria requires no dependencies, compilation, or system-wide installation. One can incorporate the Scoria source code directly into their own programs. But Scoria is not designed to compete with other similar packages. Rather, it complements them. Our package leverages others (e.g. NumPy, SciPy), if present, to speed and extend its own functionality. To show its utility, we use Scoria to analyze a molecular dynamics trajectory. Our FootPrint script colors the atoms of one chain by the frequency of their contacts with a second chain. We are hopeful that Scoria will be a useful tool for the computational-biology community. A copy is available for download free of charge (Apache License 2.0) at http://durrantlab.com/scoria/ . Graphical abstract .
Electronic Health Record Application Support Service Enablers.
Neofytou, M S; Neokleous, K; Aristodemou, A; Constantinou, I; Antoniou, Z; Schiza, E C; Pattichis, C S; Schizas, C N
2015-08-01
There is a huge need for open source software solutions in the healthcare domain, given the flexibility, interoperability and resource savings characteristics they offer. In this context, this paper presents the development of three open source libraries - Specific Enablers (SEs) for eHealth applications that were developed under the European project titled "Future Internet Social and Technological Alignment Research" (FI-STAR) funded under the "Future Internet Public Private Partnership" (FI-PPP) program. The three SEs developed under the Electronic Health Record Application Support Service Enablers (EHR-EN) correspond to: a) an Electronic Health Record enabler (EHR SE), b) a patient summary enabler based on the EU project "European patient Summary Open Source services" (epSOS SE) supporting patient mobility and the offering of interoperable services, and c) a Picture Archiving and Communications System (PACS) enabler (PACS SE) based on the dcm4che open source system for the support of medical imaging functionality. The EHR SE follows the HL7 Clinical Document Architecture (CDA) V2.0 and supports the Integrating the Healthcare Enterprise (IHE) profiles (recently awarded in Connectathon 2015). These three FI-STAR platform enablers are designed to facilitate the deployment of innovative applications and value added services in the health care sector. They can be downloaded from the FI-STAR cataloque website. Work in progress focuses in the validation and evaluation scenarios for the proving and demonstration of the usability, applicability and adaptability of the proposed enablers.
Code of Federal Regulations, 2010 CFR
2010-04-01
... OF CLASS II GAMES § 547.12 What are the minimum technical standards for downloading on a Class II... software, files, data, and prize schedules. (2) Downloads of software, games, prize schedules, or other... performed in a manner that will not affect game play. (5) Downloads shall not affect the integrity of...
Code of Federal Regulations, 2011 CFR
2011-04-01
... OF CLASS II GAMES § 547.12 What are the minimum technical standards for downloading on a Class II... software, files, data, and prize schedules. (2) Downloads of software, games, prize schedules, or other... performed in a manner that will not affect game play. (5) Downloads shall not affect the integrity of...
Code of Federal Regulations, 2012 CFR
2012-04-01
... OF CLASS II GAMES § 547.12 What are the minimum technical standards for downloading on a Class II... software, files, data, and prize schedules. (2) Downloads of software, games, prize schedules, or other... performed in a manner that will not affect game play. (5) Downloads shall not affect the integrity of...
Technical Support | Division of Cancer Prevention
To view the live webinar, you will need to have the software, Microsoft Live Meeting, downloaded onto your computer before the event. In most cases, the software will automatically download when you open the program on your system. However, in the event that you need to download it manually, you can access the software at the link below: Download the Microsoft Office Live
Saint: a lightweight integration environment for model annotation.
Lister, Allyson L; Pocock, Matthew; Taschuk, Morgan; Wipat, Anil
2009-11-15
Saint is a web application which provides a lightweight annotation integration environment for quantitative biological models. The system enables modellers to rapidly mark up models with biological information derived from a range of data sources. Saint is freely available for use on the web at http://www.cisban.ac.uk/saint. The web application is implemented in Google Web Toolkit and Tomcat, with all major browsers supported. The Java source code is freely available for download at http://saint-annotate.sourceforge.net. The Saint web server requires an installation of libSBML and has been tested on Linux (32-bit Ubuntu 8.10 and 9.04).
Standard Port-Visit Cost Forecasting Model for U.S. Navy Husbanding Contracts
2009-12-01
Protocol (HTTP) server.35 2. MySQL . An open-source database.36 3. PHP . A common scripting language used for Web development.37 E. IMPLEMENTATION OF...Inc. (2009). MySQL Community Server (Version 5.1) [Software]. Available from http://dev.mysql.com/downloads/ 37 The PHP Group (2009). PHP (Version...Logistics Services MySQL My Structured Query Language NAVSUP Navy Supply Systems Command NC Non-Contract Items NPS Naval Postgraduate
... Download Download EPUB Download PDF What is it? Points To Remember About Hip Replacement Surgery Hip replacement ... This leaves your hands and arms free for balance or to use crutches. Use a long-handled " ...
... Download Download EPUB Download PDF What are they? Points To Remember About Growth Plate Injuries Injuries to ... Neurological disorders that cause people to lose their balance and fall. Some inherited disorders. Bone infections. Metabolic ...
... Aggression Sensory Processing Disorder Social Interactions Book: The Psychology of Duchenne (download) Guide: Learning & Behavior (download) Home ❯ ... a teen and adult. Additional Resources Book: The Psychology of Duchenne (download) Education Matters Guide: Learning & Behavior ( ...
... Where can I find more information? Share Postpartum Depression Facts Download PDF Download ePub Download Mobi Order ... for herself or her family. What is postpartum depression? Postpartum depression is a mood disorder that can ...
Development of an IHE MRRT-compliant open-source web-based reporting platform.
Pinto Dos Santos, Daniel; Klos, G; Kloeckner, R; Oberle, R; Dueber, C; Mildenberger, P
2017-01-01
To develop a platform that uses structured reporting templates according to the IHE Management of Radiology Report Templates (MRRT) profile, and to implement this platform into clinical routine. The reporting platform uses standard web technologies (HTML / JavaScript and PHP / MySQL) only. Several freely available external libraries were used to simplify the programming. The platform runs on a standard web server, connects with the radiology information system (RIS) and PACS, and is easily accessible via a standard web browser. A prototype platform that allows structured reporting to be easily incorporated into the clinical routine was developed and successfully tested. To date, 797 reports were generated using IHE MRRT-compliant templates (many of them downloaded from the RSNA's radreport.org website). Reports are stored in a MySQL database and are easily accessible for further analyses. Development of an IHE MRRT-compliant platform for structured reporting is feasible using only standard web technologies. All source code will be made available upon request under a free license, and the participation of other institutions in further development is welcome. • A platform for structured reporting using IHE MRRT-compliant templates is presented. • Incorporating structured reporting into clinical routine is feasible. • Full source code will be provided upon request under a free license.
NASA Astrophysics Data System (ADS)
Horikawa, H.; Takaesu, M.; Sueki, K.; Araki, E.; Sonoda, A.; Takahashi, N.; Tsuboi, S.
2015-12-01
The Nankai Trough in southwest Japan is one of most active subduction zone in the world. Great mega-thrust earthquakes repeatedly occurred every 100 to 150 years in this area, it's anticipated to occur in the not distant future. For the purpose of elucidation of the history of mega-splay fault activity, the physical properties of the geological strata and the internal structure of the accretionary prism, and monitoring of diastrophism in this area, we have a plan, Nankai Trough Seismogenic Zone Experiments (NanTroSEIZE), as a part of Integrated Ocean Drilling Program (IODP).We have a plan to install the borehole observation system in a few locations by the NanTroSEIZE. This system is called Long-Term Borehole Monitoring System, it consists of various sensors in the borehole such as a broadband seismometer, a tiltmeter, a strainmeter, geophones and accelerometer, thermometer array as well as pressure ports for pore-fluid pressure monitoring. The signal from sensors is transmitted to DONET (Dense Ocean-floor Network System for Earthquake and Tsunamis) in real-time. During IODP Exp. 332 in December 2010, the first Long-Term Borehole Monitoring System was installed into the C0002 borehole site located 80 km off the Kii Peninsula, 1938 m water depth in the Nankai Trough.We have developed a web application system for data download, Long-Term Borehole Monitoring Data Site (*1). Based on a term and sensors which user selected on this site, user can download monitoring waveform data (e.g. broadband seismometer data, accelerometer data, strainmeter data, tiltmeter data) in near real-time. This system can make the arbitrary data which user selected a term and sensors, and download it simply. Downloadable continuous data is provided in seed format, which includes sensor information. In addition, before data download, user can check that data is available or not by data check function.In this presentation, we briefly introduce NanTroSEIZE and then show our web application system. We also discuss our future plans for developments of monitoring data download system.*1 Long-Term Borehole Monitoring Data Site http://join-web.jamstec.go.jp/borehole/borehole_top_e.html
AASG Wells Data for the EGS Test Site Planning and Analysis Task
Augustine, Chad
2013-10-09
AASG Wells Data for the EGS Test Site Planning and Analysis Task Temperature measurement data obtained from boreholes for the Association of American State Geologists (AASG) geothermal data project. Typically bottomhole temperatures are recorded from log headers, and this information is provided through a borehole temperature observation service for each state. Service includes header records, well logs, temperature measurements, and other information for each borehole. Information presented in Geothermal Prospector was derived from data aggregated from the borehole temperature observations for all states. For each observation, the given well location was recorded and the best available well identified (name), temperature and depth were chosen. The “Well Name Source,” “Temp. Type” and “Depth Type” attributes indicate the field used from the original service. This data was then cleaned and converted to consistent units. The accuracy of the observation’s location, name, temperature or depth was note assessed beyond that originally provided by the service. - AASG bottom hole temperature datasets were downloaded from repository.usgin.org between the dates of May 16th and May 24th, 2013. - Datasets were cleaned to remove “null” and non-real entries, and data converted into consistent units across all datasets - Methodology for selecting ”best” temperature and depth attributes from column headers in AASG BHT Data sets: • Temperature: • CorrectedTemperature – best • MeasuredTemperature – next best • Depth: • DepthOfMeasurement – best • TrueVerticalDepth – next best • DrillerTotalDepth – last option • Well Name/Identifier • APINo – best • WellName – next best • ObservationURI - last option. The column headers are as follows: • gid = internal unique ID • src_state = the state from which the well was downloaded (note: the low temperature wells in Idaho are coded as “ID_LowTemp”, while all other wells are simply the two character state abbreviation) • source_url = the url for the source WFS service or Excel file • temp_c = “best” temperature in Celsius • temp_type = indicates whether temp_c comes from the corrected or measured temperature header column in the source document • depth_m = “best” depth in meters • depth_type = indicates whether depth_m comes from the measured, true vertical, or driller total depth header column in the source document • well_name = “best” well name or ID • name_src = indicates whether well_name came from apino, wellname, or observationuri header column in the source document • lat_wgs84 = latitude in wgs84 • lon_wgs84 = longitude in wgs84 • state = state in which the point is located • county = county in which the point is located
Mansell, Stephanie K; Cutts, Steven; Hackney, Isobel; Wood, Martin J; Hawksworth, Kevin; Creer, Dean D; Kilbride, Cherry; Mandal, Swapna
2018-01-01
Introduction Ventilation parameter data from patients receiving home mechanical ventilation can be collected via secure data cards and modem technology. This can then be reviewed by clinicians and ventilator prescriptions adjusted. Typically available measures include tidal volume (VT), leak, respiratory rate, minute ventilation, patient triggered breaths, achieved pressures and patient compliance. This study aimed to assess the potential impact of ventilator data downloads on management of patients requiring home non-invasive ventilation (NIV). Methods A longitudinal within-group design with repeated measurements was used. Baseline ventilator data were downloaded, reviewed and adjustments made to optimise ventilation. Leak, VT and compliance data were collected for comparison at the first review and 3–7 weeks later. Ventilator data were monitored and amended remotely via a modem by a consultant physiotherapist between the first review and second appointment. Results Analysis of data from 52 patients showed increased patient compliance (% days used >4 hours) from 90% to 96% (p=0.007), increased usage from 6.53 to 6.94 hours (p=0.211) and a change in VT(9.4 vs 8.7 mL/kg/ideal body weight, p=0.022). There was no change in leak following review of NIV prescriptions (mean (SD): 43 (23.4) L/min vs 45 (19.9)L/min, p=0.272). Conclusion Ventilator data downloads, via early remote assessment, can help optimise patient ventilation through identification of modifiable factors, in particular interface leak and ventilator prescriptions. However, a prospective study is required to assess whether using ventilator data downloads provides value in terms of patient outcomes and cost-effectiveness. The presented data will help to inform the design of such a study. PMID:29531743
... Breadcrumb Home Health Topics Reactive Arthritis English Español Reactive Arthritis Basics In-Depth Download Download EPUB Download PDF What is it? Points To Remember About Reactive Arthritis Reactive arthritis is pain or swelling in ...
... hyphen, e.g. -historical Searches are case-insensitive Pancreas Anatomy Add to My Pictures View /Download : Small: ... 1586x1534 View Download Large: 3172x3068 View Download Title: Pancreas Anatomy Description: Anatomy of the pancreas; drawing shows ...
... Topics Osteoarthritis English Español 繁體中文 한국어 tiếng Việt Osteoarthritis Basics In-Depth Download Download EPUB Download PDF What is it? Points To Remember About Osteoarthritis Osteoarthritis is a disease that damages the slippery ...
Alternative Fuels Data Center: Hydrogen Drive
, contact Greater Washington Region Clean Cities Coalition. Download QuickTime Video QuickTime (.mov ) Download Windows Media Video Windows Media (.wmv) Video Download Help Text version See more videos provided
"Discoveries in Planetary Sciences": Slide Sets Highlighting New Advances for Astronomy Educators
NASA Astrophysics Data System (ADS)
Brain, David; Schneider, N.; Molaverdikhani, K.; Afsharahmadi, F.
2012-10-01
We present two new features of an ongoing effort to bring recent newsworthy advances in planetary science to undergraduate lecture halls. The effort, called 'Discoveries in Planetary Sciences', summarizes selected recently announced discoveries that are 'too new for textbooks' in the form of 3-slide PowerPoint presentations. The first slide describes the discovery, the second slide discusses the underlying planetary science concepts at a level appropriate for students of 'Astronomy 101', and the third presents the big picture implications of the discovery. A fourth slide includes links to associated press releases, images, and primary sources. This effort is generously sponsored by the Division for Planetary Sciences of the American Astronomical Society, and the slide sets are available at http://dps.aas.org/education/dpsdisc/ for download by undergraduate instructors or any interested party. Several new slide sets have just been released, and we summarize the topics covered. The slide sets are also being translated into languages other than English (including Spanish and Farsi), and we will provide an overview of the translation strategy and process. Finally, we will present web statistics on how many people are using the slide sets, as well as individual feedback from educators.
Normal Female Reproductive Anatomy
... hyphen, e.g. -historical Searches are case-insensitive Reproductive System, Female, Anatomy Add to My Pictures View /Download : ... 1500x1575 View Download Large: 3000x3150 View Download Title: Reproductive System, Female, Anatomy Description: Anatomy of the female reproductive ...
CADDIS Volume 4. Data Analysis: Download Software
Overview of the data analysis tools available for download on CADDIS. Provides instructions for downloading and installing CADStat, access to Microsoft Excel macro for computing SSDs, a brief overview of command line use of R, a statistical software.
WARCProcessor: An Integrative Tool for Building and Management of Web Spam Corpora.
Callón, Miguel; Fdez-Glez, Jorge; Ruano-Ordás, David; Laza, Rosalía; Pavón, Reyes; Fdez-Riverola, Florentino; Méndez, Jose Ramón
2017-12-22
In this work we present the design and implementation of WARCProcessor, a novel multiplatform integrative tool aimed to build scientific datasets to facilitate experimentation in web spam research. The developed application allows the user to specify multiple criteria that change the way in which new corpora are generated whilst reducing the number of repetitive and error prone tasks related with existing corpus maintenance. For this goal, WARCProcessor supports up to six commonly used data sources for web spam research, being able to store output corpus in standard WARC format together with complementary metadata files. Additionally, the application facilitates the automatic and concurrent download of web sites from Internet, giving the possibility of configuring the deep of the links to be followed as well as the behaviour when redirected URLs appear. WARCProcessor supports both an interactive GUI interface and a command line utility for being executed in background.
Python for Large-Scale Electrophysiology
Spacek, Martin; Blanche, Tim; Swindale, Nicholas
2008-01-01
Electrophysiology is increasingly moving towards highly parallel recording techniques which generate large data sets. We record extracellularly in vivo in cat and rat visual cortex with 54-channel silicon polytrodes, under time-locked visual stimulation, from localized neuronal populations within a cortical column. To help deal with the complexity of generating and analysing these data, we used the Python programming language to develop three software projects: one for temporally precise visual stimulus generation (“dimstim”); one for electrophysiological waveform visualization and spike sorting (“spyke”); and one for spike train and stimulus analysis (“neuropy”). All three are open source and available for download (http://swindale.ecc.ubc.ca/code). The requirements and solutions for these projects differed greatly, yet we found Python to be well suited for all three. Here we present our software as a showcase of the extensive capabilities of Python in neuroscience. PMID:19198646
Kinetic Modeling using BioPAX ontology
Ruebenacker, Oliver; Moraru, Ion. I.; Schaff, James C.; Blinov, Michael L.
2010-01-01
Thousands of biochemical interactions are available for download from curated databases such as Reactome, Pathway Interaction Database and other sources in the Biological Pathways Exchange (BioPAX) format. However, the BioPAX ontology does not encode the necessary information for kinetic modeling and simulation. The current standard for kinetic modeling is the System Biology Markup Language (SBML), but only a small number of models are available in SBML format in public repositories. Additionally, reusing and merging SBML models presents a significant challenge, because often each element has a value only in the context of the given model, and information encoding biological meaning is absent. We describe a software system that enables a variety of operations facilitating the use of BioPAX data to create kinetic models that can be visualized, edited, and simulated using the Virtual Cell (VCell), including improved conversion to SBML (for use with other simulation tools that support this format). PMID:20862270
WARCProcessor: An Integrative Tool for Building and Management of Web Spam Corpora
Callón, Miguel; Fdez-Glez, Jorge; Ruano-Ordás, David; Laza, Rosalía; Pavón, Reyes; Méndez, Jose Ramón
2017-01-01
In this work we present the design and implementation of WARCProcessor, a novel multiplatform integrative tool aimed to build scientific datasets to facilitate experimentation in web spam research. The developed application allows the user to specify multiple criteria that change the way in which new corpora are generated whilst reducing the number of repetitive and error prone tasks related with existing corpus maintenance. For this goal, WARCProcessor supports up to six commonly used data sources for web spam research, being able to store output corpus in standard WARC format together with complementary metadata files. Additionally, the application facilitates the automatic and concurrent download of web sites from Internet, giving the possibility of configuring the deep of the links to be followed as well as the behaviour when redirected URLs appear. WARCProcessor supports both an interactive GUI interface and a command line utility for being executed in background. PMID:29271913
Calculations of lattice vibrational mode lifetimes using Jazz: a Python wrapper for LAMMPS
NASA Astrophysics Data System (ADS)
Gao, Y.; Wang, H.; Daw, M. S.
2015-06-01
Jazz is a new python wrapper for LAMMPS [1], implemented to calculate the lifetimes of vibrational normal modes based on forces as calculated for any interatomic potential available in that package. The anharmonic character of the normal modes is analyzed via the Monte Carlo-based moments approximation as is described in Gao and Daw [2]. It is distributed as open-source software and can be downloaded from the website http://jazz.sourceforge.net/.
Häberle, Johannes; Huemer, Martina
2015-01-01
Implementation of guidelines and assessment of their adaptation is not an extensively investigated process in the field of rare diseases. However, whether targeted recipients are reached and willing and able to follow the recommendations has significant impact on the efficacy of guidelines. In 2012, a guideline for the management of urea cycle disorders (UCDs) has been published. We evaluate the efficacy of implementation, adaptation, and use of the UCD guidelines by applying different strategies. (i) Download statistics from online sources were recorded. (ii) Facilities relevant for the implementation of the guidelines were assessed in pediatric units in Germany and Austria. (iii) The guidelines were evaluated by targeted recipients using the AGREE instrument. (iv) A regional networking-based implementation process was evaluated. (i) Download statistics revealed high access with an increase in downloads over time. (ii) In 18% of hospitals ammonia testing was not available 24/7, and emergency drugs were often not available. (iii) Recipient criticism expressed in the AGREE instrument focused on incomplete inclusion of patients' perspectives. (iv) The implementation process improved the availability of ammonia measurements and access to emergency medication, patient care processes, and cooperation between nonspecialists and specialists. Interest in the UCD guidelines is high and sustained, but more precise targeting of the guidelines is advisable. Surprisingly, many hospitals do not possess all facilities necessary to apply the guidelines. Regional network and awareness campaigns result in the improvement of both facilities and knowledge.
GeneLab Phase 2: Integrated Search Data Federation of Space Biology Experimental Data
NASA Technical Reports Server (NTRS)
Tran, P. B.; Berrios, D. C.; Gurram, M. M.; Hashim, J. C. M.; Raghunandan, S.; Lin, S. Y.; Le, T. Q.; Heher, D. M.; Thai, H. T.; Welch, J. D.;
2016-01-01
The GeneLab project is a science initiative to maximize the scientific return of omics data collected from spaceflight and from ground simulations of microgravity and radiation experiments, supported by a data system for a public bioinformatics repository and collaborative analysis tools for these data. The mission of GeneLab is to maximize the utilization of the valuable biological research resources aboard the ISS by collecting genomic, transcriptomic, proteomic and metabolomic (so-called omics) data to enable the exploration of the molecular network responses of terrestrial biology to space environments using a systems biology approach. All GeneLab data are made available to a worldwide network of researchers through its open-access data system. GeneLab is currently being developed by NASA to support Open Science biomedical research in order to enable the human exploration of space and improve life on earth. Open access to Phase 1 of the GeneLab Data Systems (GLDS) was implemented in April 2015. Download volumes have grown steadily, mirroring the growth in curated space biology research data sets (61 as of June 2016), now exceeding 10 TB/month, with over 10,000 file downloads since the start of Phase 1. For the period April 2015 to May 2016, most frequently downloaded were data from studies of Mus musculus (39) followed closely by Arabidopsis thaliana (30), with the remaining downloads roughly equally split across 12 other organisms (each 10 of total downloads). GLDS Phase 2 is focusing on interoperability, supporting data federation, including integrated search capabilities, of GLDS-housed data sets with external data sources, such as gene expression data from NIHNCBIs Gene Expression Omnibus (GEO), proteomic data from EBIs PRIDE system, and metagenomic data from Argonne National Laboratory's MG-RAST. GEO and MG-RAST employ specifications for investigation metadata that are different from those used by the GLDS and PRIDE (e.g., ISA-Tab). The GLDS Phase 2 system will implement a Google-like, full-text search engine using a Service-Oriented Architecture by utilizing publicly available RESTful web services Application Programming Interfaces (e.g., GEO Entrez Programming Utilities) and a Common Metadata Model (CMM) in order to accommodate the different metadata formats between the heterogeneous bioinformatics databases. GLDS Phase 2 completion with fully implemented capabilities will be made available to the general public in September 2017.
Cox, B L; Ludwig, K D; Adamson, E B; Eliceiri, K W; Fain, S B
2018-03-01
In medical imaging, clinicians, researchers and technicians have begun to use 3D printing to create specialized phantoms to replace commercial ones due to their customizable and iterative nature. Presented here is the design of a 3D printed open source, reusable magnetic resonance imaging (MRI) phantom, capable of flood-filling, with removable samples for measurements of contrast agent solutions and reference standards, and for use in evaluating acquisition techniques and image reconstruction performance. The phantom was designed using SolidWorks, a computer-aided design software package. The phantom consists of custom and off-the-shelf parts and incorporates an air hole and Luer Lock system to aid in flood filling, a marker for orientation of samples in the filled mode and bolt and tube holes for assembly. The cost of construction for all materials is under $90. All design files are open-source and available for download. To demonstrate utility, B 0 field mapping was performed using a series of gadolinium concentrations in both the unfilled and flood-filled mode. An excellent linear agreement (R 2 >0.998) was observed between measured relaxation rates (R 1 /R 2 ) and gadolinium concentration. The phantom provides a reliable setup to test data acquisition and reconstruction methods and verify physical alignment in alternative nuclei MRI techniques (e.g. carbon-13 and fluorine-19 MRI). A cost-effective, open-source MRI phantom design for repeated quantitative measurement of contrast agents and reference standards in preclinical research is presented. Specifically, the work is an example of how the emerging technology of 3D printing improves flexibility and access for custom phantom design.
... Rheumatoid Arthritis English Español 繁體中文 한국어 tiếng Việt Rheumatoid Arthritis Basics In-Depth Download Download EPUB Download PDF What is it? Points To Remember About Rheumatoid Arthritis Rheumatoid arthritis is a disease that causes pain, ...
Code of Federal Regulations, 2014 CFR
2014-07-01
... 385.11 Patents, Trademarks, and Copyrights COPYRIGHT ROYALTY BOARD, LIBRARY OF CONGRESS RATES AND... MAKING AND DISTRIBUTING OF PHYSICAL AND DIGITAL PHONORECORDS Interactive Streaming and Limited Downloads... interactive streams or limited downloads of musical works, as applicable. Limited download means a digital...
The Internet: How Fast Can You Download?
ERIC Educational Resources Information Center
Shearer, Ron
1997-01-01
Discusses faster modems for downloading information from the Internet that may become available through cable or telephone industries. Topics include ADSL (Asymmetric Digital Subscriber Line) modems; Direct PC--downloading satellite transmissions; Land Multichannel Distribution System (LMDS)--a wireless communication device; and Internet…
Installing a Local Copy of the Reactome Web Site and Knowledgebase
McKay, Sheldon J; Weiser, Joel
2015-01-01
The Reactome project builds, maintains, and publishes a knowledgebase of biological pathways. The information in the knowledgebase is gathered from the experts in the field, peer reviewed, and edited by Reactome editorial staff and then published to the Reactome Web site, http://www.reactome.org (see UNIT 8.7; Croft et al., 2013). The Reactome software is open source and builds on top of other open-source or freely available software. Reactome data and code can be freely downloaded in its entirety and the Web site installed locally. This allows for more flexible interrogation of the data and also makes it possible to add one’s own information to the knowledgebase. PMID:26087747
MX: A beamline control system toolkit
NASA Astrophysics Data System (ADS)
Lavender, William M.
2000-06-01
The development of experimental and beamline control systems for two Collaborative Access Teams at the Advanced Photon Source has resulted in the creation of a portable data acquisition and control toolkit called MX. MX consists of a set of servers, application programs and libraries that enable the creation of command line and graphical user interface applications that may be easily retargeted to new and different kinds of motor and device controllers. The source code for MX is written in ANSI C and Tcl/Tk with interprocess communication via TCP/IP. MX is available for several versions of Unix, Windows 95/98/NT and DOS. It may be downloaded from the web site http://www.imca.aps.anl.gov/mx/.
Shuai, Xin; Pepe, Alberto; Bollen, Johan
2012-01-01
We analyze the online response to the preprint publication of a cohort of 4,606 scientific articles submitted to the preprint database arXiv.org between October 2010 and May 2011. We study three forms of responses to these preprints: downloads on the arXiv.org site, mentions on the social media site Twitter, and early citations in the scholarly record. We perform two analyses. First, we analyze the delay and time span of article downloads and Twitter mentions following submission, to understand the temporal configuration of these reactions and whether one precedes or follows the other. Second, we run regression and correlation tests to investigate the relationship between Twitter mentions, arXiv downloads, and article citations. We find that Twitter mentions and arXiv downloads of scholarly articles follow two distinct temporal patterns of activity, with Twitter mentions having shorter delays and narrower time spans than arXiv downloads. We also find that the volume of Twitter mentions is statistically correlated with arXiv downloads and early citations just months after the publication of a preprint, with a possible bias that favors highly mentioned articles. PMID:23133597
NASA Astrophysics Data System (ADS)
Krehbiel, C.; Maiersperger, T.; Friesz, A.; Harriman, L.; Quenzer, R.; Impecoven, K.
2016-12-01
Three major obstacles facing big Earth data users include data storage, management, and analysis. As the amount of satellite remote sensing data increases, so does the need for better data storage and management strategies to exploit the plethora of data now available. Standard GIS tools can help big Earth data users whom interact with and analyze increasingly large and diverse datasets. In this presentation we highlight how NASA's Land Processes Distributed Active Archive Center (LP DAAC) is tackling these big Earth data challenges. We provide a real life use case example to describe three tools and services provided by the LP DAAC to more efficiently exploit big Earth data in a GIS environment. First, we describe the Open-source Project for a Network Data Access Protocol (OPeNDAP), which calls to specific data, minimizing the amount of data that a user downloads and improves the efficiency of data downloading and processing. Next, we cover the LP DAAC's Application for Extracting and Exploring Analysis Ready Samples (AppEEARS), a web application interface for extracting and analyzing land remote sensing data. From there, we review an ArcPython toolbox that was developed to provide quality control services to land remote sensing data products. Locating and extracting specific subsets of larger big Earth datasets improves data storage and management efficiency for the end user, and quality control services provides a straightforward interpretation of big Earth data. These tools and services are beneficial to the GIS user community in terms of standardizing workflows and improving data storage, management, and analysis tactics.
Gisdon, Florian J; Culka, Martin; Ullmann, G Matthias
2016-10-01
Conjugate peak refinement (CPR) is a powerful and robust method to search transition states on a molecular potential energy surface. Nevertheless, the method was to the best of our knowledge so far only implemented in CHARMM. In this paper, we present PyCPR, a new Python-based implementation of the CPR algorithm within the pDynamo framework. We provide a detailed description of the theory underlying our implementation and discuss the different parts of the implementation. The method is applied to two different problems. First, we illustrate the method by analyzing the gauche to anti-periplanar transition of butane using a semiempirical QM method. Second, we reanalyze the mechanism of a glycyl-radical enzyme, namely of 4-hydroxyphenylacetate decarboxylase (HPD) using QM/MM calculations. In the end, we suggest a strategy how to use our implementation of the CPR algorithm. The integration of PyCPR into the framework pDynamo allows the combination of CPR with the large variety of methods implemented in pDynamo. PyCPR can be used in combination with quantum mechanical and molecular mechanical methods (and hybrid methods) implemented directly in pDynamo, but also in combination with external programs such as ORCA using pDynamo as interface. PyCPR is distributed as free, open source software and can be downloaded from http://www.bisb.uni-bayreuth.de/index.php?page=downloads . Graphical Abstract PyCPR is a search tool for finding saddle points on the potential energy landscape of a molecular system.
The MapApp Virtual Seabed Explorer
NASA Astrophysics Data System (ADS)
Haxby, W. F.; Ryan, W. B.; Carbotte, S. M.
2003-12-01
MapApp is a downloadable, open source, prototype client application running in a desktop personal computing environment with the capability to explore two hundred million years of global ocean floor geology and geochemistry. It accomplishes the exploration and discovery in an integrated data environment of bathymetry, gravity, magnetic anomalies, reflection profiles, crustal ages, sediment composition, bedrock petrology and chemistry. Exploration is undertaken in a single visual interface with spawned windowpanes that communicate with each other. These panes provide the viewport for charting subsea landscapes, the spreadsheet for examination and manipulation of data discovered either by direct encounter or by query, the notebook for recording and downloading either original data or derived products, and dialog boxes to set parameters for models. All data are real measurements and their metadata reside in relational databases. The data come from decades of marine geological and geophysical surveys, coring, dredging, deep-sea drilling, and submersible dives. The lessons learned include the importance of rigorous data management, the need for quality-control of data accuracy, the discipline to keep the interface simple and intuitive, and the requirement to be functional over large scales of variable spatial and temporal resolution. A technical challenge is the programming difficulties presented by continuously changing versions of the PC client operating systems. The greatest scientific challenge is cost-effective mining of published textural data and convincing competitive researchers to contribute their data that is often already many years old. To retain and expand the user community of students, educators and researchers, we are discovering that it is equally as important to grow content as to add functionality.
Keinki, Christian; Rudolph, Ivonne; Ruetters, Dana; Kuenzel, Ulrike; Lobitz, Jessica; Schaefer, Maike; Hanaya, Hani; Huebner, Jutta
2017-05-04
According to the information-seeking behaviors of patients, booklets which can be downloaded from the Internet for free are an important source of information notably for patients with cancer. This study investigated whether information booklets for patients with cancer available at German websites are in accordance with the formal and content criteria of evidence-based information. We compared and compiled both content and formal criteria by matching different national and international standards for written patient information using a merged instrument. A catalog with a total of 16 items within 4 categories (quality of the publication, quality of information, quality of information representation, and transparency) was created. Patient information booklets for the most frequent tumor types were collected from the Internet. A total of 52 different patient booklets were downloaded and assessed. Overall, no booklet fulfilled all criteria. The quality of the publications was evaluated with an average value of 1.67 while the quality of the information had a mean value of 1.45, and the quality of information presentation had a similar rating (1.39). The transparency criteria were evaluated as lowest with an average of 1.07. In summary, German booklets for cancer patients have some shortcomings concerning formal and content criteria for evidence-based patient information. The applied requirement catalog is suitable for wide use and may help in quality assurance of health information. It may be used as part of an obligatory external evaluation, which could help improving the quality of health information.
PathVisio 3: an extendable pathway analysis toolbox.
Kutmon, Martina; van Iersel, Martijn P; Bohler, Anwesha; Kelder, Thomas; Nunes, Nuno; Pico, Alexander R; Evelo, Chris T
2015-02-01
PathVisio is a commonly used pathway editor, visualization and analysis software. Biological pathways have been used by biologists for many years to describe the detailed steps in biological processes. Those powerful, visual representations help researchers to better understand, share and discuss knowledge. Since the first publication of PathVisio in 2008, the original paper was cited more than 170 times and PathVisio was used in many different biological studies. As an online editor PathVisio is also integrated in the community curated pathway database WikiPathways. Here we present the third version of PathVisio with the newest additions and improvements of the application. The core features of PathVisio are pathway drawing, advanced data visualization and pathway statistics. Additionally, PathVisio 3 introduces a new powerful extension systems that allows other developers to contribute additional functionality in form of plugins without changing the core application. PathVisio can be downloaded from http://www.pathvisio.org and in 2014 PathVisio 3 has been downloaded over 5,500 times. There are already more than 15 plugins available in the central plugin repository. PathVisio is a freely available, open-source tool published under the Apache 2.0 license (http://www.apache.org/licenses/LICENSE-2.0). It is implemented in Java and thus runs on all major operating systems. The code repository is available at http://svn.bigcat.unimaas.nl/pathvisio. The support mailing list for users is available on https://groups.google.com/forum/#!forum/wikipathways-discuss and for developers on https://groups.google.com/forum/#!forum/wikipathways-devel.
& Legislation Links Discussion Lists Quick Links AAPT eMentoring ComPADRE Review of High School Take Physics" Poster Why Physics Poster Thumbnail Download normal resolution JPEG Download high resolution JPEG Download Spanish Version Recruiting Physics Students in High School (FED newsletter article
Expert Involvement Predicts mHealth App Downloads: Multivariate Regression Analysis of Urology Apps
Osório, Luís; Cavadas, Vitor; Fraga, Avelino; Carrasquinho, Eduardo; Cardoso de Oliveira, Eduardo; Castelo-Branco, Miguel; Roobol, Monique J
2016-01-01
Background Urological mobile medical (mHealth) apps are gaining popularity with both clinicians and patients. mHealth is a rapidly evolving and heterogeneous field, with some urology apps being downloaded over 10,000 times and others not at all. The factors that contribute to medical app downloads have yet to be identified, including the hypothetical influence of expert involvement in app development. Objective The objective of our study was to identify predictors of the number of urology app downloads. Methods We reviewed urology apps available in the Google Play Store and collected publicly available data. Multivariate ordinal logistic regression evaluated the effect of publicly available app variables on the number of apps being downloaded. Results Of 129 urology apps eligible for study, only 2 (1.6%) had >10,000 downloads, with half having ≤100 downloads and 4 (3.1%) having none at all. Apps developed with expert urologist involvement (P=.003), optional in-app purchases (P=.01), higher user rating (P<.001), and more user reviews (P<.001) were more likely to be installed. App cost was inversely related to the number of downloads (P<.001). Only data from the Google Play Store and the developers’ websites, but not other platforms, were publicly available for analysis, and the level and nature of expert involvement was not documented. Conclusions The explicit participation of urologists in app development is likely to enhance its chances to have a higher number of downloads. This finding should help in the design of better apps and further promote urologist involvement in mHealth. Official certification processes are required to ensure app quality and user safety. PMID:27421338
Brennan, Peter A; Habib, Ahmed
2011-10-01
A large number of papers related to oral and maxillofacial surgery are published in many specialist journals. With the ever-increasing use of the internet it is easy to download them as part of a journal subscription on a fee per paper basis, or in some cases for free. Online access to the British Journal of Oral and Maxillofacial Surgery (BJOMS) is free to British Association (BAOMS) members with a $30 fee per paper download for non-members. Many colleagues use the online version of the journal, and this provides valuable information about downloading trends. Other data on articles that have been cited in subsequent publications are also readily available, and they form the basis for the calculation of a journal's impact factor. We evaluated the top 50 downloaded papers from the BJOMS website in 2010 to ascertain which articles were being read online. We also obtained data on the number of citations for papers published in 2009-2010 to see whether these papers were similar to the articles being downloaded. In 2010 there were over 360000 downloaded articles. The most popular papers were leading articles, reviews, and full length articles; only one short communication featured in the top 50 downloads. The papers most cited in subsequent publications were full length articles and leading articles or reviews, which represent 80% of the total citations of the 50 papers. Ten papers were in both the top 50 downloaded and most cited lists. We discuss the implications of this study for the journal and our readers. Copyright © 2011 The British Association of Oral and Maxillofacial Surgeons. Published by Elsevier Ltd. All rights reserved.
Sarntivijai, Sirarat; Vasant, Drashtti; Jupp, Simon; Saunders, Gary; Bento, A Patrícia; Gonzalez, Daniel; Betts, Joanna; Hasan, Samiul; Koscielny, Gautier; Dunham, Ian; Parkinson, Helen; Malone, James
2016-01-01
The Centre for Therapeutic Target Validation (CTTV - https://www.targetvalidation.org/) was established to generate therapeutic target evidence from genome-scale experiments and analyses. CTTV aims to support the validity of therapeutic targets by integrating existing and newly-generated data. Data integration has been achieved in some resources by mapping metadata such as disease and phenotypes to the Experimental Factor Ontology (EFO). Additionally, the relationship between ontology descriptions of rare and common diseases and their phenotypes can offer insights into shared biological mechanisms and potential drug targets. Ontologies are not ideal for representing the sometimes associated type relationship required. This work addresses two challenges; annotation of diverse big data, and representation of complex, sometimes associated relationships between concepts. Semantic mapping uses a combination of custom scripting, our annotation tool 'Zooma', and expert curation. Disease-phenotype associations were generated using literature mining on Europe PubMed Central abstracts, which were manually verified by experts for validity. Representation of the disease-phenotype association was achieved by the Ontology of Biomedical AssociatioN (OBAN), a generic association representation model. OBAN represents associations between a subject and object i.e., disease and its associated phenotypes and the source of evidence for that association. The indirect disease-to-disease associations are exposed through shared phenotypes. This was applied to the use case of linking rare to common diseases at the CTTV. EFO yields an average of over 80% of mapping coverage in all data sources. A 42% precision is obtained from the manual verification of the text-mined disease-phenotype associations. This results in 1452 and 2810 disease-phenotype pairs for IBD and autoimmune disease and contributes towards 11,338 rare diseases associations (merged with existing published work [Am J Hum Genet 97:111-24, 2015]). An OBAN result file is downloadable at http://sourceforge.net/p/efo/code/HEAD/tree/trunk/src/efoassociations/. Twenty common diseases are linked to 85 rare diseases by shared phenotypes. A generalizable OBAN model for association representation is presented in this study. Here we present solutions to large-scale annotation-ontology mapping in the CTTV knowledge base, a process for disease-phenotype mining, and propose a generic association model, 'OBAN', as a means to integrate disease using shared phenotypes. EFO is released monthly and available for download at http://www.ebi.ac.uk/efo/.
Ligand.Info small-molecule Meta-Database.
von Grotthuss, Marcin; Koczyk, Grzegorz; Pas, Jakub; Wyrwicz, Lucjan S; Rychlewski, Leszek
2004-12-01
Ligand.Info is a compilation of various publicly available databases of small molecules. The total size of the Meta-Database is over 1 million entries. The compound records contain calculated three-dimensional coordinates and sometimes information about biological activity. Some molecules have information about FDA drug approving status or about anti-HIV activity. Meta-Database can be downloaded from the http://Ligand.Info web page. The database can also be screened using a Java-based tool. The tool can interactively cluster sets of molecules on the user side and automatically download similar molecules from the server. The application requires the Java Runtime Environment 1.4 or higher, which can be automatically downloaded from Sun Microsystems or Apple Computer and installed during the first use of Ligand.Info on desktop systems, which support Java (Ms Windows, Mac OS, Solaris, and Linux). The Ligand.Info Meta-Database can be used for virtual high-throughput screening of new potential drugs. Presented examples showed that using a known antiviral drug as query the system was able to find others antiviral drugs and inhibitors.
Performance Evaluation of Peer-to-Peer Progressive Download in Broadband Access Networks
NASA Astrophysics Data System (ADS)
Shibuya, Megumi; Ogishi, Tomohiko; Yamamoto, Shu
P2P (Peer-to-Peer) file sharing architectures have scalable and cost-effective features. Hence, the application of P2P architectures to media streaming is attractive and expected to be an alternative to the current video streaming using IP multicast or content delivery systems because the current systems require expensive network infrastructures and large scale centralized cache storage systems. In this paper, we investigate the P2P progressive download enabling Internet video streaming services. We demonstrated the capability of the P2P progressive download in both laboratory test network as well as in the Internet. Through the experiments, we clarified the contribution of the FTTH links to the P2P progressive download in the heterogeneous access networks consisting of FTTH and ADSL links. We analyzed the cause of some download performance degradation occurred in the experiment and discussed about the effective methods to provide the video streaming service using P2P progressive download in the current heterogeneous networks.
icoshift: A versatile tool for the rapid alignment of 1D NMR spectra
NASA Astrophysics Data System (ADS)
Savorani, F.; Tomasi, G.; Engelsen, S. B.
2010-02-01
The increasing scientific and industrial interest towards metabonomics takes advantage from the high qualitative and quantitative information level of nuclear magnetic resonance (NMR) spectroscopy. However, several chemical and physical factors can affect the absolute and the relative position of an NMR signal and it is not always possible or desirable to eliminate these effects a priori. To remove misalignment of NMR signals a posteriori, several algorithms have been proposed in the literature. The icoshift program presented here is an open source and highly efficient program designed for solving signal alignment problems in metabonomic NMR data analysis. The icoshift algorithm is based on correlation shifting of spectral intervals and employs an FFT engine that aligns all spectra simultaneously. The algorithm is demonstrated to be faster than similar methods found in the literature making full-resolution alignment of large datasets feasible and thus avoiding down-sampling steps such as binning. The algorithm uses missing values as a filling alternative in order to avoid spectral artifacts at the segment boundaries. The algorithm is made open source and the Matlab code including documentation can be downloaded from www.models.life.ku.dk.
Sridhar, Vishnu B; Tian, Peifang; Dale, Anders M; Devor, Anna; Saisan, Payam A
2014-01-01
We present a database client software-Neurovascular Network Explorer 1.0 (NNE 1.0)-that uses MATLAB(®) based Graphical User Interface (GUI) for interaction with a database of 2-photon single-vessel diameter measurements from our previous publication (Tian et al., 2010). These data are of particular interest for modeling the hemodynamic response. NNE 1.0 is downloaded by the user and then runs either as a MATLAB script or as a standalone program on a Windows platform. The GUI allows browsing the database according to parameters specified by the user, simple manipulation and visualization of the retrieved records (such as averaging and peak-normalization), and export of the results. Further, we provide NNE 1.0 source code. With this source code, the user can database their own experimental results, given the appropriate data structure and naming conventions, and thus share their data in a user-friendly format with other investigators. NNE 1.0 provides an example of seamless and low-cost solution for sharing of experimental data by a regular size neuroscience laboratory and may serve as a general template, facilitating dissemination of biological results and accelerating data-driven modeling approaches.
A collection of open source applications for mass spectrometry data mining.
Gallardo, Óscar; Ovelleiro, David; Gay, Marina; Carrascal, Montserrat; Abian, Joaquin
2014-10-01
We present several bioinformatics applications for the identification and quantification of phosphoproteome components by MS. These applications include a front-end graphical user interface that combines several Thermo RAW formats to MASCOT™ Generic Format extractors (EasierMgf), two graphical user interfaces for search engines OMSSA and SEQUEST (OmssaGui and SequestGui), and three applications, one for the management of databases in FASTA format (FastaTools), another for the integration of search results from up to three search engines (Integrator), and another one for the visualization of mass spectra and their corresponding database search results (JsonVisor). These applications were developed to solve some of the common problems found in proteomic and phosphoproteomic data analysis and were integrated in the workflow for data processing and feeding on our LymPHOS database. Applications were designed modularly and can be used standalone. These tools are written in Perl and Python programming languages and are supported on Windows platforms. They are all released under an Open Source Software license and can be freely downloaded from our software repository hosted at GoogleCode. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
IPeak: An open source tool to combine results from multiple MS/MS search engines.
Wen, Bo; Du, Chaoqin; Li, Guilin; Ghali, Fawaz; Jones, Andrew R; Käll, Lukas; Xu, Shaohang; Zhou, Ruo; Ren, Zhe; Feng, Qiang; Xu, Xun; Wang, Jun
2015-09-01
Liquid chromatography coupled tandem mass spectrometry (LC-MS/MS) is an important technique for detecting peptides in proteomics studies. Here, we present an open source software tool, termed IPeak, a peptide identification pipeline that is designed to combine the Percolator post-processing algorithm and multi-search strategy to enhance the sensitivity of peptide identifications without compromising accuracy. IPeak provides a graphical user interface (GUI) as well as a command-line interface, which is implemented in JAVA and can work on all three major operating system platforms: Windows, Linux/Unix and OS X. IPeak has been designed to work with the mzIdentML standard from the Proteomics Standards Initiative (PSI) as an input and output, and also been fully integrated into the associated mzidLibrary project, providing access to the overall pipeline, as well as modules for calling Percolator on individual search engine result files. The integration thus enables IPeak (and Percolator) to be used in conjunction with any software packages implementing the mzIdentML data standard. IPeak is freely available and can be downloaded under an Apache 2.0 license at https://code.google.com/p/mzidentml-lib/. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Marvel, Skylar W; To, Kimberly; Grimm, Fabian A; Wright, Fred A; Rusyn, Ivan; Reif, David M
2018-03-05
Drawing integrated conclusions from diverse source data requires synthesis across multiple types of information. The ToxPi (Toxicological Prioritization Index) is an analytical framework that was developed to enable integration of multiple sources of evidence by transforming data into integrated, visual profiles. Methodological improvements have advanced ToxPi and expanded its applicability, necessitating a new, consolidated software platform to provide functionality, while preserving flexibility for future updates. We detail the implementation of a new graphical user interface for ToxPi (Toxicological Prioritization Index) that provides interactive visualization, analysis, reporting, and portability. The interface is deployed as a stand-alone, platform-independent Java application, with a modular design to accommodate inclusion of future analytics. The new ToxPi interface introduces several features, from flexible data import formats (including legacy formats that permit backward compatibility) to similarity-based clustering to options for high-resolution graphical output. We present the new ToxPi interface for dynamic exploration, visualization, and sharing of integrated data models. The ToxPi interface is freely-available as a single compressed download that includes the main Java executable, all libraries, example data files, and a complete user manual from http://toxpi.org .
NASA Astrophysics Data System (ADS)
Changyong, Dou; Huadong, Guo; Chunming, Han; Ming, Liu
2014-03-01
With more and more Earth observation data available to the community, how to manage and sharing these valuable remote sensing datasets is becoming an urgent issue to be solved. The web based Geographical Information Systems (GIS) technology provides a convenient way for the users in different locations to share and make use of the same dataset. In order to efficiently use the airborne Synthetic Aperture Radar (SAR) remote sensing data acquired in the Airborne Remote Sensing Center of the Institute of Remote Sensing and Digital Earth (RADI), Chinese Academy of Sciences (CAS), a Web-GIS based platform for airborne SAR data management, distribution and sharing was designed and developed. The major features of the system include map based navigation search interface, full resolution imagery shown overlaid the map, and all the software adopted in the platform are Open Source Software (OSS). The functions of the platform include browsing the imagery on the map navigation based interface, ordering and downloading data online, image dataset and user management, etc. At present, the system is under testing in RADI and will come to regular operation soon.
Magnetic Fields for All: The GPIPS Community Web-Access Portal
NASA Astrophysics Data System (ADS)
Carveth, Carol; Clemens, D. P.; Pinnick, A.; Pavel, M.; Jameson, K.; Taylor, B.
2007-12-01
The new GPIPS website portal provides community users with an intuitive and powerful interface to query the data products of the Galactic Plane Infrared Polarization Survey. The website, which was built using PHP for the front end and MySQL for the database back end, allows users to issue queries based on galactic or equatorial coordinates, GPIPS-specific identifiers, polarization information, magnitude information, and several other attributes. The returns are presented in HTML tables, with the added option of either downloading or being emailed an ASCII file including the same or more information from the database. Other functionalities of the website include providing details of the status of the Survey (which fields have been observed or are planned to be observed), techniques involved in data collection and analysis, and descriptions of the database contents and names. For this initial launch of the website, users may access the GPIPS polarization point source catalog and the deep coadd photometric point source catalog. Future planned developments include a graphics-based method for querying the database, as well as tools to combine neighboring GPIPS images into larger image files for both polarimetry and photometry. This work is partially supported by NSF grant AST-0607500.
NASA Astrophysics Data System (ADS)
Knörchen, Achim; Ketzler, Gunnar; Schneider, Christoph
2015-01-01
Although Europe has been growing together for the past decades, cross-border information platforms on environmental issues are still scarce. With regard to the establishment of a web-mapping tool on airborne particulate matter (PM) concentration for the Euregio Meuse-Rhine located in the border region of Belgium, Germany and the Netherlands, this article describes the research on methodical and technical backgrounds implementing such a platform. An open-source solution was selected for presenting the data in a Web GIS (OpenLayers/GeoExt; both JavaScript-based), applying other free tools for data handling (Python), data management (PostgreSQL), geo-statistical modelling (Octave), geoprocessing (GRASS GIS/GDAL) and web mapping (MapServer). The multilingual, made-to-order online platform provides access to near-real time data on PM concentration as well as additional background information. In an open data section, commented configuration files for the Web GIS client are being made available for download. Furthermore, all geodata generated by the project is being published under public domain and can be retrieved in various formats or integrated into Desktop GIS as Web Map Services (WMS).
Sybil--efficient constraint-based modelling in R.
Gelius-Dietrich, Gabriel; Desouki, Abdelmoneim Amer; Fritzemeier, Claus Jonathan; Lercher, Martin J
2013-11-13
Constraint-based analyses of metabolic networks are widely used to simulate the properties of genome-scale metabolic networks. Publicly available implementations tend to be slow, impeding large scale analyses such as the genome-wide computation of pairwise gene knock-outs, or the automated search for model improvements. Furthermore, available implementations cannot easily be extended or adapted by users. Here, we present sybil, an open source software library for constraint-based analyses in R; R is a free, platform-independent environment for statistical computing and graphics that is widely used in bioinformatics. Among other functions, sybil currently provides efficient methods for flux-balance analysis (FBA), MOMA, and ROOM that are about ten times faster than previous implementations when calculating the effect of whole-genome single gene deletions in silico on a complete E. coli metabolic model. Due to the object-oriented architecture of sybil, users can easily build analysis pipelines in R or even implement their own constraint-based algorithms. Based on its highly efficient communication with different mathematical optimisation programs, sybil facilitates the exploration of high-dimensional optimisation problems on small time scales. Sybil and all its dependencies are open source. Sybil and its documentation are available for download from the comprehensive R archive network (CRAN).
ONRASIA Scientific Information Bulletin. Volume 16, Number 4, October - December 1991
1991-12-01
cell or the host to transmit data to Although the AP has only been in ized and that many tests have given all the cells. (This is used to download ...information Is estiated to averaqe I hour Der response Including the time for revewing instructions, searching e,.stng data sources. gathering and...maintaining the data needed, and completing and reviewing the )lecton of nformation Send comments regarding this burden estimate or 3ny )ther aspect of this
2011-08-01
component in an airborne platform. Authorized licensed use limited to: ROME AFB. Downloaded on August 05,2010 at 14:47:37 UTC from IEEE Xplore ...UTC from IEEE Xplore . Restrictions apply. 2 . (7) For instance, it is not difficult to show that the MGF of for Nakagami-m fading with i.n.d fading...August 05,2010 at 14:47:37 UTC from IEEE Xplore . Restrictions apply. 3 ditions. Once again, using integration by parts, (14) can be concisely expressed
Toxics Release Inventory Chemical Hazard Information Profiles (TRI-CHIP) Dataset
The Toxics Release Inventory (TRI) Chemical Hazard Information Profiles (TRI-CHIP) dataset contains hazard information about the chemicals reported in TRI. Users can use this XML-format dataset to create their own databases and hazard analyses of TRI chemicals. The hazard information is compiled from a series of authoritative sources including the Integrated Risk Information System (IRIS). The dataset is provided as a downloadable .zip file that when extracted provides XML files and schemas for the hazard information tables.
Reconfigurable Software for Mission Operations
NASA Technical Reports Server (NTRS)
Trimble, Jay
2014-01-01
We developed software that provides flexibility to mission organizations through modularity and composability. Modularity enables removal and addition of functionality through the installation of plug-ins. Composability enables users to assemble software from pre-built reusable objects, thus reducing or eliminating the walls associated with traditional application architectures and enabling unique combinations of functionality. We have used composable objects to reduce display build time, create workflows, and build scenarios to test concepts for lunar roving operations. The software is open source, and may be downloaded from https:github.comnasamct.
Presentations - Emond, A.M. and others, 2015 | Alaska Division of
: Download below or please see our publication sales page for more information. Quadrangle(s): Big Delta Creek Schist; Bush Prospect; Chicken Metamorphic Complex; Chicken Pluton; DIGHEM-V EM System; Delta
ERIC Educational Resources Information Center
Bieron, Joseph F.; Dinan, Frank J.
2000-01-01
Presents a chemistry report on methamphetamine synthesis downloaded from the Internet and asks students to point out errors and answer questions about the text. Discusses the methods of methamphetamine synthesis and major issues in writing a case study. (YDS)
Development of the prototype data management system of the solar H-alpha full disk observation
NASA Astrophysics Data System (ADS)
Wei, Ka-Ning; Zhao, Shi-Qing; Li, Qiong-Ying; Chen, Dong
2004-06-01
The Solar Chromospheric Telescope in Yunnan Observatory generates about 2G bytes fits format data per day. Huge amounts of data will bring inconvenience for people to use. Hence, data searching and sharing are important at present. Data searching, on-line browsing, remote accesses and download are developed with a prototype data management system of the solar H-alpha full disk observation, and improved by the working flow technology. Based on Windows XP operating system and MySQL data management system, a prototype system of browse/server model is developed by JAVA and JSP. Data compression, searching, browsing, deletion need authority and download in real-time have been achieved.
JHelioviewer: Open-Source Software for Discovery and Image Access in the Petabyte Age
NASA Astrophysics Data System (ADS)
Mueller, D.; Dimitoglou, G.; Garcia Ortiz, J.; Langenberg, M.; Nuhn, M.; Dau, A.; Pagel, S.; Schmidt, L.; Hughitt, V. K.; Ireland, J.; Fleck, B.
2011-12-01
The unprecedented torrent of data returned by the Solar Dynamics Observatory is both a blessing and a barrier: a blessing for making available data with significantly higher spatial and temporal resolution, but a barrier for scientists to access, browse and analyze them. With such staggering data volume, the data is accessible only from a few repositories and users have to deal with data sets effectively immobile and practically difficult to download. From a scientist's perspective this poses three challenges: accessing, browsing and finding interesting data while avoiding the proverbial search for a needle in a haystack. To address these challenges, we have developed JHelioviewer, an open-source visualization software that lets users browse large data volumes both as still images and movies. We did so by deploying an efficient image encoding, storage, and dissemination solution using the JPEG 2000 standard. This solution enables users to access remote images at different resolution levels as a single data stream. Users can view, manipulate, pan, zoom, and overlay JPEG 2000 compressed data quickly, without severe network bandwidth penalties. Besides viewing data, the browser provides third-party metadata and event catalog integration to quickly locate data of interest, as well as an interface to the Virtual Solar Observatory to download science-quality data. As part of the ESA/NASA Helioviewer Project, JHelioviewer offers intuitive ways to browse large amounts of heterogeneous data remotely and provides an extensible and customizable open-source platform for the scientific community. In addition, the easy-to-use graphical user interface enables the general public and educators to access, enjoy and reuse data from space missions without barriers.
Condor-COPASI: high-throughput computing for biochemical networks
2012-01-01
Background Mathematical modelling has become a standard technique to improve our understanding of complex biological systems. As models become larger and more complex, simulations and analyses require increasing amounts of computational power. Clusters of computers in a high-throughput computing environment can help to provide the resources required for computationally expensive model analysis. However, exploiting such a system can be difficult for users without the necessary expertise. Results We present Condor-COPASI, a server-based software tool that integrates COPASI, a biological pathway simulation tool, with Condor, a high-throughput computing environment. Condor-COPASI provides a web-based interface, which makes it extremely easy for a user to run a number of model simulation and analysis tasks in parallel. Tasks are transparently split into smaller parts, and submitted for execution on a Condor pool. Result output is presented to the user in a number of formats, including tables and interactive graphical displays. Conclusions Condor-COPASI can effectively use a Condor high-throughput computing environment to provide significant gains in performance for a number of model simulation and analysis tasks. Condor-COPASI is free, open source software, released under the Artistic License 2.0, and is suitable for use by any institution with access to a Condor pool. Source code is freely available for download at http://code.google.com/p/condor-copasi/, along with full instructions on deployment and usage. PMID:22834945
MyOcean Central Information System - Achievements and Perspectives
NASA Astrophysics Data System (ADS)
Claverie, Vincent; Loubrieu, Thomas; Jolibois, Tony; de Dianous, Rémi; Blower, Jon; Romero, Laia; Griffiths, Guy
2013-04-01
Since 2009, MyOcean (http://www.myocean.eu) is providing an operational service, for forecasts, analysis and expertise on ocean currents, temperature, salinity, sea level, primary ecosystems and ice coverage. The production of observation and forecasting data is done by 42 Production Units (PU). Product download and visualisation are hosted by 25 Dissemination Units (DU). All these products and associated services are gathered in a single catalogue hiding the intricate distributed organization of PUs and DUs. Besides applying INSPIRE directive and OGC recommendations, MyOcean overcomes technical choices and challenges. This presentation focuses on 3 specific issues met by MyOcean and relevant for many Spatial Data Infrastructures: user's transaction accounting, large volume download and stream line the catalogue maintenance. Transaction Accounting: Set up powerful means to get detailed knowledge of system usage in order to subsequently improve the products (ocean observations, analysis and forecast dataset) and services (view, download) offer. This subject drives the following ones: Central authentication management for the distributed web services implementations: add-on to THREDDS Data Server for WMS and NETCDF sub-setting service, specific FTP. Share user management with co-funding projects. In addition to MyOcean, alternate projects also need consolidated information about the use of the cofunded products. Provide a central facility for the user management. This central facility provides users' rights to geographically distributed services and gathers transaction accounting history from these distributed services. Propose a user-friendly web interface to download large volume of data (several GigaBytes) as robust as basic FTP but intuitive and file/directory independent. This should rely on a web service drafting the INSPIRE to-be specification and OGC recommendations for download taking into account that FTP server is not enough friendly (need to know filenames, directories) and Web-page not allowing downloading several files. Streamline the maintenance of the central catalogue. The major update for MyOcean v3 (April 2013) is the usage of Geonetwork for catalogue management. This improves the system at different levels : The editing interface is more user-friendly and the catalogue updates are managed in a workflow. This workflow allows higher flexibility for minor updates without giving up the high level qualification requirements for the catalogue content. The distributed web services (download, view) are automatically harvested from the THREDDS Data Server. Thus the manual editing on the catalogue is reduced, the associated typos are avoided and the quality of information is finally improved.
Smart Grid Educational Series | Energy Systems Integration Facility | NREL
generation through transmission, all the way to the distribution infrastructure. Download presentation | Text on key takeaways from breakout group discussions. Learn more about the workshop. Text Version Text presentation PDF | Text Version Using MultiSpeak Data Model Standard & Essence Anomaly Detection for ICS
LED-based near infrared sensor for cancer diagnostics
NASA Astrophysics Data System (ADS)
Bogomolov, Andrey; Ageev, Vladimir; Zabarylo, Urszula; Usenov, Iskander; Schulte, Franziska; Kirsanov, Dmitry; Belikova, Valeria; Minet, Olaf; Feliksberger, E.; Meshkovsky, I.; Artyushenko, Viacheslav
2016-03-01
Optical spectroscopic technologies are increasingly used for cancer diagnostics. Feasibility of differentiation between malignant and healthy samples of human kidney using Fluorescence, Raman, MIR and NIR spectroscopy has been recently reported . In the present work, a simplification of NIR spectroscopy method has been studied. Traditional high-resolution NIR spectrometry was replaced by an optical sensor based on a set of light-emitting diodes at selected wavelengths as light sources and a photodiode. Two prototypes of the sensor have been developed and tested using 14 in-vitro samples of seven kidney tumor patients. Statistical evaluation of results using principal component analysis and partial least-squares discriminant analysis has been performed. Despite only partial discrimination between tumor and healthy tissue achieved by the presented new technique, the results evidence benefits of LED-based near-infrared sensing used for oncological diagnostics. Publisher's Note: This paper, originally published on 4 March, 2016, was replaced with a corrected/revised version on 7 April, 2016. If you downloaded the original PDF but are unable to access the revision, please contact SPIE Digital Library Customer Service for assistance.
Detailed analysis of complex single molecule FRET data with the software MASH
NASA Astrophysics Data System (ADS)
Hadzic, Mélodie C. A. S.; Kowerko, Danny; Börner, Richard; Zelger-Paulus, Susann; Sigel, Roland K. O.
2016-04-01
The processing and analysis of surface-immobilized single molecule FRET (Förster resonance energy transfer) data follows systematic steps (e.g. single molecule localization, clearance of different sources of noise, selection of the conformational and kinetic model, etc.) that require a solid knowledge in optics, photophysics, signal processing and statistics. The present proceeding aims at standardizing and facilitating procedures for single molecule detection by guiding the reader through an optimization protocol for a particular experimental data set. Relevant features were determined from single molecule movies (SMM) imaging Cy3- and Cy5-labeled Sc.ai5γ group II intron molecules synthetically recreated, to test the performances of four different detection algorithms. Up to 120 different parameterizations per method were routinely evaluated to finally establish an optimum detection procedure. The present protocol is adaptable to any movie displaying surface-immobilized molecules, and can be easily reproduced with our home-written software MASH (multifunctional analysis software for heterogeneous data) and script routines (both available in the download section of www.chem.uzh.ch/rna).
Academic Software Downloads from Google Code: Useful Usage Indicators?
ERIC Educational Resources Information Center
Thelwall, Mike; Kousha, Kayvan
2016-01-01
Introduction: Computer scientists and other researchers often make their programs freely available online. If this software makes a valuable contribution inside or outside of academia then its creators may want to demonstrate this with a suitable indicator, such as download counts. Methods: Download counts, citation counts, labels and licenses…
Wing download reduction using vortex trapping plates
NASA Technical Reports Server (NTRS)
Light, Jeffrey S.; Stremel, Paul M.; Bilanin, Alan J.
1994-01-01
A download reduction technique using spanwise plates on the upper and lower wing surfaces has been examined. Experimental and analytical techniques were used to determine the download reduction obtained using this technique. Simple two-dimensional wind tunnel testing confirmed the validity of the technique for reducing two-dimensional airfoil drag. Computations using a two-dimensional Navier-Stokes analysis provided insight into the mechanism causing the drag reduction. Finally, the download reduction technique was tested using a rotor and wing to determine the benefits for a semispan configuration representative of a tilt rotor aircraft.
MannDB: A microbial annotation database for protein characterization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhou, C; Lam, M; Smith, J
2006-05-19
MannDB was created to meet a need for rapid, comprehensive automated protein sequence analyses to support selection of proteins suitable as targets for driving the development of reagents for pathogen or protein toxin detection. Because a large number of open-source tools were needed, it was necessary to produce a software system to scale the computations for whole-proteome analysis. Thus, we built a fully automated system for executing software tools and for storage, integration, and display of automated protein sequence analysis and annotation data. MannDB is a relational database that organizes data resulting from fully automated, high-throughput protein-sequence analyses using open-sourcemore » tools. Types of analyses provided include predictions of cleavage, chemical properties, classification, features, functional assignment, post-translational modifications, motifs, antigenicity, and secondary structure. Proteomes (lists of hypothetical and known proteins) are downloaded and parsed from Genbank and then inserted into MannDB, and annotations from SwissProt are downloaded when identifiers are found in the Genbank entry or when identical sequences are identified. Currently 36 open-source tools are run against MannDB protein sequences either on local systems or by means of batch submission to external servers. In addition, BLAST against protein entries in MvirDB, our database of microbial virulence factors, is performed. A web client browser enables viewing of computational results and downloaded annotations, and a query tool enables structured and free-text search capabilities. When available, links to external databases, including MvirDB, are provided. MannDB contains whole-proteome analyses for at least one representative organism from each category of biological threat organism listed by APHIS, CDC, HHS, NIAID, USDA, USFDA, and WHO. MannDB comprises a large number of genomes and comprehensive protein sequence analyses representing organisms listed as high-priority agents on the websites of several governmental organizations concerned with bio-terrorism. MannDB provides the user with a BLAST interface for comparison of native and non-native sequences and a query tool for conveniently selecting proteins of interest. In addition, the user has access to a web-based browser that compiles comprehensive and extensive reports.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Blackwell, David D.; Chickering Pace, Cathy; Richards, Maria C.
The National Geothermal Data System (NGDS) is a Department of Energy funded effort to create a single cataloged source for a variety of geothermal information through a distributed network of databases made available via web services. The NGDS will help identify regions suitable for potential development and further scientific data collection and analysis of geothermal resources as a source for clean, renewable energy. A key NGDS repository or ‘node’ is located at Southern Methodist University developed by a consortium made up of: • SMU Geothermal Laboratory • Siemens Corporate Technology, a division of Siemens Corporation • Bureau of Economic Geologymore » at the University of Texas at Austin • Cornell Energy Institute, Cornell University • Geothermal Resources Council • MLKay Technologies • Texas Tech University • University of North Dakota. The focus of resources and research encompass the United States with particular emphasis on the Gulf Coast (on and off shore), the Great Plains, and the Eastern U.S. The data collection includes the thermal, geological and geophysical characteristics of these area resources. Types of data include, but are not limited to, temperature, heat flow, thermal conductivity, radiogenic heat production, porosity, permeability, geological structure, core geophysical logs, well tests, estimated reservoir volume, in situ stress, oil and gas well fluid chemistry, oil and gas well information, and conventional and enhanced geothermal system related resources. Libraries of publications and reports are combined into a unified, accessible, catalog with links for downloading non-copyrighted items. Field notes, individual temperature logs, site maps and related resources are included to increase data collection knowledge. Additional research based on legacy data to improve quality increases our understanding of the local and regional geology and geothermal characteristics. The software to enable the integration, analysis, and dissemination of this team’s NGDS contributions was developed by Siemens Corporate Technology. The SMU Node interactive application is accessible at http://geothermal.smu.edu. Additionally, files may be downloaded from either http://geothermal.smu.edu:9000/geoserver/web/ or through http://geothermal.smu.edu/static/DownloadFilesButtonPage.htm. The Geothermal Resources Council Library is available at https://www.geothermal-library.org/.« less
GAN: a platform of genomics and genetics analysis and application in Nicotiana
Yang, Shuai; Zhang, Xingwei; Li, Huayang; Chen, Yudong
2018-01-01
Abstract Nicotiana is an important Solanaceae genus, and plays a significant role in modern biological research. Massive Nicotiana biological data have emerged from in-depth genomics and genetics studies. From big data to big discovery, large-scale analysis and application with new platforms is critical. Based on data accumulation, a comprehensive platform of Genomics and Genetics Analysis and Application in Nicotiana (GAN) has been developed, and is publicly available at http://biodb.sdau.edu.cn/gan/. GAN consists of four main sections: (i) Sources, a total of 5267 germplasm lines, along with detailed descriptions of associated characteristics, are all available on the Germplasm page, which can be queried using eight different inquiry modes. Seven fully sequenced species with accompanying sequences and detailed genomic annotation are available on the Genomics page. (ii) Genetics, detailed descriptions of 10 genetic linkage maps, constructed by different parents, 2239 KEGG metabolic pathway maps and 209 945 gene families across all catalogued genes, along with two co-linearity maps combining N. tabacum with available tomato and potato linkage maps are available here. Furthermore, 3 963 119 genome-SSRs, 10 621 016 SNPs, 12 388 PIPs and 102 895 reverse transcription-polymerase chain reaction primers, are all available to be used and searched on the Markers page. (iii) Tools, the genome browser JBrowse and five useful online bioinformatics softwares, Blast, Primer3, SSR-detect, Nucl-Protein and E-PCR, are provided on the JBrowse and Tools pages. (iv) Auxiliary, all the datasets are shown on a Statistics page, and are available for download on a Download page. In addition, the user’s manual is provided on a Manual page in English and Chinese languages. GAN provides a user-friendly Web interface for searching, browsing and downloading the genomics and genetics datasets in Nicotiana. As far as we can ascertain, GAN is the most comprehensive source of bio-data available, and the most applicable resource for breeding, gene mapping, gene cloning, the study of the origin and evolution of polyploidy, and related studies in Nicotiana. Database URL: http://biodb.sdau.edu.cn/gan/ PMID:29688356
EPA Facility Registry Service (FRS): Facility Interests Dataset - Intranet Download
This downloadable data package consists of location and facility identification information from EPA's Facility Registry Service (FRS) for all sites that are available in the FRS individual feature layers. The layers comprise the FRS major program databases, including:Assessment Cleanup and Redevelopment Exchange System (ACRES) : brownfields sites ; Air Facility System (AFS) : stationary sources of air pollution ; Air Quality System (AQS) : ambient air pollution data from monitoring stations; Bureau of Indian Affairs (BIA) : schools data on Indian land; Base Realignment and Closure (BRAC) facilities; Clean Air Markets Division Business System (CAMDBS) : market-based air pollution control programs; Comprehensive Environmental Response, Compensation, and Liability Information System (CERCLIS) : hazardous waste sites; Integrated Compliance Information System (ICIS) : integrated enforcement and compliance information; National Compliance Database (NCDB) : Federal Insecticide, Fungicide, and Rodenticide Act (FIFRA) and the Toxic Substances Control Act (TSCA); National Pollutant Discharge Elimination System (NPDES) module of ICIS : NPDES surface water permits; Radiation Information Database (RADINFO) : radiation and radioactivity facilities; RACT/BACT/LAER Clearinghouse (RBLC) : best available air pollution technology requirements; Resource Conservation and Recovery Act Information System (RCRAInfo) : tracks generators, transporters, treaters, storers, and disposers
EPA Facility Registry Service (FRS): Facility Interests Dataset Download
This downloadable data package consists of location and facility identification information from EPA's Facility Registry Service (FRS) for all sites that are available in the FRS individual feature layers. The layers comprise the FRS major program databases, including:Assessment Cleanup and Redevelopment Exchange System (ACRES) : brownfields sites ; Air Facility System (AFS) : stationary sources of air pollution ; Air Quality System (AQS) : ambient air pollution data from monitoring stations; Bureau of Indian Affairs (BIA) : schools data on Indian land; Base Realignment and Closure (BRAC) facilities; Clean Air Markets Division Business System (CAMDBS) : market-based air pollution control programs; Comprehensive Environmental Response, Compensation, and Liability Information System (CERCLIS) : hazardous waste sites; Integrated Compliance Information System (ICIS) : integrated enforcement and compliance information; National Compliance Database (NCDB) : Federal Insecticide, Fungicide, and Rodenticide Act (FIFRA) and the Toxic Substances Control Act (TSCA); National Pollutant Discharge Elimination System (NPDES) module of ICIS : NPDES surface water permits; Radiation Information Database (RADINFO) : radiation and radioactivity facilities; RACT/BACT/LAER Clearinghouse (RBLC) : best available air pollution technology requirements; Resource Conservation and Recovery Act Information System (RCRAInfo) : tracks generators, transporters, treaters, storers, and disposers
Drive Cycle Data | Transportation Secure Data Center | NREL
one file. Download Individual Survey and Study Drive Cycle Data Below you'll find drive cycle data download files for individual surveys and studies. Greater Fairbanks, Alaska, Transportation Survey Drive Cycle Data by Vehicle (24-hour period of operation) Download Learn more about the survey. California
On the Web, a Textbook Proliferation of Piracy
ERIC Educational Resources Information Center
Young, Jeffrey R.
2008-01-01
Book publishers are stepping up efforts to stop college students from downloading illegal copies of textbooks online. One Web site, Textbook Torrents, promises more than 5,000 textbooks for download in PDF format, complete with the original books' layouts and full-color illustrations. Users must simply set up a free account and download a free…
Graphing Online Searches with Lotus 1-2-3.
ERIC Educational Resources Information Center
Persson, Olle
1986-01-01
This article illustrates how Lotus 1-2-3 software can be used to create graphs using downloaded online searches as raw material, notes most commands applied, and outlines three required steps: downloading, importing the downloading file into the worksheet, and making graphs. An example in bibliometrics and sample graphs are included. (EJS)
Alternative Fuels Data Center: Schwan's Home Service Delivers With
distribute products across the United States. For information about this project, contact Twin Cities Clean Cities Coalition. Download QuickTime Video QuickTime (.mov) Download Windows Media Video Windows Media (.wmv) Video Download Help Text version See more videos provided by Clean Cities TV and FuelEconomy.gov
Development and Implementation of Dynamic Scripts to Execute Cycled WRF/GSI Forecasts
NASA Technical Reports Server (NTRS)
Zavodsky, Bradley; Srikishen, Jayanthi; Berndt, Emily; Li, Quanli; Watson, Leela
2014-01-01
Automating the coupling of data assimilation (DA) and modeling systems is a unique challenge in the numerical weather prediction (NWP) research community. In recent years, the Development Testbed Center (DTC) has released well-documented tools such as the Weather Research and Forecasting (WRF) model and the Gridpoint Statistical Interpolation (GSI) DA system that can be easily downloaded, installed, and run by researchers on their local systems. However, developing a coupled system in which the various preprocessing, DA, model, and postprocessing capabilities are all integrated can be labor-intensive if one has little experience with any of these individual systems. Additionally, operational modeling entities generally have specific coupling methodologies that can take time to understand and develop code to implement properly. To better enable collaborating researchers to perform modeling and DA experiments with GSI, the Short-term Prediction Research and Transition (SPoRT) Center has developed a set of Perl scripts that couple GSI and WRF in a cycling methodology consistent with the use of real-time, regional observation data from the National Centers for Environmental Prediction (NCEP)/Environmental Modeling Center (EMC). Because Perl is open source, the code can be easily downloaded and executed regardless of the user's native shell environment. This paper will provide a description of this open-source code and descriptions of a number of the use cases that have been performed by SPoRT collaborators using the scripts on different computing systems.
Using Open and Interoperable Ways to Publish and Access LANCE AIRS Near-Real Time Data
NASA Astrophysics Data System (ADS)
Zhao, P.; Lynnes, C.; Vollmer, B.; Savtchenko, A. K.; Yang, W.
2011-12-01
Atmospheric Infrared Sounder (AIRS) Near-Real Time (NRT) data from the Land Atmosphere Near real time Capability for EOS (LANCE) provide the information on the global and regional atmospheric state with very low latency. An open and interoperable platform is useful to facilitate access to and integration of LANCE AIRS NRT data. This paper discusses the use of open-source software components to build Web services for publishing and accessing AIRS NRT data in the context of Service Oriented Architecture (SOA). The AIRS NRT data have also been made available through an OPeNDAP server. OPeNDAP allows several open-source netCDF-based tools such as Integrated Data Viewer, Ferret and Panoply to directly display the Level 2 data over the network. To enable users to locate swath data files in the OPeNDAP server that lie within a certain geographical area, graphical "granule maps" are being added to show the outline of each file on a map of the Earth. The metadata of AIRS NRT data and services is then explored to implement information advertisement and discovery in catalogue systems. Datacasting, an RSS-based technology for accessing Earth Science data and information to facilitate the subscriptions to AIRS NRT data availability, filtering, downloading and viewing data, is also discussed. To provide an easy entry point to AIRS NRT data and services, a Web portal designed for customized data downloading and visualization is introduced.
N/NC team claims three-peat softball championship title > U.S. Northern
base championship trophy to Lt. Gen. Michael Dubie, deputy commander United States Northern Command DOWNLOAD HI-RES / PHOTO DETAILS The NORAD-USNORTHCOM silver softball team presented their base championship FORCE BASE, Colo., Feb. 11, 2015 - The NORAD-USNORTHCOM silver softball team presented their base
News from ESO Archive Services: Next Generation Request Handler and Data Access Delegation
NASA Astrophysics Data System (ADS)
Fourniol, N.; Lockhart, J.; Suchar, D.; Tacconi-Garman, L. E.; Moins, C.; Bierwirth, T.; Eglitis, P.; Vuong, M.; Micol, A.; Delmotte, N.; Vera, I.; Dobrzycki, A.; Forchì, V.; Lange, U.; Sogni, F.
2012-09-01
We present the new ESO Archive services which improve the electronic data access via the Download Manager and also provide PIs with the option to delegate data access to their collaborators via the Data Access Control.
Practices in Code Discoverability: Astrophysics Source Code Library
NASA Astrophysics Data System (ADS)
Allen, A.; Teuben, P.; Nemiroff, R. J.; Shamir, L.
2012-09-01
Here we describe the Astrophysics Source Code Library (ASCL), which takes an active approach to sharing astrophysics source code. ASCL's editor seeks out both new and old peer-reviewed papers that describe methods or experiments that involve the development or use of source code, and adds entries for the found codes to the library. This approach ensures that source codes are added without requiring authors to actively submit them, resulting in a comprehensive listing that covers a significant number of the astrophysics source codes used in peer-reviewed studies. The ASCL now has over 340 codes in it and continues to grow. In 2011, the ASCL has on average added 19 codes per month. An advisory committee has been established to provide input and guide the development and expansion of the new site, and a marketing plan has been developed and is being executed. All ASCL source codes have been used to generate results published in or submitted to a refereed journal and are freely available either via a download site or from an identified source. This paper provides the history and description of the ASCL. It lists the requirements for including codes, examines the advantages of the ASCL, and outlines some of its future plans.
Radiology teaching file cases on the World Wide Web.
Scalzetti, E M
1997-08-01
The presentation of a radiographic teaching file on the World Wide Web can be enhanced by attending to principles of web design. Chief among these are appropriate control of page layout, minimization of the time required to download a page from the remote server, and provision for navigation within and among the web pages that constitute the site. Page layout is easily accomplished by the use of tables; column widths can be fixed to maintain an acceptable line length for text. Downloading time is minimized by rigorous editing and by optimal compression of image files; beyond this, techniques like preloading of images and specification of image width and height are also helpful. Navigation controls should be clear, consistent, and readily available.
Modes of Access: The Influence of Dissemination Channels on the Use of Open Access Monographs
ERIC Educational Resources Information Center
Snijder, Ronald
2014-01-01
Introduction: This paper studies the effects of several dissemination channels in an open access environment by analysing the download data of the OAPEN Library. Method: Download data were obtained containing the number of downloads and the name of the Internet provider. Based on public information, each Internet provider was categorised. The…
Alternative Fuels Data Center: Maine's Only Biodiesel Manufacturer Powers
this project, contact Maine Clean Communities. Download QuickTime Video QuickTime (.mov) Download Windows Media Video Windows Media (.wmv) Video Download Help Text version See more videos provided by truck Krug Energy Opens Natural Gas Fueling Station in Arkansas June 18, 2016 photo of natural gas
Alternative Fuels Data Center: Texas Taxis Go Hybrid
information about this project, contact Alamo Area Clean Cities (San Antonio). Download QuickTime Video QuickTime (.mov) Download Windows Media Video Windows Media (.wmv) Video Download Help Text version See more car Hydrogen Powers Fuel Cell Vehicles in California Nov. 18, 2017 Photo of a car Smart Car Shopping
Farris, Dominic James; Lichtwark, Glen A
2016-05-01
Dynamic measurements of human muscle fascicle length from sequences of B-mode ultrasound images have become increasingly prevalent in biomedical research. Manual digitisation of these images is time consuming and algorithms for automating the process have been developed. Here we present a freely available software implementation of a previously validated algorithm for semi-automated tracking of muscle fascicle length in dynamic ultrasound image recordings, "UltraTrack". UltraTrack implements an affine extension to an optic flow algorithm to track movement of the muscle fascicle end-points throughout dynamically recorded sequences of images. The underlying algorithm has been previously described and its reliability tested, but here we present the software implementation with features for: tracking multiple fascicles in multiple muscles simultaneously; correcting temporal drift in measurements; manually adjusting tracking results; saving and re-loading of tracking results and loading a range of file formats. Two example runs of the software are presented detailing the tracking of fascicles from several lower limb muscles during a squatting and walking activity. We have presented a software implementation of a validated fascicle-tracking algorithm and made the source code and standalone versions freely available for download. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
A Web-based open-source database for the distribution of hyperspectral signatures
NASA Astrophysics Data System (ADS)
Ferwerda, J. G.; Jones, S. D.; Du, Pei-Jun
2006-10-01
With the coming of age of field spectroscopy as a non-destructive means to collect information on the physiology of vegetation, there is a need for storage of signatures, and, more importantly, their metadata. Without the proper organisation of metadata, the signatures itself become limited. In order to facilitate re-distribution of data, a database for the storage & distribution of hyperspectral signatures and their metadata was designed. The database was built using open-source software, and can be used by the hyperspectral community to share their data. Data is uploaded through a simple web-based interface. The database recognizes major file-formats by ASD, GER and International Spectronics. The database source code is available for download through the hyperspectral.info web domain, and we happily invite suggestion for additions & modification for the database to be submitted through the online forums on the same website.
EPA'S REPORT ON THE ENVIRONMENT (2003 Draft)
The RoE presents information on environmental indicators in the areas of air, water, land, human health, and ecological condition. The report is available for download and the RoE information is searchable via an on-line database site: www.epa.gov/roe.
visPIG--a web tool for producing multi-region, multi-track, multi-scale plots of genetic data.
Scales, Matthew; Jäger, Roland; Migliorini, Gabriele; Houlston, Richard S; Henrion, Marc Y R
2014-01-01
We present VISual Plotting Interface for Genetics (visPIG; http://vispig.icr.ac.uk), a web application to produce multi-track, multi-scale, multi-region plots of genetic data. visPIG has been designed to allow users not well versed with mathematical software packages and/or programming languages such as R, Matlab®, Python, etc., to integrate data from multiple sources for interpretation and to easily create publication-ready figures. While web tools such as the UCSC Genome Browser or the WashU Epigenome Browser allow custom data uploads, such tools are primarily designed for data exploration. This is also true for the desktop-run Integrative Genomics Viewer (IGV). Other locally run data visualisation software such as Circos require significant computer skills of the user. The visPIG web application is a menu-based interface that allows users to upload custom data tracks and set track-specific parameters. Figures can be downloaded as PDF or PNG files. For sensitive data, the underlying R code can also be downloaded and run locally. visPIG is multi-track: it can display many different data types (e.g association, functional annotation, intensity, interaction, heat map data,…). It also allows annotation of genes and other custom features in the plotted region(s). Data tracks can be plotted individually or on a single figure. visPIG is multi-region: it supports plotting multiple regions, be they kilo- or megabases apart or even on different chromosomes. Finally, visPIG is multi-scale: a sub-region of particular interest can be 'zoomed' in. We describe the various features of visPIG and illustrate its utility with examples. visPIG is freely available through http://vispig.icr.ac.uk under a GNU General Public License (GPLv3).
NASA Astrophysics Data System (ADS)
Roach, Colin; Carlsson, Johan; Cary, John R.; Alexander, David A.
2002-11-01
The National Transport Code Collaboration (NTCC) has developed an array of software, including a data client/server. The data server, which is written in C++, serves local data (in the ITER Profile Database format) as well as remote data (by accessing one or several MDS+ servers). The client, a web-invocable Java applet, provides a uniform, intuitive, user-friendly, graphical interface to the data server. The uniformity of the interface relieves the user from the trouble of mastering the differences between different data formats and lets him/her focus on the essentials: plotting and viewing the data. The user runs the client by visiting a web page using any Java capable Web browser. The client is automatically downloaded and run by the browser. A reference to the data server is then retrieved via the standard Web protocol (HTTP). The communication between the client and the server is then handled by the mature, industry-standard CORBA middleware. CORBA has bindings for all common languages and many high-quality implementations are available (both Open Source and commercial). The NTCC data server has been installed at the ITPA International Multi-tokamak Confinement Profile Database, which is hosted by the UKAEA at Culham Science Centre. The installation of the data server is protected by an Internet firewall. To make it accessible to clients outside the firewall some modifications of the server were required. The working version of the ITPA confinement profile database is not open to the public. Authentification of legitimate users is done utilizing built-in Java security features to demand a password to download the client. We present an overview of the NTCC data client/server and some details of how the CORBA firewall-traversal issues were resolved and how the user authentification is implemented.
NASA Astrophysics Data System (ADS)
Hancher, M.; Lieber, A.; Scott, L.
2017-12-01
The volume of satellite and other Earth data is growing rapidly. Combined with information about where people are, these data can inform decisions in a range of areas including food and water security, disease and disaster risk management, biodiversity, and climate adaptation. Google's platform for planetary-scale geospatial data analysis, Earth Engine, grants access to petabytes of continually updating Earth data, programming interfaces for analyzing the data without the need to download and manage it, and mechanisms for sharing the analyses and publishing results for data-driven decision making. In addition to data about the planet, data about the human planet - population, settlement and urban models - are now available for global scale analysis. The Earth Engine APIs enable these data to be joined, combined or visualized with economic or environmental indicators such as nighttime lights trends, global surface water, or climate projections, in the browser without the need to download anything. We will present our newly developed application intended to serve as a resource for government agencies, disaster response and public health programs, or other consumers of these data to quickly visualize the different population models, and compare them to ground truth tabular data to determine which model suits their immediate needs. Users can further tap into the power of Earth Engine and other Google technologies to perform a range of analysis from simple statistics in custom regions to more complex machine learning models. We will highlight case studies in which organizations around the world have used Earth Engine to combine population data with multiple other sources of data, such as water resources and roads data, over deep stacks of temporal imagery to model disease risk and accessibility to inform decisions.
Alternative Fuels Data Center: America's Largest Home Runs on Biodiesel in
Coalition (Western North Carolina). Download QuickTime Video QuickTime (.mov) Download Windows Media Video Windows Media (.wmv) Video Download Help Text version See more videos provided by Clean Cities TV and Photo of a car Hydrogen Powers Fuel Cell Vehicles in California Nov. 18, 2017 Photo of a car Smart Car
Alternative Fuels Data Center: Rhode Island EV Initiative Adds Chargers
Ocean State Clean Cities. Download QuickTime Video QuickTime (.mov) Download Windows Media Video Windows Media (.wmv) Video Download Help Text version See more videos provided by Clean Cities TV and Photo of a car Hydrogen Powers Fuel Cell Vehicles in California Nov. 18, 2017 Photo of a car Smart Car
Alternative Fuels Data Center: Worcester Regional Transit Authority Drives
Clean Cities. Download QuickTime Video QuickTime (.mov) Download Windows Media Video Windows Media (.wmv ) Video Download Help Text version See more videos provided by Clean Cities TV and FuelEconomy.gov Fuel Cell Vehicles in California Nov. 18, 2017 Photo of a car Smart Car Shopping Nov. 4, 2017 Image of
Alternative Fuels Data Center: Propane Powers Airport Shuttles in New
Clean Fuel Partnership. Download QuickTime Video QuickTime (.mov) Download Windows Media Video Windows Media (.wmv) Video Download Help Text version See more videos provided by Clean Cities TV and Vehicles in California Nov. 18, 2017 Photo of a car Smart Car Shopping Nov. 4, 2017 Photo of a truck
Tonia, Thomy; Van Oyen, Herman; Berger, Anke; Schindler, Christian; Künzli, Nino
2016-05-01
We sought to investigate whether exposing scientific papers to social media (SM) has an effect on article downloads and citations. We randomized all International Journal of Public Health (IJPH) original articles published between December 2012 and December 2014 to SM exposure (blog post, Twitter and Facebook) or no exposure at three different time points after first online publication. 130 papers (SM exposure = 65, control = 65) were randomized. The number of downloads did not differ significantly between groups (p = 0.60) nor did the number of citations (p = 0.88). Adjusting for length of observation and paper's geographical origin did not change these results. There was no difference in the number of downloads and citations between the SM exposure and control group when we stratified for open access status. The number of downloads and number of citations were significantly correlated in both groups. SM exposure did not have a significant effect on traditional impact metrics, such as downloads and citations. However, other metrics may measure the added value that social media might offer to a scientific journal, such as wider dissemination.
Simulated single molecule microscopy with SMeagol.
Lindén, Martin; Ćurić, Vladimir; Boucharin, Alexis; Fange, David; Elf, Johan
2016-08-01
SMeagol is a software tool to simulate highly realistic microscopy data based on spatial systems biology models, in order to facilitate development, validation and optimization of advanced analysis methods for live cell single molecule microscopy data. SMeagol runs on Matlab R2014 and later, and uses compiled binaries in C for reaction-diffusion simulations. Documentation, source code and binaries for Mac OS, Windows and Ubuntu Linux can be downloaded from http://smeagol.sourceforge.net johan.elf@icm.uu.se Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.
Modeling Passive Propagation of Malwares on the WWW
NASA Astrophysics Data System (ADS)
Chunbo, Liu; Chunfu, Jia
Web-based malwares host in websites fixedly and download onto user's computers automatically while users browse. This passive propagation pattern is different from that of traditional viruses and worms. A propagation model based on reverse web graph is proposed. In this model, propagation of malwares is analyzed by means of random jump matrix which combines orderness and randomness of user browsing behaviors. Explanatory experiments, which has single or multiple propagation sources respectively, prove the validity of the model. Using this model, people can evaluate the hazardness of specified websites and take corresponding countermeasures.
Interactive web-based identification and visualization of transcript shared sequences.
Azhir, Alaleh; Merino, Louis-Henri; Nauen, David W
2018-05-12
We have developed TraC (Transcript Consensus), a web-based tool for detecting and visualizing shared sequences among two or more mRNA transcripts such as splice variants. Results including exon-exon boundaries are returned in a highly intuitive, data-rich, interactive plot that permits users to explore the similarities and differences of multiple transcript sequences. The online tool (http://labs.pathology.jhu.edu/nauen/trac/) is free to use. The source code is freely available for download (https://github.com/nauenlab/TraC). Copyright © 2018 Elsevier Inc. All rights reserved.
Using the Astrophysics Source Code Library
NASA Astrophysics Data System (ADS)
Allen, Alice; Teuben, P. J.; Berriman, G. B.; DuPrie, K.; Hanisch, R. J.; Mink, J. D.; Nemiroff, R. J.; Shamir, L.; Wallin, J. F.
2013-01-01
The Astrophysics Source Code Library (ASCL) is a free on-line registry of source codes that are of interest to astrophysicists; with over 500 codes, it is the largest collection of scientist-written astrophysics programs in existence. All ASCL source codes have been used to generate results published in or submitted to a refereed journal and are available either via a download site or from an identified source. An advisory committee formed in 2011 provides input and guides the development and expansion of the ASCL, and since January 2012, all accepted ASCL entries are indexed by ADS. Though software is increasingly important for the advancement of science in astrophysics, these methods are still often hidden from view or difficult to find. The ASCL (ascl.net/) seeks to improve the transparency and reproducibility of research by making these vital methods discoverable, and to provide recognition and incentive to those who write and release programs useful for astrophysics research. This poster provides a description of the ASCL, an update on recent additions, and the changes in the astrophysics community we are starting to see because of the ASCL.
Assessment of Department of Defense Basic Research
2005-01-01
Sciences, the National Academy of Engineering, the Institute of Medicine, and the National Research Council: • Download hundreds of free books in PDF...with our innovative research tools Thank you for downloading this free PDF. If you have comments, questions or just want more information... downloaded from: http://www.nap.edu/catalog/11177.html Assessment of Department of Defense Basic Research Committee on Department of Defense Basic
Privacy as Part of the App Decision-Making Process
2013-02-06
concerns they may not actively consider privacy while downloading apps from smartphone application marketplaces. Currently Android users have only the... Android permissions display which appears after they have selected an app to download to help them understand how applications access their information...marketplaces. Currently, Android users have only the Android permissions display, which appears after they have selected an app to download, to help
MetaPhinder—Identifying Bacteriophage Sequences in Metagenomic Data Sets
Villarroel, Julia; Lund, Ole; Voldby Larsen, Mette; Nielsen, Morten
2016-01-01
Bacteriophages are the most abundant biological entity on the planet, but at the same time do not account for much of the genetic material isolated from most environments due to their small genome sizes. They also show great genetic diversity and mosaic genomes making it challenging to analyze and understand them. Here we present MetaPhinder, a method to identify assembled genomic fragments (i.e.contigs) of phage origin in metagenomic data sets. The method is based on a comparison to a database of whole genome bacteriophage sequences, integrating hits to multiple genomes to accomodate for the mosaic genome structure of many bacteriophages. The method is demonstrated to out-perform both BLAST methods based on single hits and methods based on k-mer comparisons. MetaPhinder is available as a web service at the Center for Genomic Epidemiology https://cge.cbs.dtu.dk/services/MetaPhinder/, while the source code can be downloaded from https://bitbucket.org/genomicepidemiology/metaphinder or https://github.com/vanessajurtz/MetaPhinder. PMID:27684958
MetaPhinder-Identifying Bacteriophage Sequences in Metagenomic Data Sets.
Jurtz, Vanessa Isabell; Villarroel, Julia; Lund, Ole; Voldby Larsen, Mette; Nielsen, Morten
Bacteriophages are the most abundant biological entity on the planet, but at the same time do not account for much of the genetic material isolated from most environments due to their small genome sizes. They also show great genetic diversity and mosaic genomes making it challenging to analyze and understand them. Here we present MetaPhinder, a method to identify assembled genomic fragments (i.e.contigs) of phage origin in metagenomic data sets. The method is based on a comparison to a database of whole genome bacteriophage sequences, integrating hits to multiple genomes to accomodate for the mosaic genome structure of many bacteriophages. The method is demonstrated to out-perform both BLAST methods based on single hits and methods based on k-mer comparisons. MetaPhinder is available as a web service at the Center for Genomic Epidemiology https://cge.cbs.dtu.dk/services/MetaPhinder/, while the source code can be downloaded from https://bitbucket.org/genomicepidemiology/metaphinder or https://github.com/vanessajurtz/MetaPhinder.
LOLAweb: a containerized web server for interactive genomic locus overlap enrichment analysis.
Nagraj, V P; Magee, Neal E; Sheffield, Nathan C
2018-06-06
The past few years have seen an explosion of interest in understanding the role of regulatory DNA. This interest has driven large-scale production of functional genomics data and analytical methods. One popular analysis is to test for enrichment of overlaps between a query set of genomic regions and a database of region sets. In this way, new genomic data can be easily connected to annotations from external data sources. Here, we present an interactive interface for enrichment analysis of genomic locus overlaps using a web server called LOLAweb. LOLAweb accepts a set of genomic ranges from the user and tests it for enrichment against a database of region sets. LOLAweb renders results in an R Shiny application to provide interactive visualization features, enabling users to filter, sort, and explore enrichment results dynamically. LOLAweb is built and deployed in a Linux container, making it scalable to many concurrent users on our servers and also enabling users to download and run LOLAweb locally.
Estimating population diversity with CatchAll
Bunge, John; Woodard, Linda; Böhning, Dankmar; Foster, James A.; Connolly, Sean; Allen, Heather K.
2012-01-01
Motivation: The massive data produced by next-generation sequencing require advanced statistical tools. We address estimating the total diversity or species richness in a population. To date, only relatively simple methods have been implemented in available software. There is a need for software employing modern, computationally intensive statistical analyses including error, goodness-of-fit and robustness assessments. Results: We present CatchAll, a fast, easy-to-use, platform-independent program that computes maximum likelihood estimates for finite-mixture models, weighted linear regression-based analyses and coverage-based non-parametric methods, along with outlier diagnostics. Given sample ‘frequency count’ data, CatchAll computes 12 different diversity estimates and applies a model-selection algorithm. CatchAll also derives discounted diversity estimates to adjust for possibly uncertain low-frequency counts. It is accompanied by an Excel-based graphics program. Availability: Free executable downloads for Linux, Windows and Mac OS, with manual and source code, at www.northeastern.edu/catchall. Contact: jab18@cornell.edu PMID:22333246
A Mobile Data Application for the Fermi Mission
NASA Astrophysics Data System (ADS)
Stephens, Thomas E.; Science Support Center, Fermi
2014-01-01
With the ever increasing use of smartphones and tablets among scientists and the world at large, it becomes increasingly important for projects and missions to have mobile friendly access to their data. This access could come in the form of mobile friendly websites and/or native mobile applications that allow the users to explore or access the data. The Fermi Gamma-ray Space Telescope mission has begun work along the latter path. In this poster I present the current version of the Fermi Data Portal, a native mobile application for both Android and iOS devices that allows access to various high level public data products from the Fermi Science Support Center (FSSC), the Gamma-ray Coordinate Network (GCN), and other sources. While network access is required to download data, most of the data served by the app are stored locally and are available even when a network connection is not available. This poster discusses the application's features as well as the development experience and lessons learned so far along the way.
A Mobile Data Application for the Fermi Mission
NASA Astrophysics Data System (ADS)
Stephens, T. E.
2013-10-01
With the ever increasing use of smartphones and tablets among scientists and the world at large, it becomes increasingly important for projects and missions to have mobile friendly access to their data. This access could come in the form of mobile friendly websites and/or native mobile applications that allow the users to explore or access the data. The Fermi Gamma-ray Space Telescope Mission has begun work along the latter path. In this poster I present the initial version of the Fermi Mobile Data Portal, a native application for both Android and iOS devices that allows access to various high level public data products from the Fermi Science Support Center (FSSC), the Gamma-ray Coordinate Network (GCN), and other sources. While network access is required to download data, most of the data served by the app are stored locally and are available even when a network connection is not available. This poster discusses the application's features as well as the development experience and lessons learned so far along the way.
SfM with MRFs: discrete-continuous optimization for large-scale structure from motion.
Crandall, David J; Owens, Andrew; Snavely, Noah; Huttenlocher, Daniel P
2013-12-01
Recent work in structure from motion (SfM) has built 3D models from large collections of images downloaded from the Internet. Many approaches to this problem use incremental algorithms that solve progressively larger bundle adjustment problems. These incremental techniques scale poorly as the image collection grows, and can suffer from drift or local minima. We present an alternative framework for SfM based on finding a coarse initial solution using hybrid discrete-continuous optimization and then improving that solution using bundle adjustment. The initial optimization step uses a discrete Markov random field (MRF) formulation, coupled with a continuous Levenberg-Marquardt refinement. The formulation naturally incorporates various sources of information about both the cameras and points, including noisy geotags and vanishing point (VP) estimates. We test our method on several large-scale photo collections, including one with measured camera positions, and show that it produces models that are similar to or better than those produced by incremental bundle adjustment, but more robustly and in a fraction of the time.
On-the-fly Data Reprocessing and Analysis Capabilities from the XMM-Newton Archive
NASA Astrophysics Data System (ADS)
Ibarra, A.; Sarmiento, M.; Colomo, E.; Loiseau, N.; Salgado, J.; Gabriel, C.
2017-10-01
The XMM-Newton Science Archive (XSA) includes since last release the possibility to perform on-the-fly data processing with SAS through the Remote Interface for Science Analysis (RISA) server. It enables scientists to analyse data without any download and installation of data and software. The analysis options presently available include extraction of spectra and light curves of user-defined EPIC source regions and full reprocessing of data for which currently archived pipeline products were processed with older SAS versions or calibration files. The current pipeline is fully aligned with the most recent SAS and calibration, while the last full reprocessing of the archive was performed in 2013. The on-the-fly data processing functionality in this release is an experimental version and we invite the community to test and let us know their results. Known issues and workarounds are described in the 'Watchouts' section of the XSA web page. Feedback on how this functionality should evolve will be highly appreciated.
WormQTL—public archive and analysis web portal for natural variation data in Caenorhabditis spp
Snoek, L. Basten; Van der Velde, K. Joeri; Arends, Danny; Li, Yang; Beyer, Antje; Elvin, Mark; Fisher, Jasmin; Hajnal, Alex; Hengartner, Michael O.; Poulin, Gino B.; Rodriguez, Miriam; Schmid, Tobias; Schrimpf, Sabine; Xue, Feng; Jansen, Ritsert C.; Kammenga, Jan E.; Swertz, Morris A.
2013-01-01
Here, we present WormQTL (http://www.wormqtl.org), an easily accessible database enabling search, comparative analysis and meta-analysis of all data on variation in Caenorhabditis spp. Over the past decade, Caenorhabditis elegans has become instrumental for molecular quantitative genetics and the systems biology of natural variation. These efforts have resulted in a valuable amount of phenotypic, high-throughput molecular and genotypic data across different developmental worm stages and environments in hundreds of C. elegans strains. WormQTL provides a workbench of analysis tools for genotype–phenotype linkage and association mapping based on but not limited to R/qtl (http://www.rqtl.org). All data can be uploaded and downloaded using simple delimited text or Excel formats and are accessible via a public web user interface for biologists and R statistic and web service interfaces for bioinformaticians, based on open source MOLGENIS and xQTL workbench software. WormQTL welcomes data submissions from other worm researchers. PMID:23180786
WormQTL--public archive and analysis web portal for natural variation data in Caenorhabditis spp.
Snoek, L Basten; Van der Velde, K Joeri; Arends, Danny; Li, Yang; Beyer, Antje; Elvin, Mark; Fisher, Jasmin; Hajnal, Alex; Hengartner, Michael O; Poulin, Gino B; Rodriguez, Miriam; Schmid, Tobias; Schrimpf, Sabine; Xue, Feng; Jansen, Ritsert C; Kammenga, Jan E; Swertz, Morris A
2013-01-01
Here, we present WormQTL (http://www.wormqtl.org), an easily accessible database enabling search, comparative analysis and meta-analysis of all data on variation in Caenorhabditis spp. Over the past decade, Caenorhabditis elegans has become instrumental for molecular quantitative genetics and the systems biology of natural variation. These efforts have resulted in a valuable amount of phenotypic, high-throughput molecular and genotypic data across different developmental worm stages and environments in hundreds of C. elegans strains. WormQTL provides a workbench of analysis tools for genotype-phenotype linkage and association mapping based on but not limited to R/qtl (http://www.rqtl.org). All data can be uploaded and downloaded using simple delimited text or Excel formats and are accessible via a public web user interface for biologists and R statistic and web service interfaces for bioinformaticians, based on open source MOLGENIS and xQTL workbench software. WormQTL welcomes data submissions from other worm researchers.
Seismic Noise Characterization in the Northern Mississippi Embayment
NASA Astrophysics Data System (ADS)
Wiley, S.; Deshon, H. R.; Boyd, O. S.
2009-12-01
We present a study of seismic noise sources present within the northern Mississippi embayment near the New Madrid Seismic Zone (NMSZ). The northern embayment contains up to 1 km of unconsolidated coastal plain sediments overlying bedrock, making it an inherently noisy environment for seismic stations. The area is known to display high levels of cultural noise caused by agricultural activity, passing cars, trains, etc. We characterize continuous broadband seismic noise data recorded for the months of March through June 2009 at six stations operated by the Cooperative New Madrid Seismic Network. We looked at a single horizontal component of data during nighttime hours, defined as 6:15PM to 5:45AM Central Standard Time, which we determined to be the lowest amplitude period of noise for the region. Hourly median amplitudes were compared to daily average wind speeds downloaded from the National Oceanic and Atmospheric Administration. We find a correlation between time periods of increased noise and days with high wind speeds, suggesting that wind is likely a prevalent source of seismic noise in the area. The effects of wind on seismic recordings may result from wind induced tree root movement which causes ground motion to be recorded at the vaults located ~3m below ground. Automated studies utilizing the local network or the EarthScope Transportable Array, scheduled to arrive in the area in 2010-11, should expect to encounter wind induced noise fluctuations and must account for this in their analysis.
Use of Deception to Improve Client Honeypot Detection of Drive-by-Download Attacks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Popovsky, Barbara; Narvaez Suarez, Julia F.; Seifert, Christian
2009-07-24
This paper presents the application of deception theory to improve the success of client honeypots at detecting malicious web page attacks from infected servers programmed by online criminals to launch drive-by-download attacks. The design of honeypots faces three main challenges: deception, how to design honeypots that seem real systems; counter-deception, techniques used to identify honeypots and hence defeating their deceiving nature; and counter counter-deception, how to design honeypots that deceive attackers. The authors propose the application of a deception model known as the deception planning loop to identify the current status on honeypot research, development and deployment. The analysis leadsmore » to a proposal to formulate a landscape of the honeypot research and planning of steps ahead.« less
NASA Astrophysics Data System (ADS)
Copas, K.; Legind, J. K.; Hahn, A.; Braak, K.; Høftt, M.; Noesgaard, D.; Robertson, T.; Méndez Hernández, F.; Schigel, D.; Ko, C.
2017-12-01
GBIF—the Global Biodiversity Information Facility—has recently demonstrated a system that tracks publications back to individual datasets, giving data providers demonstrable evidence of the benefit and utility of sharing data to support an array of scholarly topics and practical applications. GBIF is an open-data network and research infrastructure funded by the world's governments. Its community consists of more than 90 formal participants and almost 1,000 data-publishing institutions, which currently make tens of thousands of datasets containing nearly 800 million species occurrence records freely and publicly available for discovery, use and reuse across a wide range of biodiversity-related research and policy investigations. Starting in 2015 with the help of DataONE, GBIF introduced DOIs as persistent identifiers for the datasets shared through its network. This enhancement soon extended to the assignment of DOIs to user downloads from GBIF.org, which typically filter the available records with a variety of taxonomic, geographic, temporal and other search terms. Despite the lack of widely accepted standards for citing data among researchers and publications, this technical infrastructure is beginning to take hold and support open, transparent, persistent and repeatable use and reuse of species occurrence data. These `download DOIs' provide canonical references for the search results researchers process and use in peer-reviewed articles—a practice GBIF encourages by confirming new DOIs with each download and offering guidelines on citation. GBIF has recently started linking these citation results back to dataset and publisher pages, offering more consistent, traceable evidence of the value of sharing data to support others' research. GBIF's experience may be a useful model for other repositories to follow.
VizieR Online Data Catalog: Multiwavelength study of HII region S311 (Yadav+, 2016)
NASA Astrophysics Data System (ADS)
Yadav, R. K.; Pandey, A. K.; Sharma, S.; Ojha, D. K.; Samal, M. R.; Mallick, K. K.; Jose, J.; Ogura, K.; Richichi, A.; Irawati, P.; Kobayashi, N.; Eswaraiah, C.
2017-11-01
We observed the HII region S311 (centred on RA(2000)=07:52:24, DE(2000)=-26:24:58.40) in NIR broad-bands J (1.25um), H (1.63um) and Ks (2.14um) on 2010 March 3 using the Infrared Side Port Imager (ISPI) camera mounted on the CTIO Blanco 4-m telescope. We consider only those sources having error <0.1mag in all three bands, resulting in a final catalogue of 2671 point sources. The Spitzer-IRAC observations for the S311 region (PID 20726) were made on 2006 May 3 using the 3.6, 4.5, 5.8 and 8.0um bands and were downloaded from the Spitzer heritage archive (SHA). (4 data files).
D-GENIES: dot plot large genomes in an interactive, efficient and simple way.
Cabanettes, Floréal; Klopp, Christophe
2018-01-01
Dot plots are widely used to quickly compare sequence sets. They provide a synthetic similarity overview, highlighting repetitions, breaks and inversions. Different tools have been developed to easily generated genomic alignment dot plots, but they are often limited in the input sequence size. D-GENIES is a standalone and web application performing large genome alignments using minimap2 software package and generating interactive dot plots. It enables users to sort query sequences along the reference, zoom in the plot and download several image, alignment or sequence files. D-GENIES is an easy-to-install, open-source software package (GPL) developed in Python and JavaScript. The source code is available at https://github.com/genotoul-bioinfo/dgenies and it can be tested at http://dgenies.toulouse.inra.fr/.
An event database for rotational seismology
NASA Astrophysics Data System (ADS)
Salvermoser, Johannes; Hadziioannou, Celine; Hable, Sarah; Chow, Bryant; Krischer, Lion; Wassermann, Joachim; Igel, Heiner
2016-04-01
The ring laser sensor (G-ring) located at Wettzell, Germany, routinely observes earthquake-induced rotational ground motions around a vertical axis since its installation in 2003. Here we present results from a recently installed event database which is the first that will provide ring laser event data in an open access format. Based on the GCMT event catalogue and some search criteria, seismograms from the ring laser and the collocated broadband seismometer are extracted and processed. The ObsPy-based processing scheme generates plots showing waveform fits between rotation rate and transverse acceleration and extracts characteristic wavefield parameters such as peak ground motions, noise levels, Love wave phase velocities and waveform coherence. For each event, these parameters are stored in a text file (json dictionary) which is easily readable and accessible on the website. The database contains >10000 events starting in 2007 (Mw>4.5). It is updated daily and therefore provides recent events at a time lag of max. 24 hours. The user interface allows to filter events for epoch, magnitude, and source area, whereupon the events are displayed on a zoomable world map. We investigate how well the rotational motions are compatible with the expectations from the surface wave magnitude scale. In addition, the website offers some python source code examples for downloading and processing the openly accessible waveforms.
NASA Astrophysics Data System (ADS)
Mede, Kyle; Brandt, Timothy D.
2017-03-01
We present the Exoplanet Simple Orbit Fitting Toolbox (ExoSOFT), a new, open-source suite to fit the orbital elements of planetary or stellar-mass companions to any combination of radial velocity and astrometric data. To explore the parameter space of Keplerian models, ExoSOFT may be operated with its own multistage sampling approach or interfaced with third-party tools such as emcee. In addition, ExoSOFT is packaged with a collection of post-processing tools to analyze and summarize the results. Although only a few systems have been observed with both radial velocity and direct imaging techniques, this number will increase, thanks to upcoming spacecraft and ground-based surveys. Providing both forms of data enables simultaneous fitting that can help break degeneracies in the orbital elements that arise when only one data type is available. The dynamical mass estimates this approach can produce are important when investigating the formation mechanisms and subsequent evolution of substellar companions. ExoSOFT was verified through fitting to artificial data and was implemented using the Python and Cython programming languages; it is available for public download at https://github.com/kylemede/ExoSOFT under GNU General Public License v3.
NASA Astrophysics Data System (ADS)
Brecher, Kenneth
2006-12-01
Project LITE (Light Inquiry Through Experiments) is a materials, software, and curriculum development project. It focuses on light, optics, color and visual perception. According to two recent surveys of college astronomy faculty members, these are among the topics most often included in the large introductory astronomy courses. The project has aimed largely at the design and implementation of hands-on experiences for students. However, it has also included the development of lecture demonstrations that employ novel light sources and materials. In this presentation, we will show some of our new lecture demonstrations concerning geometrical and physical optics, fluorescence, phosphorescence and polarization. We have developed over 200 Flash and Java applets that can be used either by teachers in lecture settings or by students at home. They are all posted on the web at http://lite.bu.edu. For either purpose they can be downloaded directly to the user's computer or run off line. In lecture demonstrations, some of these applets can be used to control the light emitted by video projectors to produce physical effects in materials (e.g. fluorescence). Other applets can be used, for example, to demonstrate that the human percept of color does not have a simple relationship with the physical frequency of the stimulating source of light. Project LITE is supported by Grant #DUE-0125992 from the NSF Division of Undergraduate Education.
Merrill, Matthew D.; Slucher, Ernie R.; Roberts-Ashby, Tina L.; Warwick, Peter D.; Blondes, Madalyn S.; Freeman, P.A.; Cahan, Steven M.; DeVera, Christina A.; Lohr, Celeste D.; Warwick, Peter D.; Corum, Margo D.
2015-01-01
The U.S. Geological Survey has completed an assessment of the potential geologic carbon dioxide storage resource in the onshore areas of the United States. To provide geological context and input data sources for the resources numbers, framework documents are being prepared for all areas that were investigated as part of the national assessment. This report is the geologic framework document for the Permian and Palo Duro Basins, the combined Bend arch-Fort Worth Basin area, and subbasins therein of Texas, New Mexico, and Oklahoma. In addition to a summarization of the geology and petroleum resources of studied basins, the individual storage assessment units (SAUs) within the basins are described and explanations for their selection are presented. Though appendixes in the national assessment publications include the input values used to calculate the available storage resource, this framework document provides only the context and source of inputs selected by the assessment geologists. Spatial files of boundaries for the SAUs herein, as well as maps of the density of known well bores that penetrate the SAU seal, are available for download with the release of this report.
Energy map of southwestern Wyoming, Part A - Coal and wind
Biewick, Laura; Jones, Nicholas R.
2012-01-01
To further advance the objectives of the Wyoming Landscape Conservation Initiative (WLCI) the U.S. Geological Survey (USGS) and the Wyoming State Geological Survey (WSGS) have compiled Part A of the Energy Map of Southwestern Wyoming. Focusing primarily on electrical power sources, Part A of the energy map is a compilation of both published and previously unpublished coal (including coalbed gas) and wind energy resources data, presented in a Geographic Information System (GIS) data package. Energy maps, data, documentation and spatial data processing capabilities are available in a geodatabase, published map file (pmf), ArcMap document (mxd), Adobe Acrobat PDF map (plate 1) and other digital formats that can be downloaded at the USGS website. Accompanying the map (plate 1) and the geospatial data are four additional plates that describe the geology, energy resources, and related infrastructure. These tabular plates include coal mine (plate 2), coal field (plate 3), coalbed gas assessment unit (plate 4), and wind farm (plate 5) information with hyperlinks to source publications and data on the internet. The plates can be printed and examined in hardcopy, or accessed digitally. The data represent decades of research by the USGS, WSGS, BLM and others, and can facilitate landscape-level science assessments, and resource management decisionmaking.
Merrill, Matthew D.; Drake, Ronald M.; Buursink, Marc L.; Craddock, William H.; East, Joseph A.; Slucher, Ernie R.; Warwick, Peter D.; Brennan, Sean T.; Blondes, Madalyn S.; Freeman, Philip A.; Cahan, Steven M.; DeVera, Christina A.; Lohr, Celeste D.; Warwick, Peter D.; Corum, Margo D.
2016-06-02
The U.S. Geological Survey has completed an assessment of the potential geologic carbon dioxide storage resources in the onshore areas of the United States. To provide geological context and input data sources for the resources numbers, framework documents are being prepared for all areas that were investigated as part of the national assessment. This report, chapter M, is the geologic framework document for the Uinta and Piceance, San Juan, Paradox, Raton, Eastern Great, and Black Mesa Basins, and subbasins therein of Arizona, Colorado, Idaho, Nevada, New Mexico, and Utah. In addition to a summary of the geology and petroleum resources of studied basins, the individual storage assessment units (SAUs) within the basins are described and explanations for their selection are presented. Although appendixes in the national assessment publications include the input values used to calculate the available storage resource, this framework document provides only the context and source of the input values selected by the assessment geologists. Spatial-data files of the boundaries for the SAUs, and the well-penetration density of known well bores that penetrate the SAU seal, are available for download with the release of this report.
EPA Enforcement and Compliance History Online
The Environmental Protection Agency's Enforcement and Compliance History Online (ECHO) website provides customizable and downloadable information about environmental inspections, violations, and enforcement actions for EPA-regulated facilities related to the Clean Air Act, Clean Water Act, Resource Conservation and Recovery Act, and Safe Drinking Water Act. These data are updated weekly as part of the ECHO data refresh, and ECHO offers many user-friendly options to explore data, including:? Facility Search: ECHO information is searchable by varied criteria, including location, facility type, and compliance status. Search results are customizable and downloadable.? Comparative Maps and State Dashboards: These tools offer aggregated information about facility compliance status, regulatory agency compliance monitoring, and enforcement activity at the national and state level.? Bulk Data Downloads: One of ECHO??s most popular features is the ability to work offline by downloading large data sets. Users can take advantage of the ECHO Exporter, which provides summary information about each facility in comma-separated values (csv) file format, or download data sets by program as zip files.
NASA Astrophysics Data System (ADS)
Zhang, Xiao-bo; Wang, Zhi-xue; Li, Jian-xin; Ma, Jian-hui; Li, Yang; Li, Yan-qiang
In order to facilitate Bluetooth function realization and information can be effectively tracked in the process of production, the vehicle Bluetooth hands-free devices need to download such key parameters as Bluetooth address, CVC license and base plate numbers, etc. Therefore, it is the aim to search simple and effective methods to download parameters for each vehicle Bluetooth hands-free device, and to control and record the use of parameters. In this paper, by means of Bluetooth Serial Peripheral Interface programmer device, the parallel port is switched to SPI. The first step is to download parameters is simulating SPI with the parallel port. To perform SPI function, operating the parallel port in accordance with the SPI timing. The next step is to achieve SPI data transceiver functions according to the programming parameters of options. Utilizing the new method, downloading parameters is fast and accurate. It fully meets vehicle Bluetooth hands-free devices production requirements. In the production line, it has played a large role.
Relative valuation of potentially affected ecosystem benefits can increase the legitimacy and social acceptance of ecosystem restoration projects. As an alternative or supplement to traditional methods of deriving beneficiary preference, we downloaded from social media and classi...
Forest science in the South - 2008
Southern Research Station USDA Forest Service
2008-01-01
It is my pleasure to present the 2008 Forest Science in the South, a summary of accomplishments of the USDA Forest Service Southern Research Station (SRS). This annual report includes a CD with details about our research and products, as well as links for ordering or downloading publications.
NASA Technical Reports Server (NTRS)
Bingham, Andrew W.; McCleese, Sean W.; Deen, Robert G.; Chung, Nga T.; Stough, Timothy M.
2013-01-01
Datacasting V3.0 provides an RSSbased feed mechanism for publishing the availability of Earth science data records in real time. It also provides a utility for subscribing to these feeds and sifting through all the items in an automatic manner to identify and download the data records that are required for a specific application. Datacasting is a method by which multiple data providers can publish the availability of new Earth science data and users download those files that meet a predefined need; for example, to only download data files related to a specific earthquake or region on the globe. Datacasting is a server-client architecture. The server-side software is used by data providers to create and publish the metadata about recently available data according to the Datacasting RSS (Really Simple Syndication) specification. The client software subscribes to the Datacasting RSS and other RSS-based feeds. By configuring filters associated with feeds, data consumers can use the client to identify and automatically download files that meet a specific need. On the client side, a Datacasting feed reader monitors the server for new feeds. The feed reader will be tuned by the user, via a graphical user interface (GUI), to examine the content of the feeds and initiate a data pull after some criteria are satisfied. The criteria might be, for example, to download sea surface temperature data for a particular region that has cloud cover less than 50% and during daylight hours. After the granule is downloaded to the client, the user will have the ability to visualize the data in the GUI. Based on the popular concept of podcasting, which gives listeners the capability to download only those MP3 files that match their preference, Earth science Datacasting will give users a method to download only the Earth science data files that are required for a particular application.
Pathogen metadata platform: software for accessing and analyzing pathogen strain information.
Chang, Wenling E; Peterson, Matthew W; Garay, Christopher D; Korves, Tonia
2016-09-15
Pathogen metadata includes information about where and when a pathogen was collected and the type of environment it came from. Along with genomic nucleotide sequence data, this metadata is growing rapidly and becoming a valuable resource not only for research but for biosurveillance and public health. However, current freely available tools for analyzing this data are geared towards bioinformaticians and/or do not provide summaries and visualizations needed to readily interpret results. We designed a platform to easily access and summarize data about pathogen samples. The software includes a PostgreSQL database that captures metadata useful for disease outbreak investigations, and scripts for downloading and parsing data from NCBI BioSample and BioProject into the database. The software provides a user interface to query metadata and obtain standardized results in an exportable, tab-delimited format. To visually summarize results, the user interface provides a 2D histogram for user-selected metadata types and mapping of geolocated entries. The software is built on the LabKey data platform, an open-source data management platform, which enables developers to add functionalities. We demonstrate the use of the software in querying for a pathogen serovar and for genome sequence identifiers. This software enables users to create a local database for pathogen metadata, populate it with data from NCBI, easily query the data, and obtain visual summaries. Some of the components, such as the database, are modular and can be incorporated into other data platforms. The source code is freely available for download at https://github.com/wchangmitre/bioattribution .
NASA Technical Reports Server (NTRS)
Miles, Jeffrey Hilton
2010-01-01
Combustion noise from turbofan engines has become important, as the noise from sources like the fan and jet are reduced. An aligned and un-aligned coherence technique has been developed to determine a threshold level for the coherence and thereby help to separate the coherent combustion noise source from other noise sources measured with far-field microphones. This method is compared with a statistics based coherence threshold estimation method. In addition, the un-aligned coherence procedure at the same time also reveals periodicities, spectral lines, and undamped sinusoids hidden by broadband turbofan engine noise. In calculating the coherence threshold using a statistical method, one may use either the number of independent records or a larger number corresponding to the number of overlapped records used to create the average. Using data from a turbofan engine and a simulation this paper shows that applying the Fisher z-transform to the un-aligned coherence can aid in making the proper selection of samples and produce a reasonable statistics based coherence threshold. Examples are presented showing that the underlying tonal and coherent broad band structure which is buried under random broadband noise and jet noise can be determined. The method also shows the possible presence of indirect combustion noise. Copyright 2011 Acoustical Society of America. This article may be downloaded for personal use only. Any other use requires prior permission of the author and the Acoustical Society of America.
Mod3DMT and EMTF: Free Software for MT Data Processing and Inversion
NASA Astrophysics Data System (ADS)
Egbert, G. D.; Kelbert, A.; Meqbel, N. M.
2017-12-01
"ModEM" was developed at Oregon State University as a modular system for inversion of electromagnetic (EM) geophysical data (Egbert and Kelbert, 2012; Kelbert et al., 2014). Although designed for more general (frequency domain) EM applications, and originally intended as a testbed for exploring inversion search and regularization strategies, our own initial uses of ModEM were for 3-D imaging of the deep crust and upper mantle at large scales. Since 2013 we have offered a version of the source code suitable for 3D magnetotelluric (MT) inversion on an "as is, user beware" basis for free for non-commercial applications. This version, which we refer to as Mod3DMT, has since been widely used by the international MT community. Over 250 users have registered to download the source code, and at least 50 MT studies in the refereed literature, covering locations around the globe at a range of spatial scales, cite use of ModEM for 3D inversion. For over 30 years I have also made MT processing software available for free use. In this presentation, I will discuss my experience with these freely available (but perhaps not truly open-source) computer codes. Although users are allowed to make modifications to the codes (on conditions that they provide a copy of the modified version) only a handful of users have tried to make any modification, and only rarely are modifications even reported, much less provided back to the developers.
Virtual and Printed 3D Models for Teaching Crystal Symmetry and Point Groups
ERIC Educational Resources Information Center
Casas, Lluís; Estop, Euge`nia
2015-01-01
Both, virtual and printed 3D crystal models can help students and teachers deal with chemical education topics such as symmetry and point groups. In the present paper, two freely downloadable tools (interactive PDF files and a mobile app) are presented as examples of the application of 3D design to study point-symmetry. The use of 3D printing to…
NCBI Epigenomics: what's new for 2013.
Fingerman, Ian M; Zhang, Xuan; Ratzat, Walter; Husain, Nora; Cohen, Robert F; Schuler, Gregory D
2013-01-01
The Epigenomics resource at the National Center for Biotechnology Information (NCBI) has been created to serve as a comprehensive public repository for whole-genome epigenetic data sets (www.ncbi.nlm.nih.gov/epigenomics). We have constructed this resource by selecting the subset of epigenetics-specific data from the Gene Expression Omnibus (GEO) database and then subjecting them to further review and annotation. Associated data tracks can be viewed using popular genome browsers or downloaded for local analysis. We have performed extensive user testing throughout the development of this resource, and new features and improvements are continuously being implemented based on the results. We have made substantial usability improvements to user interfaces, enhanced functionality, made identification of data tracks of interest easier and created new tools for preliminary data analyses. Additionally, we have made efforts to enhance the integration between the Epigenomics resource and other NCBI databases, including the Gene database and PubMed. Data holdings have also increased dramatically since the initial publication describing the NCBI Epigenomics resource and currently consist of >3700 viewable and downloadable data tracks from 955 biological sources encompassing five well-studied species. This updated manuscript highlights these changes and improvements.
NCBI Epigenomics: What’s new for 2013
Fingerman, Ian M.; Zhang, Xuan; Ratzat, Walter; Husain, Nora; Cohen, Robert F.; Schuler, Gregory D.
2013-01-01
The Epigenomics resource at the National Center for Biotechnology Information (NCBI) has been created to serve as a comprehensive public repository for whole-genome epigenetic data sets (www.ncbi.nlm.nih.gov/epigenomics). We have constructed this resource by selecting the subset of epigenetics-specific data from the Gene Expression Omnibus (GEO) database and then subjecting them to further review and annotation. Associated data tracks can be viewed using popular genome browsers or downloaded for local analysis. We have performed extensive user testing throughout the development of this resource, and new features and improvements are continuously being implemented based on the results. We have made substantial usability improvements to user interfaces, enhanced functionality, made identification of data tracks of interest easier and created new tools for preliminary data analyses. Additionally, we have made efforts to enhance the integration between the Epigenomics resource and other NCBI databases, including the Gene database and PubMed. Data holdings have also increased dramatically since the initial publication describing the NCBI Epigenomics resource and currently consist of >3700 viewable and downloadable data tracks from 955 biological sources encompassing five well-studied species. This updated manuscript highlights these changes and improvements. PMID:23193265
NASA Astrophysics Data System (ADS)
Bolliet, Timothé; Brockmann, Patrick; Masson-Delmotte, Valérie; Bassinot, Franck; Daux, Valérie; Genty, Dominique; Landais, Amaelle; Lavrieux, Marlène; Michel, Elisabeth; Ortega, Pablo; Risi, Camille; Roche, Didier M.; Vimeux, Françoise; Waelbroeck, Claire
2016-08-01
Past climate is an important benchmark to assess the ability of climate models to simulate key processes and feedbacks. Numerous proxy records exist for stable isotopes of water and/or carbon, which are also implemented inside the components of a growing number of Earth system model. Model-data comparisons can help to constrain the uncertainties associated with transfer functions. This motivates the need of producing a comprehensive compilation of different proxy sources. We have put together a global database of proxy records of oxygen (δ18O), hydrogen (δD) and carbon (δ13C) stable isotopes from different archives: ocean and lake sediments, corals, ice cores, speleothems and tree-ring cellulose. Source records were obtained from the georeferenced open access PANGAEA and NOAA libraries, complemented by additional data obtained from a literature survey. About 3000 source records were screened for chronological information and temporal resolution of proxy records. Altogether, this database consists of hundreds of dated δ18O, δ13C and δD records in a standardized simple text format, complemented with a metadata Excel catalog. A quality control flag was implemented to describe age markers and inform on chronological uncertainty. This compilation effort highlights the need to homogenize and structure the format of datasets and chronological information as well as enhance the distribution of published datasets that are currently highly fragmented and scattered. We also provide an online portal based on the records included in this database with an intuitive and interactive platform (http://climateproxiesfinder.ipsl.fr/), allowing one to easily select, visualize and download subsets of the homogeneously formatted records that constitute this database, following a choice of search criteria, and to upload new datasets. In the last part, we illustrate the type of application allowed by our database by comparing several key periods highly investigated by the paleoclimate community. For coherency with the Paleoclimate Modelling Intercomparison Project (PMIP), we focus on records spanning the past 200 years, the mid-Holocene (MH, 5.5-6.5 ka; calendar kiloyears before 1950), the Last Glacial Maximum (LGM, 19-23 ka), and those spanning the last interglacial period (LIG, 115-130 ka). Basic statistics have been applied to characterize anomalies between these different periods. Most changes from the MH to present day and from LIG to MH appear statistically insignificant. Significant global differences are reported from LGM to MH with regional discrepancies in signals from different archives and complex patterns.
the-wizz: clustering redshift estimation for everyone
NASA Astrophysics Data System (ADS)
Morrison, C. B.; Hildebrandt, H.; Schmidt, S. J.; Baldry, I. K.; Bilicki, M.; Choi, A.; Erben, T.; Schneider, P.
2017-05-01
We present the-wizz, an open source and user-friendly software for estimating the redshift distributions of photometric galaxies with unknown redshifts by spatially cross-correlating them against a reference sample with known redshifts. The main benefit of the-wizz is in separating the angular pair finding and correlation estimation from the computation of the output clustering redshifts allowing anyone to create a clustering redshift for their sample without the intervention of an 'expert'. It allows the end user of a given survey to select any subsample of photometric galaxies with unknown redshifts, match this sample's catalogue indices into a value-added data file and produce a clustering redshift estimation for this sample in a fraction of the time it would take to run all the angular correlations needed to produce a clustering redshift. We show results with this software using photometric data from the Kilo-Degree Survey (KiDS) and spectroscopic redshifts from the Galaxy and Mass Assembly survey and the Sloan Digital Sky Survey. The results we present for KiDS are consistent with the redshift distributions used in a recent cosmic shear analysis from the survey. We also present results using a hybrid machine learning-clustering redshift analysis that enables the estimation of clustering redshifts for individual galaxies. the-wizz can be downloaded at http://github.com/morriscb/The-wiZZ/.
Waese, Jamie; Fan, Jim; Yu, Hans; Fucile, Geoffrey; Shi, Ruian; Cumming, Matthew; Town, Chris; Stuerzlinger, Wolfgang
2017-01-01
A big challenge in current systems biology research arises when different types of data must be accessed from separate sources and visualized using separate tools. The high cognitive load required to navigate such a workflow is detrimental to hypothesis generation. Accordingly, there is a need for a robust research platform that incorporates all data and provides integrated search, analysis, and visualization features through a single portal. Here, we present ePlant (http://bar.utoronto.ca/eplant), a visual analytic tool for exploring multiple levels of Arabidopsis thaliana data through a zoomable user interface. ePlant connects to several publicly available web services to download genome, proteome, interactome, transcriptome, and 3D molecular structure data for one or more genes or gene products of interest. Data are displayed with a set of visualization tools that are presented using a conceptual hierarchy from big to small, and many of the tools combine information from more than one data type. We describe the development of ePlant in this article and present several examples illustrating its integrative features for hypothesis generation. We also describe the process of deploying ePlant as an “app” on Araport. Building on readily available web services, the code for ePlant is freely available for any other biological species research. PMID:28808136
Data Downloads | ECHO | US EPA
The ECHO website with its facility search features is designed to provide easy access to EPA's compliance and enforcement data with customizable onscreen display and download. For those with larger data needs, ECHO has several types of data sets available. These large data sets may be of particular use to developers, programmers, academics, and analysts. The data available here can be downloaded and used for many different functions and are certain to meet all data retrieval needs.
Ultrasound-Based Guidance for Partial Breast Irradiation Therapy
2012-01-01
displace- 0278-0062/$20.00 © IEEE Authorized licensed use limited to: IEEE Xplore . Downloaded on January 5, 2009 at 17:37 from IEEE Xplore . Restrictions...Authorized licensed use limited to: IEEE Xplore . Downloaded on January 5, 2009 at 17:37 from IEEE Xplore . Restrictions apply. RIVAZ et al...better CNR values: the Authorized licensed use limited to: IEEE Xplore . Downloaded on January 5, 2009 at 17:37 from IEEE Xplore . Restrictions apply
Mesa Isochrones and Stellar Tracks (MIST). I. Solar-scaled Models
NASA Astrophysics Data System (ADS)
Choi, Jieun; Dotter, Aaron; Conroy, Charlie; Cantiello, Matteo; Paxton, Bill; Johnson, Benjamin D.
2016-06-01
This is the first of a series of papers presenting the Modules for Experiments in Stellar Astrophysics (MESA) Isochrones and Stellar Tracks (MIST) project, a new comprehensive set of stellar evolutionary tracks and isochrones computed using MESA, a state-of-the-art open-source 1D stellar evolution package. In this work, we present models with solar-scaled abundance ratios covering a wide range of ages (5≤slant {log}({Age}) [{year}]≤slant 10.3), masses (0.1≤slant M/{M}⊙ ≤slant 300), and metallicities (-2.0≤slant [{{Z}}/{{H}}]≤slant 0.5). The models are self-consistently and continuously evolved from the pre-main sequence (PMS) to the end of hydrogen burning, the white dwarf cooling sequence, or the end of carbon burning, depending on the initial mass. We also provide a grid of models evolved from the PMS to the end of core helium burning for -4.0≤slant [{{Z}}/{{H}}]\\lt -2.0. We showcase extensive comparisons with observational constraints as well as with some of the most widely used existing models in the literature. The evolutionary tracks and isochrones can be downloaded from the project website at http://waps.cfa.harvard.edu/MIST/.
Biomass Characterization | Bioenergy | NREL
analytical methods for biomass characterization available for downloading. View the Biomass Compositional Methods Molecular Beam Mass Spectrometry Photo of a man in front of multiple computer screens that present Characterization of Biomass We develop new methods and tools to understand the chemical composition of raw biomass
Teaching Moments: Opening the Pipeline to Teaching Innovations
ERIC Educational Resources Information Center
Tanner, John F.; Whalen, D. Joel
2013-01-01
This paper demonstrates a strategy to speed teaching innovation transfer between marketing educators. Nine teaching innovations presented at the Society for Marketing Advances 2012 Annual Conference are offered in a brief catalog form. The reader can also download support materials at www.salesleadershipcenter.com/research.html#mer-tm13/.…
Landsat Science Team meeting—first Landsat 8 evaluations
Loveland, Thomas R.; Wulder, Michael A.; Irons, James R.
2014-01-01
The U.S. Geological Survey (USGS)-NASA Landsat Science Team (LST) met at the USGS’ Earth Resources Observation and Science (EROS) Center near Sioux Falls, SD, from October 29-31, 2013. All meeting presentations can be downloaded from landsat.usgs.gov/science_LST_October_29_31_2013.php.
Hugo Destaillats Home page. Presentation.
simulate effects of natural exposure on solar reflectance and thermal emittance of cool roofing materials practice for laboratory soiling and weathering of roofing materials to simulate effects of natural exposure Catalysis B: Environmental, 2012, 128, 159-170. Download it here . ÂQuantitative room-temperature
Point of View: Downloads, Copyright, and the Moral Responsibility Of Universities
ERIC Educational Resources Information Center
Torrey, Kate Douglas
2007-01-01
This essay regards the licensing of digital content in universities. The author states that the academic community has overwhelmingly adopted course-management software systems such as Blackboard to distribute electronic course materials to students. She argues that such use presents a significant problem: the absence of any institutional…
Managing Serious Teacher Misbehaviour
ERIC Educational Resources Information Center
Page, Damien
2014-01-01
This article presents findings from a study of five head teachers who were responsible for the management of serious teacher misbehaviour (TMB) in England. In cases that included the downloading of extreme pornography on a school laptop and a sexual relationship with a pupil, the multiple impacts of TMB were potentially devastating to the…
The EPA collects information about facilities or sites subject to environmental regulation. The EPA Geospatial Data Access Project provides downloadable files of these facilities or sites in KML format.
Calculation of the rotor induced download on airfoils
NASA Technical Reports Server (NTRS)
Lee, C. S.
1989-01-01
Interactions between the rotors and wing of a rotary wing aircraft in hover have a significant detrimental effect on its payload performance. The reduction of payload results from the wake of lifting rotors impinging on the wing, which is at 90 deg angle of attack in hover. This vertical drag, often referred as download, can be as large as 15 percent of the total rotor thrust in hover. The rotor wake is a three-dimensional, unsteady flow with concentrated tip vortices. With the rotor tip vortices impinging on the upper surface of the wing, the flow over the wing is not only three-dimensional and unsteady, but also separated from the leading and trailing edges. A simplified two-dimensional model was developed to demonstrate the stability of the methodology. The flow model combines a panel method to represent the rotor and the wing, and a vortex method to track the wing wake. A parametric study of the download on a 20 percent thick elliptical airfoil below a rotor disk of uniform inflow was performed. Comparisons with experimental data are made where the data are available. This approach is now being extended to three-dimensional flows. Preliminary results on a wing at 90 deg angle of attack in free stream is presented.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Saw, C; Baikadi, M; Peters, C
2015-06-15
Purpose: Using systems engineering to design HDR skin treatment operation for small lesions using shielded applicators to enhance patient safety. Methods: Systems engineering is an interdisciplinary field that offers formal methodologies to study, design, implement, and manage complex engineering systems as a whole over their life-cycles. The methodologies deal with human work-processes, coordination of different team, optimization, and risk management. The V-model of systems engineering emphasize two streams, the specification and the testing streams. The specification stream consists of user requirements, functional requirements, and design specifications while the testing on installation, operational, and performance specifications. In implementing system engineering tomore » this project, the user and functional requirements are (a) HDR unit parameters be downloaded from the treatment planning system, (b) dwell times and positions be generated by treatment planning system, (c) source decay be computer calculated, (d) a double-check system of treatment parameters to comply with the NRC regulation. These requirements are intended to reduce human intervention to improve patient safety. Results: A formal investigation indicated that the user requirements can be satisfied. The treatment operation consists of using the treatment planning system to generate a pseudo plan that is adjusted for different shielded applicators to compute the dwell times. The dwell positions, channel numbers, and the dwell times are verified by the medical physicist and downloaded into the HDR unit. The decayed source strength is transferred to a spreadsheet that computes the dwell times based on the type of applicators and prescribed dose used. Prior to treatment, the source strength, dwell times, dwell positions, and channel numbers are double-checked by the radiation oncologist. No dosimetric parameters are manually calculated. Conclusion: Systems engineering provides methodologies to effectively design the HDR treatment operation that minimize human intervention and improve patient safety.« less
NASA Astrophysics Data System (ADS)
Bagnardi, M.; Hooper, A. J.
2017-12-01
Inversions of geodetic observational data, such as Interferometric Synthetic Aperture Radar (InSAR) and Global Navigation Satellite System (GNSS) measurements, are often performed to obtain information about the source of surface displacements. Inverse problem theory has been applied to study magmatic processes, the earthquake cycle, and other phenomena that cause deformation of the Earth's interior and of its surface. Together with increasing improvements in data resolution, both spatial and temporal, new satellite missions (e.g., European Commission's Sentinel-1 satellites) are providing the unprecedented opportunity to access space-geodetic data within hours from their acquisition. To truly take advantage of these opportunities we must become able to interpret geodetic data in a rapid and robust manner. Here we present the open-source Geodetic Bayesian Inversion Software (GBIS; available for download at http://comet.nerc.ac.uk/gbis). GBIS is written in Matlab and offers a series of user-friendly and interactive pre- and post-processing tools. For example, an interactive function has been developed to estimate the characteristics of noise in InSAR data by calculating the experimental semi-variogram. The inversion software uses a Markov-chain Monte Carlo algorithm, incorporating the Metropolis-Hastings algorithm with adaptive step size, to efficiently sample the posterior probability distribution of the different source parameters. The probabilistic Bayesian approach allows the user to retrieve estimates of the optimal (best-fitting) deformation source parameters together with the associated uncertainties produced by errors in the data (and by scaling, errors in the model). The current version of GBIS (V1.0) includes fast analytical forward models for magmatic sources of different geometry (e.g., point source, finite spherical source, prolate spheroid source, penny-shaped sill-like source, and dipping-dike with uniform opening) and for dipping faults with uniform slip, embedded in a isotropic elastic half-space. However, the software architecture allows the user to easily add any other analytical or numerical forward models to calculate displacements at the surface. GBIS is delivered with a detailed user manual and three synthetic datasets for testing and practical training.
NASA Astrophysics Data System (ADS)
Bianchi, Luciana; Conti, A.; Shiao, B.; Keller, G. R.; Thilker, D. A.
2014-01-01
The legacy of the Galaxy Evolution Explorer (GALEX), which imaged the sky at Ultraviolet (UV) wavelengths for about 9 years, is its unprecedented database with more than 200 million source measurements in far-UV (FUV) and near-UV (NUV), as well as wide-field imaging of extended objects. GALEX's data, the first substantial sky surveys at UV wavelengths, offer an unprecedented view of the sky and a unique opportunity for an unbiased characterization of several classes of astrophysical objects, such as hot stars, QSOs at red-shift about 1, UV-peculiar QSOs, star-forming galaxies, among others. Bianchi et al. (2013, J. Adv. Space Res. (2013), DOI: http://dx.doi.org/10.1016/j.asr.2013.07.045) have constructed final catalogs of UV sources, with homogeneous quality, eliminating duplicate measurements of the same source ('unique' source catalogs), and excluding rim artifacts and bad photometry. The catalogs are constructed improving on the recipe of Bianchi et al. 2011 (MNRAS, 411, 2770, which presented the earlier version of these catalogs) and include all data for the major surveys, AIS and MIS. Considering the fields where both FUV and NUV detectors were exposed, the catalogs contain about 71 and 16.6 million unique sources respectively. We show several maps illustrating the content of UV sources across the sky, globally, and separately for bright/faint, hot, stellar/extragalactic objects. We matched the UV-source catalogs with optical-IR data from the SDSS, GSC2, 2MASS surveys. We are also in the process of matching the catalogs with preliminary PanSTARRS1 (PS1) 3pi survey photometry which already provides twice the sky coverage of SDSS, at slightly fainter magnitude limits. The sources' SED from FUV to optical wavelengths enables classification, derivation of the object physical parameters, and ultimately also a map of the Milky Way extinction. The catalogs will be available on MAST, Vizier (where the previous version already is), and in reduced form (for agile downloading), with related tools, from the author web site " http://dolomiti.pha.jhu.edu/uvsky "
Horvath, Keith J; Alemu, Dawit; Danh, Thu; Baker, Jason V; Carrico, Adam W
2016-04-15
The use of stimulant drugs among men who have sex with men (MSM) with human immunodeficiency virus (HIV) is associated with decreased odds of antiretroviral therapy (ART) adherence and elevated risk of forward HIV transmission. Advancing tailored and innovative mobile phone-based ART adherence app interventions for stimulant-using HIV-positive MSM requires greater understanding of their needs and preferences in this emerging area. The purpose of this study is to (1) assess reasons that stimulant-using HIV-positive MSM download and sustain their use of mobile phone apps in general, and (2) obtain feedback on features and functions that these men prefer in a mobile phone app to optimize their ART adherence. Focus groups were conducted with stimulant-using HIV-positive MSM (24-57 years of age; mostly non-Hispanic white; 42% once a week or more frequent stimulant drug use) in San Francisco and Minneapolis. Our aim was to explore the mobile phone app features and functions that they considered when deciding to download and sustain their use of general apps over time, as well as specific features and functions that they would like to see incorporated into an ART adherence mobile app. Focus groups were audiorecorded and transcribed verbatim. Thematic analysis was applied to transcripts using line-by-line open coding and organizing codes into meaningful themes. Men reported that they currently had a variety of health and wellness, social media and networking, gaming and entertainment, and utility apps on their mobile phones. Downloading apps to their mobile phones was influenced by the cost of the app, recommendations by a trusted source, and the time it takes to download. In addition, downloading and sustained use of apps was more likely to occur when men had control over most features of the app and apps were perceived to be useful, engaging, secure, and credible. Participants suggested that ART adherence mobile phone apps include social networking features, connections to local resources and their medical chart, and breaking HIV news and updates. Although some men expressed concerns about daily self-monitoring of HIV medication doses, many appreciated receiving a summary of their medication adherence over time and suggested that feedback about missed doses be delivered in an encouraging and humorous manner. In this study, we were able to recruit a relatively high proportion (42%) of HIV-positive MSM reporting weekly or more stimulant use. These results suggest critical design elements that may need to be considered during development of ART adherence-related mobile phone apps for this, and possibly other, high-risk groups. In particular, finding the optimal balance of security, engagement, usefulness, control capabilities, and credibility will be critical to sustained used of HIV treatment apps.
To improve public health and the environment, the United States Environmental Protection Agency (USEPA) collects information about facilities, sites, or places subject to environmental regulation or of environmental interest. Through the Geospatial Data Download Service, the public is now able to download the EPA Geodata shapefile containing facility and site information from EPA's national program systems. The file is Internet accessible from the Envirofacts Web site (http://www.epa.gov/enviro). The data may be used with geospatial mapping applications. (Note: The shapefile omits facilities without latitude/longitude coordinates.) The EPA Geospatial Data contains the name, location (latitude/longitude), and EPA program information about specific facilities and sites. In addition, the file contains a Uniform Resource Locator (URL), which allows mapping applications to present an option to users to access additional EPA data resources on a specific facility or site.
US EPA Region 4 RMP Facilities
To improve public health and the environment, the United States Environmental Protection Agency (USEPA) collects information about facilities, sites, or places subject to environmental regulation or of environmental interest. Through the Geospatial Data Download Service, the public is now able to download the EPA Geodata shapefile containing facility and site information from EPA's national program systems. The file is Internet accessible from the Envirofacts Web site (http://www.epa.gov/enviro). The data may be used with geospatial mapping applications. (Note: The shapefile omits facilities without latitude/longitude coordinates.) The EPA Geospatial Data contains the name, location (latitude/longitude), and EPA program information about specific facilities and sites. In addition, the file contains a Uniform Resource Locator (URL), which allows mapping applications to present an option to users to access additional EPA data resources on a specific facility or site.
A Study of Gaps in Defensive Countermeasures for Web Security
2015-10-18
4 1 1 High Session fixation 3 3 2 High Privilege escalation 2 6 7 Med Logic vulnerabilities 2 4 2,5,7 Med File inclusion N /A 5 4 Med Drive-by...downloads N /A N /A 10 Low Clickjacking N /A 7 N /A Low Plug-in attacks N /A N /A 9 Low occurring in each area. Reliable and well-classified data sources are...cleanly to the attack areas described above, as indicated by N /A in the table. 6 3.3 GAP DISCOVERY PROCESS For the purposes of this study, we focus on the
NASA Astrophysics Data System (ADS)
Villoria, Nelson B.; Elliott, Joshua; Müller, Christoph; Shin, Jaewoo; Zhao, Lan; Song, Carol
2018-01-01
Access to climate and spatial datasets by non-specialists is restricted by technical barriers involving hardware, software and data formats. We discuss an open-source online tool that facilitates downloading the climate data from the global circulation models used by the Inter-Sectoral Impacts Model Intercomparison Project. The tool also offers temporal and spatial aggregation capabilities for incorporating future climate scenarios in applications where spatial aggregation is important. We hope that streamlined access to these data facilitates analysis of climate related issues while considering the uncertainties derived from future climate projections and temporal aggregation choices.
Neuroimaging Data Sharing on the Neuroinformatics Database Platform
Book, Gregory A; Stevens, Michael; Assaf, Michal; Glahn, David; Pearlson, Godfrey D
2015-01-01
We describe the Neuroinformatics Database (NiDB), an open-source database platform for archiving, analysis, and sharing of neuroimaging data. Data from the multi-site projects Autism Brain Imaging Data Exchange (ABIDE), Bipolar-Schizophrenia Network on Intermediate Phenotypes parts one and two (B-SNIP1, B-SNIP2), and Monetary Incentive Delay task (MID) are available for download from the public instance of NiDB, with more projects sharing data as it becomes available. As demonstrated by making several large datasets available, NiDB is an extensible platform appropriately suited to archive and distribute shared neuroimaging data. PMID:25888923
... View Online Download PDF Monitoring the Future 2017 Survey Results Published: December 12, 2017 This infographic of ... View Online Download PDF Monitoring the Future 2016 Survey Results Published: December 13, 2016 This infographic of ...
... Brochure Vitamins, Minerals, and Herbs in MS: An Introduction (.pdf) Practical guide to diet supplements for people ... you Vitamins, Minerals, and Herbs in MS: An Introduction (.pdf) Download Brochure Taming Stress (.pdf) Download Brochure ...
JBioWH: an open-source Java framework for bioinformatics data integration
Vera, Roberto; Perez-Riverol, Yasset; Perez, Sonia; Ligeti, Balázs; Kertész-Farkas, Attila; Pongor, Sándor
2013-01-01
The Java BioWareHouse (JBioWH) project is an open-source platform-independent programming framework that allows a user to build his/her own integrated database from the most popular data sources. JBioWH can be used for intensive querying of multiple data sources and the creation of streamlined task-specific data sets on local PCs. JBioWH is based on a MySQL relational database scheme and includes JAVA API parser functions for retrieving data from 20 public databases (e.g. NCBI, KEGG, etc.). It also includes a client desktop application for (non-programmer) users to query data. In addition, JBioWH can be tailored for use in specific circumstances, including the handling of massive queries for high-throughput analyses or CPU intensive calculations. The framework is provided with complete documentation and application examples and it can be downloaded from the Project Web site at http://code.google.com/p/jbiowh. A MySQL server is available for demonstration purposes at hydrax.icgeb.trieste.it:3307. Database URL: http://code.google.com/p/jbiowh PMID:23846595
JBioWH: an open-source Java framework for bioinformatics data integration.
Vera, Roberto; Perez-Riverol, Yasset; Perez, Sonia; Ligeti, Balázs; Kertész-Farkas, Attila; Pongor, Sándor
2013-01-01
The Java BioWareHouse (JBioWH) project is an open-source platform-independent programming framework that allows a user to build his/her own integrated database from the most popular data sources. JBioWH can be used for intensive querying of multiple data sources and the creation of streamlined task-specific data sets on local PCs. JBioWH is based on a MySQL relational database scheme and includes JAVA API parser functions for retrieving data from 20 public databases (e.g. NCBI, KEGG, etc.). It also includes a client desktop application for (non-programmer) users to query data. In addition, JBioWH can be tailored for use in specific circumstances, including the handling of massive queries for high-throughput analyses or CPU intensive calculations. The framework is provided with complete documentation and application examples and it can be downloaded from the Project Web site at http://code.google.com/p/jbiowh. A MySQL server is available for demonstration purposes at hydrax.icgeb.trieste.it:3307. Database URL: http://code.google.com/p/jbiowh.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schaffer, K. K.; Crawford, T. M.; Benson, B. A.
2011-12-10
The South Pole Telescope (SPT) has nearly completed a 2500 deg{sup 2} survey of the southern sky in three frequency bands. Here, we present the first public release of SPT maps and associated data products. We present arcminute-resolution maps at 150 GHz and 220 GHz of an approximately 95 deg{sup 2} field centered at R.A. 82.{sup 0}7, decl. -55 Degree-Sign . The field was observed to a depth of approximately 17 {mu}K arcmin at 150 GHz and 41 {mu}K arcmin at 220 GHz during the 2008 austral winter season. Two variations on map filtering and map projection are presented, onemore » tailored for producing catalogs of galaxy clusters detected through their Sunyaev-Zel'dovich effect signature and one tailored for producing catalogs of emissive sources. We describe the data processing pipeline, and we present instrument response functions, filter transfer functions, and map noise properties. All data products described in this paper are available for download at http://pole.uchicago.edu/public/data/maps/ra5h30dec-55 and from the NASA Legacy Archive for Microwave Background Data Analysis server. This is the first step in the eventual release of data from the full 2500 deg{sup 2} SPT survey.« less
MedlinePlus Videos and Cool Tools
... Apnea (OSA) Download Download the ebook for further information Obstructive sleep apnea (OSA) is a serious and ... that can create the necessary air passageway. The information provided here is not intended as a substitute ...
MedlinePlus Videos and Cool Tools
... more. Anesthesia Download Download the ebook for further information Anesthesia: Safety and Comfort in the OMS Office ... comfortable as possible when you get home. The information provided here is not intended as a substitute ...
... Disease T Cells d What Causes MS? Disproved Theories Viruses Clusters d Who Gets MS? Pediatric MS ... Brochure Vitamins, Minerals, and Herbs in MS: An Introduction (.pdf) Download Brochure Taming Stress (.pdf) Download Brochure ...
Distributed Information System for Dynamic Ocean Data in Indonesia
NASA Astrophysics Data System (ADS)
Romero, Laia; Sala, Joan; Polo, Isabel; Cases, Oscar; López, Alejandro; Jolibois, Tony; Carbou, Jérome
2014-05-01
Information systems are widely used to enable access to scientific data by different user communities. MyOcean information system is a good example of such applications in Europe. The present work describes a specific distributed information system for Ocean Numerical Model (ONM) data in the scope of the INDESO project, a project focused on Infrastructure Development of Space Oceanography in Indonesia. INDESO, as part of the Blue Revolution policy conducted by the Indonesian government for the sustainable development of fisheries and aquaculture, presents challenging service requirements in terms of services performance, reliability, security and overall usability. Following state-of-the-art technologies on scientific data networks, this robust information system provides a high level of interoperability of services to discover, view and access INDESO dynamic ONM scientific data. The entire system is automatically updated four times a day, including dataset metadata, taking into account every new file available in the data repositories. The INDESO system architecture has been designed in great part around the extension and integration of open-source flexible and mature technologies. It involves three separate modules: web portal, dissemination gateway, and user administration. Supporting different gridded and non-gridded data, the INDESO information system features search-based data discovery, data access by temporal and spatial subset extraction, direct download and ftp, and multiple-layer visualization of datasets. A complex authorization system has been designed and applied throughout all components, in order to enable services authorization at dataset level, according to the different user profiles stated in the data policy. Finally, a web portal has been developed as the single entry point and standardized interface to all data services (discover, view, and access). Apache SOLR has been implemented as the search server, allowing faceted browsing among ocean data products and the connection to an external catalogue of metadata records. ncWMS and Godiva2 have been the basis of the viewing server and client technologies developed, MOTU has been used for data subsetting and intelligent management of data queues, and has allowed the deployment of a centralised download interface applicable to all ONM products. Unidata's Thredds server has been employed to provide file metadata and remote access to ONM data. CAS has been used as the single sign-on protocol for all data services. The user management application developed has been based on GOSA2. Joomla and Bootstrap have been the technologies used for the web portal, compatible with mobile phone and tablet devices. The INDESO information system comes up as an information system that is scalable, extremely easy to use, operate and maintain. This will facilitate the extensive use of ocean numerical model data by the scientific community in Indonesia. Constituted mostly of open-source solutions, the system is able to meet strict operational requirements, and carry out complex functions. It is feasible to adapt this architecture to different static and dynamic oceanographic data sources and large data volumes, in an accessible, fast, and comprehensive manner.
Analysis of United States’ Broadband Policy
2007-03-01
compared with the minimum speed the FCC uses in its definition of broadband access. For example, using a 56K modem connection to download a 10...transmission rates multiple times faster than a 56K modem , users can view video or download software and other data-intensive files in a matter of seconds...boast download speeds from 144Kbps (roughly three times faster than a 56K dial-up modem connection) to 2.4Mbps (close to cable- modem speed). Although
2014-12-01
Hypertension is one of the most common co-morbidities associated with DM and substantially contributes to the macrovascular disease that occurs in...After Numera terminated the contract to provide glucometer download support Estenda activated the patient portal , Diabetes Mellitus Everywhere...several patients. Problem areas include having to download JAVA with first upload and accessing the DME portal . d. PO has downloaded glucose
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gwyn, Stephen D. J., E-mail: Stephen.Gwyn@nrc-cnrc.gc.ca
This paper describes the image stacks and catalogs of the Canada-France-Hawaii Telescope Legacy Survey produced using the MegaPipe data pipeline at the Canadian Astronomy Data Centre. The Legacy Survey is divided into two parts. The Deep Survey consists of four fields each of 1 deg{sup 2}, with magnitude limits (50% completeness for point sources) of u = 27.5, g = 27.9, r = 27.7, i = 27.4, and z = 26.2. It contains 1.6 Multiplication-Sign 10{sup 6} sources. The Wide Survey consists of 150 deg{sup 2} split over four fields, with magnitude limits of u = 26.0, g = 26.5,more » r = 25.9, i = 25.7, and z = 24.6. It contains 3 Multiplication-Sign 10{sup 7} sources. This paper describes the calibration, image stacking, and catalog generation process. The images and catalogs are available on the web through several interfaces: normal image and text file catalog downloads, a 'Google Sky' interface, an image cutout service, and a catalog database query service.« less
XAssist: A System for the Automation of X-ray Astrophysics Analysis
NASA Astrophysics Data System (ADS)
Ptak, A.
2004-08-01
XAssist is a NASA AISR-funded project for the automation of X-ray astrophysics. It is capable of data reprocessing, source detection, and preliminary spatial, temporal and spectral analysis for each source with sufficient counts. The bulk of the system is written in Python, which in turn drives underlying software (CIAO for Chandra data, etc.). Future work will include a GUI (mainly for beginners and status monitoring) and the exposure of at least some functionality as web services. The latter will help XAssist to eventually become part of the VO, making advanced queries possible, such as determining the X-ray fluxes of counterparts to HST or SDSS sources (including the use of unpublished X-ray data), and add the ability of ``on-the-fly'' X-ray processing. Pipelines are running on Chandra and XMM-Newton observations of galaxies to demonstrate XAssist's capabilities, and the results are available online (in real time) at http://www.xassist.org. XAssist itself as well as various associated projects are available for download.
Open source clustering software.
de Hoon, M J L; Imoto, S; Nolan, J; Miyano, S
2004-06-12
We have implemented k-means clustering, hierarchical clustering and self-organizing maps in a single multipurpose open-source library of C routines, callable from other C and C++ programs. Using this library, we have created an improved version of Michael Eisen's well-known Cluster program for Windows, Mac OS X and Linux/Unix. In addition, we generated a Python and a Perl interface to the C Clustering Library, thereby combining the flexibility of a scripting language with the speed of C. The C Clustering Library and the corresponding Python C extension module Pycluster were released under the Python License, while the Perl module Algorithm::Cluster was released under the Artistic License. The GUI code Cluster 3.0 for Windows, Macintosh and Linux/Unix, as well as the corresponding command-line program, were released under the same license as the original Cluster code. The complete source code is available at http://bonsai.ims.u-tokyo.ac.jp/mdehoon/software/cluster. Alternatively, Algorithm::Cluster can be downloaded from CPAN, while Pycluster is also available as part of the Biopython distribution.
The IntAct molecular interaction database in 2012
Kerrien, Samuel; Aranda, Bruno; Breuza, Lionel; Bridge, Alan; Broackes-Carter, Fiona; Chen, Carol; Duesbury, Margaret; Dumousseau, Marine; Feuermann, Marc; Hinz, Ursula; Jandrasits, Christine; Jimenez, Rafael C.; Khadake, Jyoti; Mahadevan, Usha; Masson, Patrick; Pedruzzi, Ivo; Pfeiffenberger, Eric; Porras, Pablo; Raghunath, Arathi; Roechert, Bernd; Orchard, Sandra; Hermjakob, Henning
2012-01-01
IntAct is an open-source, open data molecular interaction database populated by data either curated from the literature or from direct data depositions. Two levels of curation are now available within the database, with both IMEx-level annotation and less detailed MIMIx-compatible entries currently supported. As from September 2011, IntAct contains approximately 275 000 curated binary interaction evidences from over 5000 publications. The IntAct website has been improved to enhance the search process and in particular the graphical display of the results. New data download formats are also available, which will facilitate the inclusion of IntAct's data in the Semantic Web. IntAct is an active contributor to the IMEx consortium (http://www.imexconsortium.org). IntAct source code and data are freely available at http://www.ebi.ac.uk/intact. PMID:22121220
Jonnalagadda, Siddhartha; Gonzalez, Graciela
2010-11-13
BioSimplify is an open source tool written in Java that introduces and facilitates the use of a novel model for sentence simplification tuned for automatic discourse analysis and information extraction (as opposed to sentence simplification for improving human readability). The model is based on a "shot-gun" approach that produces many different (simpler) versions of the original sentence by combining variants of its constituent elements. This tool is optimized for processing biomedical scientific literature such as the abstracts indexed in PubMed. We tested our tool on its impact to the task of PPI extraction and it improved the f-score of the PPI tool by around 7%, with an improvement in recall of around 20%. The BioSimplify tool and test corpus can be downloaded from https://biosimplify.sourceforge.net.
PPIS Download Product Information Data
Downloadable Pesticide Product Information System data for Federal Insecticide Fungicide and Rodenticide Act section 3 and section 24(c) (Special Local Need), vocabulary, and U.S. companies and organizations with EPA Company Numbers.
MedlinePlus Videos and Cool Tools
... Jaw Surgery Download Download the ebook for further information Corrective jaw, or orthognathic surgery is performed by ... your treatment. Correction of Common Dentofacial Deformities The information provided here is not intended as a substitute ...
MedlinePlus Videos and Cool Tools
... Neck Pathology Download Download the ebook for further information Your oral and maxillofacial surgeon (OMS) is the ... well be the key to complete recovery. The information provided here is not intended as a substitute ...
Facts and Stats - Hydrocephalus Association
... this in Theme Options SIGN IN File Download Verification Thank you for downloading our publication! We will ... I am ...: Child Friend Grandchild Grandparent Medical Professional Self Sibling Parent Spouse Other Patient Diagnosed Age: Infants ( ...
Downloading and Installing Estuary Data Mapper (EDM)
Estuary Data Mapper is a tool for geospatial data discovery, visualization, and data download for any of the approximately 2,000 estuaries and associated watersheds in along the five US coastal regions
Depression and College Students
... depression and other mental health issues? Reference Share Depression and College Students Download PDF Download ePub Order ... Answers to college students’ frequently asked questions about depression Feeling moody, sad, or grouchy? Who doesn’t ...
Engagement for Enhancement: Report of a UK Survey Pilot
ERIC Educational Resources Information Center
Buckley, Alex
2013-01-01
This report presents the findings from a United Kingdom pilot of selected questions from the National Survey of Student Engagement (NSSE). 8,582 responses were gathered from nine institutions in Spring/Summer 2013. Also available to download are a full report of the cognitive testing by researchers from King's College London, a set of…
Delivering Instruction via Streaming Media: A Higher Education Perspective.
ERIC Educational Resources Information Center
Mortensen, Mark; Schlieve, Paul; Young, Jon
2000-01-01
Describes streaming media, an audio/video presentation that is delivered across a network so that it is viewed while being downloaded onto the user's computer, including a continuous stream of video that can be pre-recorded or live. Discusses its use for nontraditional students in higher education and reports on implementation experiences. (LRW)
Differences in STEM Baccalaureate Attainment by Ethnicity
ERIC Educational Resources Information Center
Koledoye, Kimberly; Joyner, Sheila; Slate, John R.
2011-01-01
In this study, we examined the extent to which differences were present in the science, technology, engineering, and math (STEM) baccalaureate attainment of Black students and of Hispanic students at 82 Texas 4-year colleges from 2008 to 2009. A custom download of data files was conducted on the Integrated Postsecondary Education Data System in…
Participation Through Gaze Controlled Computer for Children with Severe Multiple Disabilities.
Holmqvist, Eva; Derbring, Sandra; Wallin, Sofia
2017-01-01
This paper presents work on developing methodology material for use of gaze controlled computers. The target group is families and professionals around children with severe multiple disabilities. The material includes software grids for children at various levels, aimed for communication, leisure and learning and will be available for download.
Practical Downloading to Desktop Publishing: Enhancing the Delivery of Information.
ERIC Educational Resources Information Center
Danziger, Pamela N.
This paper is addressed to librarians and information managers who, as one of the many activities they routinely perform, frequently publish information in such formats as newsletters, manuals, brochures, forms, presentations, or reports. It is argued that desktop publishing--a personal computer-based software package used to generate documents of…
ERIC Educational Resources Information Center
Isakson, Carol
2006-01-01
A podcast is essentially a radio program that can be downloaded for enjoyment. Its content includes radio broadcasts, lectures, walking tours, and student-created audio projects. Most are in the standard MP3 file format that can be played on a computer, MP3 player, PDA, or newer CD or DVD players. This article presents resources for learning about…
Biblio-Link and Pro-Cite: The Searcher's Workstation.
ERIC Educational Resources Information Center
Hoyle, Norman; McNamara, Kathleen
1987-01-01
Describes the Biblio-Link and Pro-Cite software packages, which can be used together to create local databases with downloaded records, or to reorganize and repackage downloaded records for client reports. (CLB)
Sodium 3D COncentration MApping (COMA 3D) using 23Na and proton MRI
NASA Astrophysics Data System (ADS)
Truong, Milton L.; Harrington, Michael G.; Schepkin, Victor D.; Chekmenev, Eduard Y.
2014-10-01
Functional changes of sodium 3D MRI signals were converted into millimolar concentration changes using an open-source fully automated MATLAB toolbox. These concentration changes are visualized via 3D sodium concentration maps, and they are overlaid over conventional 3D proton images to provide high-resolution co-registration for easy correlation of functional changes to anatomical regions. Nearly 5000/h concentration maps were generated on a personal computer (ca. 2012) using 21.1 T 3D sodium MRI brain images of live rats with spatial resolution of 0.8 × 0.8 × 0.8 mm3 and imaging matrices of 60 × 60 × 60. The produced concentration maps allowed for non-invasive quantitative measurement of in vivo sodium concentration in the normal rat brain as a functional response to migraine-like conditions. The presented work can also be applied to sodium-associated changes in migraine, cancer, and other metabolic abnormalities that can be sensed by molecular imaging. The MATLAB toolbox allows for automated image analysis of the 3D images acquired on the Bruker platform and can be extended to other imaging platforms. The resulting images are presented in a form of series of 2D slices in all three dimensions in native MATLAB and PDF formats. The following is provided: (a) MATLAB source code for image processing, (b) the detailed processing procedures, (c) description of the code and all sub-routines, (d) example data sets of initial and processed data. The toolbox can be downloaded at: http://www.vuiis.vanderbilt.edu/ truongm/COMA3D/.
Entropic Profiler – detection of conservation in genomes using information theory
Fernandes, Francisco; Freitas, Ana T; Almeida, Jonas S; Vinga, Susana
2009-01-01
Background In the last decades, with the successive availability of whole genome sequences, many research efforts have been made to mathematically model DNA. Entropic Profiles (EP) were proposed recently as a new measure of continuous entropy of genome sequences. EP represent local information plots related to DNA randomness and are based on information theory and statistical concepts. They express the weighed relative abundance of motifs for each position in genomes. Their study is very relevant because under or over-representation segments are often associated with significant biological meaning. Findings The Entropic Profiler application here presented is a new tool designed to detect and extract under and over-represented DNA segments in genomes by using EP. It allows its computation in a very efficient way by recurring to improved algorithms and data structures, which include modified suffix trees. Available through a web interface and as downloadable source code, it allows to study positions and to search for motifs inside the whole sequence or within a specified range. DNA sequences can be entered from different sources, including FASTA files, pre-loaded examples or resuming a previously saved work. Besides the EP value plots, p-values and z-scores for each motif are also computed, along with the Chaos Game Representation of the sequence. Conclusion EP are directly related with the statistical significance of motifs and can be considered as a new method to extract and classify significant regions in genomes and estimate local scales in DNA. The present implementation establishes an efficient and useful tool for whole genome analysis. PMID:19416538
Yam, Chun-Shan
2007-11-01
The purpose of this article is to describe an alternative for creating scrollable movie loops for electronic presentations including PowerPoint. The alternative provided in this article enables academic radiologists to present scrollable movie loops in PowerPoint. The scrolling capability is created using Flash ActionScript. A Flash template with the required ActionScript code is provided. Users can simply download the template and follow the step-by-step demonstration to create scrollable movie loops. No previous ActionScript programming knowledge is necessary.
Brabb, Earl E.; Roberts, Sebastian; Cotton, William R.; Kropp, Alan L.; Wright, Robert H.; Zinn, Erik N.; Digital database by Roberts, Sebastian; Mills, Suzanne K.; Barnes, Jason B.; Marsolek, Joanna E.
2000-01-01
This publication consists of a digital map database on a geohazards web site, http://kaibab.wr.usgs.gov/geohazweb/intro.htm, this text, and 43 digital map images available for downloading at this site. The report is stored as several digital files, in ARC export (uncompressed) format for the database, and Postscript and PDF formats for the map images. Several of the source data layers for the images have already been released in other publications by the USGS and are available for downloading on the Internet. These source layers are not included in this digital database, but rather a reference is given for the web site where the data can be found in digital format. The exported ARC coverages and grids lie in UTM zone 10 projection. The pamphlet, which only describes the content and character of the digital map database, is included as Postscript, PDF, and ASCII text files and is also available on paper as USGS Open-File Report 00-127. The full versatility of the spatial database is realized by importing the ARC export files into ARC/INFO or an equivalent GIS. Other GIS packages, including MapInfo and ARCVIEW, can also use the ARC export files. The Postscript map image can be used for viewing or plotting in computer systems with sufficient capacity, and the considerably smaller PDF image files can be viewed or plotted in full or in part from Adobe ACROBAT software running on Macintosh, PC, or UNIX platforms.
The Environmental Protection Agency's Enforcement and Compliance History Online (ECHO) website provides customizable and downloadable information about environmental inspections, violations, and enforcement actions for EPA-regulated facilities, like power plants and factories. ECHO advances public information by sharing data related to facility compliance with and regulatory agency activity related to air, hazardous waste, clean water, and drinking water regulations. ECHO offers many user-friendly options to explore data, including:1. Facility Search (http://echo.epa.gov/facilities/facility-search?mediaSelected=all): ECHO information is searchable by varied criteria, including location, facility type, and compliance status related to the Clean Air Act, Clean Water Act, Resource Conservation and Recovery Act, and Safe Drinking Water Act. Search results are customizable and downloadable.2. Comparative Maps (http://echo.epa.gov/maps/state-comparative-maps) and State Dashboards (http://echo.epa.gov/trends/comparative-maps-dashboards/state-air-dashboard): These tools offer aggregated information about facility compliance status and regulatory agency compliance monitoring and enforcement activity at the national and state level.3. Bulk Data Downloads (http://echo.epa.gov/resources/echo-data/data-downloads): One of ECHO's most popular features is the ability to work offline by downloading large data sets. Users can take advantage of the ECHO Exporter, which provides su
Cough and Cold Medicine (DXM and Codeine Syrup)
... View Online Download PDF Monitoring the Future 2017 Survey Results Published: December 12, 2017 This infographic of ... View Online Download PDF Monitoring the Future 2016 Survey Results Published: December 13, 2016 This infographic of ...
This interactive website provides access to cancer statistics (rates and trends) for a cancer site by gender, race, calendar year, stage, and histology. Users can create custom graphs and tables, download data and images, download SEER*Stat sessions, and share results.
5 Things You Should Know about Stress
... Information Share 5 Things You Should Know About Stress Download PDF Download ePub Order a free hardcopy ... five things you should know about stress: 1. Stress affects everyone. Everyone feels stressed from time to ...
DefenseLink Special: Doolittle Raid, April 18, 1942
enhanced viewing {Flash 8} Macromedia Flash Player 8 required Download free Macromedia Flash Player Download free Internet Explorer DOOLITTLE RAIDERS Â Retired Lt. Col. Chase Nelson, a Doolittle Raider
7 CFR 318.13-14 - Movement of processed fruits, vegetables, and other products.
Code of Federal Regulations, 2010 CFR
2010-01-01
.../import_export/plants/manuals/ports/downloads/hawaii.pdf and http://www.aphis.usda.gov/import_export/plants/manuals/ports/downloads/puerto_rico.pdf. (b) Consignments of processed fruits, vegetables, or...
The Teen Brain: 6 Things to Know
... adults For More Information Reprints Share The Teen Brain: 6 Things to Know Download PDF Download ePub ... big and important changes are happening to the brain during adolescence? Here are 6 things to know ...
2011-02-14
licensed use limited to: UNIV OF HAWAII LIBRARY. Downloaded on June 18,2010 at 22:24:49 UTC from IEEE Xplore . Restrictions apply. KIM et al.: MODIFIED...limited to: UNIV OF HAWAII LIBRARY. Downloaded on June 18,2010 at 22:24:49 UTC from IEEE Xplore . Restrictions apply. 404 IEEE TRANSACTIONS ON...licensed use limited to: UNIV OF HAWAII LIBRARY. Downloaded on June 18,2010 at 22:24:49 UTC from IEEE Xplore . Restrictions apply. KIM et al
NASA Technical Reports Server (NTRS)
Duggan, Brian
2012-01-01
Downloading and organizing large amounts of files is challenging, and often done using ad hoc methods. This software is capable of downloading and organizing files as an OpenSearch client. It can subscribe to RSS (Really Simple Syndication) feeds and Atom feeds containing arbitrary metadata, and maintains a local content addressable data store. It uses existing standards for obtaining the files, and uses efficient techniques for storing the files. Novel features include symbolic links to maintain a sane directory structure, checksums for validating file integrity during transfer and storage, and flexible use of server-provided metadata.
Berillo, Olga; Régnier, Mireille; Ivashchenko, Anatoly
2014-01-01
microRNAs are small RNA molecules that inhibit the translation of target genes. microRNA binding sites are located in the untranslated regions as well as in the coding domains. We describe TmiRUSite and TmiROSite scripts developed using python as tools for the extraction of nucleotide sequences for miRNA binding sites with their encoded amino acid residue sequences. The scripts allow for retrieving a set of additional sequences at left and at right from the binding site. The scripts presents all received data in table formats that are easy to analyse further. The predicted data finds utility in molecular and evolutionary biology studies. They find use in studying miRNA binding sites in animals and plants. TmiRUSite and TmiROSite scripts are available for free from authors upon request and at https: //sites.google.com/site/malaheenee/downloads for download.
To improve public health and the environment, the United States Environmental Protection Agency (USEPA) collects information about facilities, sites, or places subject to environmental regulation or of environmental interest. Through the Geospatial Data Download Service, the public is now able to download the EPA Geodata shapefile containing facility and site information from EPA's national program systems. The file is Internet accessible from the Envirofacts Web site (https://www3.epa.gov/enviro/). The data may be used with geospatial mapping applications. (Note: The shapefile omits facilities without latitude/longitude coordinates.) The EPA Geospatial Data contains the name, location (latitude/longitude), and EPA program information about specific facilities and sites. In addition, the file contains a Uniform Resource Locator (URL), which allows mapping applications to present an option to users to access additional EPA data resources on a specific facility or site. This dataset shows Brownfields listed in the 2012 Facility Registry System.
Surgical simulation software for insertion of pedicle screws.
Eftekhar, Behzad; Ghodsi, Mohammad; Ketabchi, Ebrahim; Rasaee, Saman
2002-01-01
As the first step toward finding noninvasive alternatives to the traditional methods of surgical training, we have developed a small, stand-alone computer program that simulates insertion of pedicle screws in different spinal vertebrae (T10-L5). We used Delphi 5.0 and DirectX 7.0 extension for Microsoft Windows. This is a stand-alone and portable program. The program can run on most personal computers. It provides the trainee with visual feedback during practice of the technique. At present, it uses predefined three-dimensional images of the vertebrae, but we are attempting to adapt the program to three-dimensional objects based on real computed tomographic scans of the patients. The program can be downloaded at no cost from the web site: www.tums.ac.ir/downloads As a preliminary work, it requires further development, particularly toward better visual, auditory, and even proprioceptive feedback and use of the individual patient's data.
The braingraph.org database of high resolution structural connectomes and the brain graph tools.
Kerepesi, Csaba; Szalkai, Balázs; Varga, Bálint; Grolmusz, Vince
2017-10-01
Based on the data of the NIH-funded Human Connectome Project, we have computed structural connectomes of 426 human subjects in five different resolutions of 83, 129, 234, 463 and 1015 nodes and several edge weights. The graphs are given in anatomically annotated GraphML format that facilitates better further processing and visualization. For 96 subjects, the anatomically classified sub-graphs can also be accessed, formed from the vertices corresponding to distinct lobes or even smaller regions of interests of the brain. For example, one can easily download and study the connectomes, restricted to the frontal lobes or just to the left precuneus of 96 subjects using the data. Partially directed connectomes of 423 subjects are also available for download. We also present a GitHub-deposited set of tools, called the Brain Graph Tools, for several processing tasks of the connectomes on the site http://braingraph.org.
U.S. EPAs Geospatial Data Access Project
To improve public health and the environment, the United States Environmental Protection Agency (EPA) collects information about facilities, sites, or places subject to environmental regulation or of environmental interest. Through the Geospatial Data Download Service, the public is now able to download the EPA Geodata Shapefile, Feature Class or extensible markup language (XML) file containing facility and site information from EPA's national program systems. The files are Internet accessible from the Envirofacts Web site (https://www3.epa.gov/enviro/). The data may be used with geospatial mapping applications. (Note: The files omit facilities without latitude/longitude coordinates.) The EPA Geospatial Data contains the name, location (latitude/longitude), and EPA program information about specific facilities and sites. In addition, the files contain a Uniform Resource Locator (URL), which allows mapping applications to present an option to users to access additional EPA data resources on a specific facility or site.
TCIApathfinder: an R client for The Cancer Imaging Archive REST API.
Russell, Pamela; Fountain, Kelly; Wolverton, Dulcy; Ghosh, Debashis
2018-06-05
The Cancer Imaging Archive (TCIA) hosts publicly available de-identified medical images of cancer from over 25 body sites and over 30,000 patients. Over 400 published studies have utilized freely available TCIA images. Images and metadata are available for download through a web interface or a REST API. Here we present TCIApathfinder, an R client for the TCIA REST API. TCIApathfinder wraps API access in user-friendly R functions that can be called interactively within an R session or easily incorporated into scripts. Functions are provided to explore the contents of the large database and to download image files. TCIApathfinder provides easy access to TCIA resources in the highly popular R programming environment. TCIApathfinder is freely available under the MIT license as a package on CRAN (https://cran.r-project.org/web/packages/TCIApathfinder/index.html) and at https://github.com/pamelarussell/TCIApathfinder. Copyright ©2018, American Association for Cancer Research.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Williams, Benjamin F.; Dalcanton, Julianne J.; Weisz, Daniel R.
We have measured stellar photometry with the Hubble Space Telescope Wide Field Camera 3 (WFC3) and Advanced Camera for Surveys in near ultraviolet (F275W, F336W), optical (F475W, F814W), and near infrared (F110W, F160W) bands for 117 million resolved stars in M31. As part of the Panchromatic Hubble Andromeda Treasury survey, we measured photometry with simultaneous point-spread function (PSF) fitting across all bands and at all source positions after precise astrometric image alignment (<5-10 mas accuracy). In the outer disk, the photometry reaches a completeness-limited depth of F475W ∼ 28, while in the crowded, high surface brightness bulge, the photometry reachesmore » F475W ∼ 25. We find that simultaneous photometry and optimized measurement parameters significantly increase the detection limit of the lowest-resolution filters (WFC3/IR) providing color-magnitude diagrams (CMDs) that are up to 2.5 mag deeper when compared with CMDs from WFC3/IR photometry alone. We present extensive analysis of the data quality including comparisons of luminosity functions and repeat measurements, and we use artificial star tests to quantify photometric completeness, uncertainties and biases. We find that the largest sources of systematic error in the photometry are due to spatial variations in the PSF models and charge transfer efficiency corrections. This stellar catalog is the largest ever produced for equidistant sources, and is publicly available for download by the community.« less
ERIC Educational Resources Information Center
Kumaran, Maha; Geary, Joe
2011-01-01
Technology has transformed libraries. There are digital libraries, electronic collections, online databases and catalogs, ebooks, downloadable books, and much more. With free technology such as social websites, newspaper collections, downloadable online calendars, clocks and sticky notes, online scheduling, online document sharing, and online…
Code of Federal Regulations, 2014 CFR
2014-07-01
....10 Patents, Trademarks, and Copyrights COPYRIGHT ROYALTY BOARD, LIBRARY OF CONGRESS RATES AND TERMS... AND DISTRIBUTING OF PHYSICAL AND DIGITAL PHONORECORDS Interactive Streaming and Limited Downloads... interactive streams and limited downloads of musical works by subscription and nonsubscription digital music...
Publications - PR 121 | Alaska Division of Geological & Geophysical Surveys
: Download below or please see our publication sales page for more information. Quadrangle(s): Philip Smith Philip Smith Mountains: Surficial Geology Data File Format File Size Info Download psm-surficial-geo
Mining collections of compounds with Screening Assistant 2
2012-01-01
Background High-throughput screening assays have become the starting point of many drug discovery programs for large pharmaceutical companies as well as academic organisations. Despite the increasing throughput of screening technologies, the almost infinite chemical space remains out of reach, calling for tools dedicated to the analysis and selection of the compound collections intended to be screened. Results We present Screening Assistant 2 (SA2), an open-source JAVA software dedicated to the storage and analysis of small to very large chemical libraries. SA2 stores unique molecules in a MySQL database, and encapsulates several chemoinformatics methods, among which: providers management, interactive visualisation, scaffold analysis, diverse subset creation, descriptors calculation, sub-structure / SMART search, similarity search and filtering. We illustrate the use of SA2 by analysing the composition of a database of 15 million compounds collected from 73 providers, in terms of scaffolds, frameworks, and undesired properties as defined by recently proposed HTS SMARTS filters. We also show how the software can be used to create diverse libraries based on existing ones. Conclusions Screening Assistant 2 is a user-friendly, open-source software that can be used to manage collections of compounds and perform simple to advanced chemoinformatics analyses. Its modular design and growing documentation facilitate the addition of new functionalities, calling for contributions from the community. The software can be downloaded at http://sa2.sourceforge.net/. PMID:23327565
Mining collections of compounds with Screening Assistant 2.
Guilloux, Vincent Le; Arrault, Alban; Colliandre, Lionel; Bourg, Stéphane; Vayer, Philippe; Morin-Allory, Luc
2012-08-31
High-throughput screening assays have become the starting point of many drug discovery programs for large pharmaceutical companies as well as academic organisations. Despite the increasing throughput of screening technologies, the almost infinite chemical space remains out of reach, calling for tools dedicated to the analysis and selection of the compound collections intended to be screened. We present Screening Assistant 2 (SA2), an open-source JAVA software dedicated to the storage and analysis of small to very large chemical libraries. SA2 stores unique molecules in a MySQL database, and encapsulates several chemoinformatics methods, among which: providers management, interactive visualisation, scaffold analysis, diverse subset creation, descriptors calculation, sub-structure / SMART search, similarity search and filtering. We illustrate the use of SA2 by analysing the composition of a database of 15 million compounds collected from 73 providers, in terms of scaffolds, frameworks, and undesired properties as defined by recently proposed HTS SMARTS filters. We also show how the software can be used to create diverse libraries based on existing ones. Screening Assistant 2 is a user-friendly, open-source software that can be used to manage collections of compounds and perform simple to advanced chemoinformatics analyses. Its modular design and growing documentation facilitate the addition of new functionalities, calling for contributions from the community. The software can be downloaded at http://sa2.sourceforge.net/.
Weiss, Scott T.
2014-01-01
Bayesian Networks (BN) have been a popular predictive modeling formalism in bioinformatics, but their application in modern genomics has been slowed by an inability to cleanly handle domains with mixed discrete and continuous variables. Existing free BN software packages either discretize continuous variables, which can lead to information loss, or do not include inference routines, which makes prediction with the BN impossible. We present CGBayesNets, a BN package focused around prediction of a clinical phenotype from mixed discrete and continuous variables, which fills these gaps. CGBayesNets implements Bayesian likelihood and inference algorithms for the conditional Gaussian Bayesian network (CGBNs) formalism, one appropriate for predicting an outcome of interest from, e.g., multimodal genomic data. We provide four different network learning algorithms, each making a different tradeoff between computational cost and network likelihood. CGBayesNets provides a full suite of functions for model exploration and verification, including cross validation, bootstrapping, and AUC manipulation. We highlight several results obtained previously with CGBayesNets, including predictive models of wood properties from tree genomics, leukemia subtype classification from mixed genomic data, and robust prediction of intensive care unit mortality outcomes from metabolomic profiles. We also provide detailed example analysis on public metabolomic and gene expression datasets. CGBayesNets is implemented in MATLAB and available as MATLAB source code, under an Open Source license and anonymous download at http://www.cgbayesnets.com. PMID:24922310
EVEREST: Pixel Level Decorrelation of K2 Light Curves
NASA Astrophysics Data System (ADS)
Luger, Rodrigo; Agol, Eric; Kruse, Ethan; Barnes, Rory; Becker, Andrew; Foreman-Mackey, Daniel; Deming, Drake
2016-10-01
We present EPIC Variability Extraction and Removal for Exoplanet Science Targets (EVEREST), an open-source pipeline for removing instrumental noise from K2 light curves. EVEREST employs a variant of pixel level decorrelation to remove systematics introduced by the spacecraft’s pointing error and a Gaussian process to capture astrophysical variability. We apply EVEREST to all K2 targets in campaigns 0-7, yielding light curves with precision comparable to that of the original Kepler mission for stars brighter than {K}p≈ 13, and within a factor of two of the Kepler precision for fainter targets. We perform cross-validation and transit injection and recovery tests to validate the pipeline, and compare our light curves to the other de-trended light curves available for download at the MAST High Level Science Products archive. We find that EVEREST achieves the highest average precision of any of these pipelines for unsaturated K2 stars. The improved precision of these light curves will aid in exoplanet detection and characterization, investigations of stellar variability, asteroseismology, and other photometric studies. The EVEREST pipeline can also easily be applied to future surveys, such as the TESS mission, to correct for instrumental systematics and enable the detection of low signal-to-noise transiting exoplanets. The EVEREST light curves and the source code used to generate them are freely available online.
McGeachie, Michael J; Chang, Hsun-Hsien; Weiss, Scott T
2014-06-01
Bayesian Networks (BN) have been a popular predictive modeling formalism in bioinformatics, but their application in modern genomics has been slowed by an inability to cleanly handle domains with mixed discrete and continuous variables. Existing free BN software packages either discretize continuous variables, which can lead to information loss, or do not include inference routines, which makes prediction with the BN impossible. We present CGBayesNets, a BN package focused around prediction of a clinical phenotype from mixed discrete and continuous variables, which fills these gaps. CGBayesNets implements Bayesian likelihood and inference algorithms for the conditional Gaussian Bayesian network (CGBNs) formalism, one appropriate for predicting an outcome of interest from, e.g., multimodal genomic data. We provide four different network learning algorithms, each making a different tradeoff between computational cost and network likelihood. CGBayesNets provides a full suite of functions for model exploration and verification, including cross validation, bootstrapping, and AUC manipulation. We highlight several results obtained previously with CGBayesNets, including predictive models of wood properties from tree genomics, leukemia subtype classification from mixed genomic data, and robust prediction of intensive care unit mortality outcomes from metabolomic profiles. We also provide detailed example analysis on public metabolomic and gene expression datasets. CGBayesNets is implemented in MATLAB and available as MATLAB source code, under an Open Source license and anonymous download at http://www.cgbayesnets.com.
Forsell, M; Häggström, M; Johansson, O; Sjögren, P
2008-11-08
To develop a personal digital assistant (PDA) application for oral health assessment fieldwork, including back-office and database systems (MobilDent). System design, construction and implementation of PDA, back-office and database systems. System requirements for MobilDent were collected, analysed and translated into system functions. User interfaces were implemented and system architecture was outlined. MobilDent was based on a platform with. NET (Microsoft) components, using an SQL Server 2005 (Microsoft) for data storage with Windows Mobile (Microsoft) operating system. The PDA devices were Dell Axim. System functions and user interfaces were specified for MobilDent. User interfaces for PDA, back-office and database systems were based on. NET programming. The PDA user interface was based on Windows suitable to a PDA display, whereas the back-office interface was designed for a normal-sized computer screen. A synchronisation module (MS Active Sync, Microsoft) was used to enable download of field data from PDA to the database. MobilDent is a feasible application for oral health assessment fieldwork, and the oral health assessment database may prove a valuable source for care planning, educational and research purposes. Further development of the MobilDent system will include wireless connectivity with download-on-demand technology.
GenomeVIP: a cloud platform for genomic variant discovery and interpretation
Mashl, R. Jay; Scott, Adam D.; Huang, Kuan-lin; Wyczalkowski, Matthew A.; Yoon, Christopher J.; Niu, Beifang; DeNardo, Erin; Yellapantula, Venkata D.; Handsaker, Robert E.; Chen, Ken; Koboldt, Daniel C.; Ye, Kai; Fenyö, David; Raphael, Benjamin J.; Wendl, Michael C.; Ding, Li
2017-01-01
Identifying genomic variants is a fundamental first step toward the understanding of the role of inherited and acquired variation in disease. The accelerating growth in the corpus of sequencing data that underpins such analysis is making the data-download bottleneck more evident, placing substantial burdens on the research community to keep pace. As a result, the search for alternative approaches to the traditional “download and analyze” paradigm on local computing resources has led to a rapidly growing demand for cloud-computing solutions for genomics analysis. Here, we introduce the Genome Variant Investigation Platform (GenomeVIP), an open-source framework for performing genomics variant discovery and annotation using cloud- or local high-performance computing infrastructure. GenomeVIP orchestrates the analysis of whole-genome and exome sequence data using a set of robust and popular task-specific tools, including VarScan, GATK, Pindel, BreakDancer, Strelka, and Genome STRiP, through a web interface. GenomeVIP has been used for genomic analysis in large-data projects such as the TCGA PanCanAtlas and in other projects, such as the ICGC Pilots, CPTAC, ICGC-TCGA DREAM Challenges, and the 1000 Genomes SV Project. Here, we demonstrate GenomeVIP's ability to provide high-confidence annotated somatic, germline, and de novo variants of potential biological significance using publicly available data sets. PMID:28522612
VIOLIN: vaccine investigation and online information network.
Xiang, Zuoshuang; Todd, Thomas; Ku, Kim P; Kovacic, Bethany L; Larson, Charles B; Chen, Fang; Hodges, Andrew P; Tian, Yuying; Olenzek, Elizabeth A; Zhao, Boyang; Colby, Lesley A; Rush, Howard G; Gilsdorf, Janet R; Jourdian, George W; He, Yongqun
2008-01-01
Vaccines are among the most efficacious and cost-effective tools for reducing morbidity and mortality caused by infectious diseases. The vaccine investigation and online information network (VIOLIN) is a web-based central resource, allowing easy curation, comparison and analysis of vaccine-related research data across various human pathogens (e.g. Haemophilus influenzae, human immunodeficiency virus (HIV) and Plasmodium falciparum) of medical importance and across humans, other natural hosts and laboratory animals. Vaccine-related peer-reviewed literature data have been downloaded into the database from PubMed and are searchable through various literature search programs. Vaccine data are also annotated, edited and submitted to the database through a web-based interactive system that integrates efficient computational literature mining and accurate manual curation. Curated information includes general microbial pathogenesis and host protective immunity, vaccine preparation and characteristics, stimulated host responses after vaccination and protection efficacy after challenge. Vaccine-related pathogen and host genes are also annotated and available for searching through customized BLAST programs. All VIOLIN data are available for download in an eXtensible Markup Language (XML)-based data exchange format. VIOLIN is expected to become a centralized source of vaccine information and to provide investigators in basic and clinical sciences with curated data and bioinformatics tools for vaccine research and development. VIOLIN is publicly available at http://www.violinet.org.
Davis, Brian N.; Werpy, Jason; Friesz, Aaron M.; Impecoven, Kevin; Quenzer, Robert; Maiersperger, Tom; Meyer, David J.
2015-01-01
Current methods of searching for and retrieving data from satellite land remote sensing archives do not allow for interactive information extraction. Instead, Earth science data users are required to download files over low-bandwidth networks to local workstations and process data before science questions can be addressed. New methods of extracting information from data archives need to become more interactive to meet user demands for deriving increasingly complex information from rapidly expanding archives. Moving the tools required for processing data to computer systems of data providers, and away from systems of the data consumer, can improve turnaround times for data processing workflows. The implementation of middleware services was used to provide interactive access to archive data. The goal of this middleware services development is to enable Earth science data users to access remote sensing archives for immediate answers to science questions instead of links to large volumes of data to download and process. Exposing data and metadata to web-based services enables machine-driven queries and data interaction. Also, product quality information can be integrated to enable additional filtering and sub-setting. Only the reduced content required to complete an analysis is then transferred to the user.
Operational thermal remote sensing and lava flow monitoring at the Hawaiian Volcano Observatory
Patrick, Matthew R.; Kauahikaua, James P.; Orr, Tim R.; Davies, Ashley G.; Ramsey, Michael S.
2016-01-01
Hawaiian volcanoes are highly accessible and well monitored by ground instruments. Nevertheless, observational gaps remain and thermal satellite imagery has proven useful in Hawai‘i for providing synoptic views of activity during intervals between field visits. Here we describe the beginning of a thermal remote sensing programme at the US Geological Survey Hawaiian Volcano Observatory (HVO). Whereas expensive receiving stations have been traditionally required to achieve rapid downloading of satellite data, we exploit free, low-latency data sources on the internet for timely access to GOES, MODIS, ASTER and EO-1 ALI imagery. Automated scripts at the observatory download these data and provide a basic display of the images. Satellite data have been extremely useful for monitoring the ongoing lava flow activity on Kīlauea's East Rift Zone at Pu‘u ‘Ō‘ō over the past few years. A recent lava flow, named Kahauale‘a 2, was upslope from residential subdivisions for over a year. Satellite data helped track the slow advance of the flow and contributed to hazard assessments. Ongoing improvement to thermal remote sensing at HVO incorporates automated hotspot detection, effusion rate estimation and lava flow forecasting, as has been done in Italy. These improvements should be useful for monitoring future activity on Mauna Loa.
Station Program Note Pull Automation
NASA Technical Reports Server (NTRS)
Delgado, Ivan
2016-01-01
Upon commencement of my internship, I was in charge of maintaining the CoFR (Certificate of Flight Readiness) Tool. The tool acquires data from existing Excel workbooks on NASA's and Boeing's databases to create a new spreadsheet listing out all the potential safety concerns for upcoming flights and software transitions. Since the application was written in Visual Basic, I had to learn a new programming language and prepare to handle any malfunctions within the program. Shortly afterwards, I was given the assignment to automate the Station Program Note (SPN) Pull process. I developed an application, in Python, that generated a GUI (Graphical User Interface) that will be used by the International Space Station Safety & Mission Assurance team here at Johnson Space Center. The application will allow its users to download online files with the click of a button, import SPN's based on three different pulls, instantly manipulate and filter spreadsheets, and compare the three sources to determine which active SPN's (Station Program Notes) must be reviewed for any upcoming flights, missions, and/or software transitions. Initially, to perform the NASA SPN pull (one of three), I had created the program to allow the user to login to a secure webpage that stores data, input specific parameters, and retrieve the desired SPN's based on their inputs. However, to avoid any conflicts with sustainment, I altered it so that the user may login and download the NASA file independently. After the user has downloaded the file with the click of a button, I defined the program to check for any outdated or pre-existing files, for successful downloads, to acquire the spreadsheet, convert it from a text file to a comma separated file and finally into an Excel spreadsheet to be filtered and later scrutinized for specific SPN numbers. Once this file has been automatically manipulated to provide only the SPN numbers that are desired, they are stored in a global variable, shown on the GUI, and transferred over to a new Excel worksheet for comparison. I managed to get my application to acquire the CSWG (Computer Safety Working Group) and the SPNWG (Space Station Working Group) SPN's with just two mouse clicks for each pull, as opposed to several from the original process. When all three pulls are performed, an Excel sheet containing all three different results will be generated for the user to compare and determine which SPN's will be presented or reviewed the following month. The experience from this internship has been spectacular. As a high school senior who will begin attending college in the fall, this internship has been both educationally and occupationally beneficial. The internship has allowed me the opportunities to learn new programming languages, effectively network with NASA personnel from a variety of departments at JSC, and allowed me to learn new professional skills and etiquette. My internship at NASA's Johnson Space Center has further motivated me to pursue a Master's degree in Software Engineering and strive for a prosperous career with NASA as a civil servant.
Navier-Stokes flowfield computation of wing/rotor interaction for a tilt rotor aircraft in hover
NASA Technical Reports Server (NTRS)
Fejtek, Ian G.
1993-01-01
The download on the wing produced by the rotor-induced downwash of a tilt rotor aircraft in hover is of major concern because of its severe impact on payload-carrying capability. A method has been developed to help gain a better understanding of the fundamental fluid dynamics that causes this download, and to help find ways to reduce it. In particular, the method is employed in this work to analyze the effect of a tangential leading edge circulation-control jet on download reduction. Because of the complexities associated with modeling the complete configuration, this work focuses specifically on the wing/rotor interaction of a tilt rotor aircraft in hover. The three-dimensional, unsteady, thin-layer compressible Navier-Stokes equations are solved using a time-accurate, implicit, finite difference scheme that employs LU-ADI factorization. The rotor is modeled as an actuator disk which imparts both a radical and an azimuthal distribution of pressure rise and swirl to the flowfield. A momentum theory blade element analysis of the rotor is incorporated into the Navier-Stokes solution method. Solution blanking at interior points of the mesh has been shown here to be an effective technique in introducing the effects of the rotor and tangential leading edge jet. Results are presented both for a rotor alone and for wing/rotor interaction. The overall mean characteristics of the rotor flowfield are computed including the flow acceleration through the rotor disk, the axial and swirl velocities in the rotor downwash, and the slipstream contraction. Many of the complex tilt rotor flow features are captured including the highly three-dimensional flow over the wing, the recirculation fountain at the plane of symmetry, wing leading and trailing edge separation, and the large region of separated flow beneath the wing. Mean wing surface pressures compare fairly well with available experimental data, but the time-averaged download/thrust ratio is 20-30 percent higher than the measured value. The discrepancy is due to a combination of factors that are discussed. Leading edge tangential blowing, of constant strength along the wing span, is shown to be effective in reducing download. The jet serves primarily to reduce the pressure on the wing upper surface. The computation clearly shows that, because of the three-dimensionality of the flowfield, optimum blowing would involve a spanwise variation in blowing strength.
SYRMEP Tomo Project: a graphical user interface for customizing CT reconstruction workflows.
Brun, Francesco; Massimi, Lorenzo; Fratini, Michela; Dreossi, Diego; Billé, Fulvio; Accardo, Agostino; Pugliese, Roberto; Cedola, Alessia
2017-01-01
When considering the acquisition of experimental synchrotron radiation (SR) X-ray CT data, the reconstruction workflow cannot be limited to the essential computational steps of flat fielding and filtered back projection (FBP). More refined image processing is often required, usually to compensate artifacts and enhance the quality of the reconstructed images. In principle, it would be desirable to optimize the reconstruction workflow at the facility during the experiment (beamtime). However, several practical factors affect the image reconstruction part of the experiment and users are likely to conclude the beamtime with sub-optimal reconstructed images. Through an example of application, this article presents SYRMEP Tomo Project (STP), an open-source software tool conceived to let users design custom CT reconstruction workflows. STP has been designed for post-beamtime (off-line use) and for a new reconstruction of past archived data at user's home institution where simple computing resources are available. Releases of the software can be downloaded at the Elettra Scientific Computing group GitHub repository https://github.com/ElettraSciComp/STP-Gui.
NASA Technical Reports Server (NTRS)
Fitzpatrick, Austin J.; Novati, Alexander; Fisher, Diane K.; Leon, Nancy J.; Netting, Ruth
2013-01-01
Space Place Prime is public engagement and education software for use on iPad. It targets a multi-generational audience with news, images, videos, and educational articles from the Space Place Web site and other NASA sources. New content is downloaded daily (or whenever the user accesses the app) via the wireless connection. In addition to the Space Place Web site, several NASA RSS feeds are tapped to provide new content. Content is retained for the previous several days, or some number of editions of each feed. All content is controlled on the server side, so features about the latest news, or changes to any content, can be made without updating the app in the Apple Store. It gathers many popular NASA features into one app. The interface is a boundless, slidable- in-any-direction grid of images, unique for each feature, and iconized as image, video, or article. A tap opens the feature. An alternate list mode presents menus of images, videos, and articles separately. Favorites can be tagged for permanent archive. Face - book, Twitter, and e-mail connections make any feature shareable.
Using Python as a first programming environment for computational physics in developing countries
NASA Astrophysics Data System (ADS)
Akpojotor, Godfrey; Ehwerhemuepha, Louis; Echenim, Myron; Akpojotor, Famous
2011-03-01
Python unique features such its interpretative, multiplatform and object oriented nature as well as being a free and open source software creates the possibility that any user connected to the internet can download the entire package into any platform, install it and immediately begin to use it. Thus Python is gaining reputation as a preferred environment for introducing students and new beginners to programming. Therefore in Africa, the Python African Tour project has been launched and we are coordinating its use in computational science. We examine here the challenges and prospects of using Python for computational physics (CP) education in developing countries (DC). Then we present our project on using Python to simulate and aid the learning of laboratory experiments illustrated here by modeling of the simple pendulum and also to visualize phenomena in physics illustrated here by demonstrating the wave motion of a particle in a varying potential. This project which is to train both the teachers and our students on CP using Python can easily be adopted in other DC.
Jannovar: a java library for exome annotation.
Jäger, Marten; Wang, Kai; Bauer, Sebastian; Smedley, Damian; Krawitz, Peter; Robinson, Peter N
2014-05-01
Transcript-based annotation and pedigree analysis are two basic steps in the computational analysis of whole-exome sequencing experiments in genetic diagnostics and disease-gene discovery projects. Here, we present Jannovar, a stand-alone Java application as well as a Java library designed to be used in larger software frameworks for exome and genome analysis. Jannovar uses an interval tree to identify all transcripts affected by a given variant, and provides Human Genome Variation Society-compliant annotations both for variants affecting coding sequences and splice junctions as well as untranslated regions and noncoding RNA transcripts. Jannovar can also perform family-based pedigree analysis with Variant Call Format (VCF) files with data from members of a family segregating a Mendelian disorder. Using a desktop computer, Jannovar requires a few seconds to annotate a typical VCF file with exome data. Jannovar is freely available under the BSD2 license. Source code as well as the Java application and library file can be downloaded from http://compbio.charite.de (with tutorial) and https://github.com/charite/jannovar. © 2014 WILEY PERIODICALS, INC.
DIY-style GIS service in mobile navigation system integrated with web and wireless GIS
NASA Astrophysics Data System (ADS)
Yan, Yongbin; Wu, Jianping; Fan, Caiyou; Wang, Minqi; Dai, Sheng
2007-06-01
Mobile navigation system based on handheld device can not only provide basic GIS services, but also enable these GIS services to be provided without location limit, to be more instantly interacted between users and devices. However, we still see that most navigation systems have common defects on user experience like limited map format, few map resources, and unable location share. To overcome the above defects, we propose DIY-style GIS service which provide users a more free software environment and allow uses to customize their GIS services. These services include defining geographical coordinate system of maps which helps to hugely enlarge the map source, editing vector feature, related property information and hotlink images, customizing covered area of download map via General Packet Radio Service (GPRS), and sharing users' location information via SMS (Short Message Service) which establishes the communication between users who needs GIS services. The paper introduces the integration of web and wireless GIS service in a mobile navigation system and presents an implementation sample of a DIY-Style GIS service in a mobile navigation system.
ETHERNET BASED EMBEDDED SYSTEM FOR FEL DIAGNOSTICS AND CONTROLS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jianxun Yan; Daniel Sexton; Steven Moore
2006-10-24
An Ethernet based embedded system has been developed to upgrade the Beam Viewer and Beam Position Monitor (BPM) systems within the free-electron laser (FEL) project at Jefferson Lab. The embedded microcontroller was mounted on the front-end I/O cards with software packages such as Experimental Physics and Industrial Control System (EPICS) and Real Time Executive for Multiprocessor System (RTEMS) running as an Input/Output Controller (IOC). By cross compiling with the EPICS, the RTEMS kernel, IOC device supports, and databases all of these can be downloaded into the microcontroller. The first version of the BPM electronics based on the embedded controller wasmore » built and is currently running in our FEL system. The new version of BPM that will use a Single Board IOC (SBIOC), which integrates with an Field Programming Gate Array (FPGA) and a ColdFire embedded microcontroller, is presently under development. The new system has the features of a low cost IOC, an open source real-time operating system, plug&play-like ease of installation and flexibility, and provides a much more localized solution.« less
Beryllium10: a free and simple tool for creating and managing group safety data sheets
2014-01-01
Background Countless chemicals and mixtures are used in laboratories today, which all possess their own properties and dangers. Therefore, it is important to brief oneself about possible risks and hazards before doing any experiments. However, this task is laborious and time consuming. Summary Beryllium10 is a program, which supports users by carrying out a large part of the work such as collecting/importing data sets from different providers and compiling most of the information into a single group safety data sheet, which is suitable for having all necessary information at hand while an experiment is in progress. We present here the features of Beryllium10, their implementation, and their design and development criteria and ideas. Conclusion A program for creating and managing of group safety data sheets was developed and released as open source under GPL. The program provides a fast and clear user-interface, and well-conceived design for collecting and managing safety data. It is available for download from the web page http://beryllium.keksecks.de. PMID:24650446
Proof of Concept for a Simple Smartphone Sky Monitor
NASA Astrophysics Data System (ADS)
Kantamneni, Abhilash; Nemiroff, R. J.; Brisbois, C.
2013-01-01
We present a novel approach of obtaining a cloud and bright sky monitor by using a standard smartphone with a downloadable app. The addition of an inexpensive fisheye lens can extend the angular range to the entire sky visible above the device. A preliminary proof of concept image shows an optical limit of about visual magnitude 5 for a 70-second exposure. Support science objectives include cloud monitoring in a manner similar to the more expensive cloud monitors in use at most major astronomical observatories, making expensive observing time at these observatories more efficient. Primary science objectives include bright meteor tracking, bright comet tracking, and monitoring the variability of bright stars. Citizen science objectives include crowd sourcing of many networked sky monitoring smartphones typically in broader support of many of the primary science goals. The deployment of a citizen smartphone array in an active science mode could leverage the sky monitoring data infrastructure to track other non-visual science opportunities, including monitoring the Earth's magnetic field for the effects of solar flares and exhaustive surface coverage for strong seismic events.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-04
... protect the unique and important resources and values of the land for the benefit and enjoyment of present..., paleontological, natural, scientific, recreational, wilderness, wildlife, riparian, historical, educational, and... Uncompahgre Field Offices, or may be downloaded from the following Web site: http://www.blm.gov/co/st/en/nca...
Using Maple to Implement eLearning Integrated with Computer Aided Assessment
ERIC Educational Resources Information Center
Blyth, Bill; Labovic, Aleksandra
2009-01-01
Advanced mathematics courses have been developed and refined by the first author, using an action research methodology, for more than a decade. These courses use the computer algebra system (CAS) Maple in an "immersion mode" where all presentations and student work are done using Maple. Assignments and examinations are Maple files downloaded from…
iDemocracy: Critical Literacy, Civic Engagement, and Podcasting in an Elementary Classroom
ERIC Educational Resources Information Center
Montgomery, Sarah E.
2009-01-01
The present study explored the ways in which the production of digital media, specifically podcasts (i.e., downloadable digital audio files), rooted in the key tenets of critical literacy, can support education for democracy, in addition to the overall benefits and barriers of podcasting in an elementary classroom. The project can be considered a…
A Study of ICT Infrastructure and Access to Educational Information in the Outskirts of Malang
ERIC Educational Resources Information Center
Elmunsyah, Hakkun
2012-01-01
This study aimed to determine the readiness of disadvantaged areas in support of Electronic School Books (BSE), which could be downloaded free of charge by making use of Information Communication Technology (ICT). The present study was descriptive research which was approached quantitatively, and expected to expand the model of development of…
ERIC Educational Resources Information Center
McArthur, Ellen; Kubacki, Krzysztof; Pang, Bo; Alcaraz, Celeste
2017-01-01
This study of job advertisements extends our understanding of how employers, rather than researchers, describe the specific skills and attributes sought in candidates for employment in graduate marketing roles in Australia. The article presents the findings of a content analysis of 359 marketing job advertisements downloaded in 2016, in two…
Agency Video, Audio and Imagery Library
NASA Technical Reports Server (NTRS)
Grubbs, Rodney
2015-01-01
The purpose of this presentation was to inform the ISS International Partners of the new NASA Agency Video, Audio and Imagery Library (AVAIL) website. AVAIL is a new resource for the public to search for and download NASA-related imagery, and is not intended to replace the current process by which the International Partners receive their Space Station imagery products.
Open access publishing, article downloads, and citations: randomised controlled trial
Lewenstein, Bruce V; Simon, Daniel H; Booth, James G; Connolly, Mathew J L
2008-01-01
Objective To measure the effect of free access to the scientific literature on article downloads and citations. Design Randomised controlled trial. Setting 11 journals published by the American Physiological Society. Participants 1619 research articles and reviews. Main outcome measures Article readership (measured as downloads of full text, PDFs, and abstracts) and number of unique visitors (internet protocol addresses). Citations to articles were gathered from the Institute for Scientific Information after one year. Interventions Random assignment on online publication of articles published in 11 scientific journals to open access (treatment) or subscription access (control). Results Articles assigned to open access were associated with 89% more full text downloads (95% confidence interval 76% to 103%), 42% more PDF downloads (32% to 52%), and 23% more unique visitors (16% to 30%), but 24% fewer abstract downloads (−29% to −19%) than subscription access articles in the first six months after publication. Open access articles were no more likely to be cited than subscription access articles in the first year after publication. Fifty nine per cent of open access articles (146 of 247) were cited nine to 12 months after publication compared with 63% (859 of 1372) of subscription access articles. Logistic and negative binomial regression analysis of article citation counts confirmed no citation advantage for open access articles. Conclusions Open access publishing may reach more readers than subscription access publishing. No evidence was found of a citation advantage for open access articles in the first year after publication. The citation advantage from open access reported widely in the literature may be an artefact of other causes. PMID:18669565
30-36 Months: Your Child's Development
... Resources & Services Parenting Resource 30–36 Months: Your Child’s Development Download Files Feb 10, 2016 Older toddlers are ... go?” Then you two can switch. Downloads Your Child's Development: 30–36 Months PDF 373 KB Read more ...
12-15 Months: Your Child's Development
... Resources & Services Parenting Resource 12–15 Months: Your Child’s Development Download Files Feb 9, 2016 This is a ... who she is. Downloads 12–15 Months: Your Child’s Development PDF 418 KB Read more about: Ages and ...
Generalized Anxiety Disorder: When Worry Gets Out of Control
... of Control Download PDF Download ePub Order a free hardcopy What Is GAD? Occasional anxiety is a normal part of life. You might worry about things like health, money, or family problems. But people with generalized anxiety ...
Software to Facilitate Remote Sensing Data Access for Disease Early Warning Systems
Liu, Yi; Hu, Jiameng; Snell-Feikema, Isaiah; VanBemmel, Michael S.; Lamsal, Aashis; Wimberly, Michael C.
2015-01-01
Satellite remote sensing produces an abundance of environmental data that can be used in the study of human health. To support the development of early warning systems for mosquito-borne diseases, we developed an open-source, client based software application to enable the Epidemiological Applications of Spatial Technologies (EASTWeb). Two major design decisions were full automation of the discovery, retrieval and processing of remote sensing data from multiple sources, and making the system easily modifiable in response to changes in data availability and user needs. Key innovations that helped to achieve these goals were the implementation of a software framework for data downloading and the design of a scheduler that tracks the complex dependencies among multiple data processing tasks and makes the system resilient to external errors. EASTWeb has been successfully applied to support forecasting of West Nile virus outbreaks in the United States and malaria epidemics in the Ethiopian highlands. PMID:26644779
eQuilibrator--the biochemical thermodynamics calculator.
Flamholz, Avi; Noor, Elad; Bar-Even, Arren; Milo, Ron
2012-01-01
The laws of thermodynamics constrain the action of biochemical systems. However, thermodynamic data on biochemical compounds can be difficult to find and is cumbersome to perform calculations with manually. Even simple thermodynamic questions like 'how much Gibbs energy is released by ATP hydrolysis at pH 5?' are complicated excessively by the search for accurate data. To address this problem, eQuilibrator couples a comprehensive and accurate database of thermodynamic properties of biochemical compounds and reactions with a simple and powerful online search and calculation interface. The web interface to eQuilibrator (http://equilibrator.weizmann.ac.il) enables easy calculation of Gibbs energies of compounds and reactions given arbitrary pH, ionic strength and metabolite concentrations. The eQuilibrator code is open-source and all thermodynamic source data are freely downloadable in standard formats. Here we describe the database characteristics and implementation and demonstrate its use.
eQuilibrator—the biochemical thermodynamics calculator
Flamholz, Avi; Noor, Elad; Bar-Even, Arren; Milo, Ron
2012-01-01
The laws of thermodynamics constrain the action of biochemical systems. However, thermodynamic data on biochemical compounds can be difficult to find and is cumbersome to perform calculations with manually. Even simple thermodynamic questions like ‘how much Gibbs energy is released by ATP hydrolysis at pH 5?’ are complicated excessively by the search for accurate data. To address this problem, eQuilibrator couples a comprehensive and accurate database of thermodynamic properties of biochemical compounds and reactions with a simple and powerful online search and calculation interface. The web interface to eQuilibrator (http://equilibrator.weizmann.ac.il) enables easy calculation of Gibbs energies of compounds and reactions given arbitrary pH, ionic strength and metabolite concentrations. The eQuilibrator code is open-source and all thermodynamic source data are freely downloadable in standard formats. Here we describe the database characteristics and implementation and demonstrate its use. PMID:22064852
Microsoft Biology Initiative: .NET Bioinformatics Platform and Tools
Diaz Acosta, B.
2011-01-01
The Microsoft Biology Initiative (MBI) is an effort in Microsoft Research to bring new technology and tools to the area of bioinformatics and biology. This initiative is comprised of two primary components, the Microsoft Biology Foundation (MBF) and the Microsoft Biology Tools (MBT). MBF is a language-neutral bioinformatics toolkit built as an extension to the Microsoft .NET Framework—initially aimed at the area of Genomics research. Currently, it implements a range of parsers for common bioinformatics file formats; a range of algorithms for manipulating DNA, RNA, and protein sequences; and a set of connectors to biological web services such as NCBI BLAST. MBF is available under an open source license, and executables, source code, demo applications, documentation and training materials are freely downloadable from http://research.microsoft.com/bio. MBT is a collection of tools that enable biology and bioinformatics researchers to be more productive in making scientific discoveries.
MetaQuant: a tool for the automatic quantification of GC/MS-based metabolome data.
Bunk, Boyke; Kucklick, Martin; Jonas, Rochus; Münch, Richard; Schobert, Max; Jahn, Dieter; Hiller, Karsten
2006-12-01
MetaQuant is a Java-based program for the automatic and accurate quantification of GC/MS-based metabolome data. In contrast to other programs MetaQuant is able to quantify hundreds of substances simultaneously with minimal manual intervention. The integration of a self-acting calibration function allows the parallel and fast calibration for several metabolites simultaneously. Finally, MetaQuant is able to import GC/MS data in the common NetCDF format and to export the results of the quantification into Systems Biology Markup Language (SBML), Comma Separated Values (CSV) or Microsoft Excel (XLS) format. MetaQuant is written in Java and is available under an open source license. Precompiled packages for the installation on Windows or Linux operating systems are freely available for download. The source code as well as the installation packages are available at http://bioinformatics.org/metaquant
KLASS: Kennedy Launch Academy Simulation System
NASA Technical Reports Server (NTRS)
Garner, Lesley C.
2007-01-01
Software provides access to many sophisticated scientific instrumentation (Scanning Electron Microscope (SEM), a Light Microscope, a Scanning Probe Microscope (covering Scanning Tunneling, Atomic Force, and Magnetic Force microscopy), and an Energy Dispersive Spectrometer for the SEM). Flash animation videos explain how each of the instruments work. Videos on how they are used at NASA and the sample preparation. Measuring and labeling tools provided with each instrument. Hands on experience of controlling the virtual instrument to conduct investigations, much like the real scientists at NASA do. Very open architecture. Open source on SourceForge. Extensive use of XML Target audience is high school and entry-level college students. "Many beginning students never get closer to an electron microscope than the photos in their textbooks. But anyone can get a sense of what the instrument can do by downloading this simulator from NASA's Kennedy Space Center." Science Magazine, April 8th, 2005
McEntire, Robin; Szalkowski, Debbie; Butler, James; Kuo, Michelle S; Chang, Meiping; Chang, Man; Freeman, Darren; McQuay, Sarah; Patel, Jagruti; McGlashen, Michael; Cornell, Wendy D; Xu, Jinghai James
2016-05-01
External content sources such as MEDLINE(®), National Institutes of Health (NIH) grants and conference websites provide access to the latest breaking biomedical information, which can inform pharmaceutical and biotechnology company pipeline decisions. The value of the sites for industry, however, is limited by the use of the public internet, the limited synonyms, the rarity of batch searching capability and the disconnected nature of the sites. Fortunately, many sites now offer their content for download and we have developed an automated internal workflow that uses text mining and tailored ontologies for programmatic search and knowledge extraction. We believe such an efficient and secure approach provides a competitive advantage to companies needing access to the latest information for a range of use cases and complements manually curated commercial sources. Copyright © 2016. Published by Elsevier Ltd.
Waese, Jamie; Fan, Jim; Pasha, Asher; Yu, Hans; Fucile, Geoffrey; Shi, Ruian; Cumming, Matthew; Kelley, Lawrence A; Sternberg, Michael J; Krishnakumar, Vivek; Ferlanti, Erik; Miller, Jason; Town, Chris; Stuerzlinger, Wolfgang; Provart, Nicholas J
2017-08-01
A big challenge in current systems biology research arises when different types of data must be accessed from separate sources and visualized using separate tools. The high cognitive load required to navigate such a workflow is detrimental to hypothesis generation. Accordingly, there is a need for a robust research platform that incorporates all data and provides integrated search, analysis, and visualization features through a single portal. Here, we present ePlant (http://bar.utoronto.ca/eplant), a visual analytic tool for exploring multiple levels of Arabidopsis thaliana data through a zoomable user interface. ePlant connects to several publicly available web services to download genome, proteome, interactome, transcriptome, and 3D molecular structure data for one or more genes or gene products of interest. Data are displayed with a set of visualization tools that are presented using a conceptual hierarchy from big to small, and many of the tools combine information from more than one data type. We describe the development of ePlant in this article and present several examples illustrating its integrative features for hypothesis generation. We also describe the process of deploying ePlant as an "app" on Araport. Building on readily available web services, the code for ePlant is freely available for any other biological species research. © 2017 American Society of Plant Biologists. All rights reserved.
AMPA: an automated web server for prediction of protein antimicrobial regions.
Torrent, Marc; Di Tommaso, Paolo; Pulido, David; Nogués, M Victòria; Notredame, Cedric; Boix, Ester; Andreu, David
2012-01-01
AMPA is a web application for assessing the antimicrobial domains of proteins, with a focus on the design on new antimicrobial drugs. The application provides fast discovery of antimicrobial patterns in proteins that can be used to develop new peptide-based drugs against pathogens. Results are shown in a user-friendly graphical interface and can be downloaded as raw data for later examination. AMPA is freely available on the web at http://tcoffee.crg.cat/apps/ampa. The source code is also available in the web. marc.torrent@upf.edu; david.andreu@upf.edu Supplementary data are available at Bioinformatics online.
Gobe: an interactive, web-based tool for comparative genomic visualization.
Pedersen, Brent S; Tang, Haibao; Freeling, Michael
2011-04-01
Gobe is a web-based tool for viewing comparative genomic data. It supports viewing multiple genomic regions simultaneously. Its simple text format and flash-based rendering make it an interactive, exploratory research tool. Gobe can be used without installation through our web service, or downloaded and customized with stylesheets and javascript callback functions. Gobe is a flash application that runs in all modern web-browsers. The full source-code, including that for the online web application is available under the MIT license at: http://github.com/brentp/gobe. Sample applications are hosted at http://try-gobe.appspot.com/ and http://synteny.cnr.berkeley.edu/gobe-app/.
jSquid: a Java applet for graphical on-line network exploration.
Klammer, Martin; Roopra, Sanjit; Sonnhammer, Erik L L
2008-06-15
jSquid is a graph visualization tool for exploring graphs from protein-protein interaction or functional coupling networks. The tool was designed for the FunCoup web site, but can be used for any similar network exploring purpose. The program offers various visualization and graph manipulation techniques to increase the utility for the user. jSquid is available for direct usage and download at http://jSquid.sbc.su.se including source code under the GPLv3 license, and input examples. It requires Java version 5 or higher to run properly. erik.sonnhammer@sbc.su.se Supplementary data are available at Bioinformatics online.
NetProt: Complex-based Feature Selection.
Goh, Wilson Wen Bin; Wong, Limsoon
2017-08-04
Protein complex-based feature selection (PCBFS) provides unparalleled reproducibility with high phenotypic relevance on proteomics data. Currently, there are five PCBFS paradigms, but not all representative methods have been implemented or made readily available. To allow general users to take advantage of these methods, we developed the R-package NetProt, which provides implementations of representative feature-selection methods. NetProt also provides methods for generating simulated differential data and generating pseudocomplexes for complex-based performance benchmarking. The NetProt open source R package is available for download from https://github.com/gohwils/NetProt/releases/ , and online documentation is available at http://rpubs.com/gohwils/204259 .
A seamless, high-resolution digital elevation model (DEM) of the north-central California coast
Foxgrover, Amy C.; Barnard, Patrick L.
2012-01-01
A seamless, 2-meter resolution digital elevation model (DEM) of the north-central California coast has been created from the most recent high-resolution bathymetric and topographic datasets available. The DEM extends approximately 150 kilometers along the California coastline, from Half Moon Bay north to Bodega Head. Coverage extends inland to an elevation of +20 meters and offshore to at least the 3 nautical mile limit of state waters. This report describes the procedures of DEM construction, details the input data sources, and provides the DEM for download in both ESRI Arc ASCII and GeoTIFF file formats with accompanying metadata.
Smart Location Database - Download
The Smart Location Database (SLD) summarizes over 80 demographic, built environment, transit service, and destination accessibility attributes for every census block group in the United States. Future updates to the SLD will include additional attributes which summarize the relative location efficiency of a block group when compared to other block groups within the same metropolitan region. EPA also plans to periodically update attributes and add new attributes to reflect latest available data. A log of SLD updates is included in the SLD User Guide. See the user guide for a full description of data sources, data currency, and known limitations: https://edg.epa.gov/data/Public/OP/SLD/SLD_userguide.pdf
PathJam: a new service for integrating biological pathway information.
Glez-Peña, Daniel; Reboiro-Jato, Miguel; Domínguez, Rubén; Gómez-López, Gonzalo; Pisano, David G; Fdez-Riverola, Florentino
2010-10-28
Biological pathways are crucial to much of the scientific research today including the study of specific biological processes related with human diseases. PathJam is a new comprehensive and freely accessible web-server application integrating scattered human pathway annotation from several public sources. The tool has been designed for both (i) being intuitive for wet-lab users providing statistical enrichment analysis of pathway annotations and (ii) giving support to the development of new integrative pathway applications. PathJam’s unique features and advantages include interactive graphs linking pathways and genes of interest, downloadable results in fully compatible formats, GSEA compatible output files and a standardized RESTful API.
Lee, Jasper; Zhang, Jianguo; Park, Ryan; Dagliyan, Grant; Liu, Brent; Huang, H K
2012-07-01
A Molecular Imaging Data Grid (MIDG) was developed to address current informatics challenges in archival, sharing, search, and distribution of preclinical imaging studies between animal imaging facilities and investigator sites. This manuscript presents a 2nd generation MIDG replacing the Globus Toolkit with a new system architecture that implements the IHE XDS-i integration profile. Implementation and evaluation were conducted using a 3-site interdisciplinary test-bed at the University of Southern California. The 2nd generation MIDG design architecture replaces the initial design's Globus Toolkit with dedicated web services and XML-based messaging for dedicated management and delivery of multi-modality DICOM imaging datasets. The Cross-enterprise Document Sharing for Imaging (XDS-i) integration profile from the field of enterprise radiology informatics was adopted into the MIDG design because streamlined image registration, management, and distribution dataflow are likewise needed in preclinical imaging informatics systems as in enterprise PACS application. Implementation of the MIDG is demonstrated at the University of Southern California Molecular Imaging Center (MIC) and two other sites with specified hardware, software, and network bandwidth. Evaluation of the MIDG involves data upload, download, and fault-tolerance testing scenarios using multi-modality animal imaging datasets collected at the USC Molecular Imaging Center. The upload, download, and fault-tolerance tests of the MIDG were performed multiple times using 12 collected animal study datasets. Upload and download times demonstrated reproducibility and improved real-world performance. Fault-tolerance tests showed that automated failover between Grid Node Servers has minimal impact on normal download times. Building upon the 1st generation concepts and experiences, the 2nd generation MIDG system improves accessibility of disparate animal-model molecular imaging datasets to users outside a molecular imaging facility's LAN using a new architecture, dataflow, and dedicated DICOM-based management web services. Productivity and efficiency of preclinical research for translational sciences investigators has been further streamlined for multi-center study data registration, management, and distribution.
Uptake and Usage of IntelliCare: A Publicly Available Suite of Mental Health and Well-Being Apps.
Lattie, Emily G; Schueller, Stephen M; Sargent, Elizabeth; Stiles-Shields, Colleen; Tomasino, Kathryn Noth; Corden, Marya E; Begale, Mark; Karr, Chris J; Mohr, David C
2016-05-01
Treatments for depression and anxiety have several behavioral and psychological targets and rely on varied strategies. Digital mental health treatments often employ feature-rich approaches addressing several targets and strategies. These treatments, often optimized for desktop computer use, are at odds with the ways people use smartphone applications. Smartphone use tends to focus on singular functions with easy navigation to desired tools. The IntelliCare suite of apps was developed to address the discrepancy between need for diverse behavioral strategies and constraints imposed by typical app use. Each app focuses on one strategy for a limited subset of clinical aims all pertinent to depression and anxiety. This study presents the uptake and usage of apps from the IntelliCare suite following an open deployment on a large app marketplace. Thirteen lightweight apps, including 12 interactive apps and one Hub app that coordinates use across those interactive apps, were developed and made free to download on the Google Play store. De-identified app usage data from the first year of IntelliCare suite deployment were analyzed for this study. In the first year of public availability, 5,210 individuals downloaded one or more of the IntelliCare apps, for a total of 10,131 downloads. Nearly a third of these individuals (31.8%) downloaded more than one of these apps. The modal number of launches for each of the apps was 1, however the mean number of app launches per app ranged from 3.10 to 16.98, reflecting considerable variability in the use of each app. The use rate of the IntelliCare suite of apps is higher than public deployments of other comparable digital resources. Our findings suggest that people will use multiple apps and provides support for the concept of app suites as a useful strategy for providing diverse behavioral strategies.
Alemu, Dawit; Danh, Thu; Baker, Jason V; Carrico, Adam W
2016-01-01
Background The use of stimulant drugs among men who have sex with men (MSM) with human immunodeficiency virus (HIV) is associated with decreased odds of antiretroviral therapy (ART) adherence and elevated risk of forward HIV transmission. Advancing tailored and innovative mobile phone–based ART adherence app interventions for stimulant-using HIV-positive MSM requires greater understanding of their needs and preferences in this emerging area. Objective The purpose of this study is to (1) assess reasons that stimulant-using HIV-positive MSM download and sustain their use of mobile phone apps in general, and (2) obtain feedback on features and functions that these men prefer in a mobile phone app to optimize their ART adherence. Methods Focus groups were conducted with stimulant-using HIV-positive MSM (24-57 years of age; mostly non-Hispanic white; 42% once a week or more frequent stimulant drug use) in San Francisco and Minneapolis. Our aim was to explore the mobile phone app features and functions that they considered when deciding to download and sustain their use of general apps over time, as well as specific features and functions that they would like to see incorporated into an ART adherence mobile app. Focus groups were audiorecorded and transcribed verbatim. Thematic analysis was applied to transcripts using line-by-line open coding and organizing codes into meaningful themes. Results Men reported that they currently had a variety of health and wellness, social media and networking, gaming and entertainment, and utility apps on their mobile phones. Downloading apps to their mobile phones was influenced by the cost of the app, recommendations by a trusted source, and the time it takes to download. In addition, downloading and sustained use of apps was more likely to occur when men had control over most features of the app and apps were perceived to be useful, engaging, secure, and credible. Participants suggested that ART adherence mobile phone apps include social networking features, connections to local resources and their medical chart, and breaking HIV news and updates. Although some men expressed concerns about daily self-monitoring of HIV medication doses, many appreciated receiving a summary of their medication adherence over time and suggested that feedback about missed doses be delivered in an encouraging and humorous manner. Conclusions In this study, we were able to recruit a relatively high proportion (42%) of HIV-positive MSM reporting weekly or more stimulant use. These results suggest critical design elements that may need to be considered during development of ART adherence-related mobile phone apps for this, and possibly other, high-risk groups. In particular, finding the optimal balance of security, engagement, usefulness, control capabilities, and credibility will be critical to sustained used of HIV treatment apps. PMID:27084049
The Starchive: An open access, open source archive of nearby and young stars and their planets
NASA Astrophysics Data System (ADS)
Tanner, Angelle; Gelino, Chris; Elfeki, Mario
2015-12-01
Historically, astronomers have utilized a piecemeal set of archives such as SIMBAD, the Washington Double Star Catalog, various exoplanet encyclopedias and electronic tables from the literature to cobble together stellar and exo-planetary parameters in the absence of corresponding images and spectra. As the search for planets around young stars through direct imaging, transits and infrared/optical radial velocity surveys blossoms, there is a void in the available set of to create comprehensive lists of the stellar parameters of nearby stars especially for important parameters such as metallicity and stellar activity indicators. For direct imaging surveys, we need better resources for downloading existing high contrast images to help confirm new discoveries and find ideal target stars. Once we have discovered new planets, we need a uniform database of stellar and planetary parameters from which to look for correlations to better understand the formation and evolution of these systems. As a solution to these issues, we are developing the Starchive - an open access stellar archive in the spirit of the open exoplanet catalog, the Kepler Community Follow-up Program and many others. The archive will allow users to download various datasets, upload new images, spectra and metadata and will contain multiple plotting tools to use in presentations and data interpretations. While we will highly regulate and constantly validate the data being placed into our archive the open nature of its design is intended to allow the database to be expanded efficiently and have a level of versatility which is necessary in today's fast moving, big data community. Finally, the front-end scripts will be placed on github and users will be encouraged to contribute new plotting tools. Here, I will introduce the community to the content and expected capabilities of the archive and query the audience for community feedback.
Full-text, Downloading, & Other Issues.
ERIC Educational Resources Information Center
Tenopir, Carol
1983-01-01
Issues having a possible impact on online search services in libraries are discussed including full text databases, front-end processors which translate user's input into the command language of an appropriate system, downloading to create personal files from commercial databases, and pricing. (EJS)
Code of Federal Regulations, 2014 CFR
2014-07-01
... 385.21 Patents, Trademarks, and Copyrights COPYRIGHT ROYALTY BOARD, LIBRARY OF CONGRESS RATES AND... MAKING AND DISTRIBUTING OF PHYSICAL AND DIGITAL PHONORECORDS Limited Offerings, Mixed Service Bundles..., permanent digital downloads, restricted downloads or ringtones; (3) In the case of a music bundle, the...
2013-01-01
semiconductor laser8, 9: ATAe d dA TQiTGi qg 12 11 2 1 (1) 2 0 1 AeeGgd dG GQ (2) Proc. of SPIE Vol. 8255 82551K-2 Downloaded From: http...1 Downloaded From: http://spiedigitallibrary.org/ on 01/14/2013 Terms of Use: http://spiedl.org/terms 1 Approved for public release; distribution...parameters ( 0g and 0q , respectively) have been derived to yield: Gg0 (10) Proc. of SPIE Vol. 8255 82551K-3 Downloaded From: http
Clean Air Status and Trends Network (CASTNET) Download Data Module
The CASTNET Download Data module allows users to select, view, and download CASTNET data (Raw, Aggregate, Modeled & Factual Data) based on user selections. CASTNET sites are located in or near rural areas and sensitive ecosystems collecting data on ambient levels of pollutants where urban influences are minimal. CASTNET, which was initiated in 1986, is able to provide data needed to assess and report on geographic patterns and long-term temporal trends in ambient air pollution and dry atmospheric deposition. CASTNET can also be used to track changes in measurements associated with climate change (such as temperature and precipitation).
BrEPS 2.0: Optimization of sequence pattern prediction for enzyme annotation.
Dudek, Christian-Alexander; Dannheim, Henning; Schomburg, Dietmar
2017-01-01
The prediction of gene functions is crucial for a large number of different life science areas. Faster high throughput sequencing techniques generate more and larger datasets. The manual annotation by classical wet-lab experiments is not suitable for these large amounts of data. We showed earlier that the automatic sequence pattern-based BrEPS protocol, based on manually curated sequences, can be used for the prediction of enzymatic functions of genes. The growing sequence databases provide the opportunity for more reliable patterns, but are also a challenge for the implementation of automatic protocols. We reimplemented and optimized the BrEPS pattern generation to be applicable for larger datasets in an acceptable timescale. Primary improvement of the new BrEPS protocol is the enhanced data selection step. Manually curated annotations from Swiss-Prot are used as reliable source for function prediction of enzymes observed on protein level. The pool of sequences is extended by highly similar sequences from TrEMBL and SwissProt. This allows us to restrict the selection of Swiss-Prot entries, without losing the diversity of sequences needed to generate significant patterns. Additionally, a supporting pattern type was introduced by extending the patterns at semi-conserved positions with highly similar amino acids. Extended patterns have an increased complexity, increasing the chance to match more sequences, without losing the essential structural information of the pattern. To enhance the usability of the database, we introduced enzyme function prediction based on consensus EC numbers and IUBMB enzyme nomenclature. BrEPS is part of the Braunschweig Enzyme Database (BRENDA) and is available on a completely redesigned website and as download. The database can be downloaded and used with the BrEPScmd command line tool for large scale sequence analysis. The BrEPS website and downloads for the database creation tool, command line tool and database are freely accessible at http://breps.tu-bs.de.
BrEPS 2.0: Optimization of sequence pattern prediction for enzyme annotation
Schomburg, Dietmar
2017-01-01
The prediction of gene functions is crucial for a large number of different life science areas. Faster high throughput sequencing techniques generate more and larger datasets. The manual annotation by classical wet-lab experiments is not suitable for these large amounts of data. We showed earlier that the automatic sequence pattern-based BrEPS protocol, based on manually curated sequences, can be used for the prediction of enzymatic functions of genes. The growing sequence databases provide the opportunity for more reliable patterns, but are also a challenge for the implementation of automatic protocols. We reimplemented and optimized the BrEPS pattern generation to be applicable for larger datasets in an acceptable timescale. Primary improvement of the new BrEPS protocol is the enhanced data selection step. Manually curated annotations from Swiss-Prot are used as reliable source for function prediction of enzymes observed on protein level. The pool of sequences is extended by highly similar sequences from TrEMBL and SwissProt. This allows us to restrict the selection of Swiss-Prot entries, without losing the diversity of sequences needed to generate significant patterns. Additionally, a supporting pattern type was introduced by extending the patterns at semi-conserved positions with highly similar amino acids. Extended patterns have an increased complexity, increasing the chance to match more sequences, without losing the essential structural information of the pattern. To enhance the usability of the database, we introduced enzyme function prediction based on consensus EC numbers and IUBMB enzyme nomenclature. BrEPS is part of the Braunschweig Enzyme Database (BRENDA) and is available on a completely redesigned website and as download. The database can be downloaded and used with the BrEPScmd command line tool for large scale sequence analysis. The BrEPS website and downloads for the database creation tool, command line tool and database are freely accessible at http://breps.tu-bs.de. PMID:28750104
w4CSeq: software and web application to analyze 4C-seq data.
Cai, Mingyang; Gao, Fan; Lu, Wange; Wang, Kai
2016-11-01
Circularized Chromosome Conformation Capture followed by deep sequencing (4C-Seq) is a powerful technique to identify genome-wide partners interacting with a pre-specified genomic locus. Here, we present a computational and statistical approach to analyze 4C-Seq data generated from both enzyme digestion and sonication fragmentation-based methods. We implemented a command line software tool and a web interface called w4CSeq, which takes in the raw 4C sequencing data (FASTQ files) as input, performs automated statistical analysis and presents results in a user-friendly manner. Besides providing users with the list of candidate interacting sites/regions, w4CSeq generates figures showing genome-wide distribution of interacting regions, and sketches the enrichment of key features such as TSSs, TTSs, CpG sites and DNA replication timing around 4C sites. Users can establish their own web server by downloading source codes at https://github.com/WGLab/w4CSeq Additionally, a demo web server is available at http://w4cseq.wglab.org CONTACT: kaiwang@usc.edu or wangelu@usc.eduSupplementary information: Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Thornber, Carl R.; Sherrod, David R.; Siems, David F.; Heliker, Christina C.; Meeker, Gregory P.; Oscarson, Robert L.; Kauahikaua, James P.
2002-01-01
This report presents major-element geochemical data for glasses and whole-rock aliquots among 523 lava samples collected near the vent on Kilauea's east rift zone between September 1994 and October 2001. Information on sample collection, analysis techniques and analytical standard reproducibility are presented as a PDF file, which also includes a detailed explantion of the categories of sample information presented in the database spreadsheet. The sample database is downloadable as a separate Microsoft Excel file.
Text data extraction for a prospective, research-focused data mart: implementation and validation
2012-01-01
Background Translational research typically requires data abstracted from medical records as well as data collected specifically for research. Unfortunately, many data within electronic health records are represented as text that is not amenable to aggregation for analyses. We present a scalable open source SQL Server Integration Services package, called Regextractor, for including regular expression parsers into a classic extract, transform, and load workflow. We have used Regextractor to abstract discrete data from textual reports from a number of ‘machine generated’ sources. To validate this package, we created a pulmonary function test data mart and analyzed the quality of the data mart versus manual chart review. Methods Eleven variables from pulmonary function tests performed closest to the initial clinical evaluation date were studied for 100 randomly selected subjects with scleroderma. One research assistant manually reviewed, abstracted, and entered relevant data into a database. Correlation with data obtained from the automated pulmonary function test data mart within the Northwestern Medical Enterprise Data Warehouse was determined. Results There was a near perfect (99.5%) agreement between results generated from the Regextractor package and those obtained via manual chart abstraction. The pulmonary function test data mart has been used subsequently to monitor disease progression of patients in the Northwestern Scleroderma Registry. In addition to the pulmonary function test example presented in this manuscript, the Regextractor package has been used to create cardiac catheterization and echocardiography data marts. The Regextractor package was released as open source software in October 2009 and has been downloaded 552 times as of 6/1/2012. Conclusions Collaboration between clinical researchers and biomedical informatics experts enabled the development and validation of a tool (Regextractor) to parse, abstract and assemble structured data from text data contained in the electronic health record. Regextractor has been successfully used to create additional data marts in other medical domains and is available to the public. PMID:22970696
Text data extraction for a prospective, research-focused data mart: implementation and validation.
Hinchcliff, Monique; Just, Eric; Podlusky, Sofia; Varga, John; Chang, Rowland W; Kibbe, Warren A
2012-09-13
Translational research typically requires data abstracted from medical records as well as data collected specifically for research. Unfortunately, many data within electronic health records are represented as text that is not amenable to aggregation for analyses. We present a scalable open source SQL Server Integration Services package, called Regextractor, for including regular expression parsers into a classic extract, transform, and load workflow. We have used Regextractor to abstract discrete data from textual reports from a number of 'machine generated' sources. To validate this package, we created a pulmonary function test data mart and analyzed the quality of the data mart versus manual chart review. Eleven variables from pulmonary function tests performed closest to the initial clinical evaluation date were studied for 100 randomly selected subjects with scleroderma. One research assistant manually reviewed, abstracted, and entered relevant data into a database. Correlation with data obtained from the automated pulmonary function test data mart within the Northwestern Medical Enterprise Data Warehouse was determined. There was a near perfect (99.5%) agreement between results generated from the Regextractor package and those obtained via manual chart abstraction. The pulmonary function test data mart has been used subsequently to monitor disease progression of patients in the Northwestern Scleroderma Registry. In addition to the pulmonary function test example presented in this manuscript, the Regextractor package has been used to create cardiac catheterization and echocardiography data marts. The Regextractor package was released as open source software in October 2009 and has been downloaded 552 times as of 6/1/2012. Collaboration between clinical researchers and biomedical informatics experts enabled the development and validation of a tool (Regextractor) to parse, abstract and assemble structured data from text data contained in the electronic health record. Regextractor has been successfully used to create additional data marts in other medical domains and is available to the public.
Semantically optiMize the dAta seRvice operaTion (SMART) system for better data discovery and access
NASA Astrophysics Data System (ADS)
Yang, C.; Huang, T.; Armstrong, E. M.; Moroni, D. F.; Liu, K.; Gui, Z.
2013-12-01
Abstract: We present a Semantically optiMize the dAta seRvice operaTion (SMART) system for better data discovery and access across the NASA data systems, Global Earth Observation System of Systems (GEOSS) Clearinghouse and Data.gov to facilitate scientists to select Earth observation data that fit better their needs in four aspects: 1. Integrating and interfacing the SMART system to include the functionality of a) semantic reasoning based on Jena, an open source semantic reasoning engine, b) semantic similarity calculation, c) recommendation based on spatiotemporal, semantic, and user workflow patterns, and d) ranking results based on similarity between search terms and data ontology. 2. Collaborating with data user communities to a) capture science data ontology and record relevant ontology triple stores, b) analyze and mine user search and download patterns, c) integrate SMART into metadata-centric discovery system for community-wide usage and feedback, and d) customizing data discovery, search and access user interface to include the ranked results, recommendation components, and semantic based navigations. 3. Laying the groundwork to interface the SMART system with other data search and discovery systems as an open source data search and discovery solution. The SMART systems leverages NASA, GEO, FGDC data discovery, search and access for the Earth science community by enabling scientists to readily discover and access data appropriate to their endeavors, increasing the efficiency of data exploration and decreasing the time that scientists must spend on searching, downloading, and processing the datasets most applicable to their research. By incorporating the SMART system, it is a likely aim that the time being devoted to discovering the most applicable dataset will be substantially reduced, thereby reducing the number of user inquiries and likewise reducing the time and resources expended by a data center in addressing user inquiries. Keywords: EarthCube; ECHO, DAACs, GeoPlatform; Geospatial Cyberinfrastructure References: 1. Yang, P., Evans, J., Cole, M., Alameh, N., Marley, S., & Bambacus, M., (2007). The Emerging Concepts and Applications of the Spatial Web Portal. Photogrammetry Engineering &Remote Sensing,73(6):691-698. 2. Zhang, C, Zhao, T. and W. Li. (2010). The Framework of a Geospatial Semantic Web based Spatial Decision Support System for Digital Earth. International Journal of Digital Earth. 3(2):111-134. 3. Yang C., Raskin R., Goodchild M.F., Gahegan M., 2010, Geospatial Cyberinfrastructure: Past, Present and Future,Computers, Environment, and Urban Systems, 34(4):264-277. 4. Liu K., Yang C., Li W., Gui Z., Xu C., Xia J., 2013. Using ontology and similarity calculations to rank Earth science data searching results, International Journal of Geospatial Information Applications. (in press)
Proteus - A Free and Open Source Sensor Observation Service (SOS) Client
NASA Astrophysics Data System (ADS)
Henriksson, J.; Satapathy, G.; Bermudez, L. E.
2013-12-01
The Earth's 'electronic skin' is becoming ever more sophisticated with a growing number of sensors measuring everything from seawater salinity levels to atmospheric pressure. To further the scientific application of this data collection effort, it is important to make the data easily available to anyone who wants to use it. Making Earth Science data readily available will allow the data to be used in new and potentially groundbreaking ways. The US National Science and Technology Council made this clear in its most recent National Strategy for Civil Earth Observations report, when it remarked that Earth observations 'are often found to be useful for additional purposes not foreseen during the development of the observation system'. On the road to this goal the Open Geospatial Consortium (OGC) is defining uniform data formats and service interfaces to facilitate the discovery and access of sensor data. This is being done through the Sensor Web Enablement (SWE) stack of standards, which include the Sensor Observation Service (SOS), Sensor Model Language (SensorML), Observations & Measurements (O&M) and Catalog Service for the Web (CSW). End-users do not have to use these standards directly, but can use smart tools that leverage and implement them. We have developed such a tool named Proteus. Proteus is an open-source sensor data discovery client. The goal of Proteus is to be a general-purpose client that can be used by anyone for discovering and accessing sensor data via OGC-based services. Proteus is a desktop client and supports a straightforward workflow for finding sensor data. The workflow takes the user through the process of selecting appropriate services, bounding boxes, observed properties, time periods and other search facets. NASA World Wind is used to display the matching sensor offerings on a map. Data from any sensor offering can be previewed in a time series. The user can download data from a single sensor offering, or download data in bulk from all matching sensor offerings. Proteus leverages NASA World Wind's WMS capabilities and allow overlaying sensor offerings on top of any map. Specific search criteria (i.e. user discoveries) can be saved and later restored. Proteus is supports two user types: 1) the researcher/scientist interested in discovering and downloading specific sensor data as input to research processes, and 2) the data manager responsible for maintaining sensor data services (e.g. SOSs) and wants to ensure proper data and metadata delivery, verify sensor data, and receive sensor data alerts. Proteus has a Web-based companion product named the Community Hub that is used to generate sensor data alerts. Alerts can be received via an RSS feed, viewed in a Web browser or displayed directly in Proteus via a Web-based API. To advance the vision of making Earth Science data easily discoverable and accessible to end-users, professional or laymen, Proteus is available as open-source on GitHub (https://github.com/intelligentautomation/proteus).
ERIC Educational Resources Information Center
Tettegah, Sharon Y., Ed.; Hunter, Richard C, Ed.
2006-01-01
In today's society where most students own MP3 players, engage in constant instant messaging and downloading from the Internet, more than ever school administrators and staff should be aware of issues in administration, policy, and applications. This book provides a comprehensive presentation of current policies and practices of technology in…
Patrick C. Tobin; James L. Frazier
2009-01-01
The continual development of technology opens many new and exciting doors in all walks of life, including science. Undoubtedly, we all have benefited from the ability to rapidly disseminate and acquire scientific information. Published articles can be downloaded from the Internet even prior to their "actual" publication date, requests for pdf reprints of...
Annotated Bibliography of Research in the Teaching of English
ERIC Educational Resources Information Center
Beach, Richard; Brendler, Beth; Dillon, Deborah; Dockter, Jessie; Ernst, Stacy; Frederick, Amy; Galda, Lee; Helman, Lori; Kapoor, Richa; Ngo, Bic; O'Brien, David; Scharber, Cassie; Jorgensen, Karen; Liang, Lauren; Braaksma, Martine; Janssen, Tanja
2010-01-01
This article presents an annotated bibliography of "Research in the Teaching of English" (RTE). The 2010 version of the bibliography involves a major change--the bibliography is available solely as a downloadable pdf file at http://www.ncte.org/journals/rte/issues/v45-2. As the length of the bibliography has grown from 15 pages in 2003 to 88 pages…
Comparison of Online Costs Using 1200-BPS and 2400-BPS Modems.
ERIC Educational Resources Information Center
MacMillan, Donald
1992-01-01
Compares search time and costs of four online searches, identically replicated except for the speed of the modem used to retrieve and download data. Results are presented for 1200-BPS (bits per second) and 2400-BPS modems which demonstrate that searching at 2400-BPS is more efficient and economical. Issues to be resolved before 9600-BPS modems…
The CEBAF Element Database and Related Operational Software
DOE Office of Scientific and Technical Information (OSTI.GOV)
Larrieu, Theodore; Slominski, Christopher; Keesee, Marie
The newly commissioned 12GeV CEBAF accelerator relies on a flexible, scalable and comprehensive database to define the accelerator. This database delivers the configuration for CEBAF operational tools, including hardware checkout, the downloadable optics model, control screens, and much more. The presentation will describe the flexible design of the CEBAF Element Database (CED), its features and assorted use case examples.
ERIC Educational Resources Information Center
Kim, Ann
2006-01-01
This article discusses the possible answers to most librarians' questions and how advanced and hi-technology help them in the library world. Many of the answers to librarians' questions regarding digital media depend upon community/patron awareness and library resources. In this article, the author presents some quoted statements from electronic…
NASA Technical Reports Server (NTRS)
Jordon, D. E.; Patterson, W.; Sandlin, D. R.
1985-01-01
The XV-15 Tilt Rotor Research Aircraft download phenomenon was analyzed. This phenomenon is a direct result of the two rotor wakes impinging on the wing upper surface when the aircraft is in the hover configuration. For this study the analysis proceeded along tow lines. First was a method whereby results from actual hover tests of the XV-15 aircraft were combined with drag coefficient results from wind tunnel tests of a wing that was representative of the aircraft wing. Second, an analytical method was used that modeled that airflow caused gy the two rotors. Formulas were developed in such a way that acomputer program could be used to calculate the axial velocities were then used in conjunction with the aforementioned wind tunnel drag coefficinet results to produce download values. An attempt was made to validate the analytical results by modeling a model rotor system for which direct download values were determinrd..
NASA Astrophysics Data System (ADS)
Kurtz, Michael J.; Henneken, Edwin A.
2017-03-01
Citation measures, and newer altmetric measures such as downloads are now commonly used to inform personnel decisions. How well do or can these measures measure or predict the past, current, or future scholarly performance of an individual? Using data from the Smithsonian/NASA Astrophysics Data System we analyze the publication, citation, download, and distinction histories of a cohort of 922 individuals who received a U.S. PhD in astronomy in the period 1972-1976. By examining the same and different measures at the same and different times for the same individuals we are able to show the capabilities and limitations of each measure. Because the distributions are lognormal, measurement uncertainties are multiplicative; we show that in order to state with 95% confidence that one person's citations and downloads are significantly higher than another person's, the log difference in the ratio of counts must be at least 0.3dex, which corresponds to a multiplicative factor of 2.
Loomis, John; Koontz, Steve; Miller, Holly M.; Richardson, Leslie A.
2015-01-01
While the U.S. government does not charge for downloading Landsat images, the images have value to users. This paper demonstrates a method that can value Landsat and other imagery to users. A survey of downloaders of Landsat images found: (a) established US users have a mean value of $912 USD per scene; (b) new US users and users returning when imagery became free have a mean value of $367 USD per scene. Total US user benefits for the 2.38 million scenes downloaded is $1.8 billion USD. While these benefits indicate a high willingness-to-pay among many Landsat downloaders, it would be economically inefficient for the US government to charge for Landsat imagery. Charging a price of $100 USD a scene would result in an efficiency loss of $37.5 million a year. This economic information should be useful to policy-makers who must decide about the future of this and similar remote sensing programs.
Campagnola, Luke; Kratz, Megan B; Manis, Paul B
2014-01-01
The complexity of modern neurophysiology experiments requires specialized software to coordinate multiple acquisition devices and analyze the collected data. We have developed ACQ4, an open-source software platform for performing data acquisition and analysis in experimental neurophysiology. This software integrates the tasks of acquiring, managing, and analyzing experimental data. ACQ4 has been used primarily for standard patch-clamp electrophysiology, laser scanning photostimulation, multiphoton microscopy, intrinsic imaging, and calcium imaging. The system is highly modular, which facilitates the addition of new devices and functionality. The modules included with ACQ4 provide for rapid construction of acquisition protocols, live video display, and customizable analysis tools. Position-aware data collection allows automated construction of image mosaics and registration of images with 3-dimensional anatomical atlases. ACQ4 uses free and open-source tools including Python, NumPy/SciPy for numerical computation, PyQt for the user interface, and PyQtGraph for scientific graphics. Supported hardware includes cameras, patch clamp amplifiers, scanning mirrors, lasers, shutters, Pockels cells, motorized stages, and more. ACQ4 is available for download at http://www.acq4.org.
Dempsey, Patrick G; Pollard, Jonisha; Porter, William L; Mayton, Alan; Heberger, John R; Gallagher, Sean; Reardon, Leanna; Drury, Colin G
2017-12-01
The development and testing of ergonomics and safety audits for small and bulk bag filling, haul truck and maintenance and repair operations in coal preparation and mineral processing plants found at surface mine sites is described. The content for the audits was derived from diverse sources of information on ergonomics and safety deficiencies including: analysis of injury, illness and fatality data and reports; task analysis; empirical laboratory studies of particular tasks; field studies and observations at mine sites; and maintenance records. These diverse sources of information were utilised to establish construct validity of the modular audits that were developed for use by mine safety personnel. User and interrater reliability testing was carried out prior to finalising the audits. The audits can be implemented using downloadable paper versions or with a free mobile NIOSH-developed Android application called ErgoMine. Practitioner Summary: The methodology used to develop ergonomics audits for three types of mining operations is described. Various sources of audit content are compared and contrasted to serve as a guide for developing ergonomics audits for other occupational contexts.
Wang, Lei; Alpert, Kathryn I; Calhoun, Vince D; Cobia, Derin J; Keator, David B; King, Margaret D; Kogan, Alexandr; Landis, Drew; Tallis, Marcelo; Turner, Matthew D; Potkin, Steven G; Turner, Jessica A; Ambite, Jose Luis
2016-01-01
SchizConnect (www.schizconnect.org) is built to address the issues of multiple data repositories in schizophrenia neuroimaging studies. It includes a level of mediation--translating across data sources--so that the user can place one query, e.g. for diffusion images from male individuals with schizophrenia, and find out from across participating data sources how many datasets there are, as well as downloading the imaging and related data. The current version handles the Data Usage Agreements across different studies, as well as interpreting database-specific terminologies into a common framework. New data repositories can also be mediated to bring immediate access to existing datasets. Compared with centralized, upload data sharing models, SchizConnect is a unique, virtual database with a focus on schizophrenia and related disorders that can mediate live data as information is being updated at each data source. It is our hope that SchizConnect can facilitate testing new hypotheses through aggregated datasets, promoting discovery related to the mechanisms underlying schizophrenic dysfunction. Copyright © 2015 Elsevier Inc. All rights reserved.
Unifying cancer and normal RNA sequencing data from different sources
Wang, Qingguo; Armenia, Joshua; Zhang, Chao; Penson, Alexander V.; Reznik, Ed; Zhang, Liguo; Minet, Thais; Ochoa, Angelica; Gross, Benjamin E.; Iacobuzio-Donahue, Christine A.; Betel, Doron; Taylor, Barry S.; Gao, Jianjiong; Schultz, Nikolaus
2018-01-01
Driven by the recent advances of next generation sequencing (NGS) technologies and an urgent need to decode complex human diseases, a multitude of large-scale studies were conducted recently that have resulted in an unprecedented volume of whole transcriptome sequencing (RNA-seq) data, such as the Genotype Tissue Expression project (GTEx) and The Cancer Genome Atlas (TCGA). While these data offer new opportunities to identify the mechanisms underlying disease, the comparison of data from different sources remains challenging, due to differences in sample and data processing. Here, we developed a pipeline that processes and unifies RNA-seq data from different studies, which includes uniform realignment, gene expression quantification, and batch effect removal. We find that uniform alignment and quantification is not sufficient when combining RNA-seq data from different sources and that the removal of other batch effects is essential to facilitate data comparison. We have processed data from GTEx and TCGA and successfully corrected for study-specific biases, enabling comparative analysis between TCGA and GTEx. The normalized datasets are available for download on figshare. PMID:29664468
NASA Astrophysics Data System (ADS)
Blaschek, Michael; Gerken, Daniel; Ludwig, Ralf; Duttmann, Rainer
2015-04-01
Geoportals are important elements of spatial data infrastructures (SDIs) that are strongly based on GIS-related web services. These services are basically meant for distributing, documenting and visualizing (spatial) data in a standardized manner; an important but challenging task especially in large scientific projects with a high number of data suppliers and producers from various countries. This presentation focuses on introducing the free and open-source based geoportal solution developed within the research project CLIMB (Climate Induced Changes on the Hydrology of Mediterranean Basins, www.climb-fp7.eu) that serves as the central platform for interchanging project-related spatial data and information. In this collaboration, financed by the EU-FP7-framework and coordinated at the LMU Munich, 21 partner institutions from nine European and non-European countries were involved. The CLIMB Geoportal (lgi-climbsrv.geographie.uni-kiel.de) stores and provides spatially distributed data about the current state and future changes of the hydrological conditions within the seven CLIMB test sites around the Mediterranean. Hydrological modelling outcome - validated by the CLIMB partners - is offered to the public in forms of Web Map Services (WMS), whereas downloading the underlying data itself through Web Coverage Services (WCS) is possible for registered users only. A selection of common indicators such as discharge, drought index as well as uncertainty measures including their changes over time were used in different spatial resolution. Besides map information, the portal enables the graphical display of time series of selected variables calculated by the individual models applied within the CLIMB-project. The implementation of the CLIMB Geoportal is finally based on version 2.0c5 of the open source geospatial content management system GeoNode. It includes a GeoServer instance for providing the OGC-compliant web services and comes with a metadata catalog (pycsw) as well as a built-in WebGIS-client based on GeoExt (GeoExplorer). PostgreSQL enhanced by PostGIS in versions 9.2.1/2.0.1 serves as database backend for all base data of the study sites and for the time series of relevant hydrological indicators. Spatial model results in raster-format are stored file-based as GeoTIFFs. Due to the high number of model outputs, the generation of metadata (xml) and graphical rendering instructions (sld) associated with each single layer of the WMS has been done automatically using the statistical software R. Additional applications that have been programmed during the project period include a Java-based interface for comfortable download of climate data that was initially needed as input data in hydrological modeling as well as a tool for displaying time series of selected risk indicators which is directly integrated into the portal structure implemented using Python (Django) and JavaScript. The presented CLIMB Geoportal shows that relevant results of even large international research projects involving many partners and varying national standards in data handling, can be effectively disseminated to stakeholders, policy makers and other interested parties. Thus, it is a successful example of using free and open-source software for providing long-term visibility and access to data produced within a particular (environmental) research project.
The National Informal STEM Education Network
Evaluation and Research Kits Explore Science: Earth & Space toolkit Building with Biology Kit Explore 2018 toolkits now available for download. Download the 2018 Digital Toolkit! Building with Biology ACTIVITY KIT Building with Biology Conversations and activities about synthetic biology; this emerging
Codec-on-Demand Based on User-Level Virtualization
NASA Astrophysics Data System (ADS)
Zhang, Youhui; Zheng, Weimin
At work, at home, and in some public places, a desktop PC is usually available nowadays. Therefore, it is important for users to be able to play various videos on different PCs smoothly, but the diversity of codec types complicates the situation. Although some mainstream media players can try to download the needed codec automatically, this may fail for average users because installing the codec usually requires administrator privileges to complete, while the user may not be the owner of the PC. We believe an ideal solution should work without users' intervention, and need no special privileges. This paper proposes such a user-friendly, program-transparent solution for Windows-based media players. It runs the media player in a user-mode virtualization environment, and then downloads the needed codec on-the-fly. Because of API (Application Programming Interface) interception, some resource-accessing API calls from the player will be redirected to the downloaded codec resources. Then from the viewpoint of the player, the necessary codec exists locally and it can handle the video smoothly, although neither system registry nor system folders was modified during this process. Besides convenience, the principle of least privilege is maintained and the host system is left clean. This paper completely analyzes the technical issues and presents such a prototype which can work with DirectShow-compatible players. Performance tests show that the overhead is negligible. Moreover, our solution conforms to the Software-As-A-Service (SaaS) mode, which is very promising in the Internet era.
The SOOS Data Portal, providing access to Southern Oceans data
NASA Astrophysics Data System (ADS)
Proctor, Roger; Finney, Kim; Blain, Peter; Taylor, Fiona; Newman, Louise; Meredith, Mike; Schofield, Oscar
2013-04-01
The Southern Ocean Observing System (SOOS) is an international initiative to enhance, coordinate and expand the strategic observations of the Southern Oceans that are required to address key scientific and societal challenges. A key component of SOOS will be the creation and maintenance of a Southern Ocean Data Portal to provide improved access to historical and ongoing data (Schofield et al., 2012, Eos, Vol. 93, No. 26, pp 241-243). The scale of this effort will require strong leveraging of existing data centres, new cyberinfrastructure development efforts, and defined data collection, quality control, and archiving procedures across the international community. The task of assembling the SOOS data portal is assigned to the SOOS Data Management Sub-Committee. The information infrastructure chosen for the SOOS data portal is based on the Australian Ocean Data Network (AODN, http://portal.aodn.org.au). The AODN infrastructure is built on open-source tools and the use of international standards ensures efficiency of data exchange and interoperability between contributing systems. OGC standard web services protocols are used for serving of data via the internet. These include Web Map Service (WMS) for visualisation, Web Feature Service (WFS) for data download, and Catalogue Service for Web (CSW) for catalogue exchange. The portal offers a number of tools to access and visualize data: - a Search link to the metadata catalogue enables search and discovery by simple text search, by geographic area, temporal extent, keyword, parameter, organisation, or by any combination of these, allowing users to gain access to further information and/or the data for download. Also, searches can be restricted to items which have either data to download, or attached map layers, or both - a Map interface for discovery and display of data, with the ability to change the style and opacity of layers, add additional data layers via OGC Web Map Services, view animated timeseries datastreams - data can be easily accessed and downloaded including directly from OPeNDAP/THREDDS servers. The SOOS data portal (http://soos.aodn.org.au/soos) aims to make access to Southern Ocean data a simple process and the initial layout classifies data into six themes - Heat and Freshwater; Circulation; Ice-sheets and Sea level; Carbon; Sea-ice; and Ecosystems, with the ability to integrate layers between themes. The portal is in its infancy (pilot launched January 2013) with a limited number of datasets available; however, the number of datasets is expected to grow rapidly as the international community becomes fully engaged.
Kirsch, Thomas D; Circh, Ryan; Bissell, Richard A; Goldfeder, Matthew
2016-10-01
Personal preparedness is a core activity but has been found to be frequently inadequate. Smart phone applications have many uses for the public, including preparedness. In 2012 the American Red Cross began releasing "disaster" apps for family preparedness and recovery. The Hurricane App was widely used during Hurricane Sandy in 2012. Patterns of download of the application were analyzed by using a download tracking tool by the American Red Cross and Google Analytics. Specific variables included date, time, and location of individual downloads; number of page visits and views; and average time spent on pages. As Hurricane Sandy approached in late October, daily downloads peaked at 152,258 on the day of landfall and by mid-November reached 697,585. Total page views began increasing on October 25 with over 4,000,000 page views during landfall compared to 3.7 million the first 3 weeks of October with a 43,980% increase in views of the "Right Before" page and a 76,275% increase in views of the "During" page. The Hurricane App offered a new type of "just-in-time" training that reached tens of thousands of families in areas affected by Hurricane Sandy. The app allowed these families to access real-time information before and after the storm to help them prepare and recover. (Disaster Med Public Health Preparedness. 2016;page 1 of 6).
WILBER and PyWEED: Event-based Seismic Data Request Tools
NASA Astrophysics Data System (ADS)
Falco, N.; Clark, A.; Trabant, C. M.
2017-12-01
WILBER and PyWEED are two user-friendly tools for requesting event-oriented seismic data. Both tools provide interactive maps and other controls for browsing and filtering event and station catalogs, and downloading data for selected event/station combinations, where the data window for each event/station pair may be defined relative to the arrival time of seismic waves from the event to that particular station. Both tools allow data to be previewed visually, and can download data in standard miniSEED, SAC, and other formats, complete with relevant metadata for performing instrument correction. WILBER is a web application requiring only a modern web browser. Once the user has selected an event, WILBER identifies all data available for that time period, and allows the user to select stations based on criteria such as the station's distance and orientation relative to the event. When the user has finalized their request, the data is collected and packaged on the IRIS server, and when it is ready the user is sent a link to download. PyWEED is a downloadable, cross-platform (Macintosh / Windows / Linux) application written in Python. PyWEED allows a user to select multiple events and stations, and will download data for each event/station combination selected. PyWEED is built around the ObsPy seismic toolkit, and allows direct interaction and control of the application through a Python interactive console.
Taiwan Ascii and Idl_save Data Archives (AIDA) for THEMIS
NASA Astrophysics Data System (ADS)
Lee, B.; Hsieh, W.; Shue, J.; Angelopoulos, V.; Glassmeier, K. H.; McFadden, J. P.; Larson, D.
2008-12-01
THEMIS (Time History of Events and their Macroscopic Interactions during Substorms) is a satellite mission that aims to determine where and how substorms are triggered. The space research team in Taiwan has been involved in data promotion and scientific research. Taiwan Ascii and Idl_save Data Archives (AIDA) for THEMIS is the main work of the data promotion. Taiwan AIDA is developed for those who are not familiar with the Interactive Data Language (IDL) data analysis and visualization software, and those who have some basic IDL concepts and techniques and want more flexibilities in reading and plotting the THEMIS data. Two kinds of data format are stored in Taiwan AIDA: one is ASCII format for most users and the other is IDL SAVE format for IDL users. The public can download THEMIS data in either format through the Taiwan AIDA web site, http://themis.ss.ncu.edu.tw/e_data_download.php. Taiwan AIDA provides (1) plasma data including number density, average temperature, and velocity of ions and electrons, (2) magnetic field data, and (3) state information including the position and velocity of five THEMIS probes. On the Taiwan AIDA web site there are two data-downloading options. The public can download a large amount of data for a particular instrument in the FTP equivalent option; the public can also download all the data for a particular date in the Data Search option.
NASA Astrophysics Data System (ADS)
Takahashi, Ryuichi; Hamana, Takashi; Shirasaki, Masato; Namikawa, Toshiya; Nishimichi, Takahiro; Osato, Ken; Shiroyama, Kosei
2017-11-01
We present 108 full-sky gravitational lensing simulation data sets generated by performing multiple-lens plane ray-tracing through high-resolution cosmological N-body simulations. The data sets include full-sky convergence and shear maps from redshifts z = 0.05 to 5.3 at intervals of 150 {h}-1{Mpc} comoving radial distance (corresponding to a redshift interval of {{Δ }}z≃ 0.05 at the nearby universe), enabling the construction of a mock shear catalog for an arbitrary source distribution up to z = 5.3. The dark matter halos are identified from the same N-body simulations with enough mass resolution to resolve the host halos of the Sloan Digital Sky Survey (SDSS) CMASS and luminous red galaxies (LRGs). Angular positions and redshifts of the halos are provided by a ray-tracing calculation, enabling the creation of a mock halo catalog to be used for galaxy-galaxy and cluster-galaxy lensing. The simulation also yields maps of gravitational lensing deflections for a source redshift at the last scattering surface, and we provide 108 realizations of lensed cosmic microwave background (CMB) maps in which the post-Born corrections caused by multiple light scattering are included. We present basic statistics of the simulation data, including the angular power spectra of cosmic shear, CMB temperature and polarization anisotropies, galaxy-galaxy lensing signals for halos, and their covariances. The angular power spectra of the cosmic shear and CMB anisotropies agree with theoretical predictions within 5% up to {\\ell }=3000 (or at an angular scale θ > 0.5 arcmin). The simulation data sets are generated primarily for the ongoing Subaru Hyper Suprime-Cam survey, but are freely available for download at http://cosmo.phys.hirosaki-u.ac.jp/takahasi/allsky_raytracing/.
Takahashi, Ryuichi; Hamana, Takashi; Shirasaki, Masato; ...
2017-11-14
We present 108 full-sky gravitational lensing simulation data sets generated by performing multiple-lens plane ray-tracing through high-resolution cosmological N-body simulations. The data sets include full-sky convergence and shear maps from redshifts z = 0.05 to 5.3 at intervals ofmore » $$150\\,{h}^{-1}\\mathrm{Mpc}$$ comoving radial distance (corresponding to a redshift interval of $${\\rm{\\Delta }}z\\simeq 0.05$$ at the nearby universe), enabling the construction of a mock shear catalog for an arbitrary source distribution up to z = 5.3. The dark matter halos are identified from the same N-body simulations with enough mass resolution to resolve the host halos of the Sloan Digital Sky Survey (SDSS) CMASS and luminous red galaxies (LRGs). Angular positions and redshifts of the halos are provided by a ray-tracing calculation, enabling the creation of a mock halo catalog to be used for galaxy–galaxy and cluster–galaxy lensing. The simulation also yields maps of gravitational lensing deflections for a source redshift at the last scattering surface, and we provide 108 realizations of lensed cosmic microwave background (CMB) maps in which the post-Born corrections caused by multiple light scattering are included. We present basic statistics of the simulation data, including the angular power spectra of cosmic shear, CMB temperature and polarization anisotropies, galaxy–galaxy lensing signals for halos, and their covariances. The angular power spectra of the cosmic shear and CMB anisotropies agree with theoretical predictions within 5% up to $${\\ell }=3000$$ (or at an angular scale $$\\theta \\gt 0.5$$ arcmin). The simulation data sets are generated primarily for the ongoing Subaru Hyper Suprime-Cam survey, but are freely available for download at http://cosmo.phys.hirosaki-u.ac.jp/takahasi/allsky_raytracing/.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Takahashi, Ryuichi; Hamana, Takashi; Shirasaki, Masato
We present 108 full-sky gravitational lensing simulation data sets generated by performing multiple-lens plane ray-tracing through high-resolution cosmological N-body simulations. The data sets include full-sky convergence and shear maps from redshifts z = 0.05 to 5.3 at intervals ofmore » $$150\\,{h}^{-1}\\mathrm{Mpc}$$ comoving radial distance (corresponding to a redshift interval of $${\\rm{\\Delta }}z\\simeq 0.05$$ at the nearby universe), enabling the construction of a mock shear catalog for an arbitrary source distribution up to z = 5.3. The dark matter halos are identified from the same N-body simulations with enough mass resolution to resolve the host halos of the Sloan Digital Sky Survey (SDSS) CMASS and luminous red galaxies (LRGs). Angular positions and redshifts of the halos are provided by a ray-tracing calculation, enabling the creation of a mock halo catalog to be used for galaxy–galaxy and cluster–galaxy lensing. The simulation also yields maps of gravitational lensing deflections for a source redshift at the last scattering surface, and we provide 108 realizations of lensed cosmic microwave background (CMB) maps in which the post-Born corrections caused by multiple light scattering are included. We present basic statistics of the simulation data, including the angular power spectra of cosmic shear, CMB temperature and polarization anisotropies, galaxy–galaxy lensing signals for halos, and their covariances. The angular power spectra of the cosmic shear and CMB anisotropies agree with theoretical predictions within 5% up to $${\\ell }=3000$$ (or at an angular scale $$\\theta \\gt 0.5$$ arcmin). The simulation data sets are generated primarily for the ongoing Subaru Hyper Suprime-Cam survey, but are freely available for download at http://cosmo.phys.hirosaki-u.ac.jp/takahasi/allsky_raytracing/.« less
Cloud-based Jupyter Notebooks for Water Data Analysis
NASA Astrophysics Data System (ADS)
Castronova, A. M.; Brazil, L.; Seul, M.
2017-12-01
The development and adoption of technologies by the water science community to improve our ability to openly collaborate and share workflows will have a transformative impact on how we address the challenges associated with collaborative and reproducible scientific research. Jupyter notebooks offer one solution by providing an open-source platform for creating metadata-rich toolchains for modeling and data analysis applications. Adoption of this technology within the water sciences, coupled with publicly available datasets from agencies such as USGS, NASA, and EPA enables researchers to easily prototype and execute data intensive toolchains. Moreover, implementing this software stack in a cloud-based environment extends its native functionality to provide researchers a mechanism to build and execute toolchains that are too large or computationally demanding for typical desktop computers. Additionally, this cloud-based solution enables scientists to disseminate data processing routines alongside journal publications in an effort to support reproducibility. For example, these data collection and analysis toolchains can be shared, archived, and published using the HydroShare platform or downloaded and executed locally to reproduce scientific analysis. This work presents the design and implementation of a cloud-based Jupyter environment and its application for collecting, aggregating, and munging various datasets in a transparent, sharable, and self-documented manner. The goals of this work are to establish a free and open source platform for domain scientists to (1) conduct data intensive and computationally intensive collaborative research, (2) utilize high performance libraries, models, and routines within a pre-configured cloud environment, and (3) enable dissemination of research products. This presentation will discuss recent efforts towards achieving these goals, and describe the architectural design of the notebook server in an effort to support collaborative and reproducible science.
High temporal resolution mapping of seismic noise sources using heterogeneous supercomputers
NASA Astrophysics Data System (ADS)
Gokhberg, Alexey; Ermert, Laura; Paitz, Patrick; Fichtner, Andreas
2017-04-01
Time- and space-dependent distribution of seismic noise sources is becoming a key ingredient of modern real-time monitoring of various geo-systems. Significant interest in seismic noise source maps with high temporal resolution (days) is expected to come from a number of domains, including natural resources exploration, analysis of active earthquake fault zones and volcanoes, as well as geothermal and hydrocarbon reservoir monitoring. Currently, knowledge of noise sources is insufficient for high-resolution subsurface monitoring applications. Near-real-time seismic data, as well as advanced imaging methods to constrain seismic noise sources have recently become available. These methods are based on the massive cross-correlation of seismic noise records from all available seismic stations in the region of interest and are therefore very computationally intensive. Heterogeneous massively parallel supercomputing systems introduced in the recent years combine conventional multi-core CPU with GPU accelerators and provide an opportunity for manifold increase and computing performance. Therefore, these systems represent an efficient platform for implementation of a noise source mapping solution. We present the first results of an ongoing research project conducted in collaboration with the Swiss National Supercomputing Centre (CSCS). The project aims at building a service that provides seismic noise source maps for Central Europe with high temporal resolution (days to few weeks depending on frequency and data availability). The service is hosted on the CSCS computing infrastructure; all computationally intensive processing is performed on the massively parallel heterogeneous supercomputer "Piz Daint". The solution architecture is based on the Application-as-a-Service concept in order to provide the interested external researchers the regular access to the noise source maps. The solution architecture includes the following sub-systems: (1) data acquisition responsible for collecting, on a periodic basis, raw seismic records from the European seismic networks, (2) high-performance noise source mapping application responsible for generation of source maps using cross-correlation of seismic records, (3) back-end infrastructure for the coordination of various tasks and computations, (4) front-end Web interface providing the service to the end-users and (5) data repository. The noise mapping application is composed of four principal modules: (1) pre-processing of raw data, (2) massive cross-correlation, (3) post-processing of correlation data based on computation of logarithmic energy ratio and (4) generation of source maps from post-processed data. Implementation of the solution posed various challenges, in particular, selection of data sources and transfer protocols, automation and monitoring of daily data downloads, ensuring the required data processing performance, design of a general service oriented architecture for coordination of various sub-systems, and engineering an appropriate data storage solution. The present pilot version of the service implements noise source maps for Switzerland. Extension of the solution to Central Europe is planned for the next project phase.
can help. Call NESC toll free at (304) 293-4191. or e-mail info@mail.nesc.wvu.edu and ask for different situations. Onsite Technologies for Small Communities Poster - NESC provides free downloads to . Downloads are free. This poster is available to order, charges apply to hard copies only. Pipeline
DOE Research and Development Accomplishments Help
be used to search, locate, access, and electronically download full-text research and development (R Browse Downloading, Viewing, and/or Searching Full-text Documents/Pages Searching the Database Search Features Search allows you to search the OCRed full-text document and bibliographic information, the
ERIC Educational Resources Information Center
Siemens, Jennifer Christie; Kopp, Steven W.
2006-01-01
Universities have become sensitized to the potential for students' illegal downloading of copyrighted materials. Education has been advocated as one way to curb downloading of copyrighted digital content. This study investigates the effectiveness of a university-sponsored computing ethics education program. The program positively influenced…
Medical Device Plug-and-Play Interoperability Standards and Technology Leadership
2014-10-01
downloads/AboutFDA/CentersOffices/OfficeofMedicalProductsandTobacco/ CDRH /CDRHReports/UCM391521.pdf. Dr. Goldman spoke in multiple panels at this workshop...downloads/AboutFDA/CentersOffices/OfficeofMedicalProductsandTo bacco/ CDRH /CDRHReports/UCM391521.pdf Arney D, Plourde J, Schrenker R, Mattegunta P
An interactive web application for the dissemination of human systems immunology data.
Speake, Cate; Presnell, Scott; Domico, Kelly; Zeitner, Brad; Bjork, Anna; Anderson, David; Mason, Michael J; Whalen, Elizabeth; Vargas, Olivia; Popov, Dimitry; Rinchai, Darawan; Jourde-Chiche, Noemie; Chiche, Laurent; Quinn, Charlie; Chaussabel, Damien
2015-06-19
Systems immunology approaches have proven invaluable in translational research settings. The current rate at which large-scale datasets are generated presents unique challenges and opportunities. Mining aggregates of these datasets could accelerate the pace of discovery, but new solutions are needed to integrate the heterogeneous data types with the contextual information that is necessary for interpretation. In addition, enabling tools and technologies facilitating investigators' interaction with large-scale datasets must be developed in order to promote insight and foster knowledge discovery. State of the art application programming was employed to develop an interactive web application for browsing and visualizing large and complex datasets. A collection of human immune transcriptome datasets were loaded alongside contextual information about the samples. We provide a resource enabling interactive query and navigation of transcriptome datasets relevant to human immunology research. Detailed information about studies and samples are displayed dynamically; if desired the associated data can be downloaded. Custom interactive visualizations of the data can be shared via email or social media. This application can be used to browse context-rich systems-scale data within and across systems immunology studies. This resource is publicly available online at [Gene Expression Browser Landing Page ( https://gxb.benaroyaresearch.org/dm3/landing.gsp )]. The source code is also available openly [Gene Expression Browser Source Code ( https://github.com/BenaroyaResearch/gxbrowser )]. We have developed a data browsing and visualization application capable of navigating increasingly large and complex datasets generated in the context of immunological studies. This intuitive tool ensures that, whether taken individually or as a whole, such datasets generated at great effort and expense remain interpretable and a ready source of insight for years to come.
Slab1.0: A three-dimensional model of global subduction zone geometries
NASA Astrophysics Data System (ADS)
Hayes, Gavin P.; Wald, David J.; Johnson, Rebecca L.
2012-01-01
We describe and present a new model of global subduction zone geometries, called Slab1.0. An extension of previous efforts to constrain the two-dimensional non-planar geometry of subduction zones around the focus of large earthquakes, Slab1.0 describes the detailed, non-planar, three-dimensional geometry of approximately 85% of subduction zones worldwide. While the model focuses on the detailed form of each slab from their trenches through the seismogenic zone, where it combines data sets from active source and passive seismology, it also continues to the limits of their seismic extent in the upper-mid mantle, providing a uniform approach to the definition of the entire seismically active slab geometry. Examples are shown for two well-constrained global locations; models for many other regions are available and can be freely downloaded in several formats from our new Slab1.0 website, http://on.doi.gov/d9ARbS. We describe improvements in our two-dimensional geometry constraint inversion, including the use of `average' active source seismic data profiles in the shallow trench regions where data are otherwise lacking, derived from the interpolation between other active source seismic data along-strike in the same subduction zone. We include several analyses of the uncertainty and robustness of our three-dimensional interpolation methods. In addition, we use the filtered, subduction-related earthquake data sets compiled to build Slab1.0 in a reassessment of previous analyses of the deep limit of the thrust interface seismogenic zone for all subduction zones included in our global model thus far, concluding that the width of these seismogenic zones is on average 30% larger than previous studies have suggested.